2025-08-14T21:20:37.3871873Z Current runner version: '2.328.0' 2025-08-14T21:20:37.3877816Z Runner name: 'i-0115c72a6ef255e70' 2025-08-14T21:20:37.3878843Z Runner group name: 'default' 2025-08-14T21:20:37.3879634Z Machine name: 'ip-10-0-39-154' 2025-08-14T21:20:37.3882113Z ##[group]GITHUB_TOKEN Permissions 2025-08-14T21:20:37.3884371Z Contents: read 2025-08-14T21:20:37.3885039Z Metadata: read 2025-08-14T21:20:37.3885516Z ##[endgroup] 2025-08-14T21:20:37.3887830Z Secret source: Actions 2025-08-14T21:20:37.3888723Z Prepare workflow directory 2025-08-14T21:20:37.4325087Z Prepare all required actions 2025-08-14T21:20:37.4360050Z Getting action download info 2025-08-14T21:20:37.6859206Z Download action repository 'pytorch/test-infra@main' (SHA:83f58f391e939c10dcb8cb6d745e4cefa3b98a84) 2025-08-14T21:20:39.1632675Z Download action repository 'pytorch/pytorch@main' (SHA:3be70dc30e893b552fc0f23ca06cd8f7949b6d08) 2025-08-14T21:20:54.1983356Z Download action repository 'actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065' (SHA:a26af69be951a213d495a4c3e4e4022e16d87065) 2025-08-14T21:20:54.5615092Z Download action repository 'aws-actions/configure-aws-credentials@ececac1a45f3b08a01d2dd070d28d111c5fe6722' (SHA:ececac1a45f3b08a01d2dd070d28d111c5fe6722) 2025-08-14T21:20:54.8051601Z Download action repository 'aws-actions/amazon-ecr-login@062b18b96a7aff071d4dc91bc00c4c1a7945b076' (SHA:062b18b96a7aff071d4dc91bc00c4c1a7945b076) 2025-08-14T21:20:54.9813281Z Download action repository 'seemethere/upload-artifact-s3@baba72d0712b404f646cebe0730933554ebce96a' (SHA:baba72d0712b404f646cebe0730933554ebce96a) 2025-08-14T21:20:55.2749269Z Getting action download info 2025-08-14T21:20:55.3996849Z Download action repository 'actions/checkout@v4' (SHA:08eba0b27e820071cde6df949e0beb9ba4906955) 2025-08-14T21:20:55.6934021Z Getting action download info 2025-08-14T21:20:55.7956691Z Download action repository 'nick-fields/retry@v3.0.0' (SHA:7152eba30c6575329ac0576536151aca5a72780e) 2025-08-14T21:20:55.9792012Z Getting action download info 2025-08-14T21:20:56.0813166Z Download action repository 'nick-fields/retry@3e91a01664abd3c5cd539100d10d33b9c5b68482' (SHA:3e91a01664abd3c5cd539100d10d33b9c5b68482) 2025-08-14T21:20:56.2769121Z Getting action download info 2025-08-14T21:20:56.4117843Z Uses: pytorch/pytorch/.github/workflows/_linux-test.yml@refs/heads/main (1fc683cf17c8c673044538d10266c00f92987be2) 2025-08-14T21:20:56.4121200Z ##[group] Inputs 2025-08-14T21:20:56.4121521Z build-environment: linux-jammy-py3.9-gcc11-build 2025-08-14T21:20:56.4126328Z test-matrix: {"include": [{"config": "cpu_inductor_freezing_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_timm", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_timm", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_timm", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_timm", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_timm", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_timm", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_amp_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_amp_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_amp_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_amp_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}]} 2025-08-14T21:20:56.4131733Z docker-image: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:20:56.4132435Z sync-tag: 2025-08-14T21:20:56.4133259Z timeout-minutes: 240 2025-08-14T21:20:56.4133485Z use-gha: 2025-08-14T21:20:56.4133682Z dashboard-tag: 2025-08-14T21:20:56.4133906Z s3-bucket: gha-artifacts 2025-08-14T21:20:56.4134145Z aws-role-to-assume: 2025-08-14T21:20:56.4134610Z disable-monitor: false 2025-08-14T21:20:56.4134879Z monitor-log-interval: 5 2025-08-14T21:20:56.4135160Z monitor-data-collect-interval: 1 2025-08-14T21:20:56.4135438Z ##[endgroup] 2025-08-14T21:20:56.4135901Z Complete job name: linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx) 2025-08-14T21:20:56.4674780Z A job started hook has been configured by the self-hosted runner administrator 2025-08-14T21:20:56.4760369Z ##[group]Run '/home/ec2-user/runner-scripts/before_job.sh' 2025-08-14T21:20:56.4768835Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:20:56.4769414Z ##[endgroup] 2025-08-14T21:20:57.5286405Z Runner Type: linux.8xlarge.amx 2025-08-14T21:20:57.5287008Z Instance Type: m7i-flex.8xlarge 2025-08-14T21:20:57.5287505Z AMI Name: unknown 2025-08-14T21:20:57.5322705Z AMI ID: ami-05ffe3c48a9991133 2025-08-14T21:21:02.2803546Z ##[group]Run pytorch/test-infra/.github/actions/setup-ssh@main 2025-08-14T21:21:02.2803960Z with: 2025-08-14T21:21:02.2804759Z github-secret: *** 2025-08-14T21:21:02.2805326Z instructions: All testing is done inside the container, to start an interactive session run: docker exec -it $(docker container ps --format '{{.ID}}') bash 2025-08-14T21:21:02.2805935Z activate-with-label: false 2025-08-14T21:21:02.2806216Z label: with-ssh 2025-08-14T21:21:02.2806487Z remove-existing-keys: true 2025-08-14T21:21:02.2806781Z fail-silently: true 2025-08-14T21:21:02.2807041Z env: 2025-08-14T21:21:02.2807294Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:21:02.2807602Z ##[endgroup] 2025-08-14T21:21:02.4106493Z Please see https://github.com/pytorch/pytorch/wiki/Debugging-using-with-ssh-for-Github-Actions for more info. 2025-08-14T21:21:02.4107609Z Not on pull request and ciflow reference could not be extracted, skipping adding ssh keys 2025-08-14T21:21:02.4251322Z ##[group]Run pytorch/pytorch/.github/actions/checkout-pytorch@main 2025-08-14T21:21:02.4251773Z with: 2025-08-14T21:21:02.4252060Z no-sudo: true 2025-08-14T21:21:02.4252474Z submodules: recursive 2025-08-14T21:21:02.4252774Z fetch-depth: 0 2025-08-14T21:21:02.4253053Z env: 2025-08-14T21:21:02.4253369Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:21:02.4253622Z ##[endgroup] 2025-08-14T21:21:02.4418927Z ##[group]Run echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-08-14T21:21:02.4419598Z echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-08-14T21:21:02.4428030Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:21:02.4428294Z env: 2025-08-14T21:21:02.4428647Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:21:02.4428847Z ##[endgroup] 2025-08-14T21:21:02.4510001Z ##[group]Run # Use all available CPUs for fetching 2025-08-14T21:21:02.4510374Z # Use all available CPUs for fetching 2025-08-14T21:21:02.4510644Z cd "${GITHUB_WORKSPACE}" 2025-08-14T21:21:02.4510937Z git config --global fetch.parallel 0 2025-08-14T21:21:02.4511213Z git config --global submodule.fetchJobs 0 2025-08-14T21:21:02.4511453Z  2025-08-14T21:21:02.4511709Z # Clean workspace. The default checkout action should also do this, but 2025-08-14T21:21:02.4512035Z # do it here as well just in case 2025-08-14T21:21:02.4512270Z if [[ -d .git ]]; then 2025-08-14T21:21:02.4512479Z  if [ -z "${NO_SUDO}" ]; then 2025-08-14T21:21:02.4512705Z  sudo git clean -ffdx 2025-08-14T21:21:02.4512911Z  else 2025-08-14T21:21:02.4513088Z  git clean -ffdx 2025-08-14T21:21:02.4513282Z  fi 2025-08-14T21:21:02.4513460Z fi 2025-08-14T21:21:02.4518423Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:21:02.4518694Z env: 2025-08-14T21:21:02.4518862Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:21:02.4519045Z NO_SUDO: true 2025-08-14T21:21:02.4519206Z ##[endgroup] 2025-08-14T21:21:02.4628775Z ##[group]Run actions/checkout@v4 2025-08-14T21:21:02.4629063Z with: 2025-08-14T21:21:02.4629286Z ref: 1fc683cf17c8c673044538d10266c00f92987be2 2025-08-14T21:21:02.4629550Z fetch-depth: 0 2025-08-14T21:21:02.4629766Z submodules: recursive 2025-08-14T21:21:02.4629993Z show-progress: false 2025-08-14T21:21:02.4630225Z repository: pytorch/pytorch 2025-08-14T21:21:02.4630655Z token: *** 2025-08-14T21:21:02.4630887Z ssh-strict: true 2025-08-14T21:21:02.4631099Z ssh-user: git 2025-08-14T21:21:02.4631316Z persist-credentials: true 2025-08-14T21:21:02.4631546Z clean: true 2025-08-14T21:21:02.4631763Z sparse-checkout-cone-mode: true 2025-08-14T21:21:02.4632031Z fetch-tags: false 2025-08-14T21:21:02.4632228Z lfs: false 2025-08-14T21:21:02.4632430Z set-safe-directory: true 2025-08-14T21:21:02.4632672Z env: 2025-08-14T21:21:02.4632857Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:21:02.4633081Z ##[endgroup] 2025-08-14T21:21:02.5605972Z Syncing repository: pytorch/pytorch 2025-08-14T21:21:02.5607151Z ##[group]Getting Git version info 2025-08-14T21:21:02.5607496Z Working directory is '/home/ec2-user/actions-runner/_work/pytorch/pytorch' 2025-08-14T21:21:02.5607969Z [command]/usr/bin/git version 2025-08-14T21:21:02.5835779Z git version 2.47.1 2025-08-14T21:21:02.5857784Z ##[endgroup] 2025-08-14T21:21:02.5868068Z Copying '/home/ec2-user/.gitconfig' to '/home/ec2-user/actions-runner/_work/_temp/b5be36c2-74c1-4787-a8df-f23c2c469ce8/.gitconfig' 2025-08-14T21:21:02.5890312Z Temporarily overriding HOME='/home/ec2-user/actions-runner/_work/_temp/b5be36c2-74c1-4787-a8df-f23c2c469ce8' before making global git config changes 2025-08-14T21:21:02.5891336Z Adding repository directory to the temporary git global config as a safe directory 2025-08-14T21:21:02.5903071Z [command]/usr/bin/git config --global --add safe.directory /home/ec2-user/actions-runner/_work/pytorch/pytorch 2025-08-14T21:21:02.5946850Z Deleting the contents of '/home/ec2-user/actions-runner/_work/pytorch/pytorch' 2025-08-14T21:21:02.5952460Z ##[group]Initializing the repository 2025-08-14T21:21:02.5956353Z [command]/usr/bin/git init /home/ec2-user/actions-runner/_work/pytorch/pytorch 2025-08-14T21:21:02.6009200Z hint: Using 'master' as the name for the initial branch. This default branch name 2025-08-14T21:21:02.6009765Z hint: is subject to change. To configure the initial branch name to use in all 2025-08-14T21:21:02.6010162Z hint: of your new repositories, which will suppress this warning, call: 2025-08-14T21:21:02.6010455Z hint: 2025-08-14T21:21:02.6010712Z hint: git config --global init.defaultBranch 2025-08-14T21:21:02.6010963Z hint: 2025-08-14T21:21:02.6011201Z hint: Names commonly chosen instead of 'master' are 'main', 'trunk' and 2025-08-14T21:21:02.6011874Z hint: 'development'. The just-created branch can be renamed via this command: 2025-08-14T21:21:02.6012178Z hint: 2025-08-14T21:21:02.6012465Z hint: git branch -m 2025-08-14T21:21:02.6027349Z Initialized empty Git repository in /home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/ 2025-08-14T21:21:02.6036866Z [command]/usr/bin/git remote add origin https://github.com/pytorch/pytorch 2025-08-14T21:21:02.6073844Z ##[endgroup] 2025-08-14T21:21:02.6074409Z ##[group]Disabling automatic garbage collection 2025-08-14T21:21:02.6076612Z [command]/usr/bin/git config --local gc.auto 0 2025-08-14T21:21:02.6102961Z ##[endgroup] 2025-08-14T21:21:02.6103358Z ##[group]Setting up auth 2025-08-14T21:21:02.6108952Z [command]/usr/bin/git config --local --name-only --get-regexp core\.sshCommand 2025-08-14T21:21:02.6138099Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'core\.sshCommand' && git config --local --unset-all 'core.sshCommand' || :" 2025-08-14T21:21:02.6500498Z [command]/usr/bin/git config --local --name-only --get-regexp http\.https\:\/\/github\.com\/\.extraheader 2025-08-14T21:21:02.6527755Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'http\.https\:\/\/github\.com\/\.extraheader' && git config --local --unset-all 'http.https://github.com/.extraheader' || :" 2025-08-14T21:21:02.6839509Z [command]/usr/bin/git config --local http.https://github.com/.extraheader AUTHORIZATION: basic *** 2025-08-14T21:21:02.6906471Z ##[endgroup] 2025-08-14T21:21:02.6906906Z ##[group]Fetching the repository 2025-08-14T21:21:02.6919985Z [command]/usr/bin/git -c protocol.version=2 fetch --prune --no-recurse-submodules origin +refs/heads/*:refs/remotes/origin/* +refs/tags/*:refs/tags/* 2025-08-14T21:21:50.0188652Z From https://github.com/pytorch/pytorch 2025-08-14T21:21:50.0189236Z * [new branch] 2.6.0.dev20241004+ -> origin/2.6.0.dev20241004+ 2025-08-14T21:21:50.0189656Z * [new branch] 5addvllmbuild -> origin/5addvllmbuild 2025-08-14T21:21:50.0190097Z * [new branch] AaronWang04_addmmfusion_perftest -> origin/AaronWang04_addmmfusion_perftest 2025-08-14T21:21:50.0190562Z * [new branch] HDCharles-2.6.0-release-notes -> origin/HDCharles-2.6.0-release-notes 2025-08-14T21:21:50.0191260Z * [new branch] JackCaoG/dynamo_make_fx_non_core_aten_ops -> origin/JackCaoG/dynamo_make_fx_non_core_aten_ops 2025-08-14T21:21:50.0191723Z * [new branch] PR-AOTInductorNoneBug -> origin/PR-AOTInductorNoneBug 2025-08-14T21:21:50.0192149Z * [new branch] PR-AOTInductorNoneBugFix -> origin/PR-AOTInductorNoneBugFix 2025-08-14T21:21:50.0192582Z * [new branch] PR-FixConfigsIssue -> origin/PR-FixConfigsIssue 2025-08-14T21:21:50.0192960Z * [new branch] PR-NoneBugFix-viable -> origin/PR-NoneBugFix-viable 2025-08-14T21:21:50.0193337Z * [new branch] PR-ResetToZero -> origin/PR-ResetToZero 2025-08-14T21:21:50.0194130Z * [new branch] Update-Flash-Packaging -> origin/Update-Flash-Packaging 2025-08-14T21:21:50.0194609Z * [new branch] add-missing-args-normalization -> origin/add-missing-args-normalization 2025-08-14T21:21:50.0195076Z * [new branch] add-user-guide-structure -> origin/add-user-guide-structure 2025-08-14T21:21:50.0195429Z * [new branch] addVllmPin -> origin/addVllmPin 2025-08-14T21:21:50.0195785Z * [new branch] add_windows_testing_back -> origin/add_windows_testing_back 2025-08-14T21:21:50.0200397Z * [new branch] addbuildvllm -> origin/addbuildvllm 2025-08-14T21:21:50.0201038Z * [new branch] addmm-heuristic -> origin/addmm-heuristic 2025-08-14T21:21:50.0201547Z * [new branch] addsimde -> origin/addsimde 2025-08-14T21:21:50.0202368Z * [new branch] addvllpinnedfile -> origin/addvllpinnedfile 2025-08-14T21:21:50.0203185Z * [new branch] adi/acl_upgrade -> origin/adi/acl_upgrade 2025-08-14T21:21:50.0203556Z * [new branch] adi/skip_slow_tests -> origin/adi/skip_slow_tests 2025-08-14T21:21:50.0203958Z * [new branch] adi/test -> origin/adi/test 2025-08-14T21:21:50.0204274Z * [new branch] adi/test_bgemm -> origin/adi/test_bgemm 2025-08-14T21:21:50.0204626Z * [new branch] adi/test_fusions -> origin/adi/test_fusions 2025-08-14T21:21:50.0204979Z * [new branch] adi/test_onednn_v3.9 -> origin/adi/test_onednn_v3.9 2025-08-14T21:21:50.0205354Z * [new branch] adi/test_presve_change -> origin/adi/test_presve_change 2025-08-14T21:21:50.0205700Z * [new branch] adi/test_timm -> origin/adi/test_timm 2025-08-14T21:21:50.0206058Z * [new branch] adi/testpresve_change -> origin/adi/testpresve_change 2025-08-14T21:21:50.0206559Z * [new branch] aditew01/test/vec_bf16 -> origin/aditew01/test/vec_bf16 2025-08-14T21:21:50.0206963Z * [new branch] ah-globalfeedback-hook -> origin/ah-globalfeedback-hook 2025-08-14T21:21:50.0213014Z * [new branch] albanD-patch-1 -> origin/albanD-patch-1 2025-08-14T21:21:50.0213437Z * [new branch] alt-disable -> origin/alt-disable 2025-08-14T21:21:50.0213844Z * [new branch] angelayi/aoti_additional_files -> origin/angelayi/aoti_additional_files 2025-08-14T21:21:50.0214267Z * [new branch] angelayi/aoti_inductor_fx -> origin/angelayi/aoti_inductor_fx 2025-08-14T21:21:50.0214727Z * [new branch] angelayi/assert_tensor_metadata_device -> origin/angelayi/assert_tensor_metadata_device 2025-08-14T21:21:50.0215170Z * [new branch] angelayi/benchmark -> origin/angelayi/benchmark 2025-08-14T21:21:50.0215570Z * [new branch] angelayi/benchmark2 -> origin/angelayi/benchmark2 2025-08-14T21:21:50.0216013Z * [new branch] angelayi/change_pytree_serialization -> origin/angelayi/change_pytree_serialization 2025-08-14T21:21:50.0216436Z * [new branch] angelayi/cpp_loader -> origin/angelayi/cpp_loader 2025-08-14T21:21:50.0216825Z * [new branch] angelayi/custom_op_subgraph -> origin/angelayi/custom_op_subgraph 2025-08-14T21:21:50.0217223Z * [new branch] angelayi/customop -> origin/angelayi/customop 2025-08-14T21:21:50.0217572Z * [new branch] angelayi/del_lib -> origin/angelayi/del_lib 2025-08-14T21:21:50.0217909Z * [new branch] angelayi/docs -> origin/angelayi/docs 2025-08-14T21:21:50.0223966Z * [new branch] angelayi/docs2 -> origin/angelayi/docs2 2025-08-14T21:21:50.0224424Z * [new branch] angelayi/fix_pt2 -> origin/angelayi/fix_pt2 2025-08-14T21:21:50.0225042Z * [new branch] angelayi/logging.bak -> origin/angelayi/logging.bak 2025-08-14T21:21:50.0225427Z * [new branch] angelayi/logging2 -> origin/angelayi/logging2 2025-08-14T21:21:50.0225809Z * [new branch] angelayi/no_so_weight -> origin/angelayi/no_so_weight 2025-08-14T21:21:50.0226181Z * [new branch] angelayi/pytree -> origin/angelayi/pytree 2025-08-14T21:21:50.0226588Z * [new branch] angelayi/save_error -> origin/angelayi/save_error 2025-08-14T21:21:50.0227009Z * [new branch] angelayi/scan_layers -> origin/angelayi/scan_layers 2025-08-14T21:21:50.0227417Z * [new branch] angelayi/symint_input -> origin/angelayi/symint_input 2025-08-14T21:21:50.0227847Z * [new branch] angelayi/tensor_nn_module_meta -> origin/angelayi/tensor_nn_module_meta 2025-08-14T21:21:50.0228375Z * [new branch] angelayi/torch_size -> origin/angelayi/torch_size 2025-08-14T21:21:50.0228741Z * [new branch] aoti-cuda-alloc -> origin/aoti-cuda-alloc 2025-08-14T21:21:50.0231294Z * [new branch] aoti_weight_sharing -> origin/aoti_weight_sharing 2025-08-14T21:21:50.0231708Z * [new branch] arsh/symint_mm_ind_decomp -> origin/arsh/symint_mm_ind_decomp 2025-08-14T21:21:50.0232157Z * [new branch] atalman-inductor-perf-cu124 -> origin/atalman-inductor-perf-cu124 2025-08-14T21:21:50.0232647Z * [new branch] atalman-inductor-perf-cu124.1 -> origin/atalman-inductor-perf-cu124.1 2025-08-14T21:21:50.0233059Z * [new branch] atalman-patch-1 -> origin/atalman-patch-1 2025-08-14T21:21:50.0233419Z * [new branch] atalman-patch-2 -> origin/atalman-patch-2 2025-08-14T21:21:50.0233755Z * [new branch] atalman-patch-3 -> origin/atalman-patch-3 2025-08-14T21:21:50.0234102Z * [new branch] atalman-patch-6 -> origin/atalman-patch-6 2025-08-14T21:21:50.0234438Z * [new branch] atalman-patch-7 -> origin/atalman-patch-7 2025-08-14T21:21:50.0234826Z * [new branch] atalman-patch-8 -> origin/atalman-patch-8 2025-08-14T21:21:50.0235184Z * [new branch] atalman_inductor_2.3.0 -> origin/atalman_inductor_2.3.0 2025-08-14T21:21:50.0235567Z * [new branch] atalman_inductor_2.3.1 -> origin/atalman_inductor_2.3.1 2025-08-14T21:21:50.0235944Z * [new branch] atalman_inductor_2.4.0 -> origin/atalman_inductor_2.4.0 2025-08-14T21:21:50.0236315Z * [new branch] atalman_inductor_2.4.x -> origin/atalman_inductor_2.4.x 2025-08-14T21:21:50.0236779Z * [new branch] autoupdate-transformers-pin-via-pr -> origin/autoupdate-transformers-pin-via-pr 2025-08-14T21:21:50.0237236Z * [new branch] backupvllm -> origin/backupvllm 2025-08-14T21:21:50.0237908Z * [new branch] base/1.5 -> origin/base/1.5 2025-08-14T21:21:50.0238693Z * [new branch] batching_sdpa_efficient_attention -> origin/batching_sdpa_efficient_attention 2025-08-14T21:21:50.0239387Z * [new branch] benchmark-updates -> origin/benchmark-updates 2025-08-14T21:21:50.0240250Z * [new branch] benchmarking-script -> origin/benchmarking-script 2025-08-14T21:21:50.0242883Z * [new branch] benjaminglass1/mark-large-tensor-tests-serial -> origin/benjaminglass1/mark-large-tensor-tests-serial 2025-08-14T21:21:50.0243485Z * [new branch] bertmaher/pinbump26 -> origin/bertmaher/pinbump26 2025-08-14T21:21:50.0243914Z * [new branch] bertrand/cutlass -> origin/bertrand/cutlass 2025-08-14T21:21:50.0244361Z * [new branch] bf/cg-log -> origin/bf/cg-log 2025-08-14T21:21:50.0245109Z * [new branch] bf/cg-remove-check -> origin/bf/cg-remove-check 2025-08-14T21:21:50.0245952Z * [new branch] bf/cg-skip-1-kernel -> origin/bf/cg-skip-1-kernel 2025-08-14T21:21:50.0246508Z * [new branch] bf/cudagraph -> origin/bf/cudagraph 2025-08-14T21:21:50.0247322Z * [new branch] bf/cudagraph-disable-input-mutation -> origin/bf/cudagraph-disable-input-mutation 2025-08-14T21:21:50.0249981Z * [new branch] bf/cudagraph-enable-input-mutation-support-benchmark -> origin/bf/cudagraph-enable-input-mutation-support-benchmark 2025-08-14T21:21:50.0250730Z * [new branch] bf/cudagraph-partition -> origin/bf/cudagraph-partition 2025-08-14T21:21:50.0251301Z * [new branch] bf/default-recompile-reason -> origin/bf/default-recompile-reason 2025-08-14T21:21:50.0251880Z * [new branch] bf/donated-buffer-bench -> origin/bf/donated-buffer-bench 2025-08-14T21:21:50.0252354Z * [new branch] bf/improve-kernel-bench -> origin/bf/improve-kernel-bench 2025-08-14T21:21:50.0252746Z * [new branch] bf/kernel-benchmark -> origin/bf/kernel-benchmark 2025-08-14T21:21:50.0253141Z * [new branch] bf/partition-doc -> origin/bf/partition-doc 2025-08-14T21:21:50.0256931Z * [new branch] bf/partition-move-cpu -> origin/bf/partition-move-cpu 2025-08-14T21:21:50.0257398Z * [new branch] bf/partition-turn-on -> origin/bf/partition-turn-on 2025-08-14T21:21:50.0257852Z * [new branch] bf/remove-check-55b0c39d -> origin/bf/remove-check-55b0c39d 2025-08-14T21:21:50.0258341Z * [new branch] bf/skip-asserts -> origin/bf/skip-asserts 2025-08-14T21:21:50.0258682Z * [new branch] bf16adamw -> origin/bf16adamw 2025-08-14T21:21:50.0259053Z * [new branch] bisect_perf_hf_T5_3acc6eac492 -> origin/bisect_perf_hf_T5_3acc6eac492 2025-08-14T21:21:50.0259540Z * [new branch] bisect_perf_hf_T5_3fcf66f61fb -> origin/bisect_perf_hf_T5_3fcf66f61fb 2025-08-14T21:21:50.0259979Z * [new branch] bisect_perf_hf_T5_4009d154129 -> origin/bisect_perf_hf_T5_4009d154129 2025-08-14T21:21:50.0263231Z * [new branch] bisect_perf_hf_T5_40d0740e73d -> origin/bisect_perf_hf_T5_40d0740e73d 2025-08-14T21:21:50.0263725Z * [new branch] bisect_perf_hf_T5_5268754e -> origin/bisect_perf_hf_T5_5268754e 2025-08-14T21:21:50.0264130Z * [new branch] bisect_perf_hf_T5_7d89a8d385c -> origin/bisect_perf_hf_T5_7d89a8d385c 2025-08-14T21:21:50.0264537Z * [new branch] bisect_perf_hf_T5_b7a25c1ee7c -> origin/bisect_perf_hf_T5_b7a25c1ee7c 2025-08-14T21:21:50.0264930Z * [new branch] bisect_perf_hf_T5_c25b201583f -> origin/bisect_perf_hf_T5_c25b201583f 2025-08-14T21:21:50.0265316Z * [new branch] bisect_perf_hf_T5_c93e57efac0 -> origin/bisect_perf_hf_T5_c93e57efac0 2025-08-14T21:21:50.0265722Z * [new branch] bisect_perf_hf_T5_ca9813ea149 -> origin/bisect_perf_hf_T5_ca9813ea149 2025-08-14T21:21:50.0266098Z * [new branch] bisect_perf_hf_T5_d65f194a -> origin/bisect_perf_hf_T5_d65f194a 2025-08-14T21:21:50.0266502Z * [new branch] bisect_perf_hf_T5_da94ab0b -> origin/bisect_perf_hf_T5_da94ab0b 2025-08-14T21:21:50.0266904Z * [new branch] bisect_perf_hf_T5_da94ab0b_new -> origin/bisect_perf_hf_T5_da94ab0b_new 2025-08-14T21:21:50.0267300Z * [new branch] bisect_perf_hf_T5_db4e8a1d8a8 -> origin/bisect_perf_hf_T5_db4e8a1d8a8 2025-08-14T21:21:50.0267695Z * [new branch] bisect_perf_hf_T5_e0d97e936a2 -> origin/bisect_perf_hf_T5_e0d97e936a2 2025-08-14T21:21:50.0268513Z * [new branch] bisect_perf_hf_T5_f23621ec563 -> origin/bisect_perf_hf_T5_f23621ec563 2025-08-14T21:21:50.0269417Z * [new branch] bowbao/bench_updates_stage -> origin/bowbao/bench_updates_stage 2025-08-14T21:21:50.0269899Z * [new branch] bowbao/dort_rewriter -> origin/bowbao/dort_rewriter 2025-08-14T21:21:50.0270751Z * [new branch] bowbao/wip_prs -> origin/bowbao/wip_prs 2025-08-14T21:21:50.0271415Z * [new branch] bowenbao/partial_min_max_reduce -> origin/bowenbao/partial_min_max_reduce 2025-08-14T21:21:50.0272372Z * [new branch] brister/always_wrapper_ir -> origin/brister/always_wrapper_ir 2025-08-14T21:21:50.0277799Z * [new branch] brister/flatten_contig -> origin/brister/flatten_contig 2025-08-14T21:21:50.0278259Z * [new branch] brister/test_block_ptr_same -> origin/brister/test_block_ptr_same 2025-08-14T21:21:50.0278733Z * [new branch] brister/tiled_reduction_no_numel_check -> origin/brister/tiled_reduction_no_numel_check 2025-08-14T21:21:50.0279137Z * [new branch] c57382a49 -> origin/c57382a49 2025-08-14T21:21:50.0279461Z * [new branch] ca_0431d47eaa -> origin/ca_0431d47eaa 2025-08-14T21:21:50.0280000Z * [new branch] ca_fix_0431d47eaa -> origin/ca_fix_0431d47eaa 2025-08-14T21:21:50.0280650Z * [new branch] camyll/revert-94bc900da97ad7f3c35b3b819bb53b23c74b581a-for-release-2.8 -> origin/camyll/revert-94bc900da97ad7f3c35b3b819bb53b23c74b581a-for-release-2.8 2025-08-14T21:21:50.0281407Z * [new branch] camyll/test_precommit_hooks_lintrunner -> origin/camyll/test_precommit_hooks_lintrunner 2025-08-14T21:21:50.0283068Z * [new branch] camyllh/cherrypick-151547-for-release28 -> origin/camyllh/cherrypick-151547-for-release28 2025-08-14T21:21:50.0283553Z * [new branch] camyllh/test_setup_hooks_push -> origin/camyllh/test_setup_hooks_push 2025-08-14T21:21:50.0284055Z * [new branch] cherry-pick-149654-by-pytorch_bot_bot_ -> origin/cherry-pick-149654-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0284558Z * [new branch] cherry-pick-151939-by-pytorch_bot_bot_ -> origin/cherry-pick-151939-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0285087Z * [new branch] cherry-pick-154174-by-pytorch_bot_bot_ -> origin/cherry-pick-154174-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0285586Z * [new branch] cherry-pick-155896-by-pytorch_bot_bot_ -> origin/cherry-pick-155896-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0286081Z * [new branch] cherry-pick-156260-by-pytorch_bot_bot_ -> origin/cherry-pick-156260-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0286592Z * [new branch] cherry-pick-156719-by-pytorch_bot_bot_ -> origin/cherry-pick-156719-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0287114Z * [new branch] cherry-pick-156876-by-pytorch_bot_bot_ -> origin/cherry-pick-156876-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0287659Z * [new branch] cherry-pick-156888-by-pytorch_bot_bot_ -> origin/cherry-pick-156888-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0288162Z * [new branch] cherry-pick-157014-by-pytorch_bot_bot_ -> origin/cherry-pick-157014-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0288765Z * [new branch] cherry-pick-157179-by-pytorch_bot_bot_ -> origin/cherry-pick-157179-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0289302Z * [new branch] cherry-pick-157453-by-pytorch_bot_bot_ -> origin/cherry-pick-157453-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0293183Z * [new branch] cherry-pick-157513-by-pytorch_bot_bot_ -> origin/cherry-pick-157513-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0293762Z * [new branch] cherry-pick-157558-by-pytorch_bot_bot_ -> origin/cherry-pick-157558-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0297465Z * [new branch] cherry-pick-157598-by-pytorch_bot_bot_ -> origin/cherry-pick-157598-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0300585Z * [new branch] cherry-pick-157600-by-pytorch_bot_bot_ -> origin/cherry-pick-157600-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0301395Z * [new branch] cherry-pick-157630-by-pytorch_bot_bot_ -> origin/cherry-pick-157630-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0305751Z * [new branch] cherry-pick-157695-by-pytorch_bot_bot_ -> origin/cherry-pick-157695-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0309455Z * [new branch] cherry-pick-157732-by-pytorch_bot_bot_ -> origin/cherry-pick-157732-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0313580Z * [new branch] cherry-pick-157733-by-pytorch_bot_bot_ -> origin/cherry-pick-157733-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0314215Z * [new branch] cherry-pick-157985-by-pytorch_bot_bot_ -> origin/cherry-pick-157985-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0314718Z * [new branch] cherry-pick-157993-by-pytorch_bot_bot_ -> origin/cherry-pick-157993-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0315204Z * [new branch] cherry-pick-158064-by-pytorch_bot_bot_ -> origin/cherry-pick-158064-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0315698Z * [new branch] cherry-pick-158152-by-pytorch_bot_bot_ -> origin/cherry-pick-158152-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0316476Z * [new branch] cherry-pick-158295-by-pytorch_bot_bot_ -> origin/cherry-pick-158295-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0316969Z * [new branch] cherry-pick-158301-by-pytorch_bot_bot_ -> origin/cherry-pick-158301-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0317452Z * [new branch] cherry-pick-158537-by-pytorch_bot_bot_ -> origin/cherry-pick-158537-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0317942Z * [new branch] cherry-pick-158572-by-pytorch_bot_bot_ -> origin/cherry-pick-158572-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0318385Z * [new branch] cherry-pick-158595 -> origin/cherry-pick-158595 2025-08-14T21:21:50.0318817Z * [new branch] cherry-pick-159181-by-pytorch_bot_bot_ -> origin/cherry-pick-159181-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0319299Z * [new branch] cherry-pick-159969-by-pytorch_bot_bot_ -> origin/cherry-pick-159969-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0319796Z * [new branch] cherry-pick-160586-by-pytorch_bot_bot_ -> origin/cherry-pick-160586-by-pytorch_bot_bot_ 2025-08-14T21:21:50.0320383Z * [new branch] cherry-pick-PR-158746 -> origin/cherry-pick-PR-158746 2025-08-14T21:21:50.0321012Z * [new branch] cherrypick-e4e2701429c17078c3c475382a8b1fa4c8a8cefc -> origin/cherrypick-e4e2701429c17078c3c475382a8b1fa4c8a8cefc 2025-08-14T21:21:50.0321564Z * [new branch] chilli/flex_vllm -> origin/chilli/flex_vllm 2025-08-14T21:21:50.0321943Z * [new branch] ckluk2-compileThread-1 -> origin/ckluk2-compileThread-1 2025-08-14T21:21:50.0322335Z * [new branch] ckluk2-compileThread-2 -> origin/ckluk2-compileThread-2 2025-08-14T21:21:50.0322724Z * [new branch] ckluk2-compileThread-64 -> origin/ckluk2-compileThread-64 2025-08-14T21:21:50.0323083Z * [new branch] ckluk2-test-1 -> origin/ckluk2-test-1 2025-08-14T21:21:50.0323423Z * [new branch] cleantest1 -> origin/cleantest1 2025-08-14T21:21:50.0323751Z * [new branch] codex-testing -> origin/codex-testing 2025-08-14T21:21:50.0324269Z * [new branch] codex/create-test-for-tensor-memory-leak-in-cudagraph -> origin/codex/create-test-for-tensor-memory-leak-in-cudagraph 2025-08-14T21:21:50.0324861Z * [new branch] codex/fix-issue-121219-in-pytorch -> origin/codex/fix-issue-121219-in-pytorch 2025-08-14T21:21:50.0325304Z * [new branch] codex/fix-issue-160415-in-pytorch -> origin/codex/fix-issue-160415-in-pytorch 2025-08-14T21:21:50.0325834Z * [new branch] codex/fix-noqengine-quantized-engine-support -> origin/codex/fix-noqengine-quantized-engine-support 2025-08-14T21:21:50.0326403Z * [new branch] codex/fix-pin_memory-error-handling -> origin/codex/fix-pin_memory-error-handling 2025-08-14T21:21:50.0326879Z * [new branch] codex/propose-fix-for-issue-160332 -> origin/codex/propose-fix-for-issue-160332 2025-08-14T21:21:50.0327519Z * [new branch] codex/refactor-lintrunner-config-to-use-uv-run -> origin/codex/refactor-lintrunner-config-to-use-uv-run 2025-08-14T21:21:50.0328132Z * [new branch] codex/verify-torch-output-and-log-results -> origin/codex/verify-torch-output-and-log-results 2025-08-14T21:21:50.0329050Z * [new branch] compile_fsdp2_disable_stream_and_event -> origin/compile_fsdp2_disable_stream_and_event 2025-08-14T21:21:50.0329504Z * [new branch] comply-with-setuptools -> origin/comply-with-setuptools 2025-08-14T21:21:50.0329866Z * [new branch] context_test -> origin/context_test 2025-08-14T21:21:50.0330212Z * [new branch] copilot/fix-157446 -> origin/copilot/fix-157446 2025-08-14T21:21:50.0330566Z * [new branch] copilot/fix-159257 -> origin/copilot/fix-159257 2025-08-14T21:21:50.0330941Z * [new branch] copy_graph -> origin/copy_graph 2025-08-14T21:21:50.0331285Z * [new branch] cpio/fix_new_ami_tests -> origin/cpio/fix_new_ami_tests 2025-08-14T21:21:50.0331632Z * [new branch] csl/3_proc_sm -> origin/csl/3_proc_sm 2025-08-14T21:21:50.0331995Z * [new branch] csl/add_file_merge_conflict_csv -> origin/csl/add_file_merge_conflict_csv 2025-08-14T21:21:50.0332393Z * [new branch] csl/always_produce_xml -> origin/csl/always_produce_xml 2025-08-14T21:21:50.0332771Z * [new branch] csl/build_test_more_procs -> origin/csl/build_test_more_procs 2025-08-14T21:21:50.0333139Z * [new branch] csl/build_test_more_procs2 -> origin/csl/build_test_more_procs2 2025-08-14T21:21:50.0333518Z * [new branch] csl/disable_flaky_cpp_test -> origin/csl/disable_flaky_cpp_test 2025-08-14T21:21:50.0333886Z * [new branch] csl/disable_periodic_test -> origin/csl/disable_periodic_test 2025-08-14T21:21:50.0334263Z * [new branch] csl/executorch_docker_fail -> origin/csl/executorch_docker_fail 2025-08-14T21:21:50.0334623Z * [new branch] csl/fix_check_alerts -> origin/csl/fix_check_alerts 2025-08-14T21:21:50.0334938Z * [new branch] csl/katex -> origin/csl/katex 2025-08-14T21:21:50.0335256Z * [new branch] csl/larger_runner -> origin/csl/larger_runner 2025-08-14T21:21:50.0335702Z * [new branch] csl/lintrunner_changed_files_removed -> origin/csl/lintrunner_changed_files_removed 2025-08-14T21:21:50.0336208Z * [new branch] csl/lintrunner_changed_files_removed_test -> origin/csl/lintrunner_changed_files_removed_test 2025-08-14T21:21:50.0336665Z * [new branch] csl/lintrunner_stuff -> origin/csl/lintrunner_stuff 2025-08-14T21:21:50.0337014Z * [new branch] csl/mps_sharding -> origin/csl/mps_sharding 2025-08-14T21:21:50.0337351Z * [new branch] csl/multistage_docker -> origin/csl/multistage_docker 2025-08-14T21:21:50.0337697Z * [new branch] csl/no_keep_goin_rocm -> origin/csl/no_keep_goin_rocm 2025-08-14T21:21:50.0338034Z * [new branch] csl/not_600_timeout -> origin/csl/not_600_timeout 2025-08-14T21:21:50.0338408Z * [new branch] csl/remove_unused_docker_images -> origin/csl/remove_unused_docker_images 2025-08-14T21:21:50.0338786Z * [new branch] csl/revert_open -> origin/csl/revert_open 2025-08-14T21:21:50.0339201Z * [new branch] csl/rocm_upload_artifacts_while_running -> origin/csl/rocm_upload_artifacts_while_running 2025-08-14T21:21:50.0339618Z * [new branch] csl/skip_build -> origin/csl/skip_build 2025-08-14T21:21:50.0339954Z * [new branch] csl/td_dynamo -> origin/csl/td_dynamo 2025-08-14T21:21:50.0340305Z * [new branch] csl/test_cuda_build_large_runner -> origin/csl/test_cuda_build_large_runner 2025-08-14T21:21:50.0340724Z * [new branch] csl/unused_docker -> origin/csl/unused_docker 2025-08-14T21:21:50.0344289Z * [new branch] csl/win_sccache -> origin/csl/win_sccache 2025-08-14T21:21:50.0344690Z * [new branch] cublasltrelax2 -> origin/cublasltrelax2 2025-08-14T21:21:50.0345017Z * [new branch] cublasrelax2 -> origin/cublasrelax2 2025-08-14T21:21:50.0345365Z * [new branch] cudnnsdparefactor -> origin/cudnnsdparefactor 2025-08-14T21:21:50.0345732Z * [new branch] custom_lowering_dict -> origin/custom_lowering_dict 2025-08-14T21:21:50.0346074Z * [new branch] czhuge_muon_dev -> origin/czhuge_muon_dev 2025-08-14T21:21:50.0348865Z * [new branch] d4l3k/delete_hook -> origin/d4l3k/delete_hook 2025-08-14T21:21:50.0349461Z * [new branch] d4l3k/dist_queue -> origin/d4l3k/dist_queue 2025-08-14T21:21:50.0349828Z * [new branch] d4l3k/wait_stream -> origin/d4l3k/wait_stream 2025-08-14T21:21:50.0350214Z * [new branch] dcp-safetensor-test-fix -> origin/dcp-safetensor-test-fix 2025-08-14T21:21:50.0350571Z * [new branch] dcp_zoc -> origin/dcp_zoc 2025-08-14T21:21:50.0350904Z * [new branch] delete-quant-docs -> origin/delete-quant-docs 2025-08-14T21:21:50.0351387Z * [new branch] dependabot/pip/dot-ci/docker/protobuf-5.29.5 -> origin/dependabot/pip/dot-ci/docker/protobuf-5.29.5 2025-08-14T21:21:50.0355254Z * [new branch] desertfire/test_cpp_wrapper -> origin/desertfire/test_cpp_wrapper 2025-08-14T21:21:50.0358594Z * [new branch] desertfire/triton-cpu-for-aarch64 -> origin/desertfire/triton-cpu-for-aarch64 2025-08-14T21:21:50.0359062Z * [new branch] dev/joona/MPSNDArrayAdd -> origin/dev/joona/MPSNDArrayAdd 2025-08-14T21:21:50.0359478Z * [new branch] dev/joona/Unranked -> origin/dev/joona/Unranked 2025-08-14T21:21:50.0359815Z * [new branch] dev/joona/cat -> origin/dev/joona/cat 2025-08-14T21:21:50.0360175Z * [new branch] dev/joona/cat_remove_graph -> origin/dev/joona/cat_remove_graph 2025-08-14T21:21:50.0360547Z * [new branch] dev/joona/embeddingbag -> origin/dev/joona/embeddingbag 2025-08-14T21:21:50.0360932Z * [new branch] dev/joona/getTensorsString -> origin/dev/joona/getTensorsString 2025-08-14T21:21:50.0364506Z * [new branch] dev/joona/maxpool2dwithindices_errmsg -> origin/dev/joona/maxpool2dwithindices_errmsg 2025-08-14T21:21:50.0365063Z * [new branch] dev/joona/mps_linear_macos14 -> origin/dev/joona/mps_linear_macos14 2025-08-14T21:21:50.0365480Z * [new branch] dev/joona/sdpa -> origin/dev/joona/sdpa 2025-08-14T21:21:50.0365954Z * [new branch] dev/joona/synchronize_benchmark -> origin/dev/joona/synchronize_benchmark 2025-08-14T21:21:50.0366376Z * [new branch] dev/joona/topk_newapi -> origin/dev/joona/topk_newapi 2025-08-14T21:21:50.0366745Z * [new branch] dev/joona/type_inf -> origin/dev/joona/type_inf 2025-08-14T21:21:50.0367118Z * [new branch] dev/joona/upsize3d -> origin/dev/joona/upsize3d 2025-08-14T21:21:50.0367449Z * [new branch] disable -> origin/disable 2025-08-14T21:21:50.0367896Z * [new branch] divyanshk-log-api-usage-datapipes-1 -> origin/divyanshk-log-api-usage-datapipes-1 2025-08-14T21:21:50.0368332Z * [new branch] e2e-baseline -> origin/e2e-baseline 2025-08-14T21:21:50.0368902Z * [new branch] embg/test_inductor_ci_128B -> origin/embg/test_inductor_ci_128B 2025-08-14T21:21:50.0369331Z * [new branch] embg/test_inductor_ci_base -> origin/embg/test_inductor_ci_base 2025-08-14T21:21:50.0373204Z * [new branch] embg/test_inductor_ci_control -> origin/embg/test_inductor_ci_control 2025-08-14T21:21:50.0373650Z * [new branch] embg/triton_l2_prefetch_128B -> origin/embg/triton_l2_prefetch_128B 2025-08-14T21:21:50.0374048Z * [new branch] embg/triton_l2_prefetch_256B -> origin/embg/triton_l2_prefetch_256B 2025-08-14T21:21:50.0374440Z * [new branch] enable-b200-benchmark -> origin/enable-b200-benchmark 2025-08-14T21:21:50.0374802Z * [new branch] eqy-patch-1 -> origin/eqy-patch-1 2025-08-14T21:21:50.0375115Z * [new branch] eqy-patch-10 -> origin/eqy-patch-10 2025-08-14T21:21:50.0377285Z * [new branch] eqy-patch-2 -> origin/eqy-patch-2 2025-08-14T21:21:50.0377648Z * [new branch] example-convert-torch.nn -> origin/example-convert-torch.nn 2025-08-14T21:21:50.0378116Z * [new branch] exclamaforte/amd-ma -> origin/exclamaforte/amd-ma 2025-08-14T21:21:50.0378541Z * [new branch] exclamaforte/bump-transformer-version -> origin/exclamaforte/bump-transformer-version 2025-08-14T21:21:50.0379046Z * [new branch] exclamaforte/combo-kernels-perf-run -> origin/exclamaforte/combo-kernels-perf-run 2025-08-14T21:21:50.0379549Z * [new branch] exclamaforte/debug-autotuner-profile -> origin/exclamaforte/debug-autotuner-profile 2025-08-14T21:21:50.0382961Z * [new branch] exclamaforte/do_bench_refactor -> origin/exclamaforte/do_bench_refactor 2025-08-14T21:21:50.0383413Z * [new branch] exclamaforte/enable-mem-dep-fusion -> origin/exclamaforte/enable-mem-dep-fusion 2025-08-14T21:21:50.0383915Z * [new branch] exclamaforte/fix-exhaustive-autotuning -> origin/exclamaforte/fix-exhaustive-autotuning 2025-08-14T21:21:50.0384430Z * [new branch] exclamaforte/fix-trace-parsing-fx-svg -> origin/exclamaforte/fix-trace-parsing-fx-svg 2025-08-14T21:21:50.0388804Z * [new branch] exclamaforte/force-pointwise-cat-perf-run -> origin/exclamaforte/force-pointwise-cat-perf-run 2025-08-14T21:21:50.0394082Z * [new branch] exclamaforte/fusion-data -> origin/exclamaforte/fusion-data 2025-08-14T21:21:50.0394763Z * [new branch] exclamaforte/gemm-benchmark-run -> origin/exclamaforte/gemm-benchmark-run 2025-08-14T21:21:50.0395225Z * [new branch] exclamaforte/gemm-export-model -> origin/exclamaforte/gemm-export-model 2025-08-14T21:21:50.0395706Z * [new branch] exclamaforte/gemm-model -> origin/exclamaforte/gemm-model 2025-08-14T21:21:50.0396226Z * [new branch] exclamaforte/gemm-model-all-data-collection -> origin/exclamaforte/gemm-model-all-data-collection 2025-08-14T21:21:50.0396724Z * [new branch] exclamaforte/gemm-to-amd -> origin/exclamaforte/gemm-to-amd 2025-08-14T21:21:50.0397126Z * [new branch] exclamaforte/just-gemm-model -> origin/exclamaforte/just-gemm-model 2025-08-14T21:21:50.0397651Z * [new branch] exclamaforte/just-gemm-model-no-refactor -> origin/exclamaforte/just-gemm-model-no-refactor 2025-08-14T21:21:50.0398137Z * [new branch] exclamaforte/memory-counter -> origin/exclamaforte/memory-counter 2025-08-14T21:21:50.0398574Z * [new branch] exclamaforte/scheduler-refactor -> origin/exclamaforte/scheduler-refactor 2025-08-14T21:21:50.0399013Z * [new branch] exclamaforte/test_cpp_wrapper_mode -> origin/exclamaforte/test_cpp_wrapper_mode 2025-08-14T21:21:50.0399485Z * [new branch] exclamaforte/update-autotune-configs -> origin/exclamaforte/update-autotune-configs 2025-08-14T21:21:50.0400006Z * [new branch] exclamaforte/update-autotune-configs-2 -> origin/exclamaforte/update-autotune-configs-2 2025-08-14T21:21:50.0400540Z * [new branch] exclamaforte/update-pandas-numpy-ci -> origin/exclamaforte/update-pandas-numpy-ci 2025-08-14T21:21:50.0401166Z * [new branch] exclamforte/gemm-model-final -> origin/exclamforte/gemm-model-final 2025-08-14T21:21:50.0401529Z * [new branch] exec -> origin/exec 2025-08-14T21:21:50.0401874Z * [new branch] experimental-mosaic -> origin/experimental-mosaic 2025-08-14T21:21:50.0402294Z * [new branch] export-D58091437 -> origin/export-D58091437 2025-08-14T21:21:50.0402773Z * [new branch] export-D61047529 -> origin/export-D61047529 2025-08-14T21:21:50.0403117Z * [new branch] export-D68846308 -> origin/export-D68846308 2025-08-14T21:21:50.0403459Z * [new branch] export-D70112642 -> origin/export-D70112642 2025-08-14T21:21:50.0403776Z * [new branch] export-D71412006 -> origin/export-D71412006 2025-08-14T21:21:50.0404103Z * [new branch] export-D72483950 -> origin/export-D72483950 2025-08-14T21:21:50.0404571Z * [new branch] export-D73042989 -> origin/export-D73042989 2025-08-14T21:21:50.0404898Z * [new branch] export-D73287751 -> origin/export-D73287751 2025-08-14T21:21:50.0405221Z * [new branch] export-D75183591 -> origin/export-D75183591 2025-08-14T21:21:50.0405557Z * [new branch] export-D75605373 -> origin/export-D75605373 2025-08-14T21:21:50.0405878Z * [new branch] export-D75617432 -> origin/export-D75617432 2025-08-14T21:21:50.0406203Z * [new branch] export-D75659965 -> origin/export-D75659965 2025-08-14T21:21:50.0406521Z * [new branch] export-D76080931 -> origin/export-D76080931 2025-08-14T21:21:50.0406850Z * [new branch] export-D76463347 -> origin/export-D76463347 2025-08-14T21:21:50.0407181Z * [new branch] export-D76797250 -> origin/export-D76797250 2025-08-14T21:21:50.0407511Z * [new branch] export-D76885271 -> origin/export-D76885271 2025-08-14T21:21:50.0407834Z * [new branch] export-D76885620 -> origin/export-D76885620 2025-08-14T21:21:50.0408155Z * [new branch] export-D76936623 -> origin/export-D76936623 2025-08-14T21:21:50.0408477Z * [new branch] export-D76958268 -> origin/export-D76958268 2025-08-14T21:21:50.0409036Z * [new branch] export-D78047846 -> origin/export-D78047846 2025-08-14T21:21:50.0409359Z * [new branch] export-D78308105 -> origin/export-D78308105 2025-08-14T21:21:50.0409863Z * [new branch] export-D78363609 -> origin/export-D78363609 2025-08-14T21:21:50.0410205Z * [new branch] export-D78375400 -> origin/export-D78375400 2025-08-14T21:21:50.0414909Z * [new branch] export-D78431075 -> origin/export-D78431075 2025-08-14T21:21:50.0423041Z * [new branch] export-D78431305 -> origin/export-D78431305 2025-08-14T21:21:50.0423673Z * [new branch] export-D78458745 -> origin/export-D78458745 2025-08-14T21:21:50.0424164Z * [new branch] export-D78524147 -> origin/export-D78524147 2025-08-14T21:21:50.0424644Z * [new branch] export-D78580107 -> origin/export-D78580107 2025-08-14T21:21:50.0425116Z * [new branch] export-D78588406 -> origin/export-D78588406 2025-08-14T21:21:50.0425580Z * [new branch] export-D78691422 -> origin/export-D78691422 2025-08-14T21:21:50.0426071Z * [new branch] export-D78758466 -> origin/export-D78758466 2025-08-14T21:21:50.0426610Z * [new branch] export-D78822171 -> origin/export-D78822171 2025-08-14T21:21:50.0427118Z * [new branch] export-D78822351 -> origin/export-D78822351 2025-08-14T21:21:50.0427777Z * [new branch] export-D78822507 -> origin/export-D78822507 2025-08-14T21:21:50.0428457Z * [new branch] export-D78826994 -> origin/export-D78826994 2025-08-14T21:21:50.0428996Z * [new branch] export-D78894142 -> origin/export-D78894142 2025-08-14T21:21:50.0429793Z * [new branch] export-D78894324 -> origin/export-D78894324 2025-08-14T21:21:50.0430347Z * [new branch] export-D78907485 -> origin/export-D78907485 2025-08-14T21:21:50.0430851Z * [new branch] export-D78929245 -> origin/export-D78929245 2025-08-14T21:21:50.0431400Z * [new branch] export-D78934925 -> origin/export-D78934925 2025-08-14T21:21:50.0431942Z * [new branch] export-D78953203 -> origin/export-D78953203 2025-08-14T21:21:50.0432493Z * [new branch] export-D78953229 -> origin/export-D78953229 2025-08-14T21:21:50.0433106Z * [new branch] export-D78957093 -> origin/export-D78957093 2025-08-14T21:21:50.0433695Z * [new branch] export-D78957389 -> origin/export-D78957389 2025-08-14T21:21:50.0434218Z * [new branch] export-D78957974 -> origin/export-D78957974 2025-08-14T21:21:50.0436797Z * [new branch] export-D78979812 -> origin/export-D78979812 2025-08-14T21:21:50.0441642Z * [new branch] export-D78996107 -> origin/export-D78996107 2025-08-14T21:21:50.0443694Z * [new branch] export-D79026433 -> origin/export-D79026433 2025-08-14T21:21:50.0444293Z * [new branch] export-D79230339 -> origin/export-D79230339 2025-08-14T21:21:50.0444889Z * [new branch] export-D79319835 -> origin/export-D79319835 2025-08-14T21:21:50.0445422Z * [new branch] export-D79328456 -> origin/export-D79328456 2025-08-14T21:21:50.0445998Z * [new branch] export-D79534608 -> origin/export-D79534608 2025-08-14T21:21:50.0446554Z * [new branch] export-D79647167 -> origin/export-D79647167 2025-08-14T21:21:50.0447105Z * [new branch] export-D79751098 -> origin/export-D79751098 2025-08-14T21:21:50.0447666Z * [new branch] export-D79785974 -> origin/export-D79785974 2025-08-14T21:21:50.0448510Z * [new branch] export-D80025417 -> origin/export-D80025417 2025-08-14T21:21:50.0453543Z * [new branch] export-D80120333 -> origin/export-D80120333 2025-08-14T21:21:50.0454123Z * [new branch] export-D80214882 -> origin/export-D80214882 2025-08-14T21:21:50.0454769Z * [new branch] exported-model-train-idempotent -> origin/exported-model-train-idempotent 2025-08-14T21:21:50.0455509Z * [new branch] ezyang/wip-aot-descriptors -> origin/ezyang/wip-aot-descriptors 2025-08-14T21:21:50.0456124Z * [new branch] fa_u8_brgemm -> origin/fa_u8_brgemm 2025-08-14T21:21:50.0458705Z * [new branch] fastmath_baseline -> origin/fastmath_baseline 2025-08-14T21:21:50.0459283Z * [new branch] fbcode/warm -> origin/fbcode/warm 2025-08-14T21:21:50.0459770Z * [new branch] fca -> origin/fca 2025-08-14T21:21:50.0466092Z * [new branch] fca2_ca5984c -> origin/fca2_ca5984c 2025-08-14T21:21:50.0466486Z * [new branch] fca5 -> origin/fca5 2025-08-14T21:21:50.0466890Z * [new branch] feature/function-numa-binding -> origin/feature/function-numa-binding 2025-08-14T21:21:50.0467348Z * [new branch] fengyuan/external-proj -> origin/fengyuan/external-proj 2025-08-14T21:21:50.0467835Z * [new branch] fengyuan/out-of-tree-xpu-ops-improve-test -> origin/fengyuan/out-of-tree-xpu-ops-improve-test 2025-08-14T21:21:50.0468638Z * [new branch] fengyuan/out-of-tree-xpu-ops-remove-dtype -> origin/fengyuan/out-of-tree-xpu-ops-remove-dtype 2025-08-14T21:21:50.0469104Z * [new branch] fengyuan/test-xpu -> origin/fengyuan/test-xpu 2025-08-14T21:21:50.0469462Z * [new branch] ffast_math_baseline -> origin/ffast_math_baseline 2025-08-14T21:21:50.0470967Z * [new branch] ffast_math_target -> origin/ffast_math_target 2025-08-14T21:21:50.0471330Z * [new branch] findhao/base_commit -> origin/findhao/base_commit 2025-08-14T21:21:50.0471684Z * [new branch] findhao/base_commit1 -> origin/findhao/base_commit1 2025-08-14T21:21:50.0472080Z * [new branch] findhao/fix-indirect-access -> origin/findhao/fix-indirect-access 2025-08-14T21:21:50.0472467Z * [new branch] findhao/multistream2 -> origin/findhao/multistream2 2025-08-14T21:21:50.0472883Z * [new branch] findhao/multistream5 -> origin/findhao/multistream5 2025-08-14T21:21:50.0473255Z * [new branch] findhao/multistream6 -> origin/findhao/multistream6 2025-08-14T21:21:50.0476438Z * [new branch] findhao/operatorbench3 -> origin/findhao/operatorbench3 2025-08-14T21:21:50.0476917Z * [new branch] findhao/operatorbench5 -> origin/findhao/operatorbench5 2025-08-14T21:21:50.0477305Z * [new branch] findhao/tritonparse -> origin/findhao/tritonparse 2025-08-14T21:21:50.0477681Z * [new branch] fix -> origin/fix 2025-08-14T21:21:50.0478050Z * [new branch] fix-ck-gemm-template-format -> origin/fix-ck-gemm-template-format 2025-08-14T21:21:50.0480911Z * [new branch] fix-config-ignore -> origin/fix-config-ignore 2025-08-14T21:21:50.0481256Z * [new branch] fix-dict-guard -> origin/fix-dict-guard 2025-08-14T21:21:50.0481630Z * [new branch] fix-distributed-warning -> origin/fix-distributed-warning 2025-08-14T21:21:50.0482042Z * [new branch] fix-inductor-periodic-0528 -> origin/fix-inductor-periodic-0528 2025-08-14T21:21:50.0482469Z * [new branch] fix-rlease-feature-template -> origin/fix-rlease-feature-template 2025-08-14T21:21:50.0482830Z * [new branch] fix_153389 -> origin/fix_153389 2025-08-14T21:21:50.0483143Z * [new branch] fixes-triage -> origin/fixes-triage 2025-08-14T21:21:50.0483507Z * [new branch] flash_decoding_cpu -> origin/flash_decoding_cpu 2025-08-14T21:21:50.0483852Z * [new branch] flex-flash -> origin/flex-flash 2025-08-14T21:21:50.0484175Z * [new branch] flex-lowering -> origin/flex-lowering 2025-08-14T21:21:50.0485980Z * [new branch] flex-warning -> origin/flex-warning 2025-08-14T21:21:50.0486374Z * [new branch] flex_attention_functorch_grad -> origin/flex_attention_functorch_grad 2025-08-14T21:21:50.0487780Z * [new branch] flex_flash -> origin/flex_flash 2025-08-14T21:21:50.0489508Z * [new branch] fmassa/fix_memeff_sharding_rule -> origin/fmassa/fix_memeff_sharding_rule 2025-08-14T21:21:50.0489969Z * [new branch] fmassa/try_fix_ac_tag_propagation -> origin/fmassa/try_fix_ac_tag_propagation 2025-08-14T21:21:50.0493183Z * [new branch] fsdp2_trace_rules -> origin/fsdp2_trace_rules 2025-08-14T21:21:50.0493751Z * [new branch] fsdpv2_3d -> origin/fsdpv2_3d 2025-08-14T21:21:50.0494207Z * [new branch] fsdpv2_3d_m1 -> origin/fsdpv2_3d_m1 2025-08-14T21:21:50.0494943Z * [new branch] fx_cpp -> origin/fx_cpp 2025-08-14T21:21:50.0495357Z * [new branch] fy/fix-win -> origin/fy/fix-win 2025-08-14T21:21:50.0503329Z * [new branch] gh/AlnisM/1/base -> origin/gh/AlnisM/1/base 2025-08-14T21:21:50.0508627Z * [new branch] gh/AlnisM/1/head -> origin/gh/AlnisM/1/head 2025-08-14T21:21:50.0510331Z * [new branch] gh/CaoE/2/base -> origin/gh/CaoE/2/base 2025-08-14T21:21:50.0510688Z * [new branch] gh/CaoE/2/head -> origin/gh/CaoE/2/head 2025-08-14T21:21:50.0511036Z * [new branch] gh/CaoE/2/orig -> origin/gh/CaoE/2/orig 2025-08-14T21:21:50.0511409Z * [new branch] gh/ColinPeppler/72/base -> origin/gh/ColinPeppler/72/base 2025-08-14T21:21:50.0511802Z * [new branch] gh/ColinPeppler/72/head -> origin/gh/ColinPeppler/72/head 2025-08-14T21:21:50.0512174Z * [new branch] gh/ColinPeppler/72/orig -> origin/gh/ColinPeppler/72/orig 2025-08-14T21:21:50.0512546Z * [new branch] gh/ColinPeppler/77/base -> origin/gh/ColinPeppler/77/base 2025-08-14T21:21:50.0513004Z * [new branch] gh/ColinPeppler/77/head -> origin/gh/ColinPeppler/77/head 2025-08-14T21:21:50.0513386Z * [new branch] gh/ColinPeppler/77/orig -> origin/gh/ColinPeppler/77/orig 2025-08-14T21:21:50.0513765Z * [new branch] gh/ColinPeppler/78/base -> origin/gh/ColinPeppler/78/base 2025-08-14T21:21:50.0514143Z * [new branch] gh/ColinPeppler/78/head -> origin/gh/ColinPeppler/78/head 2025-08-14T21:21:50.0514516Z * [new branch] gh/ColinPeppler/78/orig -> origin/gh/ColinPeppler/78/orig 2025-08-14T21:21:50.0514893Z * [new branch] gh/EikanWang/67/base -> origin/gh/EikanWang/67/base 2025-08-14T21:21:50.0515263Z * [new branch] gh/EikanWang/67/head -> origin/gh/EikanWang/67/head 2025-08-14T21:21:50.0515616Z * [new branch] gh/EikanWang/80/base -> origin/gh/EikanWang/80/base 2025-08-14T21:21:50.0515955Z * [new branch] gh/EikanWang/80/head -> origin/gh/EikanWang/80/head 2025-08-14T21:21:50.0516312Z * [new branch] gh/EikanWang/80/orig -> origin/gh/EikanWang/80/orig 2025-08-14T21:21:50.0516668Z * [new branch] gh/EikanWang/81/base -> origin/gh/EikanWang/81/base 2025-08-14T21:21:50.0517005Z * [new branch] gh/EikanWang/81/head -> origin/gh/EikanWang/81/head 2025-08-14T21:21:50.0517346Z * [new branch] gh/EikanWang/81/orig -> origin/gh/EikanWang/81/orig 2025-08-14T21:21:50.0517701Z * [new branch] gh/Gasoonjia/1/base -> origin/gh/Gasoonjia/1/base 2025-08-14T21:21:50.0521511Z * [new branch] gh/Gasoonjia/1/head -> origin/gh/Gasoonjia/1/head 2025-08-14T21:21:50.0521868Z * [new branch] gh/H-Huang/131/base -> origin/gh/H-Huang/131/base 2025-08-14T21:21:50.0522196Z * [new branch] gh/H-Huang/131/head -> origin/gh/H-Huang/131/head 2025-08-14T21:21:50.0522532Z * [new branch] gh/H-Huang/131/orig -> origin/gh/H-Huang/131/orig 2025-08-14T21:21:50.0522865Z * [new branch] gh/H-Huang/132/base -> origin/gh/H-Huang/132/base 2025-08-14T21:21:50.0523200Z * [new branch] gh/H-Huang/132/head -> origin/gh/H-Huang/132/head 2025-08-14T21:21:50.0523714Z * [new branch] gh/H-Huang/132/orig -> origin/gh/H-Huang/132/orig 2025-08-14T21:21:50.0527453Z * [new branch] gh/H-Huang/180/base -> origin/gh/H-Huang/180/base 2025-08-14T21:21:50.0527877Z * [new branch] gh/H-Huang/180/head -> origin/gh/H-Huang/180/head 2025-08-14T21:21:50.0528248Z * [new branch] gh/H-Huang/180/orig -> origin/gh/H-Huang/180/orig 2025-08-14T21:21:50.0528597Z * [new branch] gh/H-Huang/182/base -> origin/gh/H-Huang/182/base 2025-08-14T21:21:50.0529448Z * [new branch] gh/H-Huang/182/head -> origin/gh/H-Huang/182/head 2025-08-14T21:21:50.0535219Z * [new branch] gh/H-Huang/182/orig -> origin/gh/H-Huang/182/orig 2025-08-14T21:21:50.0541146Z * [new branch] gh/H-Huang/183/base -> origin/gh/H-Huang/183/base 2025-08-14T21:21:50.0541566Z * [new branch] gh/H-Huang/183/head -> origin/gh/H-Huang/183/head 2025-08-14T21:21:50.0541935Z * [new branch] gh/H-Huang/183/orig -> origin/gh/H-Huang/183/orig 2025-08-14T21:21:50.0542284Z * [new branch] gh/H-Huang/187/base -> origin/gh/H-Huang/187/base 2025-08-14T21:21:50.0542625Z * [new branch] gh/H-Huang/187/head -> origin/gh/H-Huang/187/head 2025-08-14T21:21:50.0543041Z * [new branch] gh/H-Huang/187/orig -> origin/gh/H-Huang/187/orig 2025-08-14T21:21:50.0543397Z * [new branch] gh/H-Huang/192/base -> origin/gh/H-Huang/192/base 2025-08-14T21:21:50.0543732Z * [new branch] gh/H-Huang/192/head -> origin/gh/H-Huang/192/head 2025-08-14T21:21:50.0544127Z * [new branch] gh/H-Huang/192/orig -> origin/gh/H-Huang/192/orig 2025-08-14T21:21:50.0544474Z * [new branch] gh/H-Huang/195/base -> origin/gh/H-Huang/195/base 2025-08-14T21:21:50.0544812Z * [new branch] gh/H-Huang/195/head -> origin/gh/H-Huang/195/head 2025-08-14T21:21:50.0545145Z * [new branch] gh/H-Huang/195/orig -> origin/gh/H-Huang/195/orig 2025-08-14T21:21:50.0545479Z * [new branch] gh/H-Huang/196/base -> origin/gh/H-Huang/196/base 2025-08-14T21:21:50.0548893Z * [new branch] gh/H-Huang/196/head -> origin/gh/H-Huang/196/head 2025-08-14T21:21:50.0549329Z * [new branch] gh/H-Huang/196/orig -> origin/gh/H-Huang/196/orig 2025-08-14T21:21:50.0553074Z * [new branch] gh/H-Huang/197/base -> origin/gh/H-Huang/197/base 2025-08-14T21:21:50.0553500Z * [new branch] gh/H-Huang/197/head -> origin/gh/H-Huang/197/head 2025-08-14T21:21:50.0553904Z * [new branch] gh/H-Huang/197/orig -> origin/gh/H-Huang/197/orig 2025-08-14T21:21:50.0554265Z * [new branch] gh/H-Huang/198/base -> origin/gh/H-Huang/198/base 2025-08-14T21:21:50.0554607Z * [new branch] gh/H-Huang/198/head -> origin/gh/H-Huang/198/head 2025-08-14T21:21:50.0554946Z * [new branch] gh/H-Huang/198/orig -> origin/gh/H-Huang/198/orig 2025-08-14T21:21:50.0555279Z * [new branch] gh/H-Huang/199/base -> origin/gh/H-Huang/199/base 2025-08-14T21:21:50.0555615Z * [new branch] gh/H-Huang/199/head -> origin/gh/H-Huang/199/head 2025-08-14T21:21:50.0555958Z * [new branch] gh/H-Huang/199/orig -> origin/gh/H-Huang/199/orig 2025-08-14T21:21:50.0556288Z * [new branch] gh/H-Huang/200/base -> origin/gh/H-Huang/200/base 2025-08-14T21:21:50.0556650Z * [new branch] gh/H-Huang/200/head -> origin/gh/H-Huang/200/head 2025-08-14T21:21:50.0557075Z * [new branch] gh/H-Huang/200/orig -> origin/gh/H-Huang/200/orig 2025-08-14T21:21:50.0557416Z * [new branch] gh/H-Huang/201/base -> origin/gh/H-Huang/201/base 2025-08-14T21:21:50.0557990Z * [new branch] gh/H-Huang/201/head -> origin/gh/H-Huang/201/head 2025-08-14T21:21:50.0558339Z * [new branch] gh/H-Huang/201/orig -> origin/gh/H-Huang/201/orig 2025-08-14T21:21:50.0558685Z * [new branch] gh/H-Huang/202/base -> origin/gh/H-Huang/202/base 2025-08-14T21:21:50.0559010Z * [new branch] gh/H-Huang/202/head -> origin/gh/H-Huang/202/head 2025-08-14T21:21:50.0559337Z * [new branch] gh/H-Huang/202/orig -> origin/gh/H-Huang/202/orig 2025-08-14T21:21:50.0559654Z * [new branch] gh/H-Huang/203/base -> origin/gh/H-Huang/203/base 2025-08-14T21:21:50.0559981Z * [new branch] gh/H-Huang/203/head -> origin/gh/H-Huang/203/head 2025-08-14T21:21:50.0560492Z * [new branch] gh/H-Huang/203/orig -> origin/gh/H-Huang/203/orig 2025-08-14T21:21:50.0562328Z * [new branch] gh/H-Huang/204/base -> origin/gh/H-Huang/204/base 2025-08-14T21:21:50.0562654Z * [new branch] gh/H-Huang/204/head -> origin/gh/H-Huang/204/head 2025-08-14T21:21:50.0562978Z * [new branch] gh/H-Huang/204/orig -> origin/gh/H-Huang/204/orig 2025-08-14T21:21:50.0563301Z * [new branch] gh/H-Huang/205/base -> origin/gh/H-Huang/205/base 2025-08-14T21:21:50.0563643Z * [new branch] gh/H-Huang/205/head -> origin/gh/H-Huang/205/head 2025-08-14T21:21:50.0563971Z * [new branch] gh/H-Huang/205/orig -> origin/gh/H-Huang/205/orig 2025-08-14T21:21:50.0564938Z * [new branch] gh/H-Huang/206/base -> origin/gh/H-Huang/206/base 2025-08-14T21:21:50.0565769Z * [new branch] gh/H-Huang/206/head -> origin/gh/H-Huang/206/head 2025-08-14T21:21:50.0566399Z * [new branch] gh/H-Huang/206/orig -> origin/gh/H-Huang/206/orig 2025-08-14T21:21:50.0567826Z * [new branch] gh/H-Huang/207/base -> origin/gh/H-Huang/207/base 2025-08-14T21:21:50.0568481Z * [new branch] gh/H-Huang/207/head -> origin/gh/H-Huang/207/head 2025-08-14T21:21:50.0570222Z * [new branch] gh/H-Huang/207/orig -> origin/gh/H-Huang/207/orig 2025-08-14T21:21:50.0571172Z * [new branch] gh/H-Huang/208/base -> origin/gh/H-Huang/208/base 2025-08-14T21:21:50.0573068Z * [new branch] gh/H-Huang/208/head -> origin/gh/H-Huang/208/head 2025-08-14T21:21:50.0576096Z * [new branch] gh/H-Huang/208/orig -> origin/gh/H-Huang/208/orig 2025-08-14T21:21:50.0576632Z * [new branch] gh/H-Huang/209/base -> origin/gh/H-Huang/209/base 2025-08-14T21:21:50.0580098Z * [new branch] gh/H-Huang/209/head -> origin/gh/H-Huang/209/head 2025-08-14T21:21:50.0580734Z * [new branch] gh/H-Huang/209/orig -> origin/gh/H-Huang/209/orig 2025-08-14T21:21:50.0581362Z * [new branch] gh/IvanKobzarev/107/base -> origin/gh/IvanKobzarev/107/base 2025-08-14T21:21:50.0582000Z * [new branch] gh/IvanKobzarev/107/head -> origin/gh/IvanKobzarev/107/head 2025-08-14T21:21:50.0582671Z * [new branch] gh/IvanKobzarev/107/orig -> origin/gh/IvanKobzarev/107/orig 2025-08-14T21:21:50.0584411Z * [new branch] gh/IvanKobzarev/110/base -> origin/gh/IvanKobzarev/110/base 2025-08-14T21:21:50.0584849Z * [new branch] gh/IvanKobzarev/110/head -> origin/gh/IvanKobzarev/110/head 2025-08-14T21:21:50.0585233Z * [new branch] gh/IvanKobzarev/110/orig -> origin/gh/IvanKobzarev/110/orig 2025-08-14T21:21:50.0590614Z * [new branch] gh/IvanKobzarev/111/base -> origin/gh/IvanKobzarev/111/base 2025-08-14T21:21:50.0596279Z * [new branch] gh/IvanKobzarev/111/head -> origin/gh/IvanKobzarev/111/head 2025-08-14T21:21:50.0600713Z * [new branch] gh/IvanKobzarev/111/orig -> origin/gh/IvanKobzarev/111/orig 2025-08-14T21:21:50.0601164Z * [new branch] gh/IvanKobzarev/112/base -> origin/gh/IvanKobzarev/112/base 2025-08-14T21:21:50.0601541Z * [new branch] gh/IvanKobzarev/112/head -> origin/gh/IvanKobzarev/112/head 2025-08-14T21:21:50.0601917Z * [new branch] gh/IvanKobzarev/112/orig -> origin/gh/IvanKobzarev/112/orig 2025-08-14T21:21:50.0602280Z * [new branch] gh/IvanKobzarev/115/base -> origin/gh/IvanKobzarev/115/base 2025-08-14T21:21:50.0602803Z * [new branch] gh/IvanKobzarev/115/head -> origin/gh/IvanKobzarev/115/head 2025-08-14T21:21:50.0603218Z * [new branch] gh/IvanKobzarev/115/orig -> origin/gh/IvanKobzarev/115/orig 2025-08-14T21:21:50.0603627Z * [new branch] gh/IvanKobzarev/116/base -> origin/gh/IvanKobzarev/116/base 2025-08-14T21:21:50.0604223Z * [new branch] gh/IvanKobzarev/116/head -> origin/gh/IvanKobzarev/116/head 2025-08-14T21:21:50.0604621Z * [new branch] gh/IvanKobzarev/116/orig -> origin/gh/IvanKobzarev/116/orig 2025-08-14T21:21:50.0605015Z * [new branch] gh/IvanKobzarev/118/base -> origin/gh/IvanKobzarev/118/base 2025-08-14T21:21:50.0605401Z * [new branch] gh/IvanKobzarev/118/head -> origin/gh/IvanKobzarev/118/head 2025-08-14T21:21:50.0605798Z * [new branch] gh/IvanKobzarev/118/orig -> origin/gh/IvanKobzarev/118/orig 2025-08-14T21:21:50.0606189Z * [new branch] gh/IvanKobzarev/124/base -> origin/gh/IvanKobzarev/124/base 2025-08-14T21:21:50.0606567Z * [new branch] gh/IvanKobzarev/124/head -> origin/gh/IvanKobzarev/124/head 2025-08-14T21:21:50.0606918Z * [new branch] gh/IvanKobzarev/124/orig -> origin/gh/IvanKobzarev/124/orig 2025-08-14T21:21:50.0607381Z * [new branch] gh/IvanKobzarev/126/base -> origin/gh/IvanKobzarev/126/base 2025-08-14T21:21:50.0607776Z * [new branch] gh/IvanKobzarev/126/head -> origin/gh/IvanKobzarev/126/head 2025-08-14T21:21:50.0608173Z * [new branch] gh/IvanKobzarev/126/orig -> origin/gh/IvanKobzarev/126/orig 2025-08-14T21:21:50.0608572Z * [new branch] gh/IvanKobzarev/127/base -> origin/gh/IvanKobzarev/127/base 2025-08-14T21:21:50.0609180Z * [new branch] gh/IvanKobzarev/127/head -> origin/gh/IvanKobzarev/127/head 2025-08-14T21:21:50.0609566Z * [new branch] gh/IvanKobzarev/127/orig -> origin/gh/IvanKobzarev/127/orig 2025-08-14T21:21:50.0609927Z * [new branch] gh/IvanKobzarev/128/base -> origin/gh/IvanKobzarev/128/base 2025-08-14T21:21:50.0610307Z * [new branch] gh/IvanKobzarev/128/head -> origin/gh/IvanKobzarev/128/head 2025-08-14T21:21:50.0610697Z * [new branch] gh/IvanKobzarev/128/orig -> origin/gh/IvanKobzarev/128/orig 2025-08-14T21:21:50.0611081Z * [new branch] gh/IvanKobzarev/129/base -> origin/gh/IvanKobzarev/129/base 2025-08-14T21:21:50.0611475Z * [new branch] gh/IvanKobzarev/129/head -> origin/gh/IvanKobzarev/129/head 2025-08-14T21:21:50.0611860Z * [new branch] gh/IvanKobzarev/129/orig -> origin/gh/IvanKobzarev/129/orig 2025-08-14T21:21:50.0612248Z * [new branch] gh/IvanKobzarev/130/base -> origin/gh/IvanKobzarev/130/base 2025-08-14T21:21:50.0612668Z * [new branch] gh/IvanKobzarev/130/head -> origin/gh/IvanKobzarev/130/head 2025-08-14T21:21:50.0615144Z * [new branch] gh/IvanKobzarev/130/orig -> origin/gh/IvanKobzarev/130/orig 2025-08-14T21:21:50.0615590Z * [new branch] gh/IvanKobzarev/131/base -> origin/gh/IvanKobzarev/131/base 2025-08-14T21:21:50.0615987Z * [new branch] gh/IvanKobzarev/131/head -> origin/gh/IvanKobzarev/131/head 2025-08-14T21:21:50.0616929Z * [new branch] gh/IvanKobzarev/131/orig -> origin/gh/IvanKobzarev/131/orig 2025-08-14T21:21:50.0617859Z * [new branch] gh/IvanKobzarev/132/base -> origin/gh/IvanKobzarev/132/base 2025-08-14T21:21:50.0618967Z * [new branch] gh/IvanKobzarev/132/head -> origin/gh/IvanKobzarev/132/head 2025-08-14T21:21:50.0619905Z * [new branch] gh/IvanKobzarev/132/orig -> origin/gh/IvanKobzarev/132/orig 2025-08-14T21:21:50.0622177Z * [new branch] gh/IvanKobzarev/133/base -> origin/gh/IvanKobzarev/133/base 2025-08-14T21:21:50.0623246Z * [new branch] gh/IvanKobzarev/133/head -> origin/gh/IvanKobzarev/133/head 2025-08-14T21:21:50.0624240Z * [new branch] gh/IvanKobzarev/133/orig -> origin/gh/IvanKobzarev/133/orig 2025-08-14T21:21:50.0626092Z * [new branch] gh/IvanKobzarev/134/base -> origin/gh/IvanKobzarev/134/base 2025-08-14T21:21:50.0627148Z * [new branch] gh/IvanKobzarev/134/head -> origin/gh/IvanKobzarev/134/head 2025-08-14T21:21:50.0627572Z * [new branch] gh/IvanKobzarev/134/orig -> origin/gh/IvanKobzarev/134/orig 2025-08-14T21:21:50.0629900Z * [new branch] gh/IvanKobzarev/135/base -> origin/gh/IvanKobzarev/135/base 2025-08-14T21:21:50.0630344Z * [new branch] gh/IvanKobzarev/135/head -> origin/gh/IvanKobzarev/135/head 2025-08-14T21:21:50.0631167Z * [new branch] gh/IvanKobzarev/135/orig -> origin/gh/IvanKobzarev/135/orig 2025-08-14T21:21:50.0633096Z * [new branch] gh/NikhilAPatel/1/base -> origin/gh/NikhilAPatel/1/base 2025-08-14T21:21:50.0634231Z * [new branch] gh/NikhilAPatel/1/head -> origin/gh/NikhilAPatel/1/head 2025-08-14T21:21:50.0635907Z * [new branch] gh/NikhilAPatel/16/base -> origin/gh/NikhilAPatel/16/base 2025-08-14T21:21:50.0636590Z * [new branch] gh/NikhilAPatel/16/head -> origin/gh/NikhilAPatel/16/head 2025-08-14T21:21:50.0637645Z * [new branch] gh/NikhilAPatel/16/orig -> origin/gh/NikhilAPatel/16/orig 2025-08-14T21:21:50.0639168Z * [new branch] gh/NikhilAPatel/18/base -> origin/gh/NikhilAPatel/18/base 2025-08-14T21:21:50.0639996Z * [new branch] gh/NikhilAPatel/18/head -> origin/gh/NikhilAPatel/18/head 2025-08-14T21:21:50.0641000Z * [new branch] gh/NikhilAPatel/18/orig -> origin/gh/NikhilAPatel/18/orig 2025-08-14T21:21:50.0643831Z * [new branch] gh/NikhilAPatel/19/base -> origin/gh/NikhilAPatel/19/base 2025-08-14T21:21:50.0644479Z * [new branch] gh/NikhilAPatel/19/head -> origin/gh/NikhilAPatel/19/head 2025-08-14T21:21:50.0645142Z * [new branch] gh/NikhilAPatel/19/orig -> origin/gh/NikhilAPatel/19/orig 2025-08-14T21:21:50.0645765Z * [new branch] gh/NikhilAPatel/2/base -> origin/gh/NikhilAPatel/2/base 2025-08-14T21:21:50.0646432Z * [new branch] gh/NikhilAPatel/2/head -> origin/gh/NikhilAPatel/2/head 2025-08-14T21:21:50.0648164Z * [new branch] gh/NikhilAPatel/4/base -> origin/gh/NikhilAPatel/4/base 2025-08-14T21:21:50.0649223Z * [new branch] gh/NikhilAPatel/4/head -> origin/gh/NikhilAPatel/4/head 2025-08-14T21:21:50.0654469Z * [new branch] gh/NikhilAPatel/8/base -> origin/gh/NikhilAPatel/8/base 2025-08-14T21:21:50.0655103Z * [new branch] gh/NikhilAPatel/8/head -> origin/gh/NikhilAPatel/8/head 2025-08-14T21:21:50.0655735Z * [new branch] gh/NikhilAPatel/8/orig -> origin/gh/NikhilAPatel/8/orig 2025-08-14T21:21:50.0656369Z * [new branch] gh/NikhilAPatel/9/base -> origin/gh/NikhilAPatel/9/base 2025-08-14T21:21:50.0657011Z * [new branch] gh/NikhilAPatel/9/head -> origin/gh/NikhilAPatel/9/head 2025-08-14T21:21:50.0657661Z * [new branch] gh/NikhilAPatel/9/orig -> origin/gh/NikhilAPatel/9/orig 2025-08-14T21:21:50.0660276Z * [new branch] gh/PaliC/1/base -> origin/gh/PaliC/1/base 2025-08-14T21:21:50.0660859Z * [new branch] gh/PaliC/1/head -> origin/gh/PaliC/1/head 2025-08-14T21:21:50.0661433Z * [new branch] gh/PaliC/1/orig -> origin/gh/PaliC/1/orig 2025-08-14T21:21:50.0662007Z * [new branch] gh/PaliC/12/base -> origin/gh/PaliC/12/base 2025-08-14T21:21:50.0665327Z * [new branch] gh/PaliC/12/head -> origin/gh/PaliC/12/head 2025-08-14T21:21:50.0666003Z * [new branch] gh/PaliC/12/orig -> origin/gh/PaliC/12/orig 2025-08-14T21:21:50.0669026Z * [new branch] gh/PaliC/13/base -> origin/gh/PaliC/13/base 2025-08-14T21:21:50.0669692Z * [new branch] gh/PaliC/13/head -> origin/gh/PaliC/13/head 2025-08-14T21:21:50.0674246Z * [new branch] gh/PaliC/13/orig -> origin/gh/PaliC/13/orig 2025-08-14T21:21:50.0675015Z * [new branch] gh/PaliC/14/base -> origin/gh/PaliC/14/base 2025-08-14T21:21:50.0681125Z * [new branch] gh/PaliC/14/head -> origin/gh/PaliC/14/head 2025-08-14T21:21:50.0682958Z * [new branch] gh/PaliC/14/orig -> origin/gh/PaliC/14/orig 2025-08-14T21:21:50.0683319Z * [new branch] gh/PaliC/15/base -> origin/gh/PaliC/15/base 2025-08-14T21:21:50.0683668Z * [new branch] gh/PaliC/15/head -> origin/gh/PaliC/15/head 2025-08-14T21:21:50.0684007Z * [new branch] gh/PaliC/15/orig -> origin/gh/PaliC/15/orig 2025-08-14T21:21:50.0684346Z * [new branch] gh/PaliC/16/base -> origin/gh/PaliC/16/base 2025-08-14T21:21:50.0684677Z * [new branch] gh/PaliC/16/head -> origin/gh/PaliC/16/head 2025-08-14T21:21:50.0685038Z * [new branch] gh/PaliC/16/orig -> origin/gh/PaliC/16/orig 2025-08-14T21:21:50.0685568Z * [new branch] gh/PaliC/17/base -> origin/gh/PaliC/17/base 2025-08-14T21:21:50.0685910Z * [new branch] gh/PaliC/17/head -> origin/gh/PaliC/17/head 2025-08-14T21:21:50.0686249Z * [new branch] gh/PaliC/17/orig -> origin/gh/PaliC/17/orig 2025-08-14T21:21:50.0686594Z * [new branch] gh/PaliC/18/base -> origin/gh/PaliC/18/base 2025-08-14T21:21:50.0686944Z * [new branch] gh/PaliC/18/head -> origin/gh/PaliC/18/head 2025-08-14T21:21:50.0687289Z * [new branch] gh/PaliC/18/orig -> origin/gh/PaliC/18/orig 2025-08-14T21:21:50.0687629Z * [new branch] gh/PaliC/19/base -> origin/gh/PaliC/19/base 2025-08-14T21:21:50.0687970Z * [new branch] gh/PaliC/19/head -> origin/gh/PaliC/19/head 2025-08-14T21:21:50.0688312Z * [new branch] gh/PaliC/19/orig -> origin/gh/PaliC/19/orig 2025-08-14T21:21:50.0689059Z * [new branch] gh/PaliC/2/base -> origin/gh/PaliC/2/base 2025-08-14T21:21:50.0689423Z * [new branch] gh/PaliC/2/head -> origin/gh/PaliC/2/head 2025-08-14T21:21:50.0689788Z * [new branch] gh/PaliC/2/orig -> origin/gh/PaliC/2/orig 2025-08-14T21:21:50.0690149Z * [new branch] gh/PaliC/20/base -> origin/gh/PaliC/20/base 2025-08-14T21:21:50.0690478Z * [new branch] gh/PaliC/20/head -> origin/gh/PaliC/20/head 2025-08-14T21:21:50.0690815Z * [new branch] gh/PaliC/20/orig -> origin/gh/PaliC/20/orig 2025-08-14T21:21:50.0693237Z * [new branch] gh/PaliC/21/base -> origin/gh/PaliC/21/base 2025-08-14T21:21:50.0697173Z * [new branch] gh/PaliC/21/head -> origin/gh/PaliC/21/head 2025-08-14T21:21:50.0701562Z * [new branch] gh/PaliC/21/orig -> origin/gh/PaliC/21/orig 2025-08-14T21:21:50.0704746Z * [new branch] gh/PaliC/22/base -> origin/gh/PaliC/22/base 2025-08-14T21:21:50.0705130Z * [new branch] gh/PaliC/22/head -> origin/gh/PaliC/22/head 2025-08-14T21:21:50.0705539Z * [new branch] gh/PaliC/22/orig -> origin/gh/PaliC/22/orig 2025-08-14T21:21:50.0705876Z * [new branch] gh/PaliC/23/base -> origin/gh/PaliC/23/base 2025-08-14T21:21:50.0706220Z * [new branch] gh/PaliC/23/head -> origin/gh/PaliC/23/head 2025-08-14T21:21:50.0706550Z * [new branch] gh/PaliC/23/orig -> origin/gh/PaliC/23/orig 2025-08-14T21:21:50.0706877Z * [new branch] gh/PaliC/24/base -> origin/gh/PaliC/24/base 2025-08-14T21:21:50.0707201Z * [new branch] gh/PaliC/24/head -> origin/gh/PaliC/24/head 2025-08-14T21:21:50.0707532Z * [new branch] gh/PaliC/24/orig -> origin/gh/PaliC/24/orig 2025-08-14T21:21:50.0708242Z * [new branch] gh/PaulZhang12/17/base -> origin/gh/PaulZhang12/17/base 2025-08-14T21:21:50.0708636Z * [new branch] gh/PaulZhang12/17/head -> origin/gh/PaulZhang12/17/head 2025-08-14T21:21:50.0709013Z * [new branch] gh/PaulZhang12/18/base -> origin/gh/PaulZhang12/18/base 2025-08-14T21:21:50.0709395Z * [new branch] gh/PaulZhang12/18/head -> origin/gh/PaulZhang12/18/head 2025-08-14T21:21:50.0709765Z * [new branch] gh/PaulZhang12/18/orig -> origin/gh/PaulZhang12/18/orig 2025-08-14T21:21:50.0710134Z * [new branch] gh/PaulZhang12/19/base -> origin/gh/PaulZhang12/19/base 2025-08-14T21:21:50.0710499Z * [new branch] gh/PaulZhang12/19/head -> origin/gh/PaulZhang12/19/head 2025-08-14T21:21:50.0710858Z * [new branch] gh/PaulZhang12/19/orig -> origin/gh/PaulZhang12/19/orig 2025-08-14T21:21:50.0711229Z * [new branch] gh/PaulZhang12/20/base -> origin/gh/PaulZhang12/20/base 2025-08-14T21:21:50.0711670Z * [new branch] gh/PaulZhang12/20/head -> origin/gh/PaulZhang12/20/head 2025-08-14T21:21:50.0712029Z * [new branch] gh/PaulZhang12/20/orig -> origin/gh/PaulZhang12/20/orig 2025-08-14T21:21:50.0712392Z * [new branch] gh/PaulZhang12/21/base -> origin/gh/PaulZhang12/21/base 2025-08-14T21:21:50.0712754Z * [new branch] gh/PaulZhang12/21/head -> origin/gh/PaulZhang12/21/head 2025-08-14T21:21:50.0718117Z * [new branch] gh/PaulZhang12/21/orig -> origin/gh/PaulZhang12/21/orig 2025-08-14T21:21:50.0722029Z * [new branch] gh/PaulZhang12/22/base -> origin/gh/PaulZhang12/22/base 2025-08-14T21:21:50.0722522Z * [new branch] gh/PaulZhang12/22/head -> origin/gh/PaulZhang12/22/head 2025-08-14T21:21:50.0722933Z * [new branch] gh/PaulZhang12/22/orig -> origin/gh/PaulZhang12/22/orig 2025-08-14T21:21:50.0723366Z * [new branch] gh/SamGinzburg/11/base -> origin/gh/SamGinzburg/11/base 2025-08-14T21:21:50.0723761Z * [new branch] gh/SamGinzburg/11/head -> origin/gh/SamGinzburg/11/head 2025-08-14T21:21:50.0724178Z * [new branch] gh/Sidharth123-cpu/24/base -> origin/gh/Sidharth123-cpu/24/base 2025-08-14T21:21:50.0724572Z * [new branch] gh/Sidharth123-cpu/25/base -> origin/gh/Sidharth123-cpu/25/base 2025-08-14T21:21:50.0724957Z * [new branch] gh/Sidharth123-cpu/26/base -> origin/gh/Sidharth123-cpu/26/base 2025-08-14T21:21:50.0725348Z * [new branch] gh/Sidharth123-cpu/27/base -> origin/gh/Sidharth123-cpu/27/base 2025-08-14T21:21:50.0725728Z * [new branch] gh/Sidharth123-cpu/42/base -> origin/gh/Sidharth123-cpu/42/base 2025-08-14T21:21:50.0726116Z * [new branch] gh/Sidharth123-cpu/42/head -> origin/gh/Sidharth123-cpu/42/head 2025-08-14T21:21:50.0726500Z * [new branch] gh/Sidharth123-cpu/42/orig -> origin/gh/Sidharth123-cpu/42/orig 2025-08-14T21:21:50.0726896Z * [new branch] gh/Sidharth123-cpu/43/base -> origin/gh/Sidharth123-cpu/43/base 2025-08-14T21:21:50.0727277Z * [new branch] gh/Sidharth123-cpu/43/head -> origin/gh/Sidharth123-cpu/43/head 2025-08-14T21:21:50.0727655Z * [new branch] gh/Sidharth123-cpu/43/orig -> origin/gh/Sidharth123-cpu/43/orig 2025-08-14T21:21:50.0728041Z * [new branch] gh/Sidharth123-cpu/44/base -> origin/gh/Sidharth123-cpu/44/base 2025-08-14T21:21:50.0728434Z * [new branch] gh/Sidharth123-cpu/44/head -> origin/gh/Sidharth123-cpu/44/head 2025-08-14T21:21:50.0729722Z * [new branch] gh/Sidharth123-cpu/44/orig -> origin/gh/Sidharth123-cpu/44/orig 2025-08-14T21:21:50.0733118Z * [new branch] gh/Sidharth123-cpu/45/base -> origin/gh/Sidharth123-cpu/45/base 2025-08-14T21:21:50.0733794Z * [new branch] gh/Sidharth123-cpu/45/head -> origin/gh/Sidharth123-cpu/45/head 2025-08-14T21:21:50.0734610Z * [new branch] gh/Sidharth123-cpu/45/orig -> origin/gh/Sidharth123-cpu/45/orig 2025-08-14T21:21:50.0736926Z * [new branch] gh/StrongerXi/1/base -> origin/gh/StrongerXi/1/base 2025-08-14T21:21:50.0737550Z * [new branch] gh/StrongerXi/1/head -> origin/gh/StrongerXi/1/head 2025-08-14T21:21:50.0738177Z * [new branch] gh/StrongerXi/103/base -> origin/gh/StrongerXi/103/base 2025-08-14T21:21:50.0738809Z * [new branch] gh/StrongerXi/103/head -> origin/gh/StrongerXi/103/head 2025-08-14T21:21:50.0743183Z * [new branch] gh/StrongerXi/103/orig -> origin/gh/StrongerXi/103/orig 2025-08-14T21:21:50.0743894Z * [new branch] gh/StrongerXi/133/base -> origin/gh/StrongerXi/133/base 2025-08-14T21:21:50.0747523Z * [new branch] gh/StrongerXi/133/head -> origin/gh/StrongerXi/133/head 2025-08-14T21:21:50.0748241Z * [new branch] gh/StrongerXi/133/orig -> origin/gh/StrongerXi/133/orig 2025-08-14T21:21:50.0751431Z * [new branch] gh/StrongerXi/134/base -> origin/gh/StrongerXi/134/base 2025-08-14T21:21:50.0752059Z * [new branch] gh/StrongerXi/134/head -> origin/gh/StrongerXi/134/head 2025-08-14T21:21:50.0752679Z * [new branch] gh/StrongerXi/134/orig -> origin/gh/StrongerXi/134/orig 2025-08-14T21:21:50.0753291Z * [new branch] gh/StrongerXi/135/base -> origin/gh/StrongerXi/135/base 2025-08-14T21:21:50.0753886Z * [new branch] gh/StrongerXi/135/head -> origin/gh/StrongerXi/135/head 2025-08-14T21:21:50.0754487Z * [new branch] gh/StrongerXi/135/orig -> origin/gh/StrongerXi/135/orig 2025-08-14T21:21:50.0755104Z * [new branch] gh/StrongerXi/136/base -> origin/gh/StrongerXi/136/base 2025-08-14T21:21:50.0755721Z * [new branch] gh/StrongerXi/136/head -> origin/gh/StrongerXi/136/head 2025-08-14T21:21:50.0756814Z * [new branch] gh/StrongerXi/136/orig -> origin/gh/StrongerXi/136/orig 2025-08-14T21:21:50.0757456Z * [new branch] gh/StrongerXi/137/base -> origin/gh/StrongerXi/137/base 2025-08-14T21:21:50.0758073Z * [new branch] gh/StrongerXi/137/head -> origin/gh/StrongerXi/137/head 2025-08-14T21:21:50.0758682Z * [new branch] gh/StrongerXi/137/orig -> origin/gh/StrongerXi/137/orig 2025-08-14T21:21:50.0759289Z * [new branch] gh/StrongerXi/138/base -> origin/gh/StrongerXi/138/base 2025-08-14T21:21:50.0759886Z * [new branch] gh/StrongerXi/138/head -> origin/gh/StrongerXi/138/head 2025-08-14T21:21:50.0761199Z * [new branch] gh/StrongerXi/138/orig -> origin/gh/StrongerXi/138/orig 2025-08-14T21:21:50.0761850Z * [new branch] gh/StrongerXi/71/base -> origin/gh/StrongerXi/71/base 2025-08-14T21:21:50.0762481Z * [new branch] gh/StrongerXi/71/head -> origin/gh/StrongerXi/71/head 2025-08-14T21:21:50.0763112Z * [new branch] gh/StrongerXi/72/base -> origin/gh/StrongerXi/72/base 2025-08-14T21:21:50.0763707Z * [new branch] gh/StrongerXi/72/head -> origin/gh/StrongerXi/72/head 2025-08-14T21:21:50.0764353Z * [new branch] gh/XilunWu/131/base -> origin/gh/XilunWu/131/base 2025-08-14T21:21:50.0764965Z * [new branch] gh/XilunWu/131/head -> origin/gh/XilunWu/131/head 2025-08-14T21:21:50.0765545Z * [new branch] gh/XilunWu/131/orig -> origin/gh/XilunWu/131/orig 2025-08-14T21:21:50.0766138Z * [new branch] gh/XilunWu/133/base -> origin/gh/XilunWu/133/base 2025-08-14T21:21:50.0766716Z * [new branch] gh/XilunWu/133/head -> origin/gh/XilunWu/133/head 2025-08-14T21:21:50.0767288Z * [new branch] gh/XilunWu/133/orig -> origin/gh/XilunWu/133/orig 2025-08-14T21:21:50.0767888Z * [new branch] gh/XilunWu/136/base -> origin/gh/XilunWu/136/base 2025-08-14T21:21:50.0769303Z * [new branch] gh/XilunWu/136/head -> origin/gh/XilunWu/136/head 2025-08-14T21:21:50.0771801Z * [new branch] gh/XilunWu/136/orig -> origin/gh/XilunWu/136/orig 2025-08-14T21:21:50.0772431Z * [new branch] gh/XilunWu/139/base -> origin/gh/XilunWu/139/base 2025-08-14T21:21:50.0773351Z * [new branch] gh/XilunWu/139/head -> origin/gh/XilunWu/139/head 2025-08-14T21:21:50.0773946Z * [new branch] gh/XilunWu/139/orig -> origin/gh/XilunWu/139/orig 2025-08-14T21:21:50.0777214Z * [new branch] gh/XilunWu/143/base -> origin/gh/XilunWu/143/base 2025-08-14T21:21:50.0777824Z * [new branch] gh/XilunWu/143/head -> origin/gh/XilunWu/143/head 2025-08-14T21:21:50.0778427Z * [new branch] gh/XilunWu/143/orig -> origin/gh/XilunWu/143/orig 2025-08-14T21:21:50.0779097Z * [new branch] gh/XilunWu/144/base -> origin/gh/XilunWu/144/base 2025-08-14T21:21:50.0781568Z * [new branch] gh/XilunWu/144/head -> origin/gh/XilunWu/144/head 2025-08-14T21:21:50.0782161Z * [new branch] gh/XilunWu/144/orig -> origin/gh/XilunWu/144/orig 2025-08-14T21:21:50.0782715Z * [new branch] gh/XilunWu/145/base -> origin/gh/XilunWu/145/base 2025-08-14T21:21:50.0783281Z * [new branch] gh/XilunWu/145/head -> origin/gh/XilunWu/145/head 2025-08-14T21:21:50.0783858Z * [new branch] gh/XilunWu/145/orig -> origin/gh/XilunWu/145/orig 2025-08-14T21:21:50.0786162Z * [new branch] gh/XilunWu/146/base -> origin/gh/XilunWu/146/base 2025-08-14T21:21:50.0786727Z * [new branch] gh/XilunWu/146/head -> origin/gh/XilunWu/146/head 2025-08-14T21:21:50.0787306Z * [new branch] gh/XilunWu/146/orig -> origin/gh/XilunWu/146/orig 2025-08-14T21:21:50.0787885Z * [new branch] gh/XilunWu/147/base -> origin/gh/XilunWu/147/base 2025-08-14T21:21:50.0792512Z * [new branch] gh/XilunWu/147/head -> origin/gh/XilunWu/147/head 2025-08-14T21:21:50.0793099Z * [new branch] gh/XilunWu/147/orig -> origin/gh/XilunWu/147/orig 2025-08-14T21:21:50.0793658Z * [new branch] gh/XilunWu/148/base -> origin/gh/XilunWu/148/base 2025-08-14T21:21:50.0794237Z * [new branch] gh/XilunWu/148/head -> origin/gh/XilunWu/148/head 2025-08-14T21:21:50.0794824Z * [new branch] gh/XilunWu/148/orig -> origin/gh/XilunWu/148/orig 2025-08-14T21:21:50.0795394Z * [new branch] gh/XilunWu/149/base -> origin/gh/XilunWu/149/base 2025-08-14T21:21:50.0795969Z * [new branch] gh/XilunWu/149/head -> origin/gh/XilunWu/149/head 2025-08-14T21:21:50.0796543Z * [new branch] gh/XilunWu/149/orig -> origin/gh/XilunWu/149/orig 2025-08-14T21:21:50.0797256Z * [new branch] gh/XilunWu/150/base -> origin/gh/XilunWu/150/base 2025-08-14T21:21:50.0797819Z * [new branch] gh/XilunWu/150/head -> origin/gh/XilunWu/150/head 2025-08-14T21:21:50.0798400Z * [new branch] gh/XilunWu/150/orig -> origin/gh/XilunWu/150/orig 2025-08-14T21:21:50.0798954Z * [new branch] gh/XilunWu/151/base -> origin/gh/XilunWu/151/base 2025-08-14T21:21:50.0799535Z * [new branch] gh/XilunWu/151/head -> origin/gh/XilunWu/151/head 2025-08-14T21:21:50.0802389Z * [new branch] gh/XilunWu/151/orig -> origin/gh/XilunWu/151/orig 2025-08-14T21:21:50.0803382Z * [new branch] gh/XilunWu/152/base -> origin/gh/XilunWu/152/base 2025-08-14T21:21:50.0803965Z * [new branch] gh/XilunWu/152/head -> origin/gh/XilunWu/152/head 2025-08-14T21:21:50.0804586Z * [new branch] gh/XilunWu/152/orig -> origin/gh/XilunWu/152/orig 2025-08-14T21:21:50.0805304Z * [new branch] gh/XilunWu/153/base -> origin/gh/XilunWu/153/base 2025-08-14T21:21:50.0805897Z * [new branch] gh/XilunWu/153/head -> origin/gh/XilunWu/153/head 2025-08-14T21:21:50.0806472Z * [new branch] gh/XilunWu/153/orig -> origin/gh/XilunWu/153/orig 2025-08-14T21:21:50.0807090Z * [new branch] gh/XilunWu/154/base -> origin/gh/XilunWu/154/base 2025-08-14T21:21:50.0808315Z * [new branch] gh/XilunWu/154/head -> origin/gh/XilunWu/154/head 2025-08-14T21:21:50.0809448Z * [new branch] gh/XilunWu/154/orig -> origin/gh/XilunWu/154/orig 2025-08-14T21:21:50.0816967Z * [new branch] gh/XilunWu/156/base -> origin/gh/XilunWu/156/base 2025-08-14T21:21:50.0817416Z * [new branch] gh/XilunWu/156/head -> origin/gh/XilunWu/156/head 2025-08-14T21:21:50.0817791Z * [new branch] gh/XilunWu/156/orig -> origin/gh/XilunWu/156/orig 2025-08-14T21:21:50.0818388Z * [new branch] gh/XilunWu/157/base -> origin/gh/XilunWu/157/base 2025-08-14T21:21:50.0818752Z * [new branch] gh/XilunWu/157/head -> origin/gh/XilunWu/157/head 2025-08-14T21:21:50.0819090Z * [new branch] gh/XilunWu/157/orig -> origin/gh/XilunWu/157/orig 2025-08-14T21:21:50.0819431Z * [new branch] gh/XilunWu/158/base -> origin/gh/XilunWu/158/base 2025-08-14T21:21:50.0819775Z * [new branch] gh/XilunWu/158/head -> origin/gh/XilunWu/158/head 2025-08-14T21:21:50.0820111Z * [new branch] gh/XilunWu/158/orig -> origin/gh/XilunWu/158/orig 2025-08-14T21:21:50.0820455Z * [new branch] gh/XilunWu/159/base -> origin/gh/XilunWu/159/base 2025-08-14T21:21:50.0820792Z * [new branch] gh/XilunWu/159/head -> origin/gh/XilunWu/159/head 2025-08-14T21:21:50.0821533Z * [new branch] gh/XilunWu/159/orig -> origin/gh/XilunWu/159/orig 2025-08-14T21:21:50.0825735Z * [new branch] gh/XilunWu/160/base -> origin/gh/XilunWu/160/base 2025-08-14T21:21:50.0826172Z * [new branch] gh/XilunWu/160/head -> origin/gh/XilunWu/160/head 2025-08-14T21:21:50.0826534Z * [new branch] gh/XilunWu/160/orig -> origin/gh/XilunWu/160/orig 2025-08-14T21:21:50.0826898Z * [new branch] gh/XilunWu/161/base -> origin/gh/XilunWu/161/base 2025-08-14T21:21:50.0827259Z * [new branch] gh/XilunWu/161/head -> origin/gh/XilunWu/161/head 2025-08-14T21:21:50.0828094Z * [new branch] gh/XilunWu/161/orig -> origin/gh/XilunWu/161/orig 2025-08-14T21:21:50.0829025Z * [new branch] gh/XilunWu/162/base -> origin/gh/XilunWu/162/base 2025-08-14T21:21:50.0830384Z * [new branch] gh/XilunWu/162/head -> origin/gh/XilunWu/162/head 2025-08-14T21:21:50.0831153Z * [new branch] gh/XilunWu/162/orig -> origin/gh/XilunWu/162/orig 2025-08-14T21:21:50.0832956Z * [new branch] gh/XilunWu/163/base -> origin/gh/XilunWu/163/base 2025-08-14T21:21:50.0833797Z * [new branch] gh/XilunWu/163/head -> origin/gh/XilunWu/163/head 2025-08-14T21:21:50.0834688Z * [new branch] gh/XilunWu/163/orig -> origin/gh/XilunWu/163/orig 2025-08-14T21:21:50.0836783Z * [new branch] gh/XuehaiPan/14/base -> origin/gh/XuehaiPan/14/base 2025-08-14T21:21:50.0837433Z * [new branch] gh/XuehaiPan/14/head -> origin/gh/XuehaiPan/14/head 2025-08-14T21:21:50.0838334Z * [new branch] gh/XuehaiPan/14/orig -> origin/gh/XuehaiPan/14/orig 2025-08-14T21:21:50.0840069Z * [new branch] gh/XuehaiPan/179/base -> origin/gh/XuehaiPan/179/base 2025-08-14T21:21:50.0840715Z * [new branch] gh/XuehaiPan/179/head -> origin/gh/XuehaiPan/179/head 2025-08-14T21:21:50.0842108Z * [new branch] gh/XuehaiPan/179/orig -> origin/gh/XuehaiPan/179/orig 2025-08-14T21:21:50.0843516Z * [new branch] gh/XuehaiPan/189/base -> origin/gh/XuehaiPan/189/base 2025-08-14T21:21:50.0846748Z * [new branch] gh/XuehaiPan/189/head -> origin/gh/XuehaiPan/189/head 2025-08-14T21:21:50.0847356Z * [new branch] gh/XuehaiPan/189/orig -> origin/gh/XuehaiPan/189/orig 2025-08-14T21:21:50.0847970Z * [new branch] gh/XuehaiPan/227/base -> origin/gh/XuehaiPan/227/base 2025-08-14T21:21:50.0848585Z * [new branch] gh/XuehaiPan/227/head -> origin/gh/XuehaiPan/227/head 2025-08-14T21:21:50.0849435Z * [new branch] gh/XuehaiPan/227/orig -> origin/gh/XuehaiPan/227/orig 2025-08-14T21:21:50.0850054Z * [new branch] gh/XuehaiPan/231/base -> origin/gh/XuehaiPan/231/base 2025-08-14T21:21:50.0854145Z * [new branch] gh/XuehaiPan/231/head -> origin/gh/XuehaiPan/231/head 2025-08-14T21:21:50.0855183Z * [new branch] gh/XuehaiPan/231/orig -> origin/gh/XuehaiPan/231/orig 2025-08-14T21:21:50.0855599Z * [new branch] gh/XuehaiPan/232/base -> origin/gh/XuehaiPan/232/base 2025-08-14T21:21:50.0855998Z * [new branch] gh/XuehaiPan/232/head -> origin/gh/XuehaiPan/232/head 2025-08-14T21:21:50.0856400Z * [new branch] gh/XuehaiPan/232/orig -> origin/gh/XuehaiPan/232/orig 2025-08-14T21:21:50.0856795Z * [new branch] gh/XuehaiPan/249/base -> origin/gh/XuehaiPan/249/base 2025-08-14T21:21:50.0857234Z * [new branch] gh/XuehaiPan/249/head -> origin/gh/XuehaiPan/249/head 2025-08-14T21:21:50.0857571Z * [new branch] gh/XuehaiPan/249/orig -> origin/gh/XuehaiPan/249/orig 2025-08-14T21:21:50.0857973Z * [new branch] gh/XuehaiPan/253/base -> origin/gh/XuehaiPan/253/base 2025-08-14T21:21:50.0861087Z * [new branch] gh/XuehaiPan/253/head -> origin/gh/XuehaiPan/253/head 2025-08-14T21:21:50.0861689Z * [new branch] gh/XuehaiPan/253/orig -> origin/gh/XuehaiPan/253/orig 2025-08-14T21:21:50.0862187Z * [new branch] gh/XuehaiPan/254/base -> origin/gh/XuehaiPan/254/base 2025-08-14T21:21:50.0862565Z * [new branch] gh/XuehaiPan/254/head -> origin/gh/XuehaiPan/254/head 2025-08-14T21:21:50.0862928Z * [new branch] gh/XuehaiPan/254/orig -> origin/gh/XuehaiPan/254/orig 2025-08-14T21:21:50.0863285Z * [new branch] gh/XuehaiPan/255/base -> origin/gh/XuehaiPan/255/base 2025-08-14T21:21:50.0863679Z * [new branch] gh/XuehaiPan/255/head -> origin/gh/XuehaiPan/255/head 2025-08-14T21:21:50.0864499Z * [new branch] gh/XuehaiPan/255/orig -> origin/gh/XuehaiPan/255/orig 2025-08-14T21:21:50.0869991Z * [new branch] gh/XuehaiPan/257/base -> origin/gh/XuehaiPan/257/base 2025-08-14T21:21:50.0870452Z * [new branch] gh/XuehaiPan/257/head -> origin/gh/XuehaiPan/257/head 2025-08-14T21:21:50.0870828Z * [new branch] gh/XuehaiPan/257/orig -> origin/gh/XuehaiPan/257/orig 2025-08-14T21:21:50.0871200Z * [new branch] gh/XuehaiPan/271/base -> origin/gh/XuehaiPan/271/base 2025-08-14T21:21:50.0871554Z * [new branch] gh/XuehaiPan/271/head -> origin/gh/XuehaiPan/271/head 2025-08-14T21:21:50.0871907Z * [new branch] gh/XuehaiPan/271/orig -> origin/gh/XuehaiPan/271/orig 2025-08-14T21:21:50.0872257Z * [new branch] gh/XuehaiPan/283/base -> origin/gh/XuehaiPan/283/base 2025-08-14T21:21:50.0872616Z * [new branch] gh/XuehaiPan/283/head -> origin/gh/XuehaiPan/283/head 2025-08-14T21:21:50.0872975Z * [new branch] gh/XuehaiPan/283/orig -> origin/gh/XuehaiPan/283/orig 2025-08-14T21:21:50.0873747Z * [new branch] gh/XuehaiPan/290/base -> origin/gh/XuehaiPan/290/base 2025-08-14T21:21:50.0874672Z * [new branch] gh/XuehaiPan/290/head -> origin/gh/XuehaiPan/290/head 2025-08-14T21:21:50.0880765Z * [new branch] gh/XuehaiPan/290/orig -> origin/gh/XuehaiPan/290/orig 2025-08-14T21:21:50.0881219Z * [new branch] gh/XuehaiPan/328/base -> origin/gh/XuehaiPan/328/base 2025-08-14T21:21:50.0881584Z * [new branch] gh/XuehaiPan/328/head -> origin/gh/XuehaiPan/328/head 2025-08-14T21:21:50.0881932Z * [new branch] gh/XuehaiPan/328/orig -> origin/gh/XuehaiPan/328/orig 2025-08-14T21:21:50.0882287Z * [new branch] gh/XuehaiPan/339/base -> origin/gh/XuehaiPan/339/base 2025-08-14T21:21:50.0882649Z * [new branch] gh/XuehaiPan/339/head -> origin/gh/XuehaiPan/339/head 2025-08-14T21:21:50.0883002Z * [new branch] gh/XuehaiPan/339/orig -> origin/gh/XuehaiPan/339/orig 2025-08-14T21:21:50.0883368Z * [new branch] gh/XuehaiPan/343/base -> origin/gh/XuehaiPan/343/base 2025-08-14T21:21:50.0883893Z * [new branch] gh/XuehaiPan/343/head -> origin/gh/XuehaiPan/343/head 2025-08-14T21:21:50.0884251Z * [new branch] gh/XuehaiPan/343/orig -> origin/gh/XuehaiPan/343/orig 2025-08-14T21:21:50.0884850Z * [new branch] gh/XuehaiPan/344/base -> origin/gh/XuehaiPan/344/base 2025-08-14T21:21:50.0885859Z * [new branch] gh/XuehaiPan/344/head -> origin/gh/XuehaiPan/344/head 2025-08-14T21:21:50.0886602Z * [new branch] gh/XuehaiPan/344/orig -> origin/gh/XuehaiPan/344/orig 2025-08-14T21:21:50.0888192Z * [new branch] gh/XuehaiPan/345/base -> origin/gh/XuehaiPan/345/base 2025-08-14T21:21:50.0888803Z * [new branch] gh/XuehaiPan/345/head -> origin/gh/XuehaiPan/345/head 2025-08-14T21:21:50.0890379Z * [new branch] gh/XuehaiPan/345/orig -> origin/gh/XuehaiPan/345/orig 2025-08-14T21:21:50.0890929Z * [new branch] gh/XuehaiPan/346/base -> origin/gh/XuehaiPan/346/base 2025-08-14T21:21:50.0896032Z * [new branch] gh/XuehaiPan/346/head -> origin/gh/XuehaiPan/346/head 2025-08-14T21:21:50.0896475Z * [new branch] gh/XuehaiPan/346/orig -> origin/gh/XuehaiPan/346/orig 2025-08-14T21:21:50.0896835Z * [new branch] gh/XuehaiPan/347/base -> origin/gh/XuehaiPan/347/base 2025-08-14T21:21:50.0897195Z * [new branch] gh/XuehaiPan/347/head -> origin/gh/XuehaiPan/347/head 2025-08-14T21:21:50.0897555Z * [new branch] gh/XuehaiPan/347/orig -> origin/gh/XuehaiPan/347/orig 2025-08-14T21:21:50.0897911Z * [new branch] gh/XuehaiPan/348/base -> origin/gh/XuehaiPan/348/base 2025-08-14T21:21:50.0898266Z * [new branch] gh/XuehaiPan/348/head -> origin/gh/XuehaiPan/348/head 2025-08-14T21:21:50.0898624Z * [new branch] gh/XuehaiPan/348/orig -> origin/gh/XuehaiPan/348/orig 2025-08-14T21:21:50.0899739Z * [new branch] gh/XuehaiPan/350/base -> origin/gh/XuehaiPan/350/base 2025-08-14T21:21:50.0900206Z * [new branch] gh/XuehaiPan/350/head -> origin/gh/XuehaiPan/350/head 2025-08-14T21:21:50.0901297Z * [new branch] gh/XuehaiPan/350/orig -> origin/gh/XuehaiPan/350/orig 2025-08-14T21:21:50.0906134Z * [new branch] gh/XuehaiPan/352/base -> origin/gh/XuehaiPan/352/base 2025-08-14T21:21:50.0906559Z * [new branch] gh/XuehaiPan/352/head -> origin/gh/XuehaiPan/352/head 2025-08-14T21:21:50.0906920Z * [new branch] gh/XuehaiPan/352/orig -> origin/gh/XuehaiPan/352/orig 2025-08-14T21:21:50.0907266Z * [new branch] gh/XuehaiPan/356/base -> origin/gh/XuehaiPan/356/base 2025-08-14T21:21:50.0907618Z * [new branch] gh/XuehaiPan/356/head -> origin/gh/XuehaiPan/356/head 2025-08-14T21:21:50.0907965Z * [new branch] gh/XuehaiPan/356/orig -> origin/gh/XuehaiPan/356/orig 2025-08-14T21:21:50.0908573Z * [new branch] gh/XuehaiPan/357/base -> origin/gh/XuehaiPan/357/base 2025-08-14T21:21:50.0908931Z * [new branch] gh/XuehaiPan/357/head -> origin/gh/XuehaiPan/357/head 2025-08-14T21:21:50.0909657Z * [new branch] gh/XuehaiPan/357/orig -> origin/gh/XuehaiPan/357/orig 2025-08-14T21:21:50.0914321Z * [new branch] gh/XuehaiPan/358/base -> origin/gh/XuehaiPan/358/base 2025-08-14T21:21:50.0914747Z * [new branch] gh/XuehaiPan/358/head -> origin/gh/XuehaiPan/358/head 2025-08-14T21:21:50.0915105Z * [new branch] gh/XuehaiPan/358/orig -> origin/gh/XuehaiPan/358/orig 2025-08-14T21:21:50.0915460Z * [new branch] gh/XuehaiPan/359/base -> origin/gh/XuehaiPan/359/base 2025-08-14T21:21:50.0915813Z * [new branch] gh/XuehaiPan/359/head -> origin/gh/XuehaiPan/359/head 2025-08-14T21:21:50.0916367Z * [new branch] gh/XuehaiPan/359/orig -> origin/gh/XuehaiPan/359/orig 2025-08-14T21:21:50.0916720Z * [new branch] gh/XuehaiPan/360/base -> origin/gh/XuehaiPan/360/base 2025-08-14T21:21:50.0917118Z * [new branch] gh/XuehaiPan/360/head -> origin/gh/XuehaiPan/360/head 2025-08-14T21:21:50.0918108Z * [new branch] gh/XuehaiPan/360/orig -> origin/gh/XuehaiPan/360/orig 2025-08-14T21:21:50.0923041Z * [new branch] gh/XuehaiPan/365/base -> origin/gh/XuehaiPan/365/base 2025-08-14T21:21:50.0923474Z * [new branch] gh/XuehaiPan/365/head -> origin/gh/XuehaiPan/365/head 2025-08-14T21:21:50.0923845Z * [new branch] gh/XuehaiPan/365/orig -> origin/gh/XuehaiPan/365/orig 2025-08-14T21:21:50.0924210Z * [new branch] gh/XuehaiPan/366/base -> origin/gh/XuehaiPan/366/base 2025-08-14T21:21:50.0924571Z * [new branch] gh/XuehaiPan/366/head -> origin/gh/XuehaiPan/366/head 2025-08-14T21:21:50.0924946Z * [new branch] gh/XuehaiPan/368/base -> origin/gh/XuehaiPan/368/base 2025-08-14T21:21:50.0925310Z * [new branch] gh/XuehaiPan/368/head -> origin/gh/XuehaiPan/368/head 2025-08-14T21:21:50.0925665Z * [new branch] gh/XuehaiPan/368/orig -> origin/gh/XuehaiPan/368/orig 2025-08-14T21:21:50.0926550Z * [new branch] gh/XuehaiPan/369/base -> origin/gh/XuehaiPan/369/base 2025-08-14T21:21:50.0927173Z * [new branch] gh/XuehaiPan/369/head -> origin/gh/XuehaiPan/369/head 2025-08-14T21:21:50.0928971Z * [new branch] gh/XuehaiPan/369/orig -> origin/gh/XuehaiPan/369/orig 2025-08-14T21:21:50.0929773Z * [new branch] gh/XuehaiPan/370/base -> origin/gh/XuehaiPan/370/base 2025-08-14T21:21:50.0937508Z * [new branch] gh/XuehaiPan/370/head -> origin/gh/XuehaiPan/370/head 2025-08-14T21:21:50.0937942Z * [new branch] gh/XuehaiPan/370/orig -> origin/gh/XuehaiPan/370/orig 2025-08-14T21:21:50.0938345Z * [new branch] gh/XuehaiPan/371/base -> origin/gh/XuehaiPan/371/base 2025-08-14T21:21:50.0938716Z * [new branch] gh/XuehaiPan/371/head -> origin/gh/XuehaiPan/371/head 2025-08-14T21:21:50.0939087Z * [new branch] gh/XuehaiPan/371/orig -> origin/gh/XuehaiPan/371/orig 2025-08-14T21:21:50.0939450Z * [new branch] gh/XuehaiPan/372/base -> origin/gh/XuehaiPan/372/base 2025-08-14T21:21:50.0939822Z * [new branch] gh/XuehaiPan/372/head -> origin/gh/XuehaiPan/372/head 2025-08-14T21:21:50.0940200Z * [new branch] gh/XuehaiPan/372/orig -> origin/gh/XuehaiPan/372/orig 2025-08-14T21:21:50.0940610Z * [new branch] gh/XuehaiPan/373/base -> origin/gh/XuehaiPan/373/base 2025-08-14T21:21:50.0940975Z * [new branch] gh/XuehaiPan/373/head -> origin/gh/XuehaiPan/373/head 2025-08-14T21:21:50.0941358Z * [new branch] gh/XuehaiPan/373/orig -> origin/gh/XuehaiPan/373/orig 2025-08-14T21:21:50.0941872Z * [new branch] gh/XuehaiPan/374/base -> origin/gh/XuehaiPan/374/base 2025-08-14T21:21:50.0942252Z * [new branch] gh/XuehaiPan/374/head -> origin/gh/XuehaiPan/374/head 2025-08-14T21:21:50.0942781Z * [new branch] gh/XuehaiPan/374/orig -> origin/gh/XuehaiPan/374/orig 2025-08-14T21:21:50.0947902Z * [new branch] gh/XuehaiPan/375/base -> origin/gh/XuehaiPan/375/base 2025-08-14T21:21:50.0948291Z * [new branch] gh/XuehaiPan/375/head -> origin/gh/XuehaiPan/375/head 2025-08-14T21:21:50.0951945Z * [new branch] gh/XuehaiPan/375/orig -> origin/gh/XuehaiPan/375/orig 2025-08-14T21:21:50.0952320Z * [new branch] gh/XuehaiPan/376/base -> origin/gh/XuehaiPan/376/base 2025-08-14T21:21:50.0952678Z * [new branch] gh/XuehaiPan/376/head -> origin/gh/XuehaiPan/376/head 2025-08-14T21:21:50.0953176Z * [new branch] gh/XuehaiPan/376/orig -> origin/gh/XuehaiPan/376/orig 2025-08-14T21:21:50.0953515Z * [new branch] gh/XuehaiPan/377/base -> origin/gh/XuehaiPan/377/base 2025-08-14T21:21:50.0953845Z * [new branch] gh/XuehaiPan/377/head -> origin/gh/XuehaiPan/377/head 2025-08-14T21:21:50.0954190Z * [new branch] gh/XuehaiPan/377/orig -> origin/gh/XuehaiPan/377/orig 2025-08-14T21:21:50.0954531Z * [new branch] gh/XuehaiPan/378/base -> origin/gh/XuehaiPan/378/base 2025-08-14T21:21:50.0954878Z * [new branch] gh/XuehaiPan/378/head -> origin/gh/XuehaiPan/378/head 2025-08-14T21:21:50.0955225Z * [new branch] gh/XuehaiPan/378/orig -> origin/gh/XuehaiPan/378/orig 2025-08-14T21:21:50.0955572Z * [new branch] gh/XuehaiPan/379/base -> origin/gh/XuehaiPan/379/base 2025-08-14T21:21:50.0955913Z * [new branch] gh/XuehaiPan/379/head -> origin/gh/XuehaiPan/379/head 2025-08-14T21:21:50.0956408Z * [new branch] gh/XuehaiPan/379/orig -> origin/gh/XuehaiPan/379/orig 2025-08-14T21:21:50.0956895Z * [new branch] gh/ZhiweiYan-96/39/base -> origin/gh/ZhiweiYan-96/39/base 2025-08-14T21:21:50.0957396Z * [new branch] gh/ZhiweiYan-96/39/head -> origin/gh/ZhiweiYan-96/39/head 2025-08-14T21:21:50.0957884Z * [new branch] gh/ZhiweiYan-96/39/orig -> origin/gh/ZhiweiYan-96/39/orig 2025-08-14T21:21:50.0958379Z * [new branch] gh/ZhiweiYan-96/44/base -> origin/gh/ZhiweiYan-96/44/base 2025-08-14T21:21:50.0958858Z * [new branch] gh/ZhiweiYan-96/44/head -> origin/gh/ZhiweiYan-96/44/head 2025-08-14T21:21:50.0959389Z * [new branch] gh/ZhiweiYan-96/45/base -> origin/gh/ZhiweiYan-96/45/base 2025-08-14T21:21:50.0959760Z * [new branch] gh/ZhiweiYan-96/45/head -> origin/gh/ZhiweiYan-96/45/head 2025-08-14T21:21:50.0960286Z * [new branch] gh/ZhiweiYan-96/49/base -> origin/gh/ZhiweiYan-96/49/base 2025-08-14T21:21:50.0961134Z * [new branch] gh/ZhiweiYan-96/49/head -> origin/gh/ZhiweiYan-96/49/head 2025-08-14T21:21:50.0961969Z * [new branch] gh/ZhiweiYan-96/62/base -> origin/gh/ZhiweiYan-96/62/base 2025-08-14T21:21:50.0962601Z * [new branch] gh/ZhiweiYan-96/62/head -> origin/gh/ZhiweiYan-96/62/head 2025-08-14T21:21:50.0964527Z * [new branch] gh/ZhiweiYan-96/64/base -> origin/gh/ZhiweiYan-96/64/base 2025-08-14T21:21:50.0964915Z * [new branch] gh/ZhiweiYan-96/64/head -> origin/gh/ZhiweiYan-96/64/head 2025-08-14T21:21:50.0965287Z * [new branch] gh/ZhiweiYan-96/64/orig -> origin/gh/ZhiweiYan-96/64/orig 2025-08-14T21:21:50.0965860Z * [new branch] gh/ZhiweiYan-96/65/base -> origin/gh/ZhiweiYan-96/65/base 2025-08-14T21:21:50.0966564Z * [new branch] gh/ZhiweiYan-96/65/head -> origin/gh/ZhiweiYan-96/65/head 2025-08-14T21:21:50.0967361Z * [new branch] gh/ZhiweiYan-96/65/orig -> origin/gh/ZhiweiYan-96/65/orig 2025-08-14T21:21:50.0972145Z * [new branch] gh/ZhiweiYan-96/66/base -> origin/gh/ZhiweiYan-96/66/base 2025-08-14T21:21:50.0972578Z * [new branch] gh/ZhiweiYan-96/66/head -> origin/gh/ZhiweiYan-96/66/head 2025-08-14T21:21:50.0972948Z * [new branch] gh/ZhiweiYan-96/67/base -> origin/gh/ZhiweiYan-96/67/base 2025-08-14T21:21:50.0973305Z * [new branch] gh/ZhiweiYan-96/67/head -> origin/gh/ZhiweiYan-96/67/head 2025-08-14T21:21:50.0973664Z * [new branch] gh/ZhiweiYan-96/68/base -> origin/gh/ZhiweiYan-96/68/base 2025-08-14T21:21:50.0974023Z * [new branch] gh/ZhiweiYan-96/68/head -> origin/gh/ZhiweiYan-96/68/head 2025-08-14T21:21:50.0974385Z * [new branch] gh/ZhiweiYan-96/68/orig -> origin/gh/ZhiweiYan-96/68/orig 2025-08-14T21:21:50.0974890Z * [new branch] gh/aakhundov/1/base -> origin/gh/aakhundov/1/base 2025-08-14T21:21:50.0975289Z * [new branch] gh/aakhundov/1/head -> origin/gh/aakhundov/1/head 2025-08-14T21:21:50.0976122Z * [new branch] gh/aakhundov/2/base -> origin/gh/aakhundov/2/base 2025-08-14T21:21:50.0976672Z * [new branch] gh/aakhundov/2/head -> origin/gh/aakhundov/2/head 2025-08-14T21:21:50.0977350Z * [new branch] gh/aditew01/openblas -> origin/gh/aditew01/openblas 2025-08-14T21:21:50.0978047Z * [new branch] gh/aditew01/sbgemm -> origin/gh/aditew01/sbgemm 2025-08-14T21:21:50.0978755Z * [new branch] gh/aditew01/vecbf16 -> origin/gh/aditew01/vecbf16 2025-08-14T21:21:50.0981120Z * [new branch] gh/alexbrauckmann/paddedtensor_faketensor_init -> origin/gh/alexbrauckmann/paddedtensor_faketensor_init 2025-08-14T21:21:50.0981704Z * [new branch] gh/alexbrauckmann/paddedtensor_init -> origin/gh/alexbrauckmann/paddedtensor_init 2025-08-14T21:21:50.0989898Z * [new branch] gh/alexbrauckmann/paddedtensor_meta_init -> origin/gh/alexbrauckmann/paddedtensor_meta_init 2025-08-14T21:21:50.0990455Z * [new branch] gh/alexsamardzic/7/base -> origin/gh/alexsamardzic/7/base 2025-08-14T21:21:50.0990889Z * [new branch] gh/alexsamardzic/7/head -> origin/gh/alexsamardzic/7/head 2025-08-14T21:21:50.0991258Z * [new branch] gh/alexsamardzic/7/orig -> origin/gh/alexsamardzic/7/orig 2025-08-14T21:21:50.0991621Z * [new branch] gh/alexsamardzic/8/base -> origin/gh/alexsamardzic/8/base 2025-08-14T21:21:50.0991979Z * [new branch] gh/alexsamardzic/8/head -> origin/gh/alexsamardzic/8/head 2025-08-14T21:21:50.0992337Z * [new branch] gh/alexsamardzic/8/orig -> origin/gh/alexsamardzic/8/orig 2025-08-14T21:21:50.0992689Z * [new branch] gh/amjames/18/base -> origin/gh/amjames/18/base 2025-08-14T21:21:50.0993029Z * [new branch] gh/amjames/18/head -> origin/gh/amjames/18/head 2025-08-14T21:21:50.0993363Z * [new branch] gh/amjames/18/orig -> origin/gh/amjames/18/orig 2025-08-14T21:21:50.0993705Z * [new branch] gh/andrewor14/35/base -> origin/gh/andrewor14/35/base 2025-08-14T21:21:50.0994054Z * [new branch] gh/andrewor14/35/head -> origin/gh/andrewor14/35/head 2025-08-14T21:21:50.0994394Z * [new branch] gh/andrewor14/35/orig -> origin/gh/andrewor14/35/orig 2025-08-14T21:21:50.0994736Z * [new branch] gh/andrewor14/50/base -> origin/gh/andrewor14/50/base 2025-08-14T21:21:50.0995082Z * [new branch] gh/andrewor14/50/head -> origin/gh/andrewor14/50/head 2025-08-14T21:21:50.1000007Z * [new branch] gh/andrewor14/50/orig -> origin/gh/andrewor14/50/orig 2025-08-14T21:21:50.1002126Z * [new branch] gh/andyanwang/1/base -> origin/gh/andyanwang/1/base 2025-08-14T21:21:50.1002768Z * [new branch] gh/andyanwang/1/head -> origin/gh/andyanwang/1/head 2025-08-14T21:21:50.1003146Z * [new branch] gh/andyanwang/1/orig -> origin/gh/andyanwang/1/orig 2025-08-14T21:21:50.1003504Z * [new branch] gh/andyanwang/13/base -> origin/gh/andyanwang/13/base 2025-08-14T21:21:50.1003852Z * [new branch] gh/andyanwang/13/head -> origin/gh/andyanwang/13/head 2025-08-14T21:21:50.1004217Z * [new branch] gh/andyanwang/13/orig -> origin/gh/andyanwang/13/orig 2025-08-14T21:21:50.1004574Z * [new branch] gh/andyanwang/2/base -> origin/gh/andyanwang/2/base 2025-08-14T21:21:50.1004923Z * [new branch] gh/andyanwang/2/head -> origin/gh/andyanwang/2/head 2025-08-14T21:21:50.1005274Z * [new branch] gh/andyanwang/2/orig -> origin/gh/andyanwang/2/orig 2025-08-14T21:21:50.1006685Z * [new branch] gh/andyanwang/28/base -> origin/gh/andyanwang/28/base 2025-08-14T21:21:50.1008110Z * [new branch] gh/andyanwang/28/head -> origin/gh/andyanwang/28/head 2025-08-14T21:21:50.1008867Z * [new branch] gh/andyanwang/28/orig -> origin/gh/andyanwang/28/orig 2025-08-14T21:21:50.1010185Z * [new branch] gh/andyanwang/3/base -> origin/gh/andyanwang/3/base 2025-08-14T21:21:50.1010779Z * [new branch] gh/andyanwang/3/head -> origin/gh/andyanwang/3/head 2025-08-14T21:21:50.1013134Z * [new branch] gh/andyanwang/3/orig -> origin/gh/andyanwang/3/orig 2025-08-14T21:21:50.1013724Z * [new branch] gh/andyanwang/30/base -> origin/gh/andyanwang/30/base 2025-08-14T21:21:50.1014252Z * [new branch] gh/andyanwang/30/orig -> origin/gh/andyanwang/30/orig 2025-08-14T21:21:50.1019563Z * [new branch] gh/andyanwang/31/base -> origin/gh/andyanwang/31/base 2025-08-14T21:21:50.1020185Z * [new branch] gh/andyanwang/31/orig -> origin/gh/andyanwang/31/orig 2025-08-14T21:21:50.1020709Z * [new branch] gh/andyanwang/32/base -> origin/gh/andyanwang/32/base 2025-08-14T21:21:50.1021110Z * [new branch] gh/andyanwang/32/head -> origin/gh/andyanwang/32/head 2025-08-14T21:21:50.1033898Z * [new branch] gh/andyanwang/32/orig -> origin/gh/andyanwang/32/orig 2025-08-14T21:21:50.1040523Z * [new branch] gh/andyanwang/33/base -> origin/gh/andyanwang/33/base 2025-08-14T21:21:50.1042509Z * [new branch] gh/andyanwang/33/head -> origin/gh/andyanwang/33/head 2025-08-14T21:21:50.1043104Z * [new branch] gh/andyanwang/33/orig -> origin/gh/andyanwang/33/orig 2025-08-14T21:21:50.1043611Z * [new branch] gh/andyanwang/34/base -> origin/gh/andyanwang/34/base 2025-08-14T21:21:50.1044143Z * [new branch] gh/andyanwang/34/head -> origin/gh/andyanwang/34/head 2025-08-14T21:21:50.1044625Z * [new branch] gh/andyanwang/34/orig -> origin/gh/andyanwang/34/orig 2025-08-14T21:21:50.1045490Z * [new branch] gh/andyanwang/35/base -> origin/gh/andyanwang/35/base 2025-08-14T21:21:50.1045958Z * [new branch] gh/andyanwang/35/head -> origin/gh/andyanwang/35/head 2025-08-14T21:21:50.1046332Z * [new branch] gh/andyanwang/35/orig -> origin/gh/andyanwang/35/orig 2025-08-14T21:21:50.1046717Z * [new branch] gh/andyanwang/36/base -> origin/gh/andyanwang/36/base 2025-08-14T21:21:50.1047098Z * [new branch] gh/andyanwang/36/head -> origin/gh/andyanwang/36/head 2025-08-14T21:21:50.1047474Z * [new branch] gh/andyanwang/36/orig -> origin/gh/andyanwang/36/orig 2025-08-14T21:21:50.1047843Z * [new branch] gh/andyanwang/37/base -> origin/gh/andyanwang/37/base 2025-08-14T21:21:50.1048235Z * [new branch] gh/andyanwang/37/head -> origin/gh/andyanwang/37/head 2025-08-14T21:21:50.1048944Z * [new branch] gh/andyanwang/37/orig -> origin/gh/andyanwang/37/orig 2025-08-14T21:21:50.1049326Z * [new branch] gh/andyanwang/38/base -> origin/gh/andyanwang/38/base 2025-08-14T21:21:50.1049695Z * [new branch] gh/andyanwang/38/head -> origin/gh/andyanwang/38/head 2025-08-14T21:21:50.1050056Z * [new branch] gh/andyanwang/38/orig -> origin/gh/andyanwang/38/orig 2025-08-14T21:21:50.1050415Z * [new branch] gh/andyanwang/39/base -> origin/gh/andyanwang/39/base 2025-08-14T21:21:50.1050772Z * [new branch] gh/andyanwang/39/head -> origin/gh/andyanwang/39/head 2025-08-14T21:21:50.1051128Z * [new branch] gh/andyanwang/39/orig -> origin/gh/andyanwang/39/orig 2025-08-14T21:21:50.1051493Z * [new branch] gh/andyanwang/4/base -> origin/gh/andyanwang/4/base 2025-08-14T21:21:50.1051921Z * [new branch] gh/andyanwang/4/head -> origin/gh/andyanwang/4/head 2025-08-14T21:21:50.1052266Z * [new branch] gh/andyanwang/4/orig -> origin/gh/andyanwang/4/orig 2025-08-14T21:21:50.1052621Z * [new branch] gh/andyanwang/40/base -> origin/gh/andyanwang/40/base 2025-08-14T21:21:50.1052978Z * [new branch] gh/andyanwang/40/head -> origin/gh/andyanwang/40/head 2025-08-14T21:21:50.1053334Z * [new branch] gh/andyanwang/40/orig -> origin/gh/andyanwang/40/orig 2025-08-14T21:21:50.1053690Z * [new branch] gh/angelayi/102/base -> origin/gh/angelayi/102/base 2025-08-14T21:21:50.1054034Z * [new branch] gh/angelayi/102/head -> origin/gh/angelayi/102/head 2025-08-14T21:21:50.1054372Z * [new branch] gh/angelayi/102/orig -> origin/gh/angelayi/102/orig 2025-08-14T21:21:50.1054712Z * [new branch] gh/angelayi/103/base -> origin/gh/angelayi/103/base 2025-08-14T21:21:50.1055065Z * [new branch] gh/angelayi/103/head -> origin/gh/angelayi/103/head 2025-08-14T21:21:50.1055582Z * [new branch] gh/angelayi/103/orig -> origin/gh/angelayi/103/orig 2025-08-14T21:21:50.1055940Z * [new branch] gh/angelayi/104/base -> origin/gh/angelayi/104/base 2025-08-14T21:21:50.1056280Z * [new branch] gh/angelayi/104/head -> origin/gh/angelayi/104/head 2025-08-14T21:21:50.1056629Z * [new branch] gh/angelayi/104/orig -> origin/gh/angelayi/104/orig 2025-08-14T21:21:50.1056978Z * [new branch] gh/angelayi/105/base -> origin/gh/angelayi/105/base 2025-08-14T21:21:50.1057320Z * [new branch] gh/angelayi/105/head -> origin/gh/angelayi/105/head 2025-08-14T21:21:50.1057660Z * [new branch] gh/angelayi/105/orig -> origin/gh/angelayi/105/orig 2025-08-14T21:21:50.1058024Z * [new branch] gh/angelayi/106/base -> origin/gh/angelayi/106/base 2025-08-14T21:21:50.1058603Z * [new branch] gh/angelayi/106/head -> origin/gh/angelayi/106/head 2025-08-14T21:21:50.1058964Z * [new branch] gh/angelayi/106/orig -> origin/gh/angelayi/106/orig 2025-08-14T21:21:50.1059293Z * [new branch] gh/angelayi/107/base -> origin/gh/angelayi/107/base 2025-08-14T21:21:50.1059629Z * [new branch] gh/angelayi/107/head -> origin/gh/angelayi/107/head 2025-08-14T21:21:50.1059972Z * [new branch] gh/angelayi/108/base -> origin/gh/angelayi/108/base 2025-08-14T21:21:50.1060310Z * [new branch] gh/angelayi/108/head -> origin/gh/angelayi/108/head 2025-08-14T21:21:50.1060640Z * [new branch] gh/angelayi/108/orig -> origin/gh/angelayi/108/orig 2025-08-14T21:21:50.1060996Z * [new branch] gh/angelayi/109/base -> origin/gh/angelayi/109/base 2025-08-14T21:21:50.1061343Z * [new branch] gh/angelayi/109/head -> origin/gh/angelayi/109/head 2025-08-14T21:21:50.1061743Z * [new branch] gh/angelayi/109/orig -> origin/gh/angelayi/109/orig 2025-08-14T21:21:50.1062094Z * [new branch] gh/angelayi/110/base -> origin/gh/angelayi/110/base 2025-08-14T21:21:50.1062448Z * [new branch] gh/angelayi/110/head -> origin/gh/angelayi/110/head 2025-08-14T21:21:50.1068248Z * [new branch] gh/angelayi/110/orig -> origin/gh/angelayi/110/orig 2025-08-14T21:21:50.1070328Z * [new branch] gh/angelayi/97/base -> origin/gh/angelayi/97/base 2025-08-14T21:21:50.1070789Z * [new branch] gh/angelayi/97/head -> origin/gh/angelayi/97/head 2025-08-14T21:21:50.1075991Z * [new branch] gh/angelayi/97/orig -> origin/gh/angelayi/97/orig 2025-08-14T21:21:50.1081208Z * [new branch] gh/ani300/1/base -> origin/gh/ani300/1/base 2025-08-14T21:21:50.1083832Z * [new branch] gh/ani300/1/head -> origin/gh/ani300/1/head 2025-08-14T21:21:50.1084270Z * [new branch] gh/ani300/1/orig -> origin/gh/ani300/1/orig 2025-08-14T21:21:50.1084647Z * [new branch] gh/anijain2305/753/base -> origin/gh/anijain2305/753/base 2025-08-14T21:21:50.1085030Z * [new branch] gh/anijain2305/753/head -> origin/gh/anijain2305/753/head 2025-08-14T21:21:50.1085400Z * [new branch] gh/anijain2305/753/orig -> origin/gh/anijain2305/753/orig 2025-08-14T21:21:50.1085762Z * [new branch] gh/anijain2305/766/base -> origin/gh/anijain2305/766/base 2025-08-14T21:21:50.1086126Z * [new branch] gh/anijain2305/766/head -> origin/gh/anijain2305/766/head 2025-08-14T21:21:50.1086489Z * [new branch] gh/anijain2305/766/orig -> origin/gh/anijain2305/766/orig 2025-08-14T21:21:50.1086841Z * [new branch] gh/anijain2305/790/base -> origin/gh/anijain2305/790/base 2025-08-14T21:21:50.1087230Z * [new branch] gh/anijain2305/790/head -> origin/gh/anijain2305/790/head 2025-08-14T21:21:50.1087592Z * [new branch] gh/anijain2305/790/orig -> origin/gh/anijain2305/790/orig 2025-08-14T21:21:50.1088212Z * [new branch] gh/anijain2305/792/base -> origin/gh/anijain2305/792/base 2025-08-14T21:21:50.1088594Z * [new branch] gh/anijain2305/792/head -> origin/gh/anijain2305/792/head 2025-08-14T21:21:50.1089228Z * [new branch] gh/anijain2305/792/orig -> origin/gh/anijain2305/792/orig 2025-08-14T21:21:50.1089602Z * [new branch] gh/anijain2305/803/base -> origin/gh/anijain2305/803/base 2025-08-14T21:21:50.1089969Z * [new branch] gh/anijain2305/803/head -> origin/gh/anijain2305/803/head 2025-08-14T21:21:50.1090328Z * [new branch] gh/anijain2305/803/orig -> origin/gh/anijain2305/803/orig 2025-08-14T21:21:50.1090700Z * [new branch] gh/anijain2305/804/base -> origin/gh/anijain2305/804/base 2025-08-14T21:21:50.1091066Z * [new branch] gh/anijain2305/804/head -> origin/gh/anijain2305/804/head 2025-08-14T21:21:50.1091426Z * [new branch] gh/anijain2305/804/orig -> origin/gh/anijain2305/804/orig 2025-08-14T21:21:50.1091786Z * [new branch] gh/anijain2305/805/base -> origin/gh/anijain2305/805/base 2025-08-14T21:21:50.1092154Z * [new branch] gh/anijain2305/805/head -> origin/gh/anijain2305/805/head 2025-08-14T21:21:50.1092537Z * [new branch] gh/anijain2305/805/orig -> origin/gh/anijain2305/805/orig 2025-08-14T21:21:50.1092903Z * [new branch] gh/anijain2305/810/base -> origin/gh/anijain2305/810/base 2025-08-14T21:21:50.1093242Z * [new branch] gh/anijain2305/810/head -> origin/gh/anijain2305/810/head 2025-08-14T21:21:50.1093687Z * [new branch] gh/anijain2305/810/orig -> origin/gh/anijain2305/810/orig 2025-08-14T21:21:50.1094214Z * [new branch] gh/anijain2305/811/base -> origin/gh/anijain2305/811/base 2025-08-14T21:21:50.1094573Z * [new branch] gh/anijain2305/811/head -> origin/gh/anijain2305/811/head 2025-08-14T21:21:50.1094941Z * [new branch] gh/anijain2305/811/orig -> origin/gh/anijain2305/811/orig 2025-08-14T21:21:50.1095302Z * [new branch] gh/anijain2305/812/base -> origin/gh/anijain2305/812/base 2025-08-14T21:21:50.1095670Z * [new branch] gh/anijain2305/812/head -> origin/gh/anijain2305/812/head 2025-08-14T21:21:50.1096013Z * [new branch] gh/anijain2305/812/orig -> origin/gh/anijain2305/812/orig 2025-08-14T21:21:50.1096487Z * [new branch] gh/anijain2305/813/base -> origin/gh/anijain2305/813/base 2025-08-14T21:21:50.1096858Z * [new branch] gh/anijain2305/813/head -> origin/gh/anijain2305/813/head 2025-08-14T21:21:50.1097277Z * [new branch] gh/anijain2305/813/orig -> origin/gh/anijain2305/813/orig 2025-08-14T21:21:50.1097628Z * [new branch] gh/anijain2305/814/base -> origin/gh/anijain2305/814/base 2025-08-14T21:21:50.1098168Z * [new branch] gh/anijain2305/814/head -> origin/gh/anijain2305/814/head 2025-08-14T21:21:50.1098682Z * [new branch] gh/anijain2305/814/orig -> origin/gh/anijain2305/814/orig 2025-08-14T21:21:50.1099046Z * [new branch] gh/anijain2305/815/base -> origin/gh/anijain2305/815/base 2025-08-14T21:21:50.1099398Z * [new branch] gh/anijain2305/815/head -> origin/gh/anijain2305/815/head 2025-08-14T21:21:50.1099941Z * [new branch] gh/anijain2305/815/orig -> origin/gh/anijain2305/815/orig 2025-08-14T21:21:50.1103433Z * [new branch] gh/anijain2305/816/base -> origin/gh/anijain2305/816/base 2025-08-14T21:21:50.1103873Z * [new branch] gh/anijain2305/816/head -> origin/gh/anijain2305/816/head 2025-08-14T21:21:50.1104294Z * [new branch] gh/anijain2305/817/base -> origin/gh/anijain2305/817/base 2025-08-14T21:21:50.1104678Z * [new branch] gh/anijain2305/817/head -> origin/gh/anijain2305/817/head 2025-08-14T21:21:50.1105089Z * [new branch] gh/anijain2305/817/orig -> origin/gh/anijain2305/817/orig 2025-08-14T21:21:50.1105473Z * [new branch] gh/anijain2305/818/base -> origin/gh/anijain2305/818/base 2025-08-14T21:21:50.1105771Z * [new branch] gh/anijain2305/818/head -> origin/gh/anijain2305/818/head 2025-08-14T21:21:50.1108882Z * [new branch] gh/anijain2305/818/orig -> origin/gh/anijain2305/818/orig 2025-08-14T21:21:50.1109111Z * [new branch] gh/anijain2305/819/base -> origin/gh/anijain2305/819/base 2025-08-14T21:21:50.1109272Z * [new branch] gh/anijain2305/819/head -> origin/gh/anijain2305/819/head 2025-08-14T21:21:50.1109614Z * [new branch] gh/anijain2305/819/orig -> origin/gh/anijain2305/819/orig 2025-08-14T21:21:50.1110955Z * [new branch] gh/anijain2305/820/base -> origin/gh/anijain2305/820/base 2025-08-14T21:21:50.1111183Z * [new branch] gh/anijain2305/820/head -> origin/gh/anijain2305/820/head 2025-08-14T21:21:50.1116694Z * [new branch] gh/anijain2305/820/orig -> origin/gh/anijain2305/820/orig 2025-08-14T21:21:50.1116890Z * [new branch] gh/anijain2305/821/base -> origin/gh/anijain2305/821/base 2025-08-14T21:21:50.1117048Z * [new branch] gh/anijain2305/821/head -> origin/gh/anijain2305/821/head 2025-08-14T21:21:50.1117196Z * [new branch] gh/anijain2305/821/orig -> origin/gh/anijain2305/821/orig 2025-08-14T21:21:50.1117351Z * [new branch] gh/anijain2305/822/base -> origin/gh/anijain2305/822/base 2025-08-14T21:21:50.1122319Z * [new branch] gh/anijain2305/822/head -> origin/gh/anijain2305/822/head 2025-08-14T21:21:50.1122798Z * [new branch] gh/anijain2305/822/orig -> origin/gh/anijain2305/822/orig 2025-08-14T21:21:50.1122966Z * [new branch] gh/anijain2305/823/base -> origin/gh/anijain2305/823/base 2025-08-14T21:21:50.1123122Z * [new branch] gh/anijain2305/823/head -> origin/gh/anijain2305/823/head 2025-08-14T21:21:50.1123283Z * [new branch] gh/anijain2305/823/orig -> origin/gh/anijain2305/823/orig 2025-08-14T21:21:50.1123434Z * [new branch] gh/anijain2305/824/base -> origin/gh/anijain2305/824/base 2025-08-14T21:21:50.1123739Z * [new branch] gh/anijain2305/824/head -> origin/gh/anijain2305/824/head 2025-08-14T21:21:50.1123892Z * [new branch] gh/anijain2305/824/orig -> origin/gh/anijain2305/824/orig 2025-08-14T21:21:50.1124042Z * [new branch] gh/anijain2305/825/base -> origin/gh/anijain2305/825/base 2025-08-14T21:21:50.1124415Z * [new branch] gh/anijain2305/825/head -> origin/gh/anijain2305/825/head 2025-08-14T21:21:50.1125025Z * [new branch] gh/anijain2305/825/orig -> origin/gh/anijain2305/825/orig 2025-08-14T21:21:50.1125236Z * [new branch] gh/anijain2305/826/base -> origin/gh/anijain2305/826/base 2025-08-14T21:21:50.1125397Z * [new branch] gh/anijain2305/826/head -> origin/gh/anijain2305/826/head 2025-08-14T21:21:50.1125801Z * [new branch] gh/anijain2305/826/orig -> origin/gh/anijain2305/826/orig 2025-08-14T21:21:50.1126335Z * [new branch] gh/anijain2305/827/base -> origin/gh/anijain2305/827/base 2025-08-14T21:21:50.1127589Z * [new branch] gh/anijain2305/827/head -> origin/gh/anijain2305/827/head 2025-08-14T21:21:50.1127766Z * [new branch] gh/anijain2305/827/orig -> origin/gh/anijain2305/827/orig 2025-08-14T21:21:50.1129252Z * [new branch] gh/anijain2305/828/base -> origin/gh/anijain2305/828/base 2025-08-14T21:21:50.1132620Z * [new branch] gh/anijain2305/828/head -> origin/gh/anijain2305/828/head 2025-08-14T21:21:50.1132859Z * [new branch] gh/anijain2305/828/orig -> origin/gh/anijain2305/828/orig 2025-08-14T21:21:50.1133025Z * [new branch] gh/anijain2305/829/base -> origin/gh/anijain2305/829/base 2025-08-14T21:21:50.1133265Z * [new branch] gh/anijain2305/829/head -> origin/gh/anijain2305/829/head 2025-08-14T21:21:50.1133435Z * [new branch] gh/anijain2305/829/orig -> origin/gh/anijain2305/829/orig 2025-08-14T21:21:50.1134420Z * [new branch] gh/anijain2305/830/base -> origin/gh/anijain2305/830/base 2025-08-14T21:21:50.1134569Z * [new branch] gh/anijain2305/830/head -> origin/gh/anijain2305/830/head 2025-08-14T21:21:50.1135187Z * [new branch] gh/anijain2305/830/orig -> origin/gh/anijain2305/830/orig 2025-08-14T21:21:50.1135876Z * [new branch] gh/anijain2305/831/base -> origin/gh/anijain2305/831/base 2025-08-14T21:21:50.1136700Z * [new branch] gh/anijain2305/831/head -> origin/gh/anijain2305/831/head 2025-08-14T21:21:50.1137326Z * [new branch] gh/anijain2305/831/orig -> origin/gh/anijain2305/831/orig 2025-08-14T21:21:50.1138535Z * [new branch] gh/anijain2305/832/base -> origin/gh/anijain2305/832/base 2025-08-14T21:21:50.1139193Z * [new branch] gh/anijain2305/832/head -> origin/gh/anijain2305/832/head 2025-08-14T21:21:50.1139703Z * [new branch] gh/anijain2305/832/orig -> origin/gh/anijain2305/832/orig 2025-08-14T21:21:50.1140987Z * [new branch] gh/anijain2305/833/base -> origin/gh/anijain2305/833/base 2025-08-14T21:21:50.1144432Z * [new branch] gh/anijain2305/833/head -> origin/gh/anijain2305/833/head 2025-08-14T21:21:50.1144909Z * [new branch] gh/anijain2305/833/orig -> origin/gh/anijain2305/833/orig 2025-08-14T21:21:50.1145302Z * [new branch] gh/anijain2305/834/base -> origin/gh/anijain2305/834/base 2025-08-14T21:21:50.1145474Z * [new branch] gh/anijain2305/834/head -> origin/gh/anijain2305/834/head 2025-08-14T21:21:50.1145628Z * [new branch] gh/anijain2305/834/orig -> origin/gh/anijain2305/834/orig 2025-08-14T21:21:50.1149342Z * [new branch] gh/anijain2305/835/base -> origin/gh/anijain2305/835/base 2025-08-14T21:21:50.1152680Z * [new branch] gh/anijain2305/835/head -> origin/gh/anijain2305/835/head 2025-08-14T21:21:50.1152880Z * [new branch] gh/anijain2305/835/orig -> origin/gh/anijain2305/835/orig 2025-08-14T21:21:50.1153035Z * [new branch] gh/anijain2305/836/base -> origin/gh/anijain2305/836/base 2025-08-14T21:21:50.1153186Z * [new branch] gh/anijain2305/836/head -> origin/gh/anijain2305/836/head 2025-08-14T21:21:50.1153342Z * [new branch] gh/anijain2305/836/orig -> origin/gh/anijain2305/836/orig 2025-08-14T21:21:50.1153681Z * [new branch] gh/anijain2305/837/base -> origin/gh/anijain2305/837/base 2025-08-14T21:21:50.1153829Z * [new branch] gh/anijain2305/837/head -> origin/gh/anijain2305/837/head 2025-08-14T21:21:50.1153995Z * [new branch] gh/anijain2305/837/orig -> origin/gh/anijain2305/837/orig 2025-08-14T21:21:50.1154161Z * [new branch] gh/anijain2305/838/base -> origin/gh/anijain2305/838/base 2025-08-14T21:21:50.1154331Z * [new branch] gh/anijain2305/838/head -> origin/gh/anijain2305/838/head 2025-08-14T21:21:50.1154495Z * [new branch] gh/anijain2305/838/orig -> origin/gh/anijain2305/838/orig 2025-08-14T21:21:50.1156777Z * [new branch] gh/anijain2305/839/base -> origin/gh/anijain2305/839/base 2025-08-14T21:21:50.1156967Z * [new branch] gh/anijain2305/839/head -> origin/gh/anijain2305/839/head 2025-08-14T21:21:50.1157151Z * [new branch] gh/anijain2305/839/orig -> origin/gh/anijain2305/839/orig 2025-08-14T21:21:50.1157925Z * [new branch] gh/anijain2305/840/base -> origin/gh/anijain2305/840/base 2025-08-14T21:21:50.1158493Z * [new branch] gh/anijain2305/840/head -> origin/gh/anijain2305/840/head 2025-08-14T21:21:50.1158991Z * [new branch] gh/anijain2305/840/orig -> origin/gh/anijain2305/840/orig 2025-08-14T21:21:50.1159353Z * [new branch] gh/anijain2305/841/base -> origin/gh/anijain2305/841/base 2025-08-14T21:21:50.1160379Z * [new branch] gh/anijain2305/841/head -> origin/gh/anijain2305/841/head 2025-08-14T21:21:50.1161036Z * [new branch] gh/anijain2305/841/orig -> origin/gh/anijain2305/841/orig 2025-08-14T21:21:50.1162181Z * [new branch] gh/anijain2305/842/base -> origin/gh/anijain2305/842/base 2025-08-14T21:21:50.1162358Z * [new branch] gh/anijain2305/842/head -> origin/gh/anijain2305/842/head 2025-08-14T21:21:50.1163633Z * [new branch] gh/anijain2305/842/orig -> origin/gh/anijain2305/842/orig 2025-08-14T21:21:50.1166773Z * [new branch] gh/anijain2305/843/base -> origin/gh/anijain2305/843/base 2025-08-14T21:21:50.1166949Z * [new branch] gh/anijain2305/843/head -> origin/gh/anijain2305/843/head 2025-08-14T21:21:50.1167103Z * [new branch] gh/anijain2305/843/orig -> origin/gh/anijain2305/843/orig 2025-08-14T21:21:50.1167269Z * [new branch] gh/anijain2305/844/base -> origin/gh/anijain2305/844/base 2025-08-14T21:21:50.1167423Z * [new branch] gh/anijain2305/844/head -> origin/gh/anijain2305/844/head 2025-08-14T21:21:50.1169224Z * [new branch] gh/anijain2305/844/orig -> origin/gh/anijain2305/844/orig 2025-08-14T21:21:50.1169431Z * [new branch] gh/anijain2305/845/base -> origin/gh/anijain2305/845/base 2025-08-14T21:21:50.1170100Z * [new branch] gh/anijain2305/845/head -> origin/gh/anijain2305/845/head 2025-08-14T21:21:50.1170809Z * [new branch] gh/anijain2305/845/orig -> origin/gh/anijain2305/845/orig 2025-08-14T21:21:50.1171895Z * [new branch] gh/anijain2305/846/base -> origin/gh/anijain2305/846/base 2025-08-14T21:21:50.1172068Z * [new branch] gh/anijain2305/846/head -> origin/gh/anijain2305/846/head 2025-08-14T21:21:50.1173413Z * [new branch] gh/anijain2305/846/orig -> origin/gh/anijain2305/846/orig 2025-08-14T21:21:50.1174141Z * [new branch] gh/anijain2305/847/base -> origin/gh/anijain2305/847/base 2025-08-14T21:21:50.1174740Z * [new branch] gh/anijain2305/847/head -> origin/gh/anijain2305/847/head 2025-08-14T21:21:50.1175848Z * [new branch] gh/anijain2305/847/orig -> origin/gh/anijain2305/847/orig 2025-08-14T21:21:50.1176630Z * [new branch] gh/anijain2305/848/base -> origin/gh/anijain2305/848/base 2025-08-14T21:21:50.1177307Z * [new branch] gh/anijain2305/848/head -> origin/gh/anijain2305/848/head 2025-08-14T21:21:50.1178388Z * [new branch] gh/anijain2305/848/orig -> origin/gh/anijain2305/848/orig 2025-08-14T21:21:50.1179712Z * [new branch] gh/anjali411/216/base -> origin/gh/anjali411/216/base 2025-08-14T21:21:50.1180015Z * [new branch] gh/anjali411/216/head -> origin/gh/anjali411/216/head 2025-08-14T21:21:50.1180976Z * [new branch] gh/anjali411/216/orig -> origin/gh/anjali411/216/orig 2025-08-14T21:21:50.1182394Z * [new branch] gh/ankitageorge/10/base -> origin/gh/ankitageorge/10/base 2025-08-14T21:21:50.1182648Z * [new branch] gh/ankitageorge/10/head -> origin/gh/ankitageorge/10/head 2025-08-14T21:21:50.1183821Z * [new branch] gh/ankitageorge/10/orig -> origin/gh/ankitageorge/10/orig 2025-08-14T21:21:50.1184379Z * [new branch] gh/ankitageorge/12/base -> origin/gh/ankitageorge/12/base 2025-08-14T21:21:50.1185362Z * [new branch] gh/ankitageorge/12/head -> origin/gh/ankitageorge/12/head 2025-08-14T21:21:50.1185799Z * [new branch] gh/ankitageorge/12/orig -> origin/gh/ankitageorge/12/orig 2025-08-14T21:21:50.1187132Z * [new branch] gh/ankitageorge/13/base -> origin/gh/ankitageorge/13/base 2025-08-14T21:21:50.1187453Z * [new branch] gh/ankitageorge/13/head -> origin/gh/ankitageorge/13/head 2025-08-14T21:21:50.1188748Z * [new branch] gh/ankitageorge/13/orig -> origin/gh/ankitageorge/13/orig 2025-08-14T21:21:50.1189775Z * [new branch] gh/ankitageorge/14/base -> origin/gh/ankitageorge/14/base 2025-08-14T21:21:50.1190392Z * [new branch] gh/ankitageorge/14/head -> origin/gh/ankitageorge/14/head 2025-08-14T21:21:50.1191815Z * [new branch] gh/ankitageorge/14/orig -> origin/gh/ankitageorge/14/orig 2025-08-14T21:21:50.1192514Z * [new branch] gh/ankitageorge/15/base -> origin/gh/ankitageorge/15/base 2025-08-14T21:21:50.1193183Z * [new branch] gh/ankitageorge/15/head -> origin/gh/ankitageorge/15/head 2025-08-14T21:21:50.1193451Z * [new branch] gh/ankitageorge/15/orig -> origin/gh/ankitageorge/15/orig 2025-08-14T21:21:50.1194949Z * [new branch] gh/ankitageorge/16/base -> origin/gh/ankitageorge/16/base 2025-08-14T21:21:50.1196525Z * [new branch] gh/ankitageorge/16/head -> origin/gh/ankitageorge/16/head 2025-08-14T21:21:50.1197156Z * [new branch] gh/ankitageorge/16/orig -> origin/gh/ankitageorge/16/orig 2025-08-14T21:21:50.1197337Z * [new branch] gh/ankitageorge/17/base -> origin/gh/ankitageorge/17/base 2025-08-14T21:21:50.1199961Z * [new branch] gh/ankitageorge/17/head -> origin/gh/ankitageorge/17/head 2025-08-14T21:21:50.1200191Z * [new branch] gh/ankitageorge/17/orig -> origin/gh/ankitageorge/17/orig 2025-08-14T21:21:50.1200686Z * [new branch] gh/ankitageorge/18/base -> origin/gh/ankitageorge/18/base 2025-08-14T21:21:50.1200876Z * [new branch] gh/ankitageorge/18/head -> origin/gh/ankitageorge/18/head 2025-08-14T21:21:50.1201108Z * [new branch] gh/ankitageorge/18/orig -> origin/gh/ankitageorge/18/orig 2025-08-14T21:21:50.1204401Z * [new branch] gh/ankitageorge/19/base -> origin/gh/ankitageorge/19/base 2025-08-14T21:21:50.1204604Z * [new branch] gh/ankitageorge/19/head -> origin/gh/ankitageorge/19/head 2025-08-14T21:21:50.1204777Z * [new branch] gh/ankitageorge/19/orig -> origin/gh/ankitageorge/19/orig 2025-08-14T21:21:50.1206491Z * [new branch] gh/ankitageorge/20/base -> origin/gh/ankitageorge/20/base 2025-08-14T21:21:50.1207261Z * [new branch] gh/ankitageorge/20/head -> origin/gh/ankitageorge/20/head 2025-08-14T21:21:50.1207711Z * [new branch] gh/ankitageorge/20/orig -> origin/gh/ankitageorge/20/orig 2025-08-14T21:21:50.1207879Z * [new branch] gh/ankitageorge/21/base -> origin/gh/ankitageorge/21/base 2025-08-14T21:21:50.1208037Z * [new branch] gh/ankitageorge/21/head -> origin/gh/ankitageorge/21/head 2025-08-14T21:21:50.1209079Z * [new branch] gh/ankitageorge/21/orig -> origin/gh/ankitageorge/21/orig 2025-08-14T21:21:50.1210286Z * [new branch] gh/anshul-si/1/base -> origin/gh/anshul-si/1/base 2025-08-14T21:21:50.1210695Z * [new branch] gh/anshul-si/1/head -> origin/gh/anshul-si/1/head 2025-08-14T21:21:50.1212831Z * [new branch] gh/anshul-si/10/base -> origin/gh/anshul-si/10/base 2025-08-14T21:21:50.1213318Z * [new branch] gh/anshul-si/10/head -> origin/gh/anshul-si/10/head 2025-08-14T21:21:50.1213514Z * [new branch] gh/anshul-si/10/orig -> origin/gh/anshul-si/10/orig 2025-08-14T21:21:50.1215398Z * [new branch] gh/anshul-si/11/base -> origin/gh/anshul-si/11/base 2025-08-14T21:21:50.1216524Z * [new branch] gh/anshul-si/11/head -> origin/gh/anshul-si/11/head 2025-08-14T21:21:50.1216808Z * [new branch] gh/anshul-si/11/orig -> origin/gh/anshul-si/11/orig 2025-08-14T21:21:50.1216986Z * [new branch] gh/anshul-si/12/base -> origin/gh/anshul-si/12/base 2025-08-14T21:21:50.1218304Z * [new branch] gh/anshul-si/12/head -> origin/gh/anshul-si/12/head 2025-08-14T21:21:50.1218556Z * [new branch] gh/anshul-si/12/orig -> origin/gh/anshul-si/12/orig 2025-08-14T21:21:50.1221672Z * [new branch] gh/anshul-si/13/base -> origin/gh/anshul-si/13/base 2025-08-14T21:21:50.1221862Z * [new branch] gh/anshul-si/13/head -> origin/gh/anshul-si/13/head 2025-08-14T21:21:50.1222045Z * [new branch] gh/anshul-si/13/orig -> origin/gh/anshul-si/13/orig 2025-08-14T21:21:50.1222571Z * [new branch] gh/anshul-si/14/base -> origin/gh/anshul-si/14/base 2025-08-14T21:21:50.1222727Z * [new branch] gh/anshul-si/14/head -> origin/gh/anshul-si/14/head 2025-08-14T21:21:50.1224213Z * [new branch] gh/anshul-si/14/orig -> origin/gh/anshul-si/14/orig 2025-08-14T21:21:50.1225088Z * [new branch] gh/anshul-si/15/base -> origin/gh/anshul-si/15/base 2025-08-14T21:21:50.1225508Z * [new branch] gh/anshul-si/15/head -> origin/gh/anshul-si/15/head 2025-08-14T21:21:50.1225768Z * [new branch] gh/anshul-si/15/orig -> origin/gh/anshul-si/15/orig 2025-08-14T21:21:50.1229213Z * [new branch] gh/anshul-si/16/base -> origin/gh/anshul-si/16/base 2025-08-14T21:21:50.1229493Z * [new branch] gh/anshul-si/16/head -> origin/gh/anshul-si/16/head 2025-08-14T21:21:50.1229922Z * [new branch] gh/anshul-si/16/orig -> origin/gh/anshul-si/16/orig 2025-08-14T21:21:50.1230225Z * [new branch] gh/anshul-si/17/base -> origin/gh/anshul-si/17/base 2025-08-14T21:21:50.1230407Z * [new branch] gh/anshul-si/17/head -> origin/gh/anshul-si/17/head 2025-08-14T21:21:50.1230638Z * [new branch] gh/anshul-si/17/orig -> origin/gh/anshul-si/17/orig 2025-08-14T21:21:50.1236112Z * [new branch] gh/anshul-si/18/base -> origin/gh/anshul-si/18/base 2025-08-14T21:21:50.1236737Z * [new branch] gh/anshul-si/18/head -> origin/gh/anshul-si/18/head 2025-08-14T21:21:50.1236927Z * [new branch] gh/anshul-si/18/orig -> origin/gh/anshul-si/18/orig 2025-08-14T21:21:50.1237109Z * [new branch] gh/anshul-si/19/base -> origin/gh/anshul-si/19/base 2025-08-14T21:21:50.1237247Z * [new branch] gh/anshul-si/19/head -> origin/gh/anshul-si/19/head 2025-08-14T21:21:50.1237562Z * [new branch] gh/anshul-si/19/orig -> origin/gh/anshul-si/19/orig 2025-08-14T21:21:50.1237752Z * [new branch] gh/anshul-si/2/base -> origin/gh/anshul-si/2/base 2025-08-14T21:21:50.1238202Z * [new branch] gh/anshul-si/2/head -> origin/gh/anshul-si/2/head 2025-08-14T21:21:50.1238842Z * [new branch] gh/anshul-si/20/base -> origin/gh/anshul-si/20/base 2025-08-14T21:21:50.1239465Z * [new branch] gh/anshul-si/20/head -> origin/gh/anshul-si/20/head 2025-08-14T21:21:50.1239678Z * [new branch] gh/anshul-si/20/orig -> origin/gh/anshul-si/20/orig 2025-08-14T21:21:50.1243505Z * [new branch] gh/anshul-si/21/base -> origin/gh/anshul-si/21/base 2025-08-14T21:21:50.1243887Z * [new branch] gh/anshul-si/21/head -> origin/gh/anshul-si/21/head 2025-08-14T21:21:50.1244070Z * [new branch] gh/anshul-si/21/orig -> origin/gh/anshul-si/21/orig 2025-08-14T21:21:50.1244229Z * [new branch] gh/anshul-si/22/base -> origin/gh/anshul-si/22/base 2025-08-14T21:21:50.1244377Z * [new branch] gh/anshul-si/22/head -> origin/gh/anshul-si/22/head 2025-08-14T21:21:50.1244515Z * [new branch] gh/anshul-si/22/orig -> origin/gh/anshul-si/22/orig 2025-08-14T21:21:50.1244652Z * [new branch] gh/anshul-si/23/base -> origin/gh/anshul-si/23/base 2025-08-14T21:21:50.1244805Z * [new branch] gh/anshul-si/23/head -> origin/gh/anshul-si/23/head 2025-08-14T21:21:50.1244942Z * [new branch] gh/anshul-si/23/orig -> origin/gh/anshul-si/23/orig 2025-08-14T21:21:50.1246021Z * [new branch] gh/anshul-si/24/base -> origin/gh/anshul-si/24/base 2025-08-14T21:21:50.1246426Z * [new branch] gh/anshul-si/24/head -> origin/gh/anshul-si/24/head 2025-08-14T21:21:50.1247408Z * [new branch] gh/anshul-si/24/orig -> origin/gh/anshul-si/24/orig 2025-08-14T21:21:50.1249211Z * [new branch] gh/anshul-si/25/base -> origin/gh/anshul-si/25/base 2025-08-14T21:21:50.1249378Z * [new branch] gh/anshul-si/25/head -> origin/gh/anshul-si/25/head 2025-08-14T21:21:50.1249709Z * [new branch] gh/anshul-si/25/orig -> origin/gh/anshul-si/25/orig 2025-08-14T21:21:50.1251900Z * [new branch] gh/anshul-si/26/base -> origin/gh/anshul-si/26/base 2025-08-14T21:21:50.1252222Z * [new branch] gh/anshul-si/26/head -> origin/gh/anshul-si/26/head 2025-08-14T21:21:50.1252625Z * [new branch] gh/anshul-si/26/orig -> origin/gh/anshul-si/26/orig 2025-08-14T21:21:50.1253690Z * [new branch] gh/anshul-si/27/base -> origin/gh/anshul-si/27/base 2025-08-14T21:21:50.1255109Z * [new branch] gh/anshul-si/27/head -> origin/gh/anshul-si/27/head 2025-08-14T21:21:50.1255316Z * [new branch] gh/anshul-si/27/orig -> origin/gh/anshul-si/27/orig 2025-08-14T21:21:50.1255825Z * [new branch] gh/anshul-si/3/base -> origin/gh/anshul-si/3/base 2025-08-14T21:21:50.1256315Z * [new branch] gh/anshul-si/3/head -> origin/gh/anshul-si/3/head 2025-08-14T21:21:50.1257440Z * [new branch] gh/anshul-si/4/base -> origin/gh/anshul-si/4/base 2025-08-14T21:21:50.1257791Z * [new branch] gh/anshul-si/4/head -> origin/gh/anshul-si/4/head 2025-08-14T21:21:50.1258958Z * [new branch] gh/anshul-si/5/base -> origin/gh/anshul-si/5/base 2025-08-14T21:21:50.1259657Z * [new branch] gh/anshul-si/5/head -> origin/gh/anshul-si/5/head 2025-08-14T21:21:50.1261050Z * [new branch] gh/anshul-si/6/base -> origin/gh/anshul-si/6/base 2025-08-14T21:21:50.1261223Z * [new branch] gh/anshul-si/6/head -> origin/gh/anshul-si/6/head 2025-08-14T21:21:50.1261904Z * [new branch] gh/anshul-si/6/orig -> origin/gh/anshul-si/6/orig 2025-08-14T21:21:50.1263769Z * [new branch] gh/anshul-si/7/base -> origin/gh/anshul-si/7/base 2025-08-14T21:21:50.1264145Z * [new branch] gh/anshul-si/7/head -> origin/gh/anshul-si/7/head 2025-08-14T21:21:50.1264418Z * [new branch] gh/anshul-si/7/orig -> origin/gh/anshul-si/7/orig 2025-08-14T21:21:50.1266472Z * [new branch] gh/anshul-si/8/base -> origin/gh/anshul-si/8/base 2025-08-14T21:21:50.1266826Z * [new branch] gh/anshul-si/8/head -> origin/gh/anshul-si/8/head 2025-08-14T21:21:50.1266997Z * [new branch] gh/anshul-si/8/orig -> origin/gh/anshul-si/8/orig 2025-08-14T21:21:50.1269574Z * [new branch] gh/anshul-si/9/base -> origin/gh/anshul-si/9/base 2025-08-14T21:21:50.1269917Z * [new branch] gh/anshul-si/9/head -> origin/gh/anshul-si/9/head 2025-08-14T21:21:50.1270199Z * [new branch] gh/anshul-si/9/orig -> origin/gh/anshul-si/9/orig 2025-08-14T21:21:50.1275128Z * [new branch] gh/aorenste/132/base -> origin/gh/aorenste/132/base 2025-08-14T21:21:50.1275472Z * [new branch] gh/aorenste/132/head -> origin/gh/aorenste/132/head 2025-08-14T21:21:50.1275687Z * [new branch] gh/aorenste/235/base -> origin/gh/aorenste/235/base 2025-08-14T21:21:50.1275939Z * [new branch] gh/aorenste/235/head -> origin/gh/aorenste/235/head 2025-08-14T21:21:50.1276112Z * [new branch] gh/aorenste/235/orig -> origin/gh/aorenste/235/orig 2025-08-14T21:21:50.1276349Z * [new branch] gh/aorenste/236/base -> origin/gh/aorenste/236/base 2025-08-14T21:21:50.1276524Z * [new branch] gh/aorenste/236/head -> origin/gh/aorenste/236/head 2025-08-14T21:21:50.1278248Z * [new branch] gh/aorenste/236/orig -> origin/gh/aorenste/236/orig 2025-08-14T21:21:50.1278618Z * [new branch] gh/aorenste/237/base -> origin/gh/aorenste/237/base 2025-08-14T21:21:50.1279105Z * [new branch] gh/aorenste/237/head -> origin/gh/aorenste/237/head 2025-08-14T21:21:50.1280485Z * [new branch] gh/aorenste/237/orig -> origin/gh/aorenste/237/orig 2025-08-14T21:21:50.1280835Z * [new branch] gh/aorenste/238/base -> origin/gh/aorenste/238/base 2025-08-14T21:21:50.1284648Z * [new branch] gh/aorenste/238/head -> origin/gh/aorenste/238/head 2025-08-14T21:21:50.1284836Z * [new branch] gh/aorenste/238/orig -> origin/gh/aorenste/238/orig 2025-08-14T21:21:50.1285000Z * [new branch] gh/bdhirsh/650/base -> origin/gh/bdhirsh/650/base 2025-08-14T21:21:50.1285151Z * [new branch] gh/bdhirsh/650/head -> origin/gh/bdhirsh/650/head 2025-08-14T21:21:50.1289516Z * [new branch] gh/bdhirsh/650/orig -> origin/gh/bdhirsh/650/orig 2025-08-14T21:21:50.1289898Z * [new branch] gh/bdhirsh/656/base -> origin/gh/bdhirsh/656/base 2025-08-14T21:21:50.1290074Z * [new branch] gh/bdhirsh/656/head -> origin/gh/bdhirsh/656/head 2025-08-14T21:21:50.1290218Z * [new branch] gh/bdhirsh/657/base -> origin/gh/bdhirsh/657/base 2025-08-14T21:21:50.1290366Z * [new branch] gh/bdhirsh/657/head -> origin/gh/bdhirsh/657/head 2025-08-14T21:21:50.1290518Z * [new branch] gh/bdhirsh/659/base -> origin/gh/bdhirsh/659/base 2025-08-14T21:21:50.1290665Z * [new branch] gh/bdhirsh/659/head -> origin/gh/bdhirsh/659/head 2025-08-14T21:21:50.1290957Z * [new branch] gh/bdhirsh/659/orig -> origin/gh/bdhirsh/659/orig 2025-08-14T21:21:50.1291447Z * [new branch] gh/bdhirsh/663/base -> origin/gh/bdhirsh/663/base 2025-08-14T21:21:50.1292770Z * [new branch] gh/bdhirsh/663/head -> origin/gh/bdhirsh/663/head 2025-08-14T21:21:50.1292969Z * [new branch] gh/bdhirsh/663/orig -> origin/gh/bdhirsh/663/orig 2025-08-14T21:21:50.1297022Z * [new branch] gh/bdhirsh/665/base -> origin/gh/bdhirsh/665/base 2025-08-14T21:21:50.1297205Z * [new branch] gh/bdhirsh/665/head -> origin/gh/bdhirsh/665/head 2025-08-14T21:21:50.1297352Z * [new branch] gh/bdhirsh/665/orig -> origin/gh/bdhirsh/665/orig 2025-08-14T21:21:50.1297515Z * [new branch] gh/bdhirsh/666/base -> origin/gh/bdhirsh/666/base 2025-08-14T21:21:50.1297653Z * [new branch] gh/bdhirsh/666/head -> origin/gh/bdhirsh/666/head 2025-08-14T21:21:50.1297839Z * [new branch] gh/bdhirsh/666/orig -> origin/gh/bdhirsh/666/orig 2025-08-14T21:21:50.1300229Z * [new branch] gh/benjaminglass1/79/base -> origin/gh/benjaminglass1/79/base 2025-08-14T21:21:50.1300595Z * [new branch] gh/benjaminglass1/79/head -> origin/gh/benjaminglass1/79/head 2025-08-14T21:21:50.1300836Z * [new branch] gh/benjaminglass1/79/orig -> origin/gh/benjaminglass1/79/orig 2025-08-14T21:21:50.1301155Z * [new branch] gh/benjaminglass1/86/base -> origin/gh/benjaminglass1/86/base 2025-08-14T21:21:50.1302154Z * [new branch] gh/benjaminglass1/86/head -> origin/gh/benjaminglass1/86/head 2025-08-14T21:21:50.1306654Z * [new branch] gh/benjaminglass1/86/orig -> origin/gh/benjaminglass1/86/orig 2025-08-14T21:21:50.1306866Z * [new branch] gh/benjaminglass1/89/base -> origin/gh/benjaminglass1/89/base 2025-08-14T21:21:50.1307037Z * [new branch] gh/benjaminglass1/89/head -> origin/gh/benjaminglass1/89/head 2025-08-14T21:21:50.1307204Z * [new branch] gh/benjaminglass1/89/orig -> origin/gh/benjaminglass1/89/orig 2025-08-14T21:21:50.1307400Z * [new branch] gh/benjaminglass1/91/base -> origin/gh/benjaminglass1/91/base 2025-08-14T21:21:50.1307571Z * [new branch] gh/benjaminglass1/91/head -> origin/gh/benjaminglass1/91/head 2025-08-14T21:21:50.1307916Z * [new branch] gh/benjaminglass1/91/orig -> origin/gh/benjaminglass1/91/orig 2025-08-14T21:21:50.1311151Z * [new branch] gh/benjaminglass1/93/base -> origin/gh/benjaminglass1/93/base 2025-08-14T21:21:50.1311540Z * [new branch] gh/benjaminglass1/93/head -> origin/gh/benjaminglass1/93/head 2025-08-14T21:21:50.1311938Z * [new branch] gh/benjaminglass1/93/orig -> origin/gh/benjaminglass1/93/orig 2025-08-14T21:21:50.1312120Z * [new branch] gh/benjaminglass1/94/base -> origin/gh/benjaminglass1/94/base 2025-08-14T21:21:50.1317092Z * [new branch] gh/benjaminglass1/94/head -> origin/gh/benjaminglass1/94/head 2025-08-14T21:21:50.1317300Z * [new branch] gh/benjaminglass1/94/orig -> origin/gh/benjaminglass1/94/orig 2025-08-14T21:21:50.1317682Z * [new branch] gh/benjaminglass1/95/base -> origin/gh/benjaminglass1/95/base 2025-08-14T21:21:50.1317861Z * [new branch] gh/benjaminglass1/95/head -> origin/gh/benjaminglass1/95/head 2025-08-14T21:21:50.1318018Z * [new branch] gh/benjaminglass1/95/orig -> origin/gh/benjaminglass1/95/orig 2025-08-14T21:21:50.1318175Z * [new branch] gh/benjaminglass1/96/base -> origin/gh/benjaminglass1/96/base 2025-08-14T21:21:50.1318339Z * [new branch] gh/benjaminglass1/96/head -> origin/gh/benjaminglass1/96/head 2025-08-14T21:21:50.1318496Z * [new branch] gh/benjaminglass1/96/orig -> origin/gh/benjaminglass1/96/orig 2025-08-14T21:21:50.1318653Z * [new branch] gh/benjaminglass1/97/base -> origin/gh/benjaminglass1/97/base 2025-08-14T21:21:50.1318818Z * [new branch] gh/benjaminglass1/97/head -> origin/gh/benjaminglass1/97/head 2025-08-14T21:21:50.1319068Z * [new branch] gh/benjaminglass1/97/orig -> origin/gh/benjaminglass1/97/orig 2025-08-14T21:21:50.1319685Z * [new branch] gh/benjaminglass1/98/base -> origin/gh/benjaminglass1/98/base 2025-08-14T21:21:50.1320253Z * [new branch] gh/benjaminglass1/98/head -> origin/gh/benjaminglass1/98/head 2025-08-14T21:21:50.1321107Z * [new branch] gh/benjaminglass1/98/orig -> origin/gh/benjaminglass1/98/orig 2025-08-14T21:21:50.1323177Z * [new branch] gh/bobrenjc93/478/base -> origin/gh/bobrenjc93/478/base 2025-08-14T21:21:50.1323340Z * [new branch] gh/bobrenjc93/478/head -> origin/gh/bobrenjc93/478/head 2025-08-14T21:21:50.1323508Z * [new branch] gh/bobrenjc93/478/orig -> origin/gh/bobrenjc93/478/orig 2025-08-14T21:21:50.1325925Z * [new branch] gh/bobrenjc93/514/base -> origin/gh/bobrenjc93/514/base 2025-08-14T21:21:50.1326090Z * [new branch] gh/bobrenjc93/514/head -> origin/gh/bobrenjc93/514/head 2025-08-14T21:21:50.1326257Z * [new branch] gh/bobrenjc93/514/orig -> origin/gh/bobrenjc93/514/orig 2025-08-14T21:21:50.1327372Z * [new branch] gh/bobrenjc93/521/base -> origin/gh/bobrenjc93/521/base 2025-08-14T21:21:50.1327850Z * [new branch] gh/bobrenjc93/521/head -> origin/gh/bobrenjc93/521/head 2025-08-14T21:21:50.1328438Z * [new branch] gh/bobrenjc93/521/orig -> origin/gh/bobrenjc93/521/orig 2025-08-14T21:21:50.1333924Z * [new branch] gh/bobrenjc93/522/base -> origin/gh/bobrenjc93/522/base 2025-08-14T21:21:50.1334455Z * [new branch] gh/bobrenjc93/522/head -> origin/gh/bobrenjc93/522/head 2025-08-14T21:21:50.1334675Z * [new branch] gh/bobrenjc93/522/orig -> origin/gh/bobrenjc93/522/orig 2025-08-14T21:21:50.1334839Z * [new branch] gh/bobrenjc93/525/base -> origin/gh/bobrenjc93/525/base 2025-08-14T21:21:50.1335010Z * [new branch] gh/bobrenjc93/525/head -> origin/gh/bobrenjc93/525/head 2025-08-14T21:21:50.1335176Z * [new branch] gh/bobrenjc93/525/orig -> origin/gh/bobrenjc93/525/orig 2025-08-14T21:21:50.1335497Z * [new branch] gh/bobrenjc93/526/base -> origin/gh/bobrenjc93/526/base 2025-08-14T21:21:50.1335721Z * [new branch] gh/bobrenjc93/526/head -> origin/gh/bobrenjc93/526/head 2025-08-14T21:21:50.1336018Z * [new branch] gh/bobrenjc93/526/orig -> origin/gh/bobrenjc93/526/orig 2025-08-14T21:21:50.1338611Z * [new branch] gh/bobrenjc93/527/base -> origin/gh/bobrenjc93/527/base 2025-08-14T21:21:50.1338791Z * [new branch] gh/bobrenjc93/527/head -> origin/gh/bobrenjc93/527/head 2025-08-14T21:21:50.1338974Z * [new branch] gh/bobrenjc93/527/orig -> origin/gh/bobrenjc93/527/orig 2025-08-14T21:21:50.1339497Z * [new branch] gh/bobrenjc93/528/base -> origin/gh/bobrenjc93/528/base 2025-08-14T21:21:50.1340036Z * [new branch] gh/bobrenjc93/528/head -> origin/gh/bobrenjc93/528/head 2025-08-14T21:21:50.1340361Z * [new branch] gh/bobrenjc93/528/orig -> origin/gh/bobrenjc93/528/orig 2025-08-14T21:21:50.1346224Z * [new branch] gh/bobrenjc93/529/base -> origin/gh/bobrenjc93/529/base 2025-08-14T21:21:50.1346629Z * [new branch] gh/bobrenjc93/529/head -> origin/gh/bobrenjc93/529/head 2025-08-14T21:21:50.1346785Z * [new branch] gh/bobrenjc93/529/orig -> origin/gh/bobrenjc93/529/orig 2025-08-14T21:21:50.1346937Z * [new branch] gh/bobrenjc93/534/base -> origin/gh/bobrenjc93/534/base 2025-08-14T21:21:50.1347103Z * [new branch] gh/bobrenjc93/534/head -> origin/gh/bobrenjc93/534/head 2025-08-14T21:21:50.1347252Z * [new branch] gh/bobrenjc93/534/orig -> origin/gh/bobrenjc93/534/orig 2025-08-14T21:21:50.1347567Z * [new branch] gh/bobrenjc93/535/base -> origin/gh/bobrenjc93/535/base 2025-08-14T21:21:50.1347721Z * [new branch] gh/bobrenjc93/535/head -> origin/gh/bobrenjc93/535/head 2025-08-14T21:21:50.1347874Z * [new branch] gh/bobrenjc93/535/orig -> origin/gh/bobrenjc93/535/orig 2025-08-14T21:21:50.1348771Z * [new branch] gh/bobrenjc93/536/base -> origin/gh/bobrenjc93/536/base 2025-08-14T21:21:50.1349408Z * [new branch] gh/bobrenjc93/536/head -> origin/gh/bobrenjc93/536/head 2025-08-14T21:21:50.1349606Z * [new branch] gh/bobrenjc93/536/orig -> origin/gh/bobrenjc93/536/orig 2025-08-14T21:21:50.1352789Z * [new branch] gh/bobrenjc93/537/base -> origin/gh/bobrenjc93/537/base 2025-08-14T21:21:50.1353395Z * [new branch] gh/bobrenjc93/537/head -> origin/gh/bobrenjc93/537/head 2025-08-14T21:21:50.1353552Z * [new branch] gh/bobrenjc93/537/orig -> origin/gh/bobrenjc93/537/orig 2025-08-14T21:21:50.1353733Z * [new branch] gh/bobrenjc93/538/base -> origin/gh/bobrenjc93/538/base 2025-08-14T21:21:50.1353906Z * [new branch] gh/bobrenjc93/538/head -> origin/gh/bobrenjc93/538/head 2025-08-14T21:21:50.1354069Z * [new branch] gh/bobrenjc93/538/orig -> origin/gh/bobrenjc93/538/orig 2025-08-14T21:21:50.1355536Z * [new branch] gh/bobrenjc93/539/base -> origin/gh/bobrenjc93/539/base 2025-08-14T21:21:50.1360445Z * [new branch] gh/bobrenjc93/539/head -> origin/gh/bobrenjc93/539/head 2025-08-14T21:21:50.1360645Z * [new branch] gh/bobrenjc93/539/orig -> origin/gh/bobrenjc93/539/orig 2025-08-14T21:21:50.1360796Z * [new branch] gh/bobrenjc93/540/base -> origin/gh/bobrenjc93/540/base 2025-08-14T21:21:50.1362049Z * [new branch] gh/bobrenjc93/540/head -> origin/gh/bobrenjc93/540/head 2025-08-14T21:21:50.1362355Z * [new branch] gh/bobrenjc93/540/orig -> origin/gh/bobrenjc93/540/orig 2025-08-14T21:21:50.1362540Z * [new branch] gh/bobrenjc93/541/base -> origin/gh/bobrenjc93/541/base 2025-08-14T21:21:50.1362718Z * [new branch] gh/bobrenjc93/541/head -> origin/gh/bobrenjc93/541/head 2025-08-14T21:21:50.1362893Z * [new branch] gh/bobrenjc93/541/orig -> origin/gh/bobrenjc93/541/orig 2025-08-14T21:21:50.1363057Z * [new branch] gh/bobrenjc93/542/base -> origin/gh/bobrenjc93/542/base 2025-08-14T21:21:50.1363367Z * [new branch] gh/bobrenjc93/542/head -> origin/gh/bobrenjc93/542/head 2025-08-14T21:21:50.1364407Z * [new branch] gh/bobrenjc93/542/orig -> origin/gh/bobrenjc93/542/orig 2025-08-14T21:21:50.1364592Z * [new branch] gh/bobrenjc93/543/base -> origin/gh/bobrenjc93/543/base 2025-08-14T21:21:50.1365183Z * [new branch] gh/bobrenjc93/543/head -> origin/gh/bobrenjc93/543/head 2025-08-14T21:21:50.1366220Z * [new branch] gh/bobrenjc93/543/orig -> origin/gh/bobrenjc93/543/orig 2025-08-14T21:21:50.1369410Z * [new branch] gh/bobrenjc93/544/base -> origin/gh/bobrenjc93/544/base 2025-08-14T21:21:50.1369612Z * [new branch] gh/bobrenjc93/544/head -> origin/gh/bobrenjc93/544/head 2025-08-14T21:21:50.1369787Z * [new branch] gh/bobrenjc93/544/orig -> origin/gh/bobrenjc93/544/orig 2025-08-14T21:21:50.1369948Z * [new branch] gh/bobrenjc93/545/base -> origin/gh/bobrenjc93/545/base 2025-08-14T21:21:50.1373774Z * [new branch] gh/bobrenjc93/545/head -> origin/gh/bobrenjc93/545/head 2025-08-14T21:21:50.1373985Z * [new branch] gh/bobrenjc93/545/orig -> origin/gh/bobrenjc93/545/orig 2025-08-14T21:21:50.1374439Z * [new branch] gh/bobrenjc93/546/base -> origin/gh/bobrenjc93/546/base 2025-08-14T21:21:50.1374634Z * [new branch] gh/bobrenjc93/546/head -> origin/gh/bobrenjc93/546/head 2025-08-14T21:21:50.1374994Z * [new branch] gh/bobrenjc93/546/orig -> origin/gh/bobrenjc93/546/orig 2025-08-14T21:21:50.1376401Z * [new branch] gh/bobrenjc93/547/base -> origin/gh/bobrenjc93/547/base 2025-08-14T21:21:50.1376642Z * [new branch] gh/bobrenjc93/547/head -> origin/gh/bobrenjc93/547/head 2025-08-14T21:21:50.1376843Z * [new branch] gh/bobrenjc93/547/orig -> origin/gh/bobrenjc93/547/orig 2025-08-14T21:21:50.1377000Z * [new branch] gh/bobrenjc93/548/base -> origin/gh/bobrenjc93/548/base 2025-08-14T21:21:50.1382110Z * [new branch] gh/bobrenjc93/548/head -> origin/gh/bobrenjc93/548/head 2025-08-14T21:21:50.1382326Z * [new branch] gh/bobrenjc93/548/orig -> origin/gh/bobrenjc93/548/orig 2025-08-14T21:21:50.1382490Z * [new branch] gh/bobrenjc93/549/base -> origin/gh/bobrenjc93/549/base 2025-08-14T21:21:50.1382684Z * [new branch] gh/bobrenjc93/549/head -> origin/gh/bobrenjc93/549/head 2025-08-14T21:21:50.1382893Z * [new branch] gh/bobrenjc93/549/orig -> origin/gh/bobrenjc93/549/orig 2025-08-14T21:21:50.1383080Z * [new branch] gh/briancoutinho/2/base -> origin/gh/briancoutinho/2/base 2025-08-14T21:21:50.1383253Z * [new branch] gh/briancoutinho/2/head -> origin/gh/briancoutinho/2/head 2025-08-14T21:21:50.1383401Z * [new branch] gh/c00w/23/base -> origin/gh/c00w/23/base 2025-08-14T21:21:50.1383546Z * [new branch] gh/c00w/23/head -> origin/gh/c00w/23/head 2025-08-14T21:21:50.1387196Z * [new branch] gh/c00w/38/base -> origin/gh/c00w/38/base 2025-08-14T21:21:50.1387360Z * [new branch] gh/c00w/38/head -> origin/gh/c00w/38/head 2025-08-14T21:21:50.1387511Z * [new branch] gh/c00w/38/orig -> origin/gh/c00w/38/orig 2025-08-14T21:21:50.1387653Z * [new branch] gh/c00w/48/base -> origin/gh/c00w/48/base 2025-08-14T21:21:50.1396064Z * [new branch] gh/c00w/48/head -> origin/gh/c00w/48/head 2025-08-14T21:21:50.1396231Z * [new branch] gh/c00w/48/orig -> origin/gh/c00w/48/orig 2025-08-14T21:21:50.1396362Z * [new branch] gh/c00w/50/base -> origin/gh/c00w/50/base 2025-08-14T21:21:50.1396498Z * [new branch] gh/c00w/50/head -> origin/gh/c00w/50/head 2025-08-14T21:21:50.1396625Z * [new branch] gh/c00w/50/orig -> origin/gh/c00w/50/orig 2025-08-14T21:21:50.1397490Z * [new branch] gh/c00w/51/base -> origin/gh/c00w/51/base 2025-08-14T21:21:50.1397620Z * [new branch] gh/c00w/51/head -> origin/gh/c00w/51/head 2025-08-14T21:21:50.1397743Z * [new branch] gh/c00w/51/orig -> origin/gh/c00w/51/orig 2025-08-14T21:21:50.1397881Z * [new branch] gh/c00w/52/base -> origin/gh/c00w/52/base 2025-08-14T21:21:50.1398181Z * [new branch] gh/c00w/52/head -> origin/gh/c00w/52/head 2025-08-14T21:21:50.1398308Z * [new branch] gh/c00w/52/orig -> origin/gh/c00w/52/orig 2025-08-14T21:21:50.1398443Z * [new branch] gh/c00w/53/base -> origin/gh/c00w/53/base 2025-08-14T21:21:50.1398569Z * [new branch] gh/c00w/53/head -> origin/gh/c00w/53/head 2025-08-14T21:21:50.1398701Z * [new branch] gh/c00w/53/orig -> origin/gh/c00w/53/orig 2025-08-14T21:21:50.1399703Z * [new branch] gh/c00w/54/base -> origin/gh/c00w/54/base 2025-08-14T21:21:50.1400088Z * [new branch] gh/c00w/54/head -> origin/gh/c00w/54/head 2025-08-14T21:21:50.1402519Z * [new branch] gh/c00w/54/orig -> origin/gh/c00w/54/orig 2025-08-14T21:21:50.1403027Z * [new branch] gh/chenmillie/1/base -> origin/gh/chenmillie/1/base 2025-08-14T21:21:50.1403484Z * [new branch] gh/chenmillie/1/head -> origin/gh/chenmillie/1/head 2025-08-14T21:21:50.1403937Z * [new branch] gh/chenmillie/1/orig -> origin/gh/chenmillie/1/orig 2025-08-14T21:21:50.1404759Z * [new branch] gh/clee2000/1/base -> origin/gh/clee2000/1/base 2025-08-14T21:21:50.1405969Z * [new branch] gh/clee2000/1/head -> origin/gh/clee2000/1/head 2025-08-14T21:21:50.1406410Z * [new branch] gh/clee2000/1/orig -> origin/gh/clee2000/1/orig 2025-08-14T21:21:50.1407780Z * [new branch] gh/coconutruben/1/base -> origin/gh/coconutruben/1/base 2025-08-14T21:21:50.1408236Z * [new branch] gh/coconutruben/1/head -> origin/gh/coconutruben/1/head 2025-08-14T21:21:50.1414285Z * [new branch] gh/coconutruben/11/base -> origin/gh/coconutruben/11/base 2025-08-14T21:21:50.1414491Z * [new branch] gh/coconutruben/11/head -> origin/gh/coconutruben/11/head 2025-08-14T21:21:50.1415496Z * [new branch] gh/coconutruben/11/orig -> origin/gh/coconutruben/11/orig 2025-08-14T21:21:50.1415658Z * [new branch] gh/coconutruben/12/base -> origin/gh/coconutruben/12/base 2025-08-14T21:21:50.1415811Z * [new branch] gh/coconutruben/12/head -> origin/gh/coconutruben/12/head 2025-08-14T21:21:50.1415970Z * [new branch] gh/coconutruben/12/orig -> origin/gh/coconutruben/12/orig 2025-08-14T21:21:50.1420823Z * [new branch] gh/coconutruben/13/base -> origin/gh/coconutruben/13/base 2025-08-14T21:21:50.1421044Z * [new branch] gh/coconutruben/13/head -> origin/gh/coconutruben/13/head 2025-08-14T21:21:50.1421202Z * [new branch] gh/coconutruben/13/orig -> origin/gh/coconutruben/13/orig 2025-08-14T21:21:50.1421356Z * [new branch] gh/coconutruben/14/base -> origin/gh/coconutruben/14/base 2025-08-14T21:21:50.1421547Z * [new branch] gh/coconutruben/14/head -> origin/gh/coconutruben/14/head 2025-08-14T21:21:50.1421701Z * [new branch] gh/coconutruben/14/orig -> origin/gh/coconutruben/14/orig 2025-08-14T21:21:50.1423969Z * [new branch] gh/coconutruben/15/base -> origin/gh/coconutruben/15/base 2025-08-14T21:21:50.1424133Z * [new branch] gh/coconutruben/15/head -> origin/gh/coconutruben/15/head 2025-08-14T21:21:50.1424520Z * [new branch] gh/coconutruben/15/orig -> origin/gh/coconutruben/15/orig 2025-08-14T21:21:50.1424693Z * [new branch] gh/coconutruben/16/base -> origin/gh/coconutruben/16/base 2025-08-14T21:21:50.1424843Z * [new branch] gh/coconutruben/16/head -> origin/gh/coconutruben/16/head 2025-08-14T21:21:50.1425007Z * [new branch] gh/coconutruben/16/orig -> origin/gh/coconutruben/16/orig 2025-08-14T21:21:50.1425185Z * [new branch] gh/coconutruben/17/base -> origin/gh/coconutruben/17/base 2025-08-14T21:21:50.1430423Z * [new branch] gh/coconutruben/17/head -> origin/gh/coconutruben/17/head 2025-08-14T21:21:50.1430638Z * [new branch] gh/coconutruben/17/orig -> origin/gh/coconutruben/17/orig 2025-08-14T21:21:50.1430793Z * [new branch] gh/coconutruben/18/base -> origin/gh/coconutruben/18/base 2025-08-14T21:21:50.1430949Z * [new branch] gh/coconutruben/18/head -> origin/gh/coconutruben/18/head 2025-08-14T21:21:50.1431163Z * [new branch] gh/coconutruben/18/orig -> origin/gh/coconutruben/18/orig 2025-08-14T21:21:50.1434719Z * [new branch] gh/coconutruben/19/base -> origin/gh/coconutruben/19/base 2025-08-14T21:21:50.1435330Z * [new branch] gh/coconutruben/19/head -> origin/gh/coconutruben/19/head 2025-08-14T21:21:50.1435519Z * [new branch] gh/coconutruben/19/orig -> origin/gh/coconutruben/19/orig 2025-08-14T21:21:50.1435885Z * [new branch] gh/coconutruben/20/base -> origin/gh/coconutruben/20/base 2025-08-14T21:21:50.1436047Z * [new branch] gh/coconutruben/20/head -> origin/gh/coconutruben/20/head 2025-08-14T21:21:50.1436197Z * [new branch] gh/coconutruben/20/orig -> origin/gh/coconutruben/20/orig 2025-08-14T21:21:50.1439445Z * [new branch] gh/coconutruben/21/base -> origin/gh/coconutruben/21/base 2025-08-14T21:21:50.1439632Z * [new branch] gh/coconutruben/21/head -> origin/gh/coconutruben/21/head 2025-08-14T21:21:50.1439796Z * [new branch] gh/coconutruben/21/orig -> origin/gh/coconutruben/21/orig 2025-08-14T21:21:50.1439968Z * [new branch] gh/coconutruben/22/base -> origin/gh/coconutruben/22/base 2025-08-14T21:21:50.1440122Z * [new branch] gh/coconutruben/22/head -> origin/gh/coconutruben/22/head 2025-08-14T21:21:50.1443803Z * [new branch] gh/coconutruben/22/orig -> origin/gh/coconutruben/22/orig 2025-08-14T21:21:50.1447773Z * [new branch] gh/coconutruben/23/base -> origin/gh/coconutruben/23/base 2025-08-14T21:21:50.1448025Z * [new branch] gh/coconutruben/23/head -> origin/gh/coconutruben/23/head 2025-08-14T21:21:50.1448197Z * [new branch] gh/coconutruben/23/orig -> origin/gh/coconutruben/23/orig 2025-08-14T21:21:50.1448359Z * [new branch] gh/coconutruben/24/base -> origin/gh/coconutruben/24/base 2025-08-14T21:21:50.1448525Z * [new branch] gh/coconutruben/24/head -> origin/gh/coconutruben/24/head 2025-08-14T21:21:50.1448903Z * [new branch] gh/coconutruben/24/orig -> origin/gh/coconutruben/24/orig 2025-08-14T21:21:50.1449077Z * [new branch] gh/coconutruben/25/base -> origin/gh/coconutruben/25/base 2025-08-14T21:21:50.1449279Z * [new branch] gh/coconutruben/25/head -> origin/gh/coconutruben/25/head 2025-08-14T21:21:50.1449458Z * [new branch] gh/coconutruben/25/orig -> origin/gh/coconutruben/25/orig 2025-08-14T21:21:50.1449628Z * [new branch] gh/coconutruben/26/base -> origin/gh/coconutruben/26/base 2025-08-14T21:21:50.1449787Z * [new branch] gh/coconutruben/26/head -> origin/gh/coconutruben/26/head 2025-08-14T21:21:50.1455942Z * [new branch] gh/coconutruben/26/orig -> origin/gh/coconutruben/26/orig 2025-08-14T21:21:50.1457782Z * [new branch] gh/coconutruben/27/base -> origin/gh/coconutruben/27/base 2025-08-14T21:21:50.1458075Z * [new branch] gh/coconutruben/27/head -> origin/gh/coconutruben/27/head 2025-08-14T21:21:50.1461933Z * [new branch] gh/coconutruben/27/orig -> origin/gh/coconutruben/27/orig 2025-08-14T21:21:50.1462143Z * [new branch] gh/codingwithsurya/10/base -> origin/gh/codingwithsurya/10/base 2025-08-14T21:21:50.1462321Z * [new branch] gh/codingwithsurya/10/head -> origin/gh/codingwithsurya/10/head 2025-08-14T21:21:50.1462649Z * [new branch] gh/codingwithsurya/10/orig -> origin/gh/codingwithsurya/10/orig 2025-08-14T21:21:50.1462827Z * [new branch] gh/codingwithsurya/11/base -> origin/gh/codingwithsurya/11/base 2025-08-14T21:21:50.1463006Z * [new branch] gh/codingwithsurya/11/head -> origin/gh/codingwithsurya/11/head 2025-08-14T21:21:50.1463176Z * [new branch] gh/codingwithsurya/11/orig -> origin/gh/codingwithsurya/11/orig 2025-08-14T21:21:50.1463345Z * [new branch] gh/codingwithsurya/12/base -> origin/gh/codingwithsurya/12/base 2025-08-14T21:21:50.1463514Z * [new branch] gh/codingwithsurya/12/head -> origin/gh/codingwithsurya/12/head 2025-08-14T21:21:50.1463681Z * [new branch] gh/codingwithsurya/12/orig -> origin/gh/codingwithsurya/12/orig 2025-08-14T21:21:50.1463854Z * [new branch] gh/codingwithsurya/13/base -> origin/gh/codingwithsurya/13/base 2025-08-14T21:21:50.1464225Z * [new branch] gh/codingwithsurya/13/head -> origin/gh/codingwithsurya/13/head 2025-08-14T21:21:50.1468804Z * [new branch] gh/codingwithsurya/13/orig -> origin/gh/codingwithsurya/13/orig 2025-08-14T21:21:50.1469546Z * [new branch] gh/codingwithsurya/14/base -> origin/gh/codingwithsurya/14/base 2025-08-14T21:21:50.1469732Z * [new branch] gh/codingwithsurya/14/head -> origin/gh/codingwithsurya/14/head 2025-08-14T21:21:50.1469917Z * [new branch] gh/codingwithsurya/14/orig -> origin/gh/codingwithsurya/14/orig 2025-08-14T21:21:50.1470500Z * [new branch] gh/codingwithsurya/15/base -> origin/gh/codingwithsurya/15/base 2025-08-14T21:21:50.1470679Z * [new branch] gh/codingwithsurya/15/head -> origin/gh/codingwithsurya/15/head 2025-08-14T21:21:50.1470844Z * [new branch] gh/codingwithsurya/15/orig -> origin/gh/codingwithsurya/15/orig 2025-08-14T21:21:50.1471031Z * [new branch] gh/codingwithsurya/16/base -> origin/gh/codingwithsurya/16/base 2025-08-14T21:21:50.1471203Z * [new branch] gh/codingwithsurya/16/head -> origin/gh/codingwithsurya/16/head 2025-08-14T21:21:50.1471374Z * [new branch] gh/codingwithsurya/16/orig -> origin/gh/codingwithsurya/16/orig 2025-08-14T21:21:50.1477129Z * [new branch] gh/codingwithsurya/17/base -> origin/gh/codingwithsurya/17/base 2025-08-14T21:21:50.1477343Z * [new branch] gh/codingwithsurya/17/head -> origin/gh/codingwithsurya/17/head 2025-08-14T21:21:50.1477522Z * [new branch] gh/codingwithsurya/17/orig -> origin/gh/codingwithsurya/17/orig 2025-08-14T21:21:50.1477688Z * [new branch] gh/codingwithsurya/18/base -> origin/gh/codingwithsurya/18/base 2025-08-14T21:21:50.1477857Z * [new branch] gh/codingwithsurya/18/head -> origin/gh/codingwithsurya/18/head 2025-08-14T21:21:50.1478051Z * [new branch] gh/codingwithsurya/18/orig -> origin/gh/codingwithsurya/18/orig 2025-08-14T21:21:50.1480255Z * [new branch] gh/codingwithsurya/19/base -> origin/gh/codingwithsurya/19/base 2025-08-14T21:21:50.1480437Z * [new branch] gh/codingwithsurya/19/head -> origin/gh/codingwithsurya/19/head 2025-08-14T21:21:50.1480600Z * [new branch] gh/codingwithsurya/19/orig -> origin/gh/codingwithsurya/19/orig 2025-08-14T21:21:50.1480767Z * [new branch] gh/codingwithsurya/20/base -> origin/gh/codingwithsurya/20/base 2025-08-14T21:21:50.1480945Z * [new branch] gh/codingwithsurya/20/head -> origin/gh/codingwithsurya/20/head 2025-08-14T21:21:50.1481112Z * [new branch] gh/codingwithsurya/20/orig -> origin/gh/codingwithsurya/20/orig 2025-08-14T21:21:50.1485699Z * [new branch] gh/codingwithsurya/21/base -> origin/gh/codingwithsurya/21/base 2025-08-14T21:21:50.1485891Z * [new branch] gh/codingwithsurya/21/head -> origin/gh/codingwithsurya/21/head 2025-08-14T21:21:50.1486312Z * [new branch] gh/codingwithsurya/21/orig -> origin/gh/codingwithsurya/21/orig 2025-08-14T21:21:50.1486514Z * [new branch] gh/codingwithsurya/8/base -> origin/gh/codingwithsurya/8/base 2025-08-14T21:21:50.1486683Z * [new branch] gh/codingwithsurya/8/head -> origin/gh/codingwithsurya/8/head 2025-08-14T21:21:50.1486853Z * [new branch] gh/codingwithsurya/8/orig -> origin/gh/codingwithsurya/8/orig 2025-08-14T21:21:50.1487026Z * [new branch] gh/codingwithsurya/9/base -> origin/gh/codingwithsurya/9/base 2025-08-14T21:21:50.1487197Z * [new branch] gh/codingwithsurya/9/head -> origin/gh/codingwithsurya/9/head 2025-08-14T21:21:50.1487376Z * [new branch] gh/codingwithsurya/9/orig -> origin/gh/codingwithsurya/9/orig 2025-08-14T21:21:50.1488774Z * [new branch] gh/colinchan15/1/base -> origin/gh/colinchan15/1/base 2025-08-14T21:21:50.1489223Z * [new branch] gh/colinchan15/1/head -> origin/gh/colinchan15/1/head 2025-08-14T21:21:50.1494085Z * [new branch] gh/colinchan15/2/base -> origin/gh/colinchan15/2/base 2025-08-14T21:21:50.1494249Z * [new branch] gh/colinchan15/2/head -> origin/gh/colinchan15/2/head 2025-08-14T21:21:50.1494401Z * [new branch] gh/colinchan15/3/base -> origin/gh/colinchan15/3/base 2025-08-14T21:21:50.1494561Z * [new branch] gh/colinchan15/3/head -> origin/gh/colinchan15/3/head 2025-08-14T21:21:50.1494762Z * [new branch] gh/colinchan15/4/base -> origin/gh/colinchan15/4/base 2025-08-14T21:21:50.1494914Z * [new branch] gh/colinchan15/4/head -> origin/gh/colinchan15/4/head 2025-08-14T21:21:50.1495074Z * [new branch] gh/colinchan15/5/base -> origin/gh/colinchan15/5/base 2025-08-14T21:21:50.1495226Z * [new branch] gh/colinchan15/5/head -> origin/gh/colinchan15/5/head 2025-08-14T21:21:50.1497894Z * [new branch] gh/colinchan15/6/base -> origin/gh/colinchan15/6/base 2025-08-14T21:21:50.1498080Z * [new branch] gh/colinchan15/6/head -> origin/gh/colinchan15/6/head 2025-08-14T21:21:50.1498274Z * [new branch] gh/davidberard98/351/base -> origin/gh/davidberard98/351/base 2025-08-14T21:21:50.1498451Z * [new branch] gh/davidberard98/351/head -> origin/gh/davidberard98/351/head 2025-08-14T21:21:50.1498618Z * [new branch] gh/davidberard98/351/orig -> origin/gh/davidberard98/351/orig 2025-08-14T21:21:50.1501240Z * [new branch] gh/davidberard98/353/base -> origin/gh/davidberard98/353/base 2025-08-14T21:21:50.1501409Z * [new branch] gh/davidberard98/353/head -> origin/gh/davidberard98/353/head 2025-08-14T21:21:50.1501583Z * [new branch] gh/davidberard98/353/orig -> origin/gh/davidberard98/353/orig 2025-08-14T21:21:50.1501766Z * [new branch] gh/davidberard98/356/base -> origin/gh/davidberard98/356/base 2025-08-14T21:21:50.1506305Z * [new branch] gh/davidberard98/356/head -> origin/gh/davidberard98/356/head 2025-08-14T21:21:50.1506711Z * [new branch] gh/davidberard98/356/orig -> origin/gh/davidberard98/356/orig 2025-08-14T21:21:50.1506972Z * [new branch] gh/davidberard98/382/base -> origin/gh/davidberard98/382/base 2025-08-14T21:21:50.1507173Z * [new branch] gh/davidberard98/382/head -> origin/gh/davidberard98/382/head 2025-08-14T21:21:50.1507360Z * [new branch] gh/davidberard98/382/orig -> origin/gh/davidberard98/382/orig 2025-08-14T21:21:50.1508295Z * [new branch] gh/davidberard98/386/base -> origin/gh/davidberard98/386/base 2025-08-14T21:21:50.1508531Z * [new branch] gh/davidberard98/386/head -> origin/gh/davidberard98/386/head 2025-08-14T21:21:50.1509070Z * [new branch] gh/davidberard98/386/orig -> origin/gh/davidberard98/386/orig 2025-08-14T21:21:50.1509511Z * [new branch] gh/davidberard98/389/base -> origin/gh/davidberard98/389/base 2025-08-14T21:21:50.1509721Z * [new branch] gh/davidberard98/389/head -> origin/gh/davidberard98/389/head 2025-08-14T21:21:50.1517296Z * [new branch] gh/davidberard98/389/orig -> origin/gh/davidberard98/389/orig 2025-08-14T21:21:50.1522460Z * [new branch] gh/davidberard98/390/base -> origin/gh/davidberard98/390/base 2025-08-14T21:21:50.1522660Z * [new branch] gh/davidberard98/390/head -> origin/gh/davidberard98/390/head 2025-08-14T21:21:50.1523165Z * [new branch] gh/davidberard98/390/orig -> origin/gh/davidberard98/390/orig 2025-08-14T21:21:50.1523358Z * [new branch] gh/davidberard98/391/base -> origin/gh/davidberard98/391/base 2025-08-14T21:21:50.1523529Z * [new branch] gh/davidberard98/391/head -> origin/gh/davidberard98/391/head 2025-08-14T21:21:50.1523894Z * [new branch] gh/davidberard98/391/orig -> origin/gh/davidberard98/391/orig 2025-08-14T21:21:50.1524088Z * [new branch] gh/davidberard98/392/base -> origin/gh/davidberard98/392/base 2025-08-14T21:21:50.1524248Z * [new branch] gh/davidberard98/392/head -> origin/gh/davidberard98/392/head 2025-08-14T21:21:50.1524413Z * [new branch] gh/davidberard98/392/orig -> origin/gh/davidberard98/392/orig 2025-08-14T21:21:50.1524610Z * [new branch] gh/davidberard98/393/base -> origin/gh/davidberard98/393/base 2025-08-14T21:21:50.1524775Z * [new branch] gh/davidberard98/393/head -> origin/gh/davidberard98/393/head 2025-08-14T21:21:50.1524941Z * [new branch] gh/davidberard98/393/orig -> origin/gh/davidberard98/393/orig 2025-08-14T21:21:50.1525116Z * [new branch] gh/davidberard98/394/base -> origin/gh/davidberard98/394/base 2025-08-14T21:21:50.1525282Z * [new branch] gh/davidberard98/394/head -> origin/gh/davidberard98/394/head 2025-08-14T21:21:50.1525455Z * [new branch] gh/davidberard98/394/orig -> origin/gh/davidberard98/394/orig 2025-08-14T21:21:50.1525611Z * [new branch] gh/davidberard98/395/base -> origin/gh/davidberard98/395/base 2025-08-14T21:21:50.1525779Z * [new branch] gh/davidberard98/395/head -> origin/gh/davidberard98/395/head 2025-08-14T21:21:50.1525962Z * [new branch] gh/davidberard98/395/orig -> origin/gh/davidberard98/395/orig 2025-08-14T21:21:50.1526128Z * [new branch] gh/davidberard98/396/base -> origin/gh/davidberard98/396/base 2025-08-14T21:21:50.1526296Z * [new branch] gh/davidberard98/396/head -> origin/gh/davidberard98/396/head 2025-08-14T21:21:50.1527414Z * [new branch] gh/davidberard98/396/orig -> origin/gh/davidberard98/396/orig 2025-08-14T21:21:50.1527878Z * [new branch] gh/davidberard98/397/base -> origin/gh/davidberard98/397/base 2025-08-14T21:21:50.1529527Z * [new branch] gh/davidberard98/397/head -> origin/gh/davidberard98/397/head 2025-08-14T21:21:50.1529709Z * [new branch] gh/davidberard98/397/orig -> origin/gh/davidberard98/397/orig 2025-08-14T21:21:50.1534387Z * [new branch] gh/davidberard98/398/base -> origin/gh/davidberard98/398/base 2025-08-14T21:21:50.1539426Z * [new branch] gh/davidberard98/398/head -> origin/gh/davidberard98/398/head 2025-08-14T21:21:50.1539641Z * [new branch] gh/davidberard98/398/orig -> origin/gh/davidberard98/398/orig 2025-08-14T21:21:50.1539808Z * [new branch] gh/desertfire/570/base -> origin/gh/desertfire/570/base 2025-08-14T21:21:50.1539965Z * [new branch] gh/desertfire/570/head -> origin/gh/desertfire/570/head 2025-08-14T21:21:50.1540128Z * [new branch] gh/desertfire/570/orig -> origin/gh/desertfire/570/orig 2025-08-14T21:21:50.1540299Z * [new branch] gh/desertfire/572/base -> origin/gh/desertfire/572/base 2025-08-14T21:21:50.1540598Z * [new branch] gh/desertfire/572/head -> origin/gh/desertfire/572/head 2025-08-14T21:21:50.1540756Z * [new branch] gh/desertfire/572/orig -> origin/gh/desertfire/572/orig 2025-08-14T21:21:50.1540909Z * [new branch] gh/desertfire/589/base -> origin/gh/desertfire/589/base 2025-08-14T21:21:50.1541064Z * [new branch] gh/desertfire/589/head -> origin/gh/desertfire/589/head 2025-08-14T21:21:50.1541211Z * [new branch] gh/desertfire/589/orig -> origin/gh/desertfire/589/orig 2025-08-14T21:21:50.1545959Z * [new branch] gh/desertfire/590/base -> origin/gh/desertfire/590/base 2025-08-14T21:21:50.1546152Z * [new branch] gh/desertfire/590/head -> origin/gh/desertfire/590/head 2025-08-14T21:21:50.1546310Z * [new branch] gh/desertfire/590/orig -> origin/gh/desertfire/590/orig 2025-08-14T21:21:50.1546638Z * [new branch] gh/desertfire/591/base -> origin/gh/desertfire/591/base 2025-08-14T21:21:50.1546794Z * [new branch] gh/desertfire/591/head -> origin/gh/desertfire/591/head 2025-08-14T21:21:50.1546957Z * [new branch] gh/desertfire/591/orig -> origin/gh/desertfire/591/orig 2025-08-14T21:21:50.1549370Z * [new branch] gh/desertfire/592/base -> origin/gh/desertfire/592/base 2025-08-14T21:21:50.1549576Z * [new branch] gh/desertfire/592/head -> origin/gh/desertfire/592/head 2025-08-14T21:21:50.1549737Z * [new branch] gh/desertfire/592/orig -> origin/gh/desertfire/592/orig 2025-08-14T21:21:50.1549901Z * [new branch] gh/desertfire/593/base -> origin/gh/desertfire/593/base 2025-08-14T21:21:50.1550060Z * [new branch] gh/desertfire/593/head -> origin/gh/desertfire/593/head 2025-08-14T21:21:50.1550216Z * [new branch] gh/desertfire/593/orig -> origin/gh/desertfire/593/orig 2025-08-14T21:21:50.1550426Z * [new branch] gh/desertfire/594/base -> origin/gh/desertfire/594/base 2025-08-14T21:21:50.1550580Z * [new branch] gh/desertfire/594/head -> origin/gh/desertfire/594/head 2025-08-14T21:21:50.1555704Z * [new branch] gh/desertfire/594/orig -> origin/gh/desertfire/594/orig 2025-08-14T21:21:50.1555896Z * [new branch] gh/desertfire/595/base -> origin/gh/desertfire/595/base 2025-08-14T21:21:50.1556050Z * [new branch] gh/desertfire/595/head -> origin/gh/desertfire/595/head 2025-08-14T21:21:50.1556222Z * [new branch] gh/desertfire/595/orig -> origin/gh/desertfire/595/orig 2025-08-14T21:21:50.1556384Z * [new branch] gh/desertfire/596/base -> origin/gh/desertfire/596/base 2025-08-14T21:21:50.1556548Z * [new branch] gh/desertfire/596/head -> origin/gh/desertfire/596/head 2025-08-14T21:21:50.1562626Z * [new branch] gh/desertfire/596/orig -> origin/gh/desertfire/596/orig 2025-08-14T21:21:50.1562845Z * [new branch] gh/desertfire/597/base -> origin/gh/desertfire/597/base 2025-08-14T21:21:50.1563013Z * [new branch] gh/desertfire/597/head -> origin/gh/desertfire/597/head 2025-08-14T21:21:50.1563169Z * [new branch] gh/desertfire/597/orig -> origin/gh/desertfire/597/orig 2025-08-14T21:21:50.1563331Z * [new branch] gh/dharakk/1/base -> origin/gh/dharakk/1/base 2025-08-14T21:21:50.1563470Z * [new branch] gh/dharakk/1/head -> origin/gh/dharakk/1/head 2025-08-14T21:21:50.1563609Z * [new branch] gh/dharakk/4/base -> origin/gh/dharakk/4/base 2025-08-14T21:21:50.1563787Z * [new branch] gh/dharakk/4/head -> origin/gh/dharakk/4/head 2025-08-14T21:21:50.1563971Z * [new branch] gh/dharakk/4/orig -> origin/gh/dharakk/4/orig 2025-08-14T21:21:50.1564706Z * [new branch] gh/drisspg/140/base -> origin/gh/drisspg/140/base 2025-08-14T21:21:50.1564907Z * [new branch] gh/drisspg/140/head -> origin/gh/drisspg/140/head 2025-08-14T21:21:50.1565055Z * [new branch] gh/drisspg/140/orig -> origin/gh/drisspg/140/orig 2025-08-14T21:21:50.1565204Z * [new branch] gh/drisspg/149/base -> origin/gh/drisspg/149/base 2025-08-14T21:21:50.1565352Z * [new branch] gh/drisspg/149/head -> origin/gh/drisspg/149/head 2025-08-14T21:21:50.1565495Z * [new branch] gh/drisspg/149/orig -> origin/gh/drisspg/149/orig 2025-08-14T21:21:50.1566676Z * [new branch] gh/drisspg/150/base -> origin/gh/drisspg/150/base 2025-08-14T21:21:50.1566907Z * [new branch] gh/drisspg/150/head -> origin/gh/drisspg/150/head 2025-08-14T21:21:50.1568025Z * [new branch] gh/drisspg/150/orig -> origin/gh/drisspg/150/orig 2025-08-14T21:21:50.1569021Z * [new branch] gh/drisspg/151/base -> origin/gh/drisspg/151/base 2025-08-14T21:21:50.1573752Z * [new branch] gh/drisspg/151/head -> origin/gh/drisspg/151/head 2025-08-14T21:21:50.1574012Z * [new branch] gh/drisspg/151/orig -> origin/gh/drisspg/151/orig 2025-08-14T21:21:50.1574170Z * [new branch] gh/drisspg/158/base -> origin/gh/drisspg/158/base 2025-08-14T21:21:50.1574320Z * [new branch] gh/drisspg/158/head -> origin/gh/drisspg/158/head 2025-08-14T21:21:50.1574480Z * [new branch] gh/drisspg/158/orig -> origin/gh/drisspg/158/orig 2025-08-14T21:21:50.1574633Z * [new branch] gh/drisspg/159/base -> origin/gh/drisspg/159/base 2025-08-14T21:21:50.1574785Z * [new branch] gh/drisspg/159/head -> origin/gh/drisspg/159/head 2025-08-14T21:21:50.1578800Z * [new branch] gh/drisspg/159/orig -> origin/gh/drisspg/159/orig 2025-08-14T21:21:50.1578968Z * [new branch] gh/drisspg/166/base -> origin/gh/drisspg/166/base 2025-08-14T21:21:50.1579127Z * [new branch] gh/drisspg/166/head -> origin/gh/drisspg/166/head 2025-08-14T21:21:50.1579277Z * [new branch] gh/drisspg/166/orig -> origin/gh/drisspg/166/orig 2025-08-14T21:21:50.1579424Z * [new branch] gh/drisspg/168/base -> origin/gh/drisspg/168/base 2025-08-14T21:21:50.1579574Z * [new branch] gh/drisspg/168/head -> origin/gh/drisspg/168/head 2025-08-14T21:21:50.1579711Z * [new branch] gh/drisspg/168/orig -> origin/gh/drisspg/168/orig 2025-08-14T21:21:50.1579867Z * [new branch] gh/drisspg/169/base -> origin/gh/drisspg/169/base 2025-08-14T21:21:50.1584862Z * [new branch] gh/drisspg/169/head -> origin/gh/drisspg/169/head 2025-08-14T21:21:50.1585058Z * [new branch] gh/drisspg/169/orig -> origin/gh/drisspg/169/orig 2025-08-14T21:21:50.1585288Z * [new branch] gh/drisspg/170/base -> origin/gh/drisspg/170/base 2025-08-14T21:21:50.1585447Z * [new branch] gh/drisspg/170/head -> origin/gh/drisspg/170/head 2025-08-14T21:21:50.1585605Z * [new branch] gh/drisspg/170/orig -> origin/gh/drisspg/170/orig 2025-08-14T21:21:50.1585750Z * [new branch] gh/drisspg/171/base -> origin/gh/drisspg/171/base 2025-08-14T21:21:50.1585900Z * [new branch] gh/drisspg/171/head -> origin/gh/drisspg/171/head 2025-08-14T21:21:50.1586072Z * [new branch] gh/drisspg/171/orig -> origin/gh/drisspg/171/orig 2025-08-14T21:21:50.1588887Z * [new branch] gh/drisspg/172/base -> origin/gh/drisspg/172/base 2025-08-14T21:21:50.1589055Z * [new branch] gh/drisspg/172/head -> origin/gh/drisspg/172/head 2025-08-14T21:21:50.1589217Z * [new branch] gh/drisspg/172/orig -> origin/gh/drisspg/172/orig 2025-08-14T21:21:50.1589553Z * [new branch] gh/drisspg/173/base -> origin/gh/drisspg/173/base 2025-08-14T21:21:50.1589713Z * [new branch] gh/drisspg/173/head -> origin/gh/drisspg/173/head 2025-08-14T21:21:50.1589860Z * [new branch] gh/drisspg/173/orig -> origin/gh/drisspg/173/orig 2025-08-14T21:21:50.1595987Z * [new branch] gh/drisspg/174/base -> origin/gh/drisspg/174/base 2025-08-14T21:21:50.1600437Z * [new branch] gh/drisspg/174/head -> origin/gh/drisspg/174/head 2025-08-14T21:21:50.1601508Z * [new branch] gh/drisspg/174/orig -> origin/gh/drisspg/174/orig 2025-08-14T21:21:50.1601736Z * [new branch] gh/drisspg/175/base -> origin/gh/drisspg/175/base 2025-08-14T21:21:50.1601918Z * [new branch] gh/drisspg/175/head -> origin/gh/drisspg/175/head 2025-08-14T21:21:50.1602229Z * [new branch] gh/drisspg/175/orig -> origin/gh/drisspg/175/orig 2025-08-14T21:21:50.1602388Z * [new branch] gh/drisspg/176/base -> origin/gh/drisspg/176/base 2025-08-14T21:21:50.1602531Z * [new branch] gh/drisspg/176/head -> origin/gh/drisspg/176/head 2025-08-14T21:21:50.1602805Z * [new branch] gh/drisspg/176/orig -> origin/gh/drisspg/176/orig 2025-08-14T21:21:50.1602955Z * [new branch] gh/drisspg/177/base -> origin/gh/drisspg/177/base 2025-08-14T21:21:50.1603094Z * [new branch] gh/drisspg/177/head -> origin/gh/drisspg/177/head 2025-08-14T21:21:50.1603240Z * [new branch] gh/drisspg/177/orig -> origin/gh/drisspg/177/orig 2025-08-14T21:21:50.1603381Z * [new branch] gh/drisspg/178/base -> origin/gh/drisspg/178/base 2025-08-14T21:21:50.1603528Z * [new branch] gh/drisspg/178/head -> origin/gh/drisspg/178/head 2025-08-14T21:21:50.1603676Z * [new branch] gh/drisspg/178/orig -> origin/gh/drisspg/178/orig 2025-08-14T21:21:50.1603821Z * [new branch] gh/drisspg/179/base -> origin/gh/drisspg/179/base 2025-08-14T21:21:50.1603969Z * [new branch] gh/drisspg/179/head -> origin/gh/drisspg/179/head 2025-08-14T21:21:50.1604107Z * [new branch] gh/drisspg/179/orig -> origin/gh/drisspg/179/orig 2025-08-14T21:21:50.1604258Z * [new branch] gh/drisspg/180/base -> origin/gh/drisspg/180/base 2025-08-14T21:21:50.1604541Z * [new branch] gh/drisspg/180/head -> origin/gh/drisspg/180/head 2025-08-14T21:21:50.1605341Z * [new branch] gh/drisspg/180/orig -> origin/gh/drisspg/180/orig 2025-08-14T21:21:50.1606373Z * [new branch] gh/drisspg/181/base -> origin/gh/drisspg/181/base 2025-08-14T21:21:50.1606641Z * [new branch] gh/drisspg/181/head -> origin/gh/drisspg/181/head 2025-08-14T21:21:50.1607713Z * [new branch] gh/drisspg/181/orig -> origin/gh/drisspg/181/orig 2025-08-14T21:21:50.1608818Z * [new branch] gh/drisspg/182/base -> origin/gh/drisspg/182/base 2025-08-14T21:21:50.1609345Z * [new branch] gh/drisspg/182/head -> origin/gh/drisspg/182/head 2025-08-14T21:21:50.1610688Z * [new branch] gh/drisspg/183/base -> origin/gh/drisspg/183/base 2025-08-14T21:21:50.1610841Z * [new branch] gh/drisspg/183/head -> origin/gh/drisspg/183/head 2025-08-14T21:21:50.1612985Z * [new branch] gh/drisspg/184/base -> origin/gh/drisspg/184/base 2025-08-14T21:21:50.1613275Z * [new branch] gh/drisspg/184/head -> origin/gh/drisspg/184/head 2025-08-14T21:21:50.1613576Z * [new branch] gh/drisspg/185/base -> origin/gh/drisspg/185/base 2025-08-14T21:21:50.1613767Z * [new branch] gh/drisspg/185/head -> origin/gh/drisspg/185/head 2025-08-14T21:21:50.1615947Z * [new branch] gh/dsjohns2/1/base -> origin/gh/dsjohns2/1/base 2025-08-14T21:21:50.1616272Z * [new branch] gh/dsjohns2/1/head -> origin/gh/dsjohns2/1/head 2025-08-14T21:21:50.1618190Z * [new branch] gh/eellison/784/base -> origin/gh/eellison/784/base 2025-08-14T21:21:50.1618913Z * [new branch] gh/eellison/784/head -> origin/gh/eellison/784/head 2025-08-14T21:21:50.1619086Z * [new branch] gh/eellison/784/orig -> origin/gh/eellison/784/orig 2025-08-14T21:21:50.1622604Z * [new branch] gh/eellison/785/base -> origin/gh/eellison/785/base 2025-08-14T21:21:50.1627480Z * [new branch] gh/eellison/785/head -> origin/gh/eellison/785/head 2025-08-14T21:21:50.1632334Z * [new branch] gh/eellison/785/orig -> origin/gh/eellison/785/orig 2025-08-14T21:21:50.1632802Z * [new branch] gh/eellison/789/base -> origin/gh/eellison/789/base 2025-08-14T21:21:50.1632965Z * [new branch] gh/eellison/789/head -> origin/gh/eellison/789/head 2025-08-14T21:21:50.1633107Z * [new branch] gh/eellison/789/orig -> origin/gh/eellison/789/orig 2025-08-14T21:21:50.1633251Z * [new branch] gh/eellison/800/base -> origin/gh/eellison/800/base 2025-08-14T21:21:50.1633389Z * [new branch] gh/eellison/800/head -> origin/gh/eellison/800/head 2025-08-14T21:21:50.1633526Z * [new branch] gh/eellison/800/orig -> origin/gh/eellison/800/orig 2025-08-14T21:21:50.1633670Z * [new branch] gh/eellison/801/base -> origin/gh/eellison/801/base 2025-08-14T21:21:50.1633807Z * [new branch] gh/eellison/801/head -> origin/gh/eellison/801/head 2025-08-14T21:21:50.1633944Z * [new branch] gh/eellison/801/orig -> origin/gh/eellison/801/orig 2025-08-14T21:21:50.1634093Z * [new branch] gh/eellison/802/base -> origin/gh/eellison/802/base 2025-08-14T21:21:50.1634238Z * [new branch] gh/eellison/802/head -> origin/gh/eellison/802/head 2025-08-14T21:21:50.1634387Z * [new branch] gh/eellison/802/orig -> origin/gh/eellison/802/orig 2025-08-14T21:21:50.1634527Z * [new branch] gh/eellison/805/base -> origin/gh/eellison/805/base 2025-08-14T21:21:50.1634666Z * [new branch] gh/eellison/805/head -> origin/gh/eellison/805/head 2025-08-14T21:21:50.1634854Z * [new branch] gh/eellison/805/orig -> origin/gh/eellison/805/orig 2025-08-14T21:21:50.1635106Z * [new branch] gh/eellison/808/base -> origin/gh/eellison/808/base 2025-08-14T21:21:50.1635770Z * [new branch] gh/eellison/808/head -> origin/gh/eellison/808/head 2025-08-14T21:21:50.1635960Z * [new branch] gh/eellison/808/orig -> origin/gh/eellison/808/orig 2025-08-14T21:21:50.1636134Z * [new branch] gh/eellison/809/base -> origin/gh/eellison/809/base 2025-08-14T21:21:50.1636277Z * [new branch] gh/eellison/809/head -> origin/gh/eellison/809/head 2025-08-14T21:21:50.1636430Z * [new branch] gh/eellison/809/orig -> origin/gh/eellison/809/orig 2025-08-14T21:21:50.1639761Z * [new branch] gh/eellison/810/base -> origin/gh/eellison/810/base 2025-08-14T21:21:50.1639955Z * [new branch] gh/eellison/810/head -> origin/gh/eellison/810/head 2025-08-14T21:21:50.1640115Z * [new branch] gh/eellison/810/orig -> origin/gh/eellison/810/orig 2025-08-14T21:21:50.1640266Z * [new branch] gh/eellison/811/base -> origin/gh/eellison/811/base 2025-08-14T21:21:50.1640427Z * [new branch] gh/eellison/811/head -> origin/gh/eellison/811/head 2025-08-14T21:21:50.1645508Z * [new branch] gh/eellison/811/orig -> origin/gh/eellison/811/orig 2025-08-14T21:21:50.1645835Z * [new branch] gh/eellison/812/base -> origin/gh/eellison/812/base 2025-08-14T21:21:50.1646003Z * [new branch] gh/eellison/812/head -> origin/gh/eellison/812/head 2025-08-14T21:21:50.1646162Z * [new branch] gh/eellison/812/orig -> origin/gh/eellison/812/orig 2025-08-14T21:21:50.1646313Z * [new branch] gh/eellison/813/base -> origin/gh/eellison/813/base 2025-08-14T21:21:50.1646463Z * [new branch] gh/eellison/813/head -> origin/gh/eellison/813/head 2025-08-14T21:21:50.1646618Z * [new branch] gh/eellison/813/orig -> origin/gh/eellison/813/orig 2025-08-14T21:21:50.1646765Z * [new branch] gh/etaf/132/base -> origin/gh/etaf/132/base 2025-08-14T21:21:50.1646937Z * [new branch] gh/etaf/132/head -> origin/gh/etaf/132/head 2025-08-14T21:21:50.1648256Z * [new branch] gh/etaf/132/orig -> origin/gh/etaf/132/orig 2025-08-14T21:21:50.1649122Z * [new branch] gh/etaf/138/base -> origin/gh/etaf/138/base 2025-08-14T21:21:50.1654228Z * [new branch] gh/etaf/138/head -> origin/gh/etaf/138/head 2025-08-14T21:21:50.1654734Z * [new branch] gh/etaf/138/orig -> origin/gh/etaf/138/orig 2025-08-14T21:21:50.1654905Z * [new branch] gh/etaf/140/base -> origin/gh/etaf/140/base 2025-08-14T21:21:50.1655049Z * [new branch] gh/etaf/140/head -> origin/gh/etaf/140/head 2025-08-14T21:21:50.1655180Z * [new branch] gh/etaf/140/orig -> origin/gh/etaf/140/orig 2025-08-14T21:21:50.1655320Z * [new branch] gh/etaf/143/base -> origin/gh/etaf/143/base 2025-08-14T21:21:50.1655447Z * [new branch] gh/etaf/143/head -> origin/gh/etaf/143/head 2025-08-14T21:21:50.1655573Z * [new branch] gh/etaf/143/orig -> origin/gh/etaf/143/orig 2025-08-14T21:21:50.1659093Z * [new branch] gh/etaf/147/base -> origin/gh/etaf/147/base 2025-08-14T21:21:50.1659370Z * [new branch] gh/etaf/147/head -> origin/gh/etaf/147/head 2025-08-14T21:21:50.1659524Z * [new branch] gh/etaf/148/base -> origin/gh/etaf/148/base 2025-08-14T21:21:50.1659666Z * [new branch] gh/etaf/148/head -> origin/gh/etaf/148/head 2025-08-14T21:21:50.1659802Z * [new branch] gh/etaf/148/orig -> origin/gh/etaf/148/orig 2025-08-14T21:21:50.1659959Z * [new branch] gh/etaf/149/base -> origin/gh/etaf/149/base 2025-08-14T21:21:50.1660097Z * [new branch] gh/etaf/149/head -> origin/gh/etaf/149/head 2025-08-14T21:21:50.1664200Z * [new branch] gh/etaf/149/orig -> origin/gh/etaf/149/orig 2025-08-14T21:21:50.1664339Z * [new branch] gh/etaf/150/base -> origin/gh/etaf/150/base 2025-08-14T21:21:50.1664626Z * [new branch] gh/etaf/150/head -> origin/gh/etaf/150/head 2025-08-14T21:21:50.1664767Z * [new branch] gh/etaf/150/orig -> origin/gh/etaf/150/orig 2025-08-14T21:21:50.1664900Z * [new branch] gh/etaf/151/base -> origin/gh/etaf/151/base 2025-08-14T21:21:50.1665032Z * [new branch] gh/etaf/151/head -> origin/gh/etaf/151/head 2025-08-14T21:21:50.1665161Z * [new branch] gh/etaf/151/orig -> origin/gh/etaf/151/orig 2025-08-14T21:21:50.1671527Z * [new branch] gh/etaf/152/base -> origin/gh/etaf/152/base 2025-08-14T21:21:50.1674567Z * [new branch] gh/etaf/152/head -> origin/gh/etaf/152/head 2025-08-14T21:21:50.1674752Z * [new branch] gh/etaf/152/orig -> origin/gh/etaf/152/orig 2025-08-14T21:21:50.1674890Z * [new branch] gh/etaf/153/base -> origin/gh/etaf/153/base 2025-08-14T21:21:50.1675221Z * [new branch] gh/etaf/153/head -> origin/gh/etaf/153/head 2025-08-14T21:21:50.1675366Z * [new branch] gh/etaf/153/orig -> origin/gh/etaf/153/orig 2025-08-14T21:21:50.1675500Z * [new branch] gh/etaf/154/base -> origin/gh/etaf/154/base 2025-08-14T21:21:50.1675641Z * [new branch] gh/etaf/154/head -> origin/gh/etaf/154/head 2025-08-14T21:21:50.1675775Z * [new branch] gh/etaf/154/orig -> origin/gh/etaf/154/orig 2025-08-14T21:21:50.1675908Z * [new branch] gh/etaf/155/base -> origin/gh/etaf/155/base 2025-08-14T21:21:50.1676052Z * [new branch] gh/etaf/155/head -> origin/gh/etaf/155/head 2025-08-14T21:21:50.1676186Z * [new branch] gh/etaf/155/orig -> origin/gh/etaf/155/orig 2025-08-14T21:21:50.1676349Z * [new branch] gh/ezyang/2374/base -> origin/gh/ezyang/2374/base 2025-08-14T21:21:50.1680988Z * [new branch] gh/ezyang/2374/head -> origin/gh/ezyang/2374/head 2025-08-14T21:21:50.1683447Z * [new branch] gh/ezyang/2374/orig -> origin/gh/ezyang/2374/orig 2025-08-14T21:21:50.1683629Z * [new branch] gh/ezyang/2973/base -> origin/gh/ezyang/2973/base 2025-08-14T21:21:50.1683784Z * [new branch] gh/ezyang/2973/head -> origin/gh/ezyang/2973/head 2025-08-14T21:21:50.1683929Z * [new branch] gh/ezyang/2973/orig -> origin/gh/ezyang/2973/orig 2025-08-14T21:21:50.1684095Z * [new branch] gh/ezyang/2974/base -> origin/gh/ezyang/2974/base 2025-08-14T21:21:50.1684236Z * [new branch] gh/ezyang/2974/head -> origin/gh/ezyang/2974/head 2025-08-14T21:21:50.1684391Z * [new branch] gh/ezyang/2974/orig -> origin/gh/ezyang/2974/orig 2025-08-14T21:21:50.1684562Z * [new branch] gh/ezyang/3068/base -> origin/gh/ezyang/3068/base 2025-08-14T21:21:50.1684765Z * [new branch] gh/ezyang/3068/head -> origin/gh/ezyang/3068/head 2025-08-14T21:21:50.1684920Z * [new branch] gh/ezyang/3068/orig -> origin/gh/ezyang/3068/orig 2025-08-14T21:21:50.1685071Z * [new branch] gh/ezyang/3071/base -> origin/gh/ezyang/3071/base 2025-08-14T21:21:50.1685216Z * [new branch] gh/ezyang/3071/head -> origin/gh/ezyang/3071/head 2025-08-14T21:21:50.1685396Z * [new branch] gh/ezyang/3071/orig -> origin/gh/ezyang/3071/orig 2025-08-14T21:21:50.1685566Z * [new branch] gh/ezyang/3074/base -> origin/gh/ezyang/3074/base 2025-08-14T21:21:50.1686538Z * [new branch] gh/ezyang/3074/head -> origin/gh/ezyang/3074/head 2025-08-14T21:21:50.1687003Z * [new branch] gh/ezyang/3074/orig -> origin/gh/ezyang/3074/orig 2025-08-14T21:21:50.1687219Z * [new branch] gh/ezyang/3088/base -> origin/gh/ezyang/3088/base 2025-08-14T21:21:50.1688174Z * [new branch] gh/ezyang/3088/head -> origin/gh/ezyang/3088/head 2025-08-14T21:21:50.1689007Z * [new branch] gh/ezyang/3088/orig -> origin/gh/ezyang/3088/orig 2025-08-14T21:21:50.1690745Z * [new branch] gh/ezyang/3092/base -> origin/gh/ezyang/3092/base 2025-08-14T21:21:50.1691309Z * [new branch] gh/ezyang/3092/head -> origin/gh/ezyang/3092/head 2025-08-14T21:21:50.1691468Z * [new branch] gh/ezyang/3092/orig -> origin/gh/ezyang/3092/orig 2025-08-14T21:21:50.1692022Z * [new branch] gh/ezyang/3097/base -> origin/gh/ezyang/3097/base 2025-08-14T21:21:50.1693447Z * [new branch] gh/ezyang/3097/head -> origin/gh/ezyang/3097/head 2025-08-14T21:21:50.1693747Z * [new branch] gh/ezyang/3097/orig -> origin/gh/ezyang/3097/orig 2025-08-14T21:21:50.1695362Z * [new branch] gh/ezyang/3098/base -> origin/gh/ezyang/3098/base 2025-08-14T21:21:50.1695854Z * [new branch] gh/ezyang/3098/head -> origin/gh/ezyang/3098/head 2025-08-14T21:21:50.1696022Z * [new branch] gh/ezyang/3098/orig -> origin/gh/ezyang/3098/orig 2025-08-14T21:21:50.1696677Z * [new branch] gh/ezyang/3099/base -> origin/gh/ezyang/3099/base 2025-08-14T21:21:50.1697545Z * [new branch] gh/ezyang/3099/head -> origin/gh/ezyang/3099/head 2025-08-14T21:21:50.1698362Z * [new branch] gh/ezyang/3099/orig -> origin/gh/ezyang/3099/orig 2025-08-14T21:21:50.1699921Z * [new branch] gh/ezyang/3100/base -> origin/gh/ezyang/3100/base 2025-08-14T21:21:50.1700109Z * [new branch] gh/ezyang/3100/head -> origin/gh/ezyang/3100/head 2025-08-14T21:21:50.1700743Z * [new branch] gh/ezyang/3100/orig -> origin/gh/ezyang/3100/orig 2025-08-14T21:21:50.1702002Z * [new branch] gh/ezyang/3101/base -> origin/gh/ezyang/3101/base 2025-08-14T21:21:50.1702181Z * [new branch] gh/ezyang/3101/head -> origin/gh/ezyang/3101/head 2025-08-14T21:21:50.1706801Z * [new branch] gh/ezyang/3101/orig -> origin/gh/ezyang/3101/orig 2025-08-14T21:21:50.1706997Z * [new branch] gh/ezyang/3102/base -> origin/gh/ezyang/3102/base 2025-08-14T21:21:50.1712122Z * [new branch] gh/ezyang/3102/head -> origin/gh/ezyang/3102/head 2025-08-14T21:21:50.1714345Z * [new branch] gh/ezyang/3102/orig -> origin/gh/ezyang/3102/orig 2025-08-14T21:21:50.1714614Z * [new branch] gh/ezyang/3103/base -> origin/gh/ezyang/3103/base 2025-08-14T21:21:50.1720853Z * [new branch] gh/ezyang/3103/head -> origin/gh/ezyang/3103/head 2025-08-14T21:21:50.1722712Z * [new branch] gh/ezyang/3103/orig -> origin/gh/ezyang/3103/orig 2025-08-14T21:21:50.1722908Z * [new branch] gh/ezyang/3104/base -> origin/gh/ezyang/3104/base 2025-08-14T21:21:50.1723061Z * [new branch] gh/ezyang/3104/head -> origin/gh/ezyang/3104/head 2025-08-14T21:21:50.1723206Z * [new branch] gh/ezyang/3104/orig -> origin/gh/ezyang/3104/orig 2025-08-14T21:21:50.1723359Z * [new branch] gh/ezyang/3105/base -> origin/gh/ezyang/3105/base 2025-08-14T21:21:50.1723522Z * [new branch] gh/ezyang/3105/head -> origin/gh/ezyang/3105/head 2025-08-14T21:21:50.1723669Z * [new branch] gh/ezyang/3105/orig -> origin/gh/ezyang/3105/orig 2025-08-14T21:21:50.1723821Z * [new branch] gh/ezyang/3106/base -> origin/gh/ezyang/3106/base 2025-08-14T21:21:50.1723968Z * [new branch] gh/ezyang/3106/head -> origin/gh/ezyang/3106/head 2025-08-14T21:21:50.1724120Z * [new branch] gh/ezyang/3106/orig -> origin/gh/ezyang/3106/orig 2025-08-14T21:21:50.1724269Z * [new branch] gh/ezyang/3107/base -> origin/gh/ezyang/3107/base 2025-08-14T21:21:50.1724413Z * [new branch] gh/ezyang/3107/head -> origin/gh/ezyang/3107/head 2025-08-14T21:21:50.1724570Z * [new branch] gh/ezyang/3107/orig -> origin/gh/ezyang/3107/orig 2025-08-14T21:21:50.1724716Z * [new branch] gh/ezyang/3108/base -> origin/gh/ezyang/3108/base 2025-08-14T21:21:50.1724868Z * [new branch] gh/ezyang/3108/head -> origin/gh/ezyang/3108/head 2025-08-14T21:21:50.1725011Z * [new branch] gh/ezyang/3108/orig -> origin/gh/ezyang/3108/orig 2025-08-14T21:21:50.1725158Z * [new branch] gh/ezyang/3109/base -> origin/gh/ezyang/3109/base 2025-08-14T21:21:50.1725306Z * [new branch] gh/ezyang/3109/head -> origin/gh/ezyang/3109/head 2025-08-14T21:21:50.1725458Z * [new branch] gh/ezyang/3109/orig -> origin/gh/ezyang/3109/orig 2025-08-14T21:21:50.1725813Z * [new branch] gh/ezyang/3110/base -> origin/gh/ezyang/3110/base 2025-08-14T21:21:50.1725963Z * [new branch] gh/ezyang/3110/head -> origin/gh/ezyang/3110/head 2025-08-14T21:21:50.1726148Z * [new branch] gh/ezyang/3110/orig -> origin/gh/ezyang/3110/orig 2025-08-14T21:21:50.1727558Z * [new branch] gh/ezyang/3111/base -> origin/gh/ezyang/3111/base 2025-08-14T21:21:50.1727804Z * [new branch] gh/ezyang/3111/head -> origin/gh/ezyang/3111/head 2025-08-14T21:21:50.1729195Z * [new branch] gh/ezyang/3111/orig -> origin/gh/ezyang/3111/orig 2025-08-14T21:21:50.1733756Z * [new branch] gh/ezyang/3112/base -> origin/gh/ezyang/3112/base 2025-08-14T21:21:50.1738694Z * [new branch] gh/ezyang/3112/head -> origin/gh/ezyang/3112/head 2025-08-14T21:21:50.1741744Z * [new branch] gh/ezyang/3112/orig -> origin/gh/ezyang/3112/orig 2025-08-14T21:21:50.1742293Z * [new branch] gh/ezyang/3113/base -> origin/gh/ezyang/3113/base 2025-08-14T21:21:50.1742475Z * [new branch] gh/ezyang/3113/head -> origin/gh/ezyang/3113/head 2025-08-14T21:21:50.1742627Z * [new branch] gh/ezyang/3113/orig -> origin/gh/ezyang/3113/orig 2025-08-14T21:21:50.1742770Z * [new branch] gh/ezyang/3114/base -> origin/gh/ezyang/3114/base 2025-08-14T21:21:50.1742913Z * [new branch] gh/ezyang/3114/head -> origin/gh/ezyang/3114/head 2025-08-14T21:21:50.1743064Z * [new branch] gh/ezyang/3114/orig -> origin/gh/ezyang/3114/orig 2025-08-14T21:21:50.1743207Z * [new branch] gh/ezyang/3115/base -> origin/gh/ezyang/3115/base 2025-08-14T21:21:50.1743356Z * [new branch] gh/ezyang/3115/head -> origin/gh/ezyang/3115/head 2025-08-14T21:21:50.1743521Z * [new branch] gh/ezyang/3115/orig -> origin/gh/ezyang/3115/orig 2025-08-14T21:21:50.1743665Z * [new branch] gh/ezyang/3116/base -> origin/gh/ezyang/3116/base 2025-08-14T21:21:50.1743808Z * [new branch] gh/ezyang/3116/head -> origin/gh/ezyang/3116/head 2025-08-14T21:21:50.1747418Z * [new branch] gh/ezyang/3116/orig -> origin/gh/ezyang/3116/orig 2025-08-14T21:21:50.1747610Z * [new branch] gh/ezyang/3117/base -> origin/gh/ezyang/3117/base 2025-08-14T21:21:50.1747784Z * [new branch] gh/ezyang/3117/head -> origin/gh/ezyang/3117/head 2025-08-14T21:21:50.1747934Z * [new branch] gh/ezyang/3117/orig -> origin/gh/ezyang/3117/orig 2025-08-14T21:21:50.1748078Z * [new branch] gh/ezyang/3118/base -> origin/gh/ezyang/3118/base 2025-08-14T21:21:50.1748226Z * [new branch] gh/ezyang/3118/head -> origin/gh/ezyang/3118/head 2025-08-14T21:21:50.1748420Z * [new branch] gh/ezyang/3118/orig -> origin/gh/ezyang/3118/orig 2025-08-14T21:21:50.1748561Z * [new branch] gh/ezyang/3119/base -> origin/gh/ezyang/3119/base 2025-08-14T21:21:50.1749065Z * [new branch] gh/ezyang/3119/head -> origin/gh/ezyang/3119/head 2025-08-14T21:21:50.1749308Z * [new branch] gh/ezyang/3119/orig -> origin/gh/ezyang/3119/orig 2025-08-14T21:21:50.1749475Z * [new branch] gh/ezyang/3120/base -> origin/gh/ezyang/3120/base 2025-08-14T21:21:50.1749614Z * [new branch] gh/ezyang/3120/head -> origin/gh/ezyang/3120/head 2025-08-14T21:21:50.1749888Z * [new branch] gh/ezyang/3120/orig -> origin/gh/ezyang/3120/orig 2025-08-14T21:21:50.1750056Z * [new branch] gh/ezyang/3121/base -> origin/gh/ezyang/3121/base 2025-08-14T21:21:50.1750342Z * [new branch] gh/ezyang/3121/head -> origin/gh/ezyang/3121/head 2025-08-14T21:21:50.1756803Z * [new branch] gh/ezyang/3121/orig -> origin/gh/ezyang/3121/orig 2025-08-14T21:21:50.1757061Z * [new branch] gh/ezyang/3122/base -> origin/gh/ezyang/3122/base 2025-08-14T21:21:50.1757298Z * [new branch] gh/ezyang/3122/head -> origin/gh/ezyang/3122/head 2025-08-14T21:21:50.1757467Z * [new branch] gh/ezyang/3122/orig -> origin/gh/ezyang/3122/orig 2025-08-14T21:21:50.1757727Z * [new branch] gh/ezyang/3123/base -> origin/gh/ezyang/3123/base 2025-08-14T21:21:50.1757892Z * [new branch] gh/ezyang/3123/head -> origin/gh/ezyang/3123/head 2025-08-14T21:21:50.1758049Z * [new branch] gh/ezyang/3123/orig -> origin/gh/ezyang/3123/orig 2025-08-14T21:21:50.1758785Z * [new branch] gh/ezyang/3124/base -> origin/gh/ezyang/3124/base 2025-08-14T21:21:50.1760308Z * [new branch] gh/ezyang/3124/head -> origin/gh/ezyang/3124/head 2025-08-14T21:21:50.1760649Z * [new branch] gh/ezyang/3124/orig -> origin/gh/ezyang/3124/orig 2025-08-14T21:21:50.1760790Z * [new branch] gh/ezyang/3125/base -> origin/gh/ezyang/3125/base 2025-08-14T21:21:50.1760940Z * [new branch] gh/ezyang/3125/head -> origin/gh/ezyang/3125/head 2025-08-14T21:21:50.1761088Z * [new branch] gh/ezyang/3125/orig -> origin/gh/ezyang/3125/orig 2025-08-14T21:21:50.1761234Z * [new branch] gh/ezyang/3126/base -> origin/gh/ezyang/3126/base 2025-08-14T21:21:50.1761384Z * [new branch] gh/ezyang/3126/head -> origin/gh/ezyang/3126/head 2025-08-14T21:21:50.1765598Z * [new branch] gh/ezyang/3126/orig -> origin/gh/ezyang/3126/orig 2025-08-14T21:21:50.1765781Z * [new branch] gh/ezyang/3127/base -> origin/gh/ezyang/3127/base 2025-08-14T21:21:50.1765921Z * [new branch] gh/ezyang/3127/head -> origin/gh/ezyang/3127/head 2025-08-14T21:21:50.1766072Z * [new branch] gh/ezyang/3127/orig -> origin/gh/ezyang/3127/orig 2025-08-14T21:21:50.1766217Z * [new branch] gh/ezyang/3128/base -> origin/gh/ezyang/3128/base 2025-08-14T21:21:50.1766354Z * [new branch] gh/ezyang/3128/head -> origin/gh/ezyang/3128/head 2025-08-14T21:21:50.1766507Z * [new branch] gh/ezyang/3128/orig -> origin/gh/ezyang/3128/orig 2025-08-14T21:21:50.1766647Z * [new branch] gh/ezyang/3129/base -> origin/gh/ezyang/3129/base 2025-08-14T21:21:50.1766788Z * [new branch] gh/ezyang/3129/head -> origin/gh/ezyang/3129/head 2025-08-14T21:21:50.1767780Z * [new branch] gh/ezyang/3129/orig -> origin/gh/ezyang/3129/orig 2025-08-14T21:21:50.1768407Z * [new branch] gh/ezyang/3130/base -> origin/gh/ezyang/3130/base 2025-08-14T21:21:50.1769076Z * [new branch] gh/ezyang/3130/head -> origin/gh/ezyang/3130/head 2025-08-14T21:21:50.1774658Z * [new branch] gh/ezyang/3130/orig -> origin/gh/ezyang/3130/orig 2025-08-14T21:21:50.1774857Z * [new branch] gh/ezyang/3131/base -> origin/gh/ezyang/3131/base 2025-08-14T21:21:50.1775041Z * [new branch] gh/ezyang/3131/head -> origin/gh/ezyang/3131/head 2025-08-14T21:21:50.1775200Z * [new branch] gh/ezyang/3131/orig -> origin/gh/ezyang/3131/orig 2025-08-14T21:21:50.1775423Z * [new branch] gh/ezyang/3132/base -> origin/gh/ezyang/3132/base 2025-08-14T21:21:50.1775561Z * [new branch] gh/ezyang/3132/head -> origin/gh/ezyang/3132/head 2025-08-14T21:21:50.1775830Z * [new branch] gh/ezyang/3132/orig -> origin/gh/ezyang/3132/orig 2025-08-14T21:21:50.1775979Z * [new branch] gh/ezyang/3133/base -> origin/gh/ezyang/3133/base 2025-08-14T21:21:50.1776191Z * [new branch] gh/ezyang/3133/head -> origin/gh/ezyang/3133/head 2025-08-14T21:21:50.1781854Z * [new branch] gh/ezyang/3133/orig -> origin/gh/ezyang/3133/orig 2025-08-14T21:21:50.1782216Z * [new branch] gh/ezyang/3134/base -> origin/gh/ezyang/3134/base 2025-08-14T21:21:50.1782372Z * [new branch] gh/ezyang/3134/head -> origin/gh/ezyang/3134/head 2025-08-14T21:21:50.1782518Z * [new branch] gh/ezyang/3134/orig -> origin/gh/ezyang/3134/orig 2025-08-14T21:21:50.1782666Z * [new branch] gh/ezyang/3135/base -> origin/gh/ezyang/3135/base 2025-08-14T21:21:50.1782809Z * [new branch] gh/ezyang/3135/head -> origin/gh/ezyang/3135/head 2025-08-14T21:21:50.1782954Z * [new branch] gh/ezyang/3135/orig -> origin/gh/ezyang/3135/orig 2025-08-14T21:21:50.1783090Z * [new branch] gh/ezyang/3136/base -> origin/gh/ezyang/3136/base 2025-08-14T21:21:50.1788248Z * [new branch] gh/ezyang/3136/head -> origin/gh/ezyang/3136/head 2025-08-14T21:21:50.1788468Z * [new branch] gh/ezyang/3136/orig -> origin/gh/ezyang/3136/orig 2025-08-14T21:21:50.1788651Z * [new branch] gh/fadara01/1/base -> origin/gh/fadara01/1/base 2025-08-14T21:21:50.1788947Z * [new branch] gh/fadara01/1/head -> origin/gh/fadara01/1/head 2025-08-14T21:21:50.1789123Z * [new branch] gh/fadara01/1/orig -> origin/gh/fadara01/1/orig 2025-08-14T21:21:50.1789291Z * [new branch] gh/fduwjj/168/base -> origin/gh/fduwjj/168/base 2025-08-14T21:21:50.1789566Z * [new branch] gh/fduwjj/168/head -> origin/gh/fduwjj/168/head 2025-08-14T21:21:50.1798031Z * [new branch] gh/fduwjj/168/orig -> origin/gh/fduwjj/168/orig 2025-08-14T21:21:50.1798564Z * [new branch] gh/fduwjj/169/base -> origin/gh/fduwjj/169/base 2025-08-14T21:21:50.1798769Z * [new branch] gh/fduwjj/169/head -> origin/gh/fduwjj/169/head 2025-08-14T21:21:50.1798925Z * [new branch] gh/fduwjj/169/orig -> origin/gh/fduwjj/169/orig 2025-08-14T21:21:50.1799082Z * [new branch] gh/fduwjj/170/base -> origin/gh/fduwjj/170/base 2025-08-14T21:21:50.1799243Z * [new branch] gh/fduwjj/170/head -> origin/gh/fduwjj/170/head 2025-08-14T21:21:50.1799407Z * [new branch] gh/fduwjj/170/orig -> origin/gh/fduwjj/170/orig 2025-08-14T21:21:50.1799567Z * [new branch] gh/fduwjj/171/base -> origin/gh/fduwjj/171/base 2025-08-14T21:21:50.1799720Z * [new branch] gh/fduwjj/171/head -> origin/gh/fduwjj/171/head 2025-08-14T21:21:50.1799876Z * [new branch] gh/fduwjj/171/orig -> origin/gh/fduwjj/171/orig 2025-08-14T21:21:50.1800033Z * [new branch] gh/fduwjj/172/base -> origin/gh/fduwjj/172/base 2025-08-14T21:21:50.1800191Z * [new branch] gh/fduwjj/172/head -> origin/gh/fduwjj/172/head 2025-08-14T21:21:50.1800344Z * [new branch] gh/fduwjj/172/orig -> origin/gh/fduwjj/172/orig 2025-08-14T21:21:50.1800486Z * [new branch] gh/fduwjj/173/base -> origin/gh/fduwjj/173/base 2025-08-14T21:21:50.1800636Z * [new branch] gh/fduwjj/173/head -> origin/gh/fduwjj/173/head 2025-08-14T21:21:50.1800786Z * [new branch] gh/fduwjj/173/orig -> origin/gh/fduwjj/173/orig 2025-08-14T21:21:50.1800932Z * [new branch] gh/fduwjj/174/base -> origin/gh/fduwjj/174/base 2025-08-14T21:21:50.1801517Z * [new branch] gh/fduwjj/174/head -> origin/gh/fduwjj/174/head 2025-08-14T21:21:50.1801697Z * [new branch] gh/fduwjj/174/orig -> origin/gh/fduwjj/174/orig 2025-08-14T21:21:50.1804196Z * [new branch] gh/fduwjj/175/base -> origin/gh/fduwjj/175/base 2025-08-14T21:21:50.1806693Z * [new branch] gh/fduwjj/175/head -> origin/gh/fduwjj/175/head 2025-08-14T21:21:50.1806856Z * [new branch] gh/fduwjj/175/orig -> origin/gh/fduwjj/175/orig 2025-08-14T21:21:50.1808157Z * [new branch] gh/fduwjj/176/base -> origin/gh/fduwjj/176/base 2025-08-14T21:21:50.1808438Z * [new branch] gh/fduwjj/176/head -> origin/gh/fduwjj/176/head 2025-08-14T21:21:50.1809802Z * [new branch] gh/fduwjj/176/orig -> origin/gh/fduwjj/176/orig 2025-08-14T21:21:50.1813626Z * [new branch] gh/fduwjj/177/base -> origin/gh/fduwjj/177/base 2025-08-14T21:21:50.1813900Z * [new branch] gh/fduwjj/177/head -> origin/gh/fduwjj/177/head 2025-08-14T21:21:50.1814059Z * [new branch] gh/fduwjj/177/orig -> origin/gh/fduwjj/177/orig 2025-08-14T21:21:50.1814198Z * [new branch] gh/fduwjj/178/base -> origin/gh/fduwjj/178/base 2025-08-14T21:21:50.1814561Z * [new branch] gh/fduwjj/178/head -> origin/gh/fduwjj/178/head 2025-08-14T21:21:50.1814696Z * [new branch] gh/fduwjj/178/orig -> origin/gh/fduwjj/178/orig 2025-08-14T21:21:50.1820677Z * [new branch] gh/fduwjj/179/base -> origin/gh/fduwjj/179/base 2025-08-14T21:21:50.1820870Z * [new branch] gh/fduwjj/179/head -> origin/gh/fduwjj/179/head 2025-08-14T21:21:50.1821017Z * [new branch] gh/fduwjj/179/orig -> origin/gh/fduwjj/179/orig 2025-08-14T21:21:50.1821156Z * [new branch] gh/fduwjj/180/base -> origin/gh/fduwjj/180/base 2025-08-14T21:21:50.1821303Z * [new branch] gh/fduwjj/180/head -> origin/gh/fduwjj/180/head 2025-08-14T21:21:50.1821445Z * [new branch] gh/fduwjj/180/orig -> origin/gh/fduwjj/180/orig 2025-08-14T21:21:50.1826375Z * [new branch] gh/fduwjj/181/base -> origin/gh/fduwjj/181/base 2025-08-14T21:21:50.1826752Z * [new branch] gh/fduwjj/181/head -> origin/gh/fduwjj/181/head 2025-08-14T21:21:50.1826985Z * [new branch] gh/fduwjj/181/orig -> origin/gh/fduwjj/181/orig 2025-08-14T21:21:50.1827181Z * [new branch] gh/fegin/306/base -> origin/gh/fegin/306/base 2025-08-14T21:21:50.1827349Z * [new branch] gh/fegin/306/head -> origin/gh/fegin/306/head 2025-08-14T21:21:50.1827513Z * [new branch] gh/fegin/306/orig -> origin/gh/fegin/306/orig 2025-08-14T21:21:50.1828229Z * [new branch] gh/fegin/307/base -> origin/gh/fegin/307/base 2025-08-14T21:21:50.1828399Z * [new branch] gh/fegin/307/head -> origin/gh/fegin/307/head 2025-08-14T21:21:50.1828544Z * [new branch] gh/fegin/307/orig -> origin/gh/fegin/307/orig 2025-08-14T21:21:50.1830837Z * [new branch] gh/fffrog/114/base -> origin/gh/fffrog/114/base 2025-08-14T21:21:50.1831128Z * [new branch] gh/fffrog/114/head -> origin/gh/fffrog/114/head 2025-08-14T21:21:50.1831269Z * [new branch] gh/fffrog/114/orig -> origin/gh/fffrog/114/orig 2025-08-14T21:21:50.1831410Z * [new branch] gh/fffrog/117/base -> origin/gh/fffrog/117/base 2025-08-14T21:21:50.1831567Z * [new branch] gh/fffrog/117/head -> origin/gh/fffrog/117/head 2025-08-14T21:21:50.1832052Z * [new branch] gh/fffrog/117/orig -> origin/gh/fffrog/117/orig 2025-08-14T21:21:50.1832247Z * [new branch] gh/fffrog/119/base -> origin/gh/fffrog/119/base 2025-08-14T21:21:50.1832391Z * [new branch] gh/fffrog/119/head -> origin/gh/fffrog/119/head 2025-08-14T21:21:50.1837867Z * [new branch] gh/fffrog/119/orig -> origin/gh/fffrog/119/orig 2025-08-14T21:21:50.1838232Z * [new branch] gh/fffrog/120/base -> origin/gh/fffrog/120/base 2025-08-14T21:21:50.1838613Z * [new branch] gh/fffrog/120/head -> origin/gh/fffrog/120/head 2025-08-14T21:21:50.1838897Z * [new branch] gh/fffrog/120/orig -> origin/gh/fffrog/120/orig 2025-08-14T21:21:50.1839064Z * [new branch] gh/fffrog/121/base -> origin/gh/fffrog/121/base 2025-08-14T21:21:50.1839829Z * [new branch] gh/fffrog/121/head -> origin/gh/fffrog/121/head 2025-08-14T21:21:50.1840044Z * [new branch] gh/fffrog/121/orig -> origin/gh/fffrog/121/orig 2025-08-14T21:21:50.1840210Z * [new branch] gh/fffrog/122/base -> origin/gh/fffrog/122/base 2025-08-14T21:21:50.1840455Z * [new branch] gh/fffrog/122/head -> origin/gh/fffrog/122/head 2025-08-14T21:21:50.1840630Z * [new branch] gh/fffrog/122/orig -> origin/gh/fffrog/122/orig 2025-08-14T21:21:50.1841091Z * [new branch] gh/fffrog/123/base -> origin/gh/fffrog/123/base 2025-08-14T21:21:50.1841267Z * [new branch] gh/fffrog/123/head -> origin/gh/fffrog/123/head 2025-08-14T21:21:50.1841512Z * [new branch] gh/fffrog/123/orig -> origin/gh/fffrog/123/orig 2025-08-14T21:21:50.1841680Z * [new branch] gh/fffrog/124/base -> origin/gh/fffrog/124/base 2025-08-14T21:21:50.1841937Z * [new branch] gh/fffrog/124/head -> origin/gh/fffrog/124/head 2025-08-14T21:21:50.1842421Z * [new branch] gh/fffrog/124/orig -> origin/gh/fffrog/124/orig 2025-08-14T21:21:50.1844114Z * [new branch] gh/fffrog/125/base -> origin/gh/fffrog/125/base 2025-08-14T21:21:50.1844304Z * [new branch] gh/fffrog/125/head -> origin/gh/fffrog/125/head 2025-08-14T21:21:50.1844881Z * [new branch] gh/fffrog/125/orig -> origin/gh/fffrog/125/orig 2025-08-14T21:21:50.1848560Z * [new branch] gh/fffrog/126/base -> origin/gh/fffrog/126/base 2025-08-14T21:21:50.1848883Z * [new branch] gh/fffrog/126/head -> origin/gh/fffrog/126/head 2025-08-14T21:21:50.1849036Z * [new branch] gh/fffrog/126/orig -> origin/gh/fffrog/126/orig 2025-08-14T21:21:50.1849184Z * [new branch] gh/fffrog/127/base -> origin/gh/fffrog/127/base 2025-08-14T21:21:50.1849329Z * [new branch] gh/fffrog/127/head -> origin/gh/fffrog/127/head 2025-08-14T21:21:50.1849467Z * [new branch] gh/fffrog/127/orig -> origin/gh/fffrog/127/orig 2025-08-14T21:21:50.1854551Z * [new branch] gh/fffrog/128/base -> origin/gh/fffrog/128/base 2025-08-14T21:21:50.1854853Z * [new branch] gh/fffrog/128/head -> origin/gh/fffrog/128/head 2025-08-14T21:21:50.1855028Z * [new branch] gh/fffrog/128/orig -> origin/gh/fffrog/128/orig 2025-08-14T21:21:50.1855308Z * [new branch] gh/fffrog/129/base -> origin/gh/fffrog/129/base 2025-08-14T21:21:50.1855486Z * [new branch] gh/fffrog/129/head -> origin/gh/fffrog/129/head 2025-08-14T21:21:50.1855672Z * [new branch] gh/fffrog/129/orig -> origin/gh/fffrog/129/orig 2025-08-14T21:21:50.1856373Z * [new branch] gh/fffrog/130/base -> origin/gh/fffrog/130/base 2025-08-14T21:21:50.1860849Z * [new branch] gh/fffrog/130/head -> origin/gh/fffrog/130/head 2025-08-14T21:21:50.1861034Z * [new branch] gh/fffrog/130/orig -> origin/gh/fffrog/130/orig 2025-08-14T21:21:50.1861189Z * [new branch] gh/fffrog/131/base -> origin/gh/fffrog/131/base 2025-08-14T21:21:50.1861329Z * [new branch] gh/fffrog/131/head -> origin/gh/fffrog/131/head 2025-08-14T21:21:50.1861484Z * [new branch] gh/fffrog/131/orig -> origin/gh/fffrog/131/orig 2025-08-14T21:21:50.1861907Z * [new branch] gh/fffrog/132/base -> origin/gh/fffrog/132/base 2025-08-14T21:21:50.1867280Z * [new branch] gh/fffrog/132/head -> origin/gh/fffrog/132/head 2025-08-14T21:21:50.1871641Z * [new branch] gh/fffrog/132/orig -> origin/gh/fffrog/132/orig 2025-08-14T21:21:50.1872193Z * [new branch] gh/fffrog/133/base -> origin/gh/fffrog/133/base 2025-08-14T21:21:50.1872382Z * [new branch] gh/fffrog/133/head -> origin/gh/fffrog/133/head 2025-08-14T21:21:50.1872531Z * [new branch] gh/fffrog/133/orig -> origin/gh/fffrog/133/orig 2025-08-14T21:21:50.1872680Z * [new branch] gh/fffrog/134/base -> origin/gh/fffrog/134/base 2025-08-14T21:21:50.1872826Z * [new branch] gh/fffrog/134/head -> origin/gh/fffrog/134/head 2025-08-14T21:21:50.1872965Z * [new branch] gh/fffrog/134/orig -> origin/gh/fffrog/134/orig 2025-08-14T21:21:50.1873277Z * [new branch] gh/fffrog/135/base -> origin/gh/fffrog/135/base 2025-08-14T21:21:50.1873421Z * [new branch] gh/fffrog/135/head -> origin/gh/fffrog/135/head 2025-08-14T21:21:50.1873572Z * [new branch] gh/fffrog/135/orig -> origin/gh/fffrog/135/orig 2025-08-14T21:21:50.1873712Z * [new branch] gh/fffrog/136/base -> origin/gh/fffrog/136/base 2025-08-14T21:21:50.1873855Z * [new branch] gh/fffrog/136/head -> origin/gh/fffrog/136/head 2025-08-14T21:21:50.1874006Z * [new branch] gh/fffrog/136/orig -> origin/gh/fffrog/136/orig 2025-08-14T21:21:50.1874151Z * [new branch] gh/fffrog/137/base -> origin/gh/fffrog/137/base 2025-08-14T21:21:50.1874302Z * [new branch] gh/fffrog/137/head -> origin/gh/fffrog/137/head 2025-08-14T21:21:50.1874449Z * [new branch] gh/fffrog/137/orig -> origin/gh/fffrog/137/orig 2025-08-14T21:21:50.1874592Z * [new branch] gh/fffrog/138/base -> origin/gh/fffrog/138/base 2025-08-14T21:21:50.1874742Z * [new branch] gh/fffrog/138/head -> origin/gh/fffrog/138/head 2025-08-14T21:21:50.1874879Z * [new branch] gh/fffrog/138/orig -> origin/gh/fffrog/138/orig 2025-08-14T21:21:50.1877401Z * [new branch] gh/gmagogsfm/1/base -> origin/gh/gmagogsfm/1/base 2025-08-14T21:21:50.1878009Z * [new branch] gh/gmagogsfm/1/head -> origin/gh/gmagogsfm/1/head 2025-08-14T21:21:50.1878193Z * [new branch] gh/gmagogsfm/1/orig -> origin/gh/gmagogsfm/1/orig 2025-08-14T21:21:50.1878359Z * [new branch] gh/gmagogsfm/2/base -> origin/gh/gmagogsfm/2/base 2025-08-14T21:21:50.1878515Z * [new branch] gh/gmagogsfm/2/head -> origin/gh/gmagogsfm/2/head 2025-08-14T21:21:50.1878656Z * [new branch] gh/gmagogsfm/2/orig -> origin/gh/gmagogsfm/2/orig 2025-08-14T21:21:50.1883449Z * [new branch] gh/gmagogsfm/3/base -> origin/gh/gmagogsfm/3/base 2025-08-14T21:21:50.1883752Z * [new branch] gh/gmagogsfm/3/head -> origin/gh/gmagogsfm/3/head 2025-08-14T21:21:50.1883932Z * [new branch] gh/gmagogsfm/3/orig -> origin/gh/gmagogsfm/3/orig 2025-08-14T21:21:50.1884158Z * [new branch] gh/gmagogsfm/4/base -> origin/gh/gmagogsfm/4/base 2025-08-14T21:21:50.1884329Z * [new branch] gh/gmagogsfm/4/head -> origin/gh/gmagogsfm/4/head 2025-08-14T21:21:50.1884554Z * [new branch] gh/gmagogsfm/4/orig -> origin/gh/gmagogsfm/4/orig 2025-08-14T21:21:50.1884849Z * [new branch] gh/guangyey/130/base -> origin/gh/guangyey/130/base 2025-08-14T21:21:50.1885041Z * [new branch] gh/guangyey/130/head -> origin/gh/guangyey/130/head 2025-08-14T21:21:50.1885206Z * [new branch] gh/guangyey/130/orig -> origin/gh/guangyey/130/orig 2025-08-14T21:21:50.1885753Z * [new branch] gh/guangyey/133/base -> origin/gh/guangyey/133/base 2025-08-14T21:21:50.1886300Z * [new branch] gh/guangyey/133/head -> origin/gh/guangyey/133/head 2025-08-14T21:21:50.1887270Z * [new branch] gh/guangyey/133/orig -> origin/gh/guangyey/133/orig 2025-08-14T21:21:50.1888329Z * [new branch] gh/guangyey/134/base -> origin/gh/guangyey/134/base 2025-08-14T21:21:50.1888777Z * [new branch] gh/guangyey/134/head -> origin/gh/guangyey/134/head 2025-08-14T21:21:50.1890040Z * [new branch] gh/guangyey/134/orig -> origin/gh/guangyey/134/orig 2025-08-14T21:21:50.1890632Z * [new branch] gh/guangyey/135/base -> origin/gh/guangyey/135/base 2025-08-14T21:21:50.1891259Z * [new branch] gh/guangyey/135/head -> origin/gh/guangyey/135/head 2025-08-14T21:21:50.1891981Z * [new branch] gh/guangyey/135/orig -> origin/gh/guangyey/135/orig 2025-08-14T21:21:50.1893155Z * [new branch] gh/guangyey/139/base -> origin/gh/guangyey/139/base 2025-08-14T21:21:50.1893401Z * [new branch] gh/guangyey/139/head -> origin/gh/guangyey/139/head 2025-08-14T21:21:50.1894479Z * [new branch] gh/guangyey/139/orig -> origin/gh/guangyey/139/orig 2025-08-14T21:21:50.1895439Z * [new branch] gh/guangyey/140/base -> origin/gh/guangyey/140/base 2025-08-14T21:21:50.1897421Z * [new branch] gh/guangyey/140/head -> origin/gh/guangyey/140/head 2025-08-14T21:21:50.1897588Z * [new branch] gh/guangyey/140/orig -> origin/gh/guangyey/140/orig 2025-08-14T21:21:50.1897751Z * [new branch] gh/guangyey/142/base -> origin/gh/guangyey/142/base 2025-08-14T21:21:50.1898164Z * [new branch] gh/guangyey/142/head -> origin/gh/guangyey/142/head 2025-08-14T21:21:50.1900772Z * [new branch] gh/guangyey/142/orig -> origin/gh/guangyey/142/orig 2025-08-14T21:21:50.1900968Z * [new branch] gh/guangyey/145/base -> origin/gh/guangyey/145/base 2025-08-14T21:21:50.1901127Z * [new branch] gh/guangyey/145/head -> origin/gh/guangyey/145/head 2025-08-14T21:21:50.1901544Z * [new branch] gh/guangyey/145/orig -> origin/gh/guangyey/145/orig 2025-08-14T21:21:50.1902236Z * [new branch] gh/guangyey/153/base -> origin/gh/guangyey/153/base 2025-08-14T21:21:50.1902872Z * [new branch] gh/guangyey/153/head -> origin/gh/guangyey/153/head 2025-08-14T21:21:50.1905771Z * [new branch] gh/guangyey/153/orig -> origin/gh/guangyey/153/orig 2025-08-14T21:21:50.1906081Z * [new branch] gh/guangyey/158/base -> origin/gh/guangyey/158/base 2025-08-14T21:21:50.1906249Z * [new branch] gh/guangyey/158/head -> origin/gh/guangyey/158/head 2025-08-14T21:21:50.1906553Z * [new branch] gh/guangyey/158/orig -> origin/gh/guangyey/158/orig 2025-08-14T21:21:50.1906961Z * [new branch] gh/guangyey/159/base -> origin/gh/guangyey/159/base 2025-08-14T21:21:50.1907972Z * [new branch] gh/guangyey/159/head -> origin/gh/guangyey/159/head 2025-08-14T21:21:50.1908328Z * [new branch] gh/guangyey/159/orig -> origin/gh/guangyey/159/orig 2025-08-14T21:21:50.1911393Z * [new branch] gh/guangyey/163/base -> origin/gh/guangyey/163/base 2025-08-14T21:21:50.1911720Z * [new branch] gh/guangyey/163/head -> origin/gh/guangyey/163/head 2025-08-14T21:21:50.1911895Z * [new branch] gh/guangyey/163/orig -> origin/gh/guangyey/163/orig 2025-08-14T21:21:50.1912131Z * [new branch] gh/guangyey/165/base -> origin/gh/guangyey/165/base 2025-08-14T21:21:50.1912305Z * [new branch] gh/guangyey/165/head -> origin/gh/guangyey/165/head 2025-08-14T21:21:50.1913973Z * [new branch] gh/guangyey/165/orig -> origin/gh/guangyey/165/orig 2025-08-14T21:21:50.1914238Z * [new branch] gh/guangyey/168/base -> origin/gh/guangyey/168/base 2025-08-14T21:21:50.1914731Z * [new branch] gh/guangyey/168/head -> origin/gh/guangyey/168/head 2025-08-14T21:21:50.1916382Z * [new branch] gh/guangyey/168/orig -> origin/gh/guangyey/168/orig 2025-08-14T21:21:50.1916600Z * [new branch] gh/guangyey/169/base -> origin/gh/guangyey/169/base 2025-08-14T21:21:50.1918850Z * [new branch] gh/guangyey/169/head -> origin/gh/guangyey/169/head 2025-08-14T21:21:50.1919203Z * [new branch] gh/guangyey/169/orig -> origin/gh/guangyey/169/orig 2025-08-14T21:21:50.1919446Z * [new branch] gh/guangyey/170/base -> origin/gh/guangyey/170/base 2025-08-14T21:21:50.1919875Z * [new branch] gh/guangyey/170/head -> origin/gh/guangyey/170/head 2025-08-14T21:21:50.1925698Z * [new branch] gh/guangyey/170/orig -> origin/gh/guangyey/170/orig 2025-08-14T21:21:50.1925878Z * [new branch] gh/guangyey/171/base -> origin/gh/guangyey/171/base 2025-08-14T21:21:50.1926041Z * [new branch] gh/guangyey/171/head -> origin/gh/guangyey/171/head 2025-08-14T21:21:50.1926185Z * [new branch] gh/guangyey/171/orig -> origin/gh/guangyey/171/orig 2025-08-14T21:21:50.1926330Z * [new branch] gh/guangyey/172/base -> origin/gh/guangyey/172/base 2025-08-14T21:21:50.1926482Z * [new branch] gh/guangyey/172/head -> origin/gh/guangyey/172/head 2025-08-14T21:21:50.1926625Z * [new branch] gh/guangyey/172/orig -> origin/gh/guangyey/172/orig 2025-08-14T21:21:50.1928803Z * [new branch] gh/guangyey/173/base -> origin/gh/guangyey/173/base 2025-08-14T21:21:50.1929513Z * [new branch] gh/guangyey/173/head -> origin/gh/guangyey/173/head 2025-08-14T21:21:50.1932727Z * [new branch] gh/guangyey/173/orig -> origin/gh/guangyey/173/orig 2025-08-14T21:21:50.1932933Z * [new branch] gh/guangyey/174/base -> origin/gh/guangyey/174/base 2025-08-14T21:21:50.1933110Z * [new branch] gh/guangyey/174/head -> origin/gh/guangyey/174/head 2025-08-14T21:21:50.1933264Z * [new branch] gh/guangyey/174/orig -> origin/gh/guangyey/174/orig 2025-08-14T21:21:50.1933415Z * [new branch] gh/guangyey/175/base -> origin/gh/guangyey/175/base 2025-08-14T21:21:50.1933593Z * [new branch] gh/guangyey/175/head -> origin/gh/guangyey/175/head 2025-08-14T21:21:50.1933750Z * [new branch] gh/guangyey/175/orig -> origin/gh/guangyey/175/orig 2025-08-14T21:21:50.1933897Z * [new branch] gh/guangyey/176/base -> origin/gh/guangyey/176/base 2025-08-14T21:21:50.1934087Z * [new branch] gh/guangyey/176/head -> origin/gh/guangyey/176/head 2025-08-14T21:21:50.1934231Z * [new branch] gh/guangyey/176/orig -> origin/gh/guangyey/176/orig 2025-08-14T21:21:50.1939210Z * [new branch] gh/guangyey/177/base -> origin/gh/guangyey/177/base 2025-08-14T21:21:50.1944221Z * [new branch] gh/guangyey/177/head -> origin/gh/guangyey/177/head 2025-08-14T21:21:50.1948873Z * [new branch] gh/guangyey/177/orig -> origin/gh/guangyey/177/orig 2025-08-14T21:21:50.1950811Z * [new branch] gh/guangyey/178/base -> origin/gh/guangyey/178/base 2025-08-14T21:21:50.1951134Z * [new branch] gh/guangyey/178/head -> origin/gh/guangyey/178/head 2025-08-14T21:21:50.1951384Z * [new branch] gh/guangyey/178/orig -> origin/gh/guangyey/178/orig 2025-08-14T21:21:50.1951592Z * [new branch] gh/guangyey/179/base -> origin/gh/guangyey/179/base 2025-08-14T21:21:50.1951929Z * [new branch] gh/guangyey/179/head -> origin/gh/guangyey/179/head 2025-08-14T21:21:50.1952221Z * [new branch] gh/guangyey/179/orig -> origin/gh/guangyey/179/orig 2025-08-14T21:21:50.1952394Z * [new branch] gh/guangyey/180/base -> origin/gh/guangyey/180/base 2025-08-14T21:21:50.1952673Z * [new branch] gh/guangyey/180/head -> origin/gh/guangyey/180/head 2025-08-14T21:21:50.1952825Z * [new branch] gh/guangyey/180/orig -> origin/gh/guangyey/180/orig 2025-08-14T21:21:50.1953099Z * [new branch] gh/guangyey/181/base -> origin/gh/guangyey/181/base 2025-08-14T21:21:50.1953337Z * [new branch] gh/guangyey/181/head -> origin/gh/guangyey/181/head 2025-08-14T21:21:50.1953967Z * [new branch] gh/guangyey/181/orig -> origin/gh/guangyey/181/orig 2025-08-14T21:21:50.1954345Z * [new branch] gh/guangyey/182/base -> origin/gh/guangyey/182/base 2025-08-14T21:21:50.1954560Z * [new branch] gh/guangyey/182/head -> origin/gh/guangyey/182/head 2025-08-14T21:21:50.1954825Z * [new branch] gh/guangyey/182/orig -> origin/gh/guangyey/182/orig 2025-08-14T21:21:50.1954991Z * [new branch] gh/guangyey/183/base -> origin/gh/guangyey/183/base 2025-08-14T21:21:50.1955221Z * [new branch] gh/guangyey/183/head -> origin/gh/guangyey/183/head 2025-08-14T21:21:50.1955494Z * [new branch] gh/guangyey/183/orig -> origin/gh/guangyey/183/orig 2025-08-14T21:21:50.1955653Z * [new branch] gh/guangyey/184/base -> origin/gh/guangyey/184/base 2025-08-14T21:21:50.1955917Z * [new branch] gh/guangyey/184/head -> origin/gh/guangyey/184/head 2025-08-14T21:21:50.1956081Z * [new branch] gh/guangyey/184/orig -> origin/gh/guangyey/184/orig 2025-08-14T21:21:50.1956308Z * [new branch] gh/guangyey/185/base -> origin/gh/guangyey/185/base 2025-08-14T21:21:50.1956577Z * [new branch] gh/guangyey/185/head -> origin/gh/guangyey/185/head 2025-08-14T21:21:50.1956748Z * [new branch] gh/guangyey/185/orig -> origin/gh/guangyey/185/orig 2025-08-14T21:21:50.1956922Z * [new branch] gh/guangyey/79/base -> origin/gh/guangyey/79/base 2025-08-14T21:21:50.1957090Z * [new branch] gh/guangyey/79/head -> origin/gh/guangyey/79/head 2025-08-14T21:21:50.1963430Z * [new branch] gh/guangyey/79/orig -> origin/gh/guangyey/79/orig 2025-08-14T21:21:50.1963623Z * [new branch] gh/guangyey/89/base -> origin/gh/guangyey/89/base 2025-08-14T21:21:50.1963775Z * [new branch] gh/guangyey/89/head -> origin/gh/guangyey/89/head 2025-08-14T21:21:50.1963923Z * [new branch] gh/guangyey/89/orig -> origin/gh/guangyey/89/orig 2025-08-14T21:21:50.1964165Z * [new branch] gh/guilhermeleobas/107/base -> origin/gh/guilhermeleobas/107/base 2025-08-14T21:21:50.1964350Z * [new branch] gh/guilhermeleobas/107/head -> origin/gh/guilhermeleobas/107/head 2025-08-14T21:21:50.1964523Z * [new branch] gh/guilhermeleobas/107/orig -> origin/gh/guilhermeleobas/107/orig 2025-08-14T21:21:50.1964697Z * [new branch] gh/guilhermeleobas/108/base -> origin/gh/guilhermeleobas/108/base 2025-08-14T21:21:50.1964860Z * [new branch] gh/guilhermeleobas/108/head -> origin/gh/guilhermeleobas/108/head 2025-08-14T21:21:50.1965031Z * [new branch] gh/guilhermeleobas/108/orig -> origin/gh/guilhermeleobas/108/orig 2025-08-14T21:21:50.1965243Z * [new branch] gh/guilhermeleobas/124/base -> origin/gh/guilhermeleobas/124/base 2025-08-14T21:21:50.1965420Z * [new branch] gh/guilhermeleobas/124/head -> origin/gh/guilhermeleobas/124/head 2025-08-14T21:21:50.1966176Z * [new branch] gh/guilhermeleobas/124/orig -> origin/gh/guilhermeleobas/124/orig 2025-08-14T21:21:50.1967314Z * [new branch] gh/guilhermeleobas/147/base -> origin/gh/guilhermeleobas/147/base 2025-08-14T21:21:50.1967636Z * [new branch] gh/guilhermeleobas/147/head -> origin/gh/guilhermeleobas/147/head 2025-08-14T21:21:50.1968944Z * [new branch] gh/guilhermeleobas/147/orig -> origin/gh/guilhermeleobas/147/orig 2025-08-14T21:21:50.1970513Z * [new branch] gh/guilhermeleobas/150/base -> origin/gh/guilhermeleobas/150/base 2025-08-14T21:21:50.1970707Z * [new branch] gh/guilhermeleobas/150/head -> origin/gh/guilhermeleobas/150/head 2025-08-14T21:21:50.1970869Z * [new branch] gh/guilhermeleobas/150/orig -> origin/gh/guilhermeleobas/150/orig 2025-08-14T21:21:50.1973539Z * [new branch] gh/guilhermeleobas/163/base -> origin/gh/guilhermeleobas/163/base 2025-08-14T21:21:50.1974270Z * [new branch] gh/guilhermeleobas/163/head -> origin/gh/guilhermeleobas/163/head 2025-08-14T21:21:50.1974489Z * [new branch] gh/guilhermeleobas/163/orig -> origin/gh/guilhermeleobas/163/orig 2025-08-14T21:21:50.1974667Z * [new branch] gh/guilhermeleobas/164/base -> origin/gh/guilhermeleobas/164/base 2025-08-14T21:21:50.1975106Z * [new branch] gh/guilhermeleobas/164/head -> origin/gh/guilhermeleobas/164/head 2025-08-14T21:21:50.1976549Z * [new branch] gh/guilhermeleobas/164/orig -> origin/gh/guilhermeleobas/164/orig 2025-08-14T21:21:50.1977195Z * [new branch] gh/guilhermeleobas/165/base -> origin/gh/guilhermeleobas/165/base 2025-08-14T21:21:50.1977422Z * [new branch] gh/guilhermeleobas/165/head -> origin/gh/guilhermeleobas/165/head 2025-08-14T21:21:50.1978281Z * [new branch] gh/guilhermeleobas/165/orig -> origin/gh/guilhermeleobas/165/orig 2025-08-14T21:21:50.1979924Z * [new branch] gh/guilhermeleobas/166/base -> origin/gh/guilhermeleobas/166/base 2025-08-14T21:21:50.1980284Z * [new branch] gh/guilhermeleobas/166/head -> origin/gh/guilhermeleobas/166/head 2025-08-14T21:21:50.1980492Z * [new branch] gh/guilhermeleobas/166/orig -> origin/gh/guilhermeleobas/166/orig 2025-08-14T21:21:50.1982227Z * [new branch] gh/guilhermeleobas/167/base -> origin/gh/guilhermeleobas/167/base 2025-08-14T21:21:50.1982654Z * [new branch] gh/guilhermeleobas/167/head -> origin/gh/guilhermeleobas/167/head 2025-08-14T21:21:50.1984127Z * [new branch] gh/guilhermeleobas/167/orig -> origin/gh/guilhermeleobas/167/orig 2025-08-14T21:21:50.1984464Z * [new branch] gh/guilhermeleobas/168/base -> origin/gh/guilhermeleobas/168/base 2025-08-14T21:21:50.1984675Z * [new branch] gh/guilhermeleobas/168/head -> origin/gh/guilhermeleobas/168/head 2025-08-14T21:21:50.1984881Z * [new branch] gh/guilhermeleobas/168/orig -> origin/gh/guilhermeleobas/168/orig 2025-08-14T21:21:50.1986890Z * [new branch] gh/guilhermeleobas/169/base -> origin/gh/guilhermeleobas/169/base 2025-08-14T21:21:50.1987100Z * [new branch] gh/guilhermeleobas/169/head -> origin/gh/guilhermeleobas/169/head 2025-08-14T21:21:50.1987300Z * [new branch] gh/guilhermeleobas/169/orig -> origin/gh/guilhermeleobas/169/orig 2025-08-14T21:21:50.1988789Z * [new branch] gh/guilhermeleobas/170/base -> origin/gh/guilhermeleobas/170/base 2025-08-14T21:21:50.1988987Z * [new branch] gh/guilhermeleobas/170/head -> origin/gh/guilhermeleobas/170/head 2025-08-14T21:21:50.1989635Z * [new branch] gh/guilhermeleobas/170/orig -> origin/gh/guilhermeleobas/170/orig 2025-08-14T21:21:50.1993549Z * [new branch] gh/guilhermeleobas/171/base -> origin/gh/guilhermeleobas/171/base 2025-08-14T21:21:50.1993756Z * [new branch] gh/guilhermeleobas/171/head -> origin/gh/guilhermeleobas/171/head 2025-08-14T21:21:50.1994089Z * [new branch] gh/guilhermeleobas/171/orig -> origin/gh/guilhermeleobas/171/orig 2025-08-14T21:21:50.1994269Z * [new branch] gh/guilhermeleobas/173/base -> origin/gh/guilhermeleobas/173/base 2025-08-14T21:21:50.1994438Z * [new branch] gh/guilhermeleobas/173/head -> origin/gh/guilhermeleobas/173/head 2025-08-14T21:21:50.1995052Z * [new branch] gh/guilhermeleobas/173/orig -> origin/gh/guilhermeleobas/173/orig 2025-08-14T21:21:50.1995229Z * [new branch] gh/guilhermeleobas/181/base -> origin/gh/guilhermeleobas/181/base 2025-08-14T21:21:50.1996330Z * [new branch] gh/guilhermeleobas/181/head -> origin/gh/guilhermeleobas/181/head 2025-08-14T21:21:50.1996522Z * [new branch] gh/guilhermeleobas/181/orig -> origin/gh/guilhermeleobas/181/orig 2025-08-14T21:21:50.2001270Z * [new branch] gh/guilhermeleobas/182/base -> origin/gh/guilhermeleobas/182/base 2025-08-14T21:21:50.2001915Z * [new branch] gh/guilhermeleobas/182/head -> origin/gh/guilhermeleobas/182/head 2025-08-14T21:21:50.2002110Z * [new branch] gh/guilhermeleobas/182/orig -> origin/gh/guilhermeleobas/182/orig 2025-08-14T21:21:50.2002274Z * [new branch] gh/guilhermeleobas/183/base -> origin/gh/guilhermeleobas/183/base 2025-08-14T21:21:50.2002443Z * [new branch] gh/guilhermeleobas/183/head -> origin/gh/guilhermeleobas/183/head 2025-08-14T21:21:50.2002779Z * [new branch] gh/guilhermeleobas/183/orig -> origin/gh/guilhermeleobas/183/orig 2025-08-14T21:21:50.2002984Z * [new branch] gh/guilhermeleobas/184/base -> origin/gh/guilhermeleobas/184/base 2025-08-14T21:21:50.2005572Z * [new branch] gh/guilhermeleobas/184/head -> origin/gh/guilhermeleobas/184/head 2025-08-14T21:21:50.2006314Z * [new branch] gh/guilhermeleobas/184/orig -> origin/gh/guilhermeleobas/184/orig 2025-08-14T21:21:50.2006539Z * [new branch] gh/guilhermeleobas/185/base -> origin/gh/guilhermeleobas/185/base 2025-08-14T21:21:50.2006759Z * [new branch] gh/guilhermeleobas/185/head -> origin/gh/guilhermeleobas/185/head 2025-08-14T21:21:50.2006948Z * [new branch] gh/guilhermeleobas/185/orig -> origin/gh/guilhermeleobas/185/orig 2025-08-14T21:21:50.2008186Z * [new branch] gh/guilhermeleobas/188/base -> origin/gh/guilhermeleobas/188/base 2025-08-14T21:21:50.2012425Z * [new branch] gh/guilhermeleobas/188/head -> origin/gh/guilhermeleobas/188/head 2025-08-14T21:21:50.2017383Z * [new branch] gh/guilhermeleobas/188/orig -> origin/gh/guilhermeleobas/188/orig 2025-08-14T21:21:50.2022307Z * [new branch] gh/guilhermeleobas/189/base -> origin/gh/guilhermeleobas/189/base 2025-08-14T21:21:50.2022580Z * [new branch] gh/guilhermeleobas/189/head -> origin/gh/guilhermeleobas/189/head 2025-08-14T21:21:50.2023161Z * [new branch] gh/guilhermeleobas/189/orig -> origin/gh/guilhermeleobas/189/orig 2025-08-14T21:21:50.2023420Z * [new branch] gh/guilhermeleobas/190/base -> origin/gh/guilhermeleobas/190/base 2025-08-14T21:21:50.2023656Z * [new branch] gh/guilhermeleobas/190/head -> origin/gh/guilhermeleobas/190/head 2025-08-14T21:21:50.2023850Z * [new branch] gh/guilhermeleobas/190/orig -> origin/gh/guilhermeleobas/190/orig 2025-08-14T21:21:50.2024020Z * [new branch] gh/guilhermeleobas/192/base -> origin/gh/guilhermeleobas/192/base 2025-08-14T21:21:50.2024190Z * [new branch] gh/guilhermeleobas/192/head -> origin/gh/guilhermeleobas/192/head 2025-08-14T21:21:50.2024370Z * [new branch] gh/guilhermeleobas/192/orig -> origin/gh/guilhermeleobas/192/orig 2025-08-14T21:21:50.2024538Z * [new branch] gh/guilhermeleobas/193/base -> origin/gh/guilhermeleobas/193/base 2025-08-14T21:21:50.2024711Z * [new branch] gh/guilhermeleobas/193/head -> origin/gh/guilhermeleobas/193/head 2025-08-14T21:21:50.2025129Z * [new branch] gh/guilhermeleobas/193/orig -> origin/gh/guilhermeleobas/193/orig 2025-08-14T21:21:50.2025304Z * [new branch] gh/guilhermeleobas/194/base -> origin/gh/guilhermeleobas/194/base 2025-08-14T21:21:50.2025482Z * [new branch] gh/guilhermeleobas/194/head -> origin/gh/guilhermeleobas/194/head 2025-08-14T21:21:50.2025650Z * [new branch] gh/guilhermeleobas/194/orig -> origin/gh/guilhermeleobas/194/orig 2025-08-14T21:21:50.2025827Z * [new branch] gh/guilhermeleobas/203/base -> origin/gh/guilhermeleobas/203/base 2025-08-14T21:21:50.2025996Z * [new branch] gh/guilhermeleobas/203/head -> origin/gh/guilhermeleobas/203/head 2025-08-14T21:21:50.2026165Z * [new branch] gh/guilhermeleobas/203/orig -> origin/gh/guilhermeleobas/203/orig 2025-08-14T21:21:50.2026635Z * [new branch] gh/guilhermeleobas/204/base -> origin/gh/guilhermeleobas/204/base 2025-08-14T21:21:50.2026888Z * [new branch] gh/guilhermeleobas/204/head -> origin/gh/guilhermeleobas/204/head 2025-08-14T21:21:50.2027071Z * [new branch] gh/guilhermeleobas/204/orig -> origin/gh/guilhermeleobas/204/orig 2025-08-14T21:21:50.2027242Z * [new branch] gh/guilhermeleobas/205/base -> origin/gh/guilhermeleobas/205/base 2025-08-14T21:21:50.2027408Z * [new branch] gh/guilhermeleobas/205/head -> origin/gh/guilhermeleobas/205/head 2025-08-14T21:21:50.2027594Z * [new branch] gh/guilhermeleobas/205/orig -> origin/gh/guilhermeleobas/205/orig 2025-08-14T21:21:50.2028779Z * [new branch] gh/guilhermeleobas/206/base -> origin/gh/guilhermeleobas/206/base 2025-08-14T21:21:50.2028965Z * [new branch] gh/guilhermeleobas/206/head -> origin/gh/guilhermeleobas/206/head 2025-08-14T21:21:50.2029557Z * [new branch] gh/guilhermeleobas/206/orig -> origin/gh/guilhermeleobas/206/orig 2025-08-14T21:21:50.2029938Z * [new branch] gh/guilhermeleobas/207/base -> origin/gh/guilhermeleobas/207/base 2025-08-14T21:21:50.2030810Z * [new branch] gh/guilhermeleobas/207/head -> origin/gh/guilhermeleobas/207/head 2025-08-14T21:21:50.2031472Z * [new branch] gh/guilhermeleobas/207/orig -> origin/gh/guilhermeleobas/207/orig 2025-08-14T21:21:50.2032326Z * [new branch] gh/guilhermeleobas/208/base -> origin/gh/guilhermeleobas/208/base 2025-08-14T21:21:50.2032779Z * [new branch] gh/guilhermeleobas/208/head -> origin/gh/guilhermeleobas/208/head 2025-08-14T21:21:50.2037824Z * [new branch] gh/guilhermeleobas/208/orig -> origin/gh/guilhermeleobas/208/orig 2025-08-14T21:21:50.2038087Z * [new branch] gh/guilhermeleobas/209/base -> origin/gh/guilhermeleobas/209/base 2025-08-14T21:21:50.2038362Z * [new branch] gh/guilhermeleobas/209/head -> origin/gh/guilhermeleobas/209/head 2025-08-14T21:21:50.2038624Z * [new branch] gh/guilhermeleobas/209/orig -> origin/gh/guilhermeleobas/209/orig 2025-08-14T21:21:50.2038898Z * [new branch] gh/guilhermeleobas/210/base -> origin/gh/guilhermeleobas/210/base 2025-08-14T21:21:50.2039092Z * [new branch] gh/guilhermeleobas/210/head -> origin/gh/guilhermeleobas/210/head 2025-08-14T21:21:50.2039375Z * [new branch] gh/guilhermeleobas/210/orig -> origin/gh/guilhermeleobas/210/orig 2025-08-14T21:21:50.2039573Z * [new branch] gh/guilhermeleobas/211/base -> origin/gh/guilhermeleobas/211/base 2025-08-14T21:21:50.2039849Z * [new branch] gh/guilhermeleobas/211/head -> origin/gh/guilhermeleobas/211/head 2025-08-14T21:21:50.2043414Z * [new branch] gh/guilhermeleobas/211/orig -> origin/gh/guilhermeleobas/211/orig 2025-08-14T21:21:50.2043788Z * [new branch] gh/guilhermeleobas/212/base -> origin/gh/guilhermeleobas/212/base 2025-08-14T21:21:50.2044064Z * [new branch] gh/guilhermeleobas/212/head -> origin/gh/guilhermeleobas/212/head 2025-08-14T21:21:50.2044283Z * [new branch] gh/guilhermeleobas/212/orig -> origin/gh/guilhermeleobas/212/orig 2025-08-14T21:21:50.2044691Z * [new branch] gh/guilhermeleobas/213/base -> origin/gh/guilhermeleobas/213/base 2025-08-14T21:21:50.2044872Z * [new branch] gh/guilhermeleobas/213/head -> origin/gh/guilhermeleobas/213/head 2025-08-14T21:21:50.2045042Z * [new branch] gh/guilhermeleobas/213/orig -> origin/gh/guilhermeleobas/213/orig 2025-08-14T21:21:50.2045608Z * [new branch] gh/guilhermeleobas/214/base -> origin/gh/guilhermeleobas/214/base 2025-08-14T21:21:50.2046570Z * [new branch] gh/guilhermeleobas/214/head -> origin/gh/guilhermeleobas/214/head 2025-08-14T21:21:50.2046913Z * [new branch] gh/guilhermeleobas/214/orig -> origin/gh/guilhermeleobas/214/orig 2025-08-14T21:21:50.2048193Z * [new branch] gh/guilhermeleobas/215/base -> origin/gh/guilhermeleobas/215/base 2025-08-14T21:21:50.2048533Z * [new branch] gh/guilhermeleobas/215/head -> origin/gh/guilhermeleobas/215/head 2025-08-14T21:21:50.2049485Z * [new branch] gh/guilhermeleobas/215/orig -> origin/gh/guilhermeleobas/215/orig 2025-08-14T21:21:50.2054308Z * [new branch] gh/guilhermeleobas/216/base -> origin/gh/guilhermeleobas/216/base 2025-08-14T21:21:50.2054545Z * [new branch] gh/guilhermeleobas/216/head -> origin/gh/guilhermeleobas/216/head 2025-08-14T21:21:50.2054737Z * [new branch] gh/guilhermeleobas/216/orig -> origin/gh/guilhermeleobas/216/orig 2025-08-14T21:21:50.2055206Z * [new branch] gh/guilhermeleobas/217/base -> origin/gh/guilhermeleobas/217/base 2025-08-14T21:21:50.2055401Z * [new branch] gh/guilhermeleobas/217/head -> origin/gh/guilhermeleobas/217/head 2025-08-14T21:21:50.2055585Z * [new branch] gh/guilhermeleobas/217/orig -> origin/gh/guilhermeleobas/217/orig 2025-08-14T21:21:50.2060667Z * [new branch] gh/guilhermeleobas/218/base -> origin/gh/guilhermeleobas/218/base 2025-08-14T21:21:50.2060868Z * [new branch] gh/guilhermeleobas/218/head -> origin/gh/guilhermeleobas/218/head 2025-08-14T21:21:50.2061079Z * [new branch] gh/guilhermeleobas/218/orig -> origin/gh/guilhermeleobas/218/orig 2025-08-14T21:21:50.2061253Z * [new branch] gh/guilhermeleobas/219/base -> origin/gh/guilhermeleobas/219/base 2025-08-14T21:21:50.2061430Z * [new branch] gh/guilhermeleobas/219/head -> origin/gh/guilhermeleobas/219/head 2025-08-14T21:21:50.2061605Z * [new branch] gh/guilhermeleobas/219/orig -> origin/gh/guilhermeleobas/219/orig 2025-08-14T21:21:50.2061774Z * [new branch] gh/guilhermeleobas/220/base -> origin/gh/guilhermeleobas/220/base 2025-08-14T21:21:50.2061941Z * [new branch] gh/guilhermeleobas/220/head -> origin/gh/guilhermeleobas/220/head 2025-08-14T21:21:50.2062108Z * [new branch] gh/guilhermeleobas/220/orig -> origin/gh/guilhermeleobas/220/orig 2025-08-14T21:21:50.2068825Z * [new branch] gh/guilhermeleobas/221/base -> origin/gh/guilhermeleobas/221/base 2025-08-14T21:21:50.2069080Z * [new branch] gh/guilhermeleobas/221/head -> origin/gh/guilhermeleobas/221/head 2025-08-14T21:21:50.2069256Z * [new branch] gh/guilhermeleobas/221/orig -> origin/gh/guilhermeleobas/221/orig 2025-08-14T21:21:50.2069425Z * [new branch] gh/guilhermeleobas/222/base -> origin/gh/guilhermeleobas/222/base 2025-08-14T21:21:50.2069585Z * [new branch] gh/guilhermeleobas/222/head -> origin/gh/guilhermeleobas/222/head 2025-08-14T21:21:50.2069746Z * [new branch] gh/guilhermeleobas/222/orig -> origin/gh/guilhermeleobas/222/orig 2025-08-14T21:21:50.2069921Z * [new branch] gh/guilhermeleobas/223/base -> origin/gh/guilhermeleobas/223/base 2025-08-14T21:21:50.2070303Z * [new branch] gh/guilhermeleobas/223/head -> origin/gh/guilhermeleobas/223/head 2025-08-14T21:21:50.2070487Z * [new branch] gh/guilhermeleobas/223/orig -> origin/gh/guilhermeleobas/223/orig 2025-08-14T21:21:50.2070820Z * [new branch] gh/guilhermeleobas/224/base -> origin/gh/guilhermeleobas/224/base 2025-08-14T21:21:50.2070988Z * [new branch] gh/guilhermeleobas/224/head -> origin/gh/guilhermeleobas/224/head 2025-08-14T21:21:50.2071159Z * [new branch] gh/guilhermeleobas/224/orig -> origin/gh/guilhermeleobas/224/orig 2025-08-14T21:21:50.2071321Z * [new branch] gh/guilhermeleobas/225/base -> origin/gh/guilhermeleobas/225/base 2025-08-14T21:21:50.2071493Z * [new branch] gh/guilhermeleobas/225/head -> origin/gh/guilhermeleobas/225/head 2025-08-14T21:21:50.2077434Z * [new branch] gh/guilhermeleobas/225/orig -> origin/gh/guilhermeleobas/225/orig 2025-08-14T21:21:50.2077637Z * [new branch] gh/guilhermeleobas/226/base -> origin/gh/guilhermeleobas/226/base 2025-08-14T21:21:50.2077970Z * [new branch] gh/guilhermeleobas/226/head -> origin/gh/guilhermeleobas/226/head 2025-08-14T21:21:50.2078139Z * [new branch] gh/guilhermeleobas/226/orig -> origin/gh/guilhermeleobas/226/orig 2025-08-14T21:21:50.2078312Z * [new branch] gh/guilhermeleobas/227/base -> origin/gh/guilhermeleobas/227/base 2025-08-14T21:21:50.2078473Z * [new branch] gh/guilhermeleobas/227/head -> origin/gh/guilhermeleobas/227/head 2025-08-14T21:21:50.2078631Z * [new branch] gh/guilhermeleobas/227/orig -> origin/gh/guilhermeleobas/227/orig 2025-08-14T21:21:50.2081635Z * [new branch] gh/guilhermeleobas/228/base -> origin/gh/guilhermeleobas/228/base 2025-08-14T21:21:50.2081997Z * [new branch] gh/guilhermeleobas/228/head -> origin/gh/guilhermeleobas/228/head 2025-08-14T21:21:50.2082189Z * [new branch] gh/guilhermeleobas/228/orig -> origin/gh/guilhermeleobas/228/orig 2025-08-14T21:21:50.2082400Z * [new branch] gh/guilhermeleobas/229/base -> origin/gh/guilhermeleobas/229/base 2025-08-14T21:21:50.2082760Z * [new branch] gh/guilhermeleobas/229/head -> origin/gh/guilhermeleobas/229/head 2025-08-14T21:21:50.2083044Z * [new branch] gh/guilhermeleobas/229/orig -> origin/gh/guilhermeleobas/229/orig 2025-08-14T21:21:50.2083244Z * [new branch] gh/guilhermeleobas/230/base -> origin/gh/guilhermeleobas/230/base 2025-08-14T21:21:50.2083515Z * [new branch] gh/guilhermeleobas/230/head -> origin/gh/guilhermeleobas/230/head 2025-08-14T21:21:50.2083906Z * [new branch] gh/guilhermeleobas/230/orig -> origin/gh/guilhermeleobas/230/orig 2025-08-14T21:21:50.2084086Z * [new branch] gh/guilhermeleobas/231/base -> origin/gh/guilhermeleobas/231/base 2025-08-14T21:21:50.2084289Z * [new branch] gh/guilhermeleobas/231/head -> origin/gh/guilhermeleobas/231/head 2025-08-14T21:21:50.2084473Z * [new branch] gh/guilhermeleobas/231/orig -> origin/gh/guilhermeleobas/231/orig 2025-08-14T21:21:50.2085285Z * [new branch] gh/guilhermeleobas/232/base -> origin/gh/guilhermeleobas/232/base 2025-08-14T21:21:50.2085823Z * [new branch] gh/guilhermeleobas/232/head -> origin/gh/guilhermeleobas/232/head 2025-08-14T21:21:50.2086811Z * [new branch] gh/guilhermeleobas/232/orig -> origin/gh/guilhermeleobas/232/orig 2025-08-14T21:21:50.2087755Z * [new branch] gh/guilhermeleobas/233/base -> origin/gh/guilhermeleobas/233/base 2025-08-14T21:21:50.2088088Z * [new branch] gh/guilhermeleobas/233/head -> origin/gh/guilhermeleobas/233/head 2025-08-14T21:21:50.2089481Z * [new branch] gh/guilhermeleobas/233/orig -> origin/gh/guilhermeleobas/233/orig 2025-08-14T21:21:50.2092951Z * [new branch] gh/guilhermeleobas/73/base -> origin/gh/guilhermeleobas/73/base 2025-08-14T21:21:50.2093326Z * [new branch] gh/guilhermeleobas/73/head -> origin/gh/guilhermeleobas/73/head 2025-08-14T21:21:50.2093522Z * [new branch] gh/guilhermeleobas/73/orig -> origin/gh/guilhermeleobas/73/orig 2025-08-14T21:21:50.2093866Z * [new branch] gh/henrylhtsang/103/base -> origin/gh/henrylhtsang/103/base 2025-08-14T21:21:50.2094140Z * [new branch] gh/henrylhtsang/103/head -> origin/gh/henrylhtsang/103/head 2025-08-14T21:21:50.2094324Z * [new branch] gh/henrylhtsang/103/orig -> origin/gh/henrylhtsang/103/orig 2025-08-14T21:21:50.2096601Z * [new branch] gh/henrylhtsang/108/base -> origin/gh/henrylhtsang/108/base 2025-08-14T21:21:50.2096956Z * [new branch] gh/henrylhtsang/108/head -> origin/gh/henrylhtsang/108/head 2025-08-14T21:21:50.2097209Z * [new branch] gh/henrylhtsang/108/orig -> origin/gh/henrylhtsang/108/orig 2025-08-14T21:21:50.2099384Z * [new branch] gh/henrylhtsang/118/base -> origin/gh/henrylhtsang/118/base 2025-08-14T21:21:50.2099754Z * [new branch] gh/henrylhtsang/118/head -> origin/gh/henrylhtsang/118/head 2025-08-14T21:21:50.2100232Z * [new branch] gh/henrylhtsang/118/orig -> origin/gh/henrylhtsang/118/orig 2025-08-14T21:21:50.2100461Z * [new branch] gh/henrylhtsang/123/base -> origin/gh/henrylhtsang/123/base 2025-08-14T21:21:50.2102031Z * [new branch] gh/henrylhtsang/123/head -> origin/gh/henrylhtsang/123/head 2025-08-14T21:21:50.2102395Z * [new branch] gh/henrylhtsang/123/orig -> origin/gh/henrylhtsang/123/orig 2025-08-14T21:21:50.2105752Z * [new branch] gh/henrylhtsang/124/base -> origin/gh/henrylhtsang/124/base 2025-08-14T21:21:50.2112233Z * [new branch] gh/henrylhtsang/124/head -> origin/gh/henrylhtsang/124/head 2025-08-14T21:21:50.2112669Z * [new branch] gh/henrylhtsang/124/orig -> origin/gh/henrylhtsang/124/orig 2025-08-14T21:21:50.2112949Z * [new branch] gh/henrylhtsang/125/base -> origin/gh/henrylhtsang/125/base 2025-08-14T21:21:50.2113663Z * [new branch] gh/henrylhtsang/125/head -> origin/gh/henrylhtsang/125/head 2025-08-14T21:21:50.2113885Z * [new branch] gh/henrylhtsang/125/orig -> origin/gh/henrylhtsang/125/orig 2025-08-14T21:21:50.2114057Z * [new branch] gh/henrylhtsang/126/base -> origin/gh/henrylhtsang/126/base 2025-08-14T21:21:50.2114343Z * [new branch] gh/henrylhtsang/126/head -> origin/gh/henrylhtsang/126/head 2025-08-14T21:21:50.2116392Z * [new branch] gh/henrylhtsang/126/orig -> origin/gh/henrylhtsang/126/orig 2025-08-14T21:21:50.2116623Z * [new branch] gh/henrylhtsang/127/base -> origin/gh/henrylhtsang/127/base 2025-08-14T21:21:50.2117291Z * [new branch] gh/henrylhtsang/127/head -> origin/gh/henrylhtsang/127/head 2025-08-14T21:21:50.2117621Z * [new branch] gh/henrylhtsang/127/orig -> origin/gh/henrylhtsang/127/orig 2025-08-14T21:21:50.2120742Z * [new branch] gh/henrylhtsang/128/base -> origin/gh/henrylhtsang/128/base 2025-08-14T21:21:50.2121152Z * [new branch] gh/henrylhtsang/128/head -> origin/gh/henrylhtsang/128/head 2025-08-14T21:21:50.2121313Z * [new branch] gh/henrylhtsang/128/orig -> origin/gh/henrylhtsang/128/orig 2025-08-14T21:21:50.2121467Z * [new branch] gh/henrylhtsang/129/base -> origin/gh/henrylhtsang/129/base 2025-08-14T21:21:50.2121652Z * [new branch] gh/henrylhtsang/129/head -> origin/gh/henrylhtsang/129/head 2025-08-14T21:21:50.2122676Z * [new branch] gh/henrylhtsang/129/orig -> origin/gh/henrylhtsang/129/orig 2025-08-14T21:21:50.2123859Z * [new branch] gh/henrylhtsang/130/base -> origin/gh/henrylhtsang/130/base 2025-08-14T21:21:50.2125158Z * [new branch] gh/henrylhtsang/130/head -> origin/gh/henrylhtsang/130/head 2025-08-14T21:21:50.2125793Z * [new branch] gh/henrylhtsang/131/base -> origin/gh/henrylhtsang/131/base 2025-08-14T21:21:50.2126259Z * [new branch] gh/henrylhtsang/131/head -> origin/gh/henrylhtsang/131/head 2025-08-14T21:21:50.2126465Z * [new branch] gh/henrylhtsang/131/orig -> origin/gh/henrylhtsang/131/orig 2025-08-14T21:21:50.2129257Z * [new branch] gh/henrylhtsang/132/base -> origin/gh/henrylhtsang/132/base 2025-08-14T21:21:50.2129467Z * [new branch] gh/henrylhtsang/132/head -> origin/gh/henrylhtsang/132/head 2025-08-14T21:21:50.2129629Z * [new branch] gh/henrylhtsang/132/orig -> origin/gh/henrylhtsang/132/orig 2025-08-14T21:21:50.2135383Z * [new branch] gh/henrylhtsang/133/base -> origin/gh/henrylhtsang/133/base 2025-08-14T21:21:50.2135756Z * [new branch] gh/henrylhtsang/133/head -> origin/gh/henrylhtsang/133/head 2025-08-14T21:21:50.2136012Z * [new branch] gh/henrylhtsang/133/orig -> origin/gh/henrylhtsang/133/orig 2025-08-14T21:21:50.2136267Z * [new branch] gh/henrylhtsang/134/base -> origin/gh/henrylhtsang/134/base 2025-08-14T21:21:50.2136803Z * [new branch] gh/henrylhtsang/134/head -> origin/gh/henrylhtsang/134/head 2025-08-14T21:21:50.2137514Z * [new branch] gh/henrylhtsang/134/orig -> origin/gh/henrylhtsang/134/orig 2025-08-14T21:21:50.2137747Z * [new branch] gh/henrylhtsang/135/base -> origin/gh/henrylhtsang/135/base 2025-08-14T21:21:50.2137919Z * [new branch] gh/henrylhtsang/135/head -> origin/gh/henrylhtsang/135/head 2025-08-14T21:21:50.2138086Z * [new branch] gh/henrylhtsang/135/orig -> origin/gh/henrylhtsang/135/orig 2025-08-14T21:21:50.2138706Z * [new branch] gh/henrylhtsang/136/base -> origin/gh/henrylhtsang/136/base 2025-08-14T21:21:50.2138897Z * [new branch] gh/henrylhtsang/136/head -> origin/gh/henrylhtsang/136/head 2025-08-14T21:21:50.2139091Z * [new branch] gh/henrylhtsang/136/orig -> origin/gh/henrylhtsang/136/orig 2025-08-14T21:21:50.2139408Z * [new branch] gh/henrylhtsang/137/base -> origin/gh/henrylhtsang/137/base 2025-08-14T21:21:50.2139606Z * [new branch] gh/henrylhtsang/137/head -> origin/gh/henrylhtsang/137/head 2025-08-14T21:21:50.2140973Z * [new branch] gh/henrylhtsang/137/orig -> origin/gh/henrylhtsang/137/orig 2025-08-14T21:21:50.2141150Z * [new branch] gh/henrylhtsang/138/base -> origin/gh/henrylhtsang/138/base 2025-08-14T21:21:50.2142277Z * [new branch] gh/henrylhtsang/138/head -> origin/gh/henrylhtsang/138/head 2025-08-14T21:21:50.2142676Z * [new branch] gh/henrylhtsang/138/orig -> origin/gh/henrylhtsang/138/orig 2025-08-14T21:21:50.2147685Z * [new branch] gh/henrylhtsang/139/base -> origin/gh/henrylhtsang/139/base 2025-08-14T21:21:50.2147974Z * [new branch] gh/henrylhtsang/139/head -> origin/gh/henrylhtsang/139/head 2025-08-14T21:21:50.2148629Z * [new branch] gh/henrylhtsang/139/orig -> origin/gh/henrylhtsang/139/orig 2025-08-14T21:21:50.2149036Z * [new branch] gh/henrylhtsang/140/base -> origin/gh/henrylhtsang/140/base 2025-08-14T21:21:50.2149214Z * [new branch] gh/henrylhtsang/140/head -> origin/gh/henrylhtsang/140/head 2025-08-14T21:21:50.2149511Z * [new branch] gh/henrylhtsang/140/orig -> origin/gh/henrylhtsang/140/orig 2025-08-14T21:21:50.2149695Z * [new branch] gh/henrylhtsang/141/base -> origin/gh/henrylhtsang/141/base 2025-08-14T21:21:50.2149970Z * [new branch] gh/henrylhtsang/141/head -> origin/gh/henrylhtsang/141/head 2025-08-14T21:21:50.2150152Z * [new branch] gh/henrylhtsang/141/orig -> origin/gh/henrylhtsang/141/orig 2025-08-14T21:21:50.2160662Z * [new branch] gh/henrylhtsang/142/base -> origin/gh/henrylhtsang/142/base 2025-08-14T21:21:50.2160982Z * [new branch] gh/henrylhtsang/142/head -> origin/gh/henrylhtsang/142/head 2025-08-14T21:21:50.2161479Z * [new branch] gh/henrylhtsang/142/orig -> origin/gh/henrylhtsang/142/orig 2025-08-14T21:21:50.2161668Z * [new branch] gh/henrylhtsang/143/base -> origin/gh/henrylhtsang/143/base 2025-08-14T21:21:50.2161821Z * [new branch] gh/henrylhtsang/143/head -> origin/gh/henrylhtsang/143/head 2025-08-14T21:21:50.2161972Z * [new branch] gh/henrylhtsang/143/orig -> origin/gh/henrylhtsang/143/orig 2025-08-14T21:21:50.2162139Z * [new branch] gh/henrylhtsang/144/base -> origin/gh/henrylhtsang/144/base 2025-08-14T21:21:50.2162326Z * [new branch] gh/henrylhtsang/144/head -> origin/gh/henrylhtsang/144/head 2025-08-14T21:21:50.2162517Z * [new branch] gh/henrylhtsang/144/orig -> origin/gh/henrylhtsang/144/orig 2025-08-14T21:21:50.2162703Z * [new branch] gh/henrylhtsang/145/base -> origin/gh/henrylhtsang/145/base 2025-08-14T21:21:50.2163001Z * [new branch] gh/henrylhtsang/145/head -> origin/gh/henrylhtsang/145/head 2025-08-14T21:21:50.2163173Z * [new branch] gh/henrylhtsang/145/orig -> origin/gh/henrylhtsang/145/orig 2025-08-14T21:21:50.2163368Z * [new branch] gh/henrylhtsang/146/base -> origin/gh/henrylhtsang/146/base 2025-08-14T21:21:50.2163544Z * [new branch] gh/henrylhtsang/146/head -> origin/gh/henrylhtsang/146/head 2025-08-14T21:21:50.2163696Z * [new branch] gh/henrylhtsang/146/orig -> origin/gh/henrylhtsang/146/orig 2025-08-14T21:21:50.2163857Z * [new branch] gh/huydhn/1/head -> origin/gh/huydhn/1/head 2025-08-14T21:21:50.2163996Z * [new branch] gh/huydhn/1/next -> origin/gh/huydhn/1/next 2025-08-14T21:21:50.2165708Z * [new branch] gh/huydhn/2/head -> origin/gh/huydhn/2/head 2025-08-14T21:21:50.2165883Z * [new branch] gh/huydhn/2/next -> origin/gh/huydhn/2/next 2025-08-14T21:21:50.2166052Z * [new branch] gh/huydhn/2/orig -> origin/gh/huydhn/2/orig 2025-08-14T21:21:50.2166212Z * [new branch] gh/huydhn/3/head -> origin/gh/huydhn/3/head 2025-08-14T21:21:50.2167908Z * [new branch] gh/huydhn/3/next -> origin/gh/huydhn/3/next 2025-08-14T21:21:50.2168358Z * [new branch] gh/huydhn/3/orig -> origin/gh/huydhn/3/orig 2025-08-14T21:21:50.2168557Z * [new branch] gh/huydhn/4/head -> origin/gh/huydhn/4/head 2025-08-14T21:21:50.2183031Z * [new branch] gh/huydhn/4/next -> origin/gh/huydhn/4/next 2025-08-14T21:21:50.2183281Z * [new branch] gh/huydhn/4/orig -> origin/gh/huydhn/4/orig 2025-08-14T21:21:50.2183443Z * [new branch] gh/huydhn/5/head -> origin/gh/huydhn/5/head 2025-08-14T21:21:50.2183693Z * [new branch] gh/huydhn/5/next -> origin/gh/huydhn/5/next 2025-08-14T21:21:50.2183900Z * [new branch] gh/huydhn/5/orig -> origin/gh/huydhn/5/orig 2025-08-14T21:21:50.2184048Z * [new branch] gh/huydhn/6/head -> origin/gh/huydhn/6/head 2025-08-14T21:21:50.2184194Z * [new branch] gh/huydhn/6/next -> origin/gh/huydhn/6/next 2025-08-14T21:21:50.2184337Z * [new branch] gh/huydhn/6/orig -> origin/gh/huydhn/6/orig 2025-08-14T21:21:50.2184481Z * [new branch] gh/int3/97/base -> origin/gh/int3/97/base 2025-08-14T21:21:50.2184632Z * [new branch] gh/int3/97/head -> origin/gh/int3/97/head 2025-08-14T21:21:50.2184797Z * [new branch] gh/isuruf/101/base -> origin/gh/isuruf/101/base 2025-08-14T21:21:50.2184962Z * [new branch] gh/isuruf/101/head -> origin/gh/isuruf/101/head 2025-08-14T21:21:50.2185110Z * [new branch] gh/isuruf/116/base -> origin/gh/isuruf/116/base 2025-08-14T21:21:50.2185258Z * [new branch] gh/isuruf/116/head -> origin/gh/isuruf/116/head 2025-08-14T21:21:50.2185552Z * [new branch] gh/isuruf/116/orig -> origin/gh/isuruf/116/orig 2025-08-14T21:21:50.2185703Z * [new branch] gh/isuruf/141/base -> origin/gh/isuruf/141/base 2025-08-14T21:21:50.2185853Z * [new branch] gh/isuruf/141/head -> origin/gh/isuruf/141/head 2025-08-14T21:21:50.2185995Z * [new branch] gh/isuruf/141/orig -> origin/gh/isuruf/141/orig 2025-08-14T21:21:50.2186138Z * [new branch] gh/isuruf/142/base -> origin/gh/isuruf/142/base 2025-08-14T21:21:50.2186304Z * [new branch] gh/isuruf/142/head -> origin/gh/isuruf/142/head 2025-08-14T21:21:50.2186454Z * [new branch] gh/isuruf/142/orig -> origin/gh/isuruf/142/orig 2025-08-14T21:21:50.2186610Z * [new branch] gh/isuruf/81/base -> origin/gh/isuruf/81/base 2025-08-14T21:21:50.2186793Z * [new branch] gh/isuruf/81/head -> origin/gh/isuruf/81/head 2025-08-14T21:21:50.2187055Z * [new branch] gh/isuruf/81/orig -> origin/gh/isuruf/81/orig 2025-08-14T21:21:50.2187568Z * [new branch] gh/jamesjwu/140/base -> origin/gh/jamesjwu/140/base 2025-08-14T21:21:50.2192153Z * [new branch] gh/jamesjwu/140/head -> origin/gh/jamesjwu/140/head 2025-08-14T21:21:50.2192502Z * [new branch] gh/jamesjwu/140/orig -> origin/gh/jamesjwu/140/orig 2025-08-14T21:21:50.2192747Z * [new branch] gh/jamesjwu/150/base -> origin/gh/jamesjwu/150/base 2025-08-14T21:21:50.2193385Z * [new branch] gh/jamesjwu/150/head -> origin/gh/jamesjwu/150/head 2025-08-14T21:21:50.2193579Z * [new branch] gh/jamesjwu/150/orig -> origin/gh/jamesjwu/150/orig 2025-08-14T21:21:50.2193730Z * [new branch] gh/jamesjwu/154/base -> origin/gh/jamesjwu/154/base 2025-08-14T21:21:50.2193906Z * [new branch] gh/jamesjwu/154/head -> origin/gh/jamesjwu/154/head 2025-08-14T21:21:50.2194245Z * [new branch] gh/jamesjwu/154/orig -> origin/gh/jamesjwu/154/orig 2025-08-14T21:21:50.2194694Z * [new branch] gh/jamesjwu/155/base -> origin/gh/jamesjwu/155/base 2025-08-14T21:21:50.2201692Z * [new branch] gh/jamesjwu/155/head -> origin/gh/jamesjwu/155/head 2025-08-14T21:21:50.2203576Z * [new branch] gh/jamesjwu/155/orig -> origin/gh/jamesjwu/155/orig 2025-08-14T21:21:50.2203730Z * [new branch] gh/jamesjwu/159/base -> origin/gh/jamesjwu/159/base 2025-08-14T21:21:50.2204030Z * [new branch] gh/jamesjwu/159/head -> origin/gh/jamesjwu/159/head 2025-08-14T21:21:50.2204188Z * [new branch] gh/jamesjwu/159/orig -> origin/gh/jamesjwu/159/orig 2025-08-14T21:21:50.2204347Z * [new branch] gh/jamesjwu/163/base -> origin/gh/jamesjwu/163/base 2025-08-14T21:21:50.2204502Z * [new branch] gh/jamesjwu/163/head -> origin/gh/jamesjwu/163/head 2025-08-14T21:21:50.2204642Z * [new branch] gh/jamesjwu/163/orig -> origin/gh/jamesjwu/163/orig 2025-08-14T21:21:50.2204796Z * [new branch] gh/jamesjwu/171/base -> origin/gh/jamesjwu/171/base 2025-08-14T21:21:50.2204947Z * [new branch] gh/jamesjwu/171/head -> origin/gh/jamesjwu/171/head 2025-08-14T21:21:50.2205329Z * [new branch] gh/jamesjwu/171/orig -> origin/gh/jamesjwu/171/orig 2025-08-14T21:21:50.2205773Z * [new branch] gh/jamesjwu/174/base -> origin/gh/jamesjwu/174/base 2025-08-14T21:21:50.2205956Z * [new branch] gh/jamesjwu/174/head -> origin/gh/jamesjwu/174/head 2025-08-14T21:21:50.2206120Z * [new branch] gh/jamesjwu/174/orig -> origin/gh/jamesjwu/174/orig 2025-08-14T21:21:50.2206332Z * [new branch] gh/jamesjwu/175/base -> origin/gh/jamesjwu/175/base 2025-08-14T21:21:50.2207604Z * [new branch] gh/jamesjwu/175/head -> origin/gh/jamesjwu/175/head 2025-08-14T21:21:50.2208045Z * [new branch] gh/jamesjwu/175/orig -> origin/gh/jamesjwu/175/orig 2025-08-14T21:21:50.2209134Z * [new branch] gh/jamesjwu/176/base -> origin/gh/jamesjwu/176/base 2025-08-14T21:21:50.2209540Z * [new branch] gh/jamesjwu/176/head -> origin/gh/jamesjwu/176/head 2025-08-14T21:21:50.2214251Z * [new branch] gh/jamesjwu/176/orig -> origin/gh/jamesjwu/176/orig 2025-08-14T21:21:50.2214482Z * [new branch] gh/jamesjwu/177/base -> origin/gh/jamesjwu/177/base 2025-08-14T21:21:50.2214632Z * [new branch] gh/jamesjwu/177/head -> origin/gh/jamesjwu/177/head 2025-08-14T21:21:50.2214780Z * [new branch] gh/jamesjwu/177/orig -> origin/gh/jamesjwu/177/orig 2025-08-14T21:21:50.2215139Z * [new branch] gh/jamesjwu/178/base -> origin/gh/jamesjwu/178/base 2025-08-14T21:21:50.2215315Z * [new branch] gh/jamesjwu/178/head -> origin/gh/jamesjwu/178/head 2025-08-14T21:21:50.2215492Z * [new branch] gh/jamesjwu/178/orig -> origin/gh/jamesjwu/178/orig 2025-08-14T21:21:50.2219663Z * [new branch] gh/jamesjwu/179/base -> origin/gh/jamesjwu/179/base 2025-08-14T21:21:50.2222991Z * [new branch] gh/jamesjwu/179/head -> origin/gh/jamesjwu/179/head 2025-08-14T21:21:50.2223367Z * [new branch] gh/jamesjwu/179/orig -> origin/gh/jamesjwu/179/orig 2025-08-14T21:21:50.2223734Z * [new branch] gh/jamesjwu/180/base -> origin/gh/jamesjwu/180/base 2025-08-14T21:21:50.2224148Z * [new branch] gh/jamesjwu/180/head -> origin/gh/jamesjwu/180/head 2025-08-14T21:21:50.2224491Z * [new branch] gh/jamesjwu/180/orig -> origin/gh/jamesjwu/180/orig 2025-08-14T21:21:50.2224839Z * [new branch] gh/jamesjwu/181/base -> origin/gh/jamesjwu/181/base 2025-08-14T21:21:50.2225169Z * [new branch] gh/jamesjwu/181/head -> origin/gh/jamesjwu/181/head 2025-08-14T21:21:50.2225507Z * [new branch] gh/jamesjwu/181/orig -> origin/gh/jamesjwu/181/orig 2025-08-14T21:21:50.2225858Z * [new branch] gh/jamesjwu/182/base -> origin/gh/jamesjwu/182/base 2025-08-14T21:21:50.2227963Z * [new branch] gh/jamesjwu/182/head -> origin/gh/jamesjwu/182/head 2025-08-14T21:21:50.2228375Z * [new branch] gh/jamesjwu/182/orig -> origin/gh/jamesjwu/182/orig 2025-08-14T21:21:50.2232134Z * [new branch] gh/jamesjwu/183/base -> origin/gh/jamesjwu/183/base 2025-08-14T21:21:50.2232592Z * [new branch] gh/jamesjwu/183/head -> origin/gh/jamesjwu/183/head 2025-08-14T21:21:50.2232960Z * [new branch] gh/jamesjwu/183/orig -> origin/gh/jamesjwu/183/orig 2025-08-14T21:21:50.2233397Z * [new branch] gh/jamesjwu/184/base -> origin/gh/jamesjwu/184/base 2025-08-14T21:21:50.2233768Z * [new branch] gh/jamesjwu/184/head -> origin/gh/jamesjwu/184/head 2025-08-14T21:21:50.2234112Z * [new branch] gh/jamesjwu/184/orig -> origin/gh/jamesjwu/184/orig 2025-08-14T21:21:50.2234470Z * [new branch] gh/jamesjwu/52/base -> origin/gh/jamesjwu/52/base 2025-08-14T21:21:50.2236431Z * [new branch] gh/jamesjwu/52/head -> origin/gh/jamesjwu/52/head 2025-08-14T21:21:50.2236807Z * [new branch] gh/jamesjwu/53/base -> origin/gh/jamesjwu/53/base 2025-08-14T21:21:50.2237156Z * [new branch] gh/jamesjwu/53/head -> origin/gh/jamesjwu/53/head 2025-08-14T21:21:50.2237503Z * [new branch] gh/jamesjwu/54/base -> origin/gh/jamesjwu/54/base 2025-08-14T21:21:50.2237850Z * [new branch] gh/jamesjwu/54/head -> origin/gh/jamesjwu/54/head 2025-08-14T21:21:50.2240288Z * [new branch] gh/jamesjwu/55/base -> origin/gh/jamesjwu/55/base 2025-08-14T21:21:50.2240712Z * [new branch] gh/jamesjwu/55/head -> origin/gh/jamesjwu/55/head 2025-08-14T21:21:50.2244922Z * [new branch] gh/jamesjwu/56/base -> origin/gh/jamesjwu/56/base 2025-08-14T21:21:50.2245358Z * [new branch] gh/jamesjwu/56/head -> origin/gh/jamesjwu/56/head 2025-08-14T21:21:50.2245740Z * [new branch] gh/jamesjwu/57/base -> origin/gh/jamesjwu/57/base 2025-08-14T21:21:50.2246100Z * [new branch] gh/jamesjwu/57/head -> origin/gh/jamesjwu/57/head 2025-08-14T21:21:50.2246470Z * [new branch] gh/jamesjwu/58/base -> origin/gh/jamesjwu/58/base 2025-08-14T21:21:50.2246824Z * [new branch] gh/jamesjwu/58/head -> origin/gh/jamesjwu/58/head 2025-08-14T21:21:50.2247371Z * [new branch] gh/jamesjwu/59/base -> origin/gh/jamesjwu/59/base 2025-08-14T21:21:50.2247727Z * [new branch] gh/jamesjwu/59/head -> origin/gh/jamesjwu/59/head 2025-08-14T21:21:50.2248090Z * [new branch] gh/jamesjwu/60/base -> origin/gh/jamesjwu/60/base 2025-08-14T21:21:50.2248445Z * [new branch] gh/jamesjwu/60/head -> origin/gh/jamesjwu/60/head 2025-08-14T21:21:50.2249249Z * [new branch] gh/jamesjwu/61/base -> origin/gh/jamesjwu/61/base 2025-08-14T21:21:50.2249606Z * [new branch] gh/jamesjwu/61/head -> origin/gh/jamesjwu/61/head 2025-08-14T21:21:50.2249960Z * [new branch] gh/jamesjwu/62/base -> origin/gh/jamesjwu/62/base 2025-08-14T21:21:50.2250317Z * [new branch] gh/jamesjwu/62/head -> origin/gh/jamesjwu/62/head 2025-08-14T21:21:50.2250665Z * [new branch] gh/jamesjwu/63/base -> origin/gh/jamesjwu/63/base 2025-08-14T21:21:50.2251025Z * [new branch] gh/jamesjwu/63/head -> origin/gh/jamesjwu/63/head 2025-08-14T21:21:50.2251389Z * [new branch] gh/jamesjwu/64/base -> origin/gh/jamesjwu/64/base 2025-08-14T21:21:50.2251784Z * [new branch] gh/jamesjwu/64/head -> origin/gh/jamesjwu/64/head 2025-08-14T21:21:50.2252147Z * [new branch] gh/jamesjwu/65/base -> origin/gh/jamesjwu/65/base 2025-08-14T21:21:50.2252518Z * [new branch] gh/jamesjwu/65/head -> origin/gh/jamesjwu/65/head 2025-08-14T21:21:50.2252942Z * [new branch] gh/janeyx99/165/base -> origin/gh/janeyx99/165/base 2025-08-14T21:21:50.2253685Z * [new branch] gh/janeyx99/165/head -> origin/gh/janeyx99/165/head 2025-08-14T21:21:50.2254034Z * [new branch] gh/janeyx99/165/orig -> origin/gh/janeyx99/165/orig 2025-08-14T21:21:50.2254382Z * [new branch] gh/janeyx99/201/base -> origin/gh/janeyx99/201/base 2025-08-14T21:21:50.2258349Z * [new branch] gh/janeyx99/201/head -> origin/gh/janeyx99/201/head 2025-08-14T21:21:50.2258774Z * [new branch] gh/janeyx99/201/orig -> origin/gh/janeyx99/201/orig 2025-08-14T21:21:50.2259135Z * [new branch] gh/janeyx99/225/base -> origin/gh/janeyx99/225/base 2025-08-14T21:21:50.2259484Z * [new branch] gh/janeyx99/225/head -> origin/gh/janeyx99/225/head 2025-08-14T21:21:50.2259895Z * [new branch] gh/janeyx99/225/orig -> origin/gh/janeyx99/225/orig 2025-08-14T21:21:50.2265665Z * [new branch] gh/janeyx99/256/base -> origin/gh/janeyx99/256/base 2025-08-14T21:21:50.2267191Z * [new branch] gh/janeyx99/256/head -> origin/gh/janeyx99/256/head 2025-08-14T21:21:50.2268064Z * [new branch] gh/janeyx99/256/orig -> origin/gh/janeyx99/256/orig 2025-08-14T21:21:50.2272865Z * [new branch] gh/janeyx99/268/base -> origin/gh/janeyx99/268/base 2025-08-14T21:21:50.2276959Z * [new branch] gh/janeyx99/268/head -> origin/gh/janeyx99/268/head 2025-08-14T21:21:50.2277345Z * [new branch] gh/janeyx99/268/orig -> origin/gh/janeyx99/268/orig 2025-08-14T21:21:50.2277720Z * [new branch] gh/janeyx99/269/base -> origin/gh/janeyx99/269/base 2025-08-14T21:21:50.2278056Z * [new branch] gh/janeyx99/269/head -> origin/gh/janeyx99/269/head 2025-08-14T21:21:50.2278395Z * [new branch] gh/janeyx99/269/orig -> origin/gh/janeyx99/269/orig 2025-08-14T21:21:50.2278732Z * [new branch] gh/janeyx99/274/base -> origin/gh/janeyx99/274/base 2025-08-14T21:21:50.2279058Z * [new branch] gh/janeyx99/274/head -> origin/gh/janeyx99/274/head 2025-08-14T21:21:50.2279387Z * [new branch] gh/janeyx99/274/orig -> origin/gh/janeyx99/274/orig 2025-08-14T21:21:50.2279766Z * [new branch] gh/janeyx99/276/base -> origin/gh/janeyx99/276/base 2025-08-14T21:21:50.2280091Z * [new branch] gh/janeyx99/276/head -> origin/gh/janeyx99/276/head 2025-08-14T21:21:50.2280422Z * [new branch] gh/janeyx99/276/orig -> origin/gh/janeyx99/276/orig 2025-08-14T21:21:50.2280762Z * [new branch] gh/janeyx99/277/base -> origin/gh/janeyx99/277/base 2025-08-14T21:21:50.2281101Z * [new branch] gh/janeyx99/277/head -> origin/gh/janeyx99/277/head 2025-08-14T21:21:50.2281435Z * [new branch] gh/janeyx99/277/orig -> origin/gh/janeyx99/277/orig 2025-08-14T21:21:50.2281776Z * [new branch] gh/janeyx99/278/base -> origin/gh/janeyx99/278/base 2025-08-14T21:21:50.2282121Z * [new branch] gh/janeyx99/278/head -> origin/gh/janeyx99/278/head 2025-08-14T21:21:50.2282472Z * [new branch] gh/janeyx99/278/orig -> origin/gh/janeyx99/278/orig 2025-08-14T21:21:50.2282799Z * [new branch] gh/janeyx99/279/base -> origin/gh/janeyx99/279/base 2025-08-14T21:21:50.2283133Z * [new branch] gh/janeyx99/279/head -> origin/gh/janeyx99/279/head 2025-08-14T21:21:50.2283464Z * [new branch] gh/janeyx99/279/orig -> origin/gh/janeyx99/279/orig 2025-08-14T21:21:50.2283800Z * [new branch] gh/janeyx99/280/base -> origin/gh/janeyx99/280/base 2025-08-14T21:21:50.2284130Z * [new branch] gh/janeyx99/280/head -> origin/gh/janeyx99/280/head 2025-08-14T21:21:50.2284606Z * [new branch] gh/janeyx99/280/orig -> origin/gh/janeyx99/280/orig 2025-08-14T21:21:50.2284959Z * [new branch] gh/janeyx99/281/base -> origin/gh/janeyx99/281/base 2025-08-14T21:21:50.2285299Z * [new branch] gh/janeyx99/281/head -> origin/gh/janeyx99/281/head 2025-08-14T21:21:50.2285633Z * [new branch] gh/janeyx99/281/orig -> origin/gh/janeyx99/281/orig 2025-08-14T21:21:50.2285983Z * [new branch] gh/janeyx99/282/base -> origin/gh/janeyx99/282/base 2025-08-14T21:21:50.2286325Z * [new branch] gh/janeyx99/282/head -> origin/gh/janeyx99/282/head 2025-08-14T21:21:50.2286656Z * [new branch] gh/janeyx99/282/orig -> origin/gh/janeyx99/282/orig 2025-08-14T21:21:50.2286998Z * [new branch] gh/janeyx99/283/base -> origin/gh/janeyx99/283/base 2025-08-14T21:21:50.2287346Z * [new branch] gh/janeyx99/283/head -> origin/gh/janeyx99/283/head 2025-08-14T21:21:50.2287689Z * [new branch] gh/janeyx99/283/orig -> origin/gh/janeyx99/283/orig 2025-08-14T21:21:50.2288405Z * [new branch] gh/janeyx99/284/base -> origin/gh/janeyx99/284/base 2025-08-14T21:21:50.2288960Z * [new branch] gh/janeyx99/284/head -> origin/gh/janeyx99/284/head 2025-08-14T21:21:50.2289378Z * [new branch] gh/janeyx99/284/orig -> origin/gh/janeyx99/284/orig 2025-08-14T21:21:50.2290767Z * [new branch] gh/janeyx99/285/base -> origin/gh/janeyx99/285/base 2025-08-14T21:21:50.2291427Z * [new branch] gh/janeyx99/285/head -> origin/gh/janeyx99/285/head 2025-08-14T21:21:50.2292820Z * [new branch] gh/janeyx99/285/orig -> origin/gh/janeyx99/285/orig 2025-08-14T21:21:50.2293204Z * [new branch] gh/janeyx99/286/base -> origin/gh/janeyx99/286/base 2025-08-14T21:21:50.2294737Z * [new branch] gh/janeyx99/286/head -> origin/gh/janeyx99/286/head 2025-08-14T21:21:50.2295085Z * [new branch] gh/janeyx99/286/orig -> origin/gh/janeyx99/286/orig 2025-08-14T21:21:50.2296333Z * [new branch] gh/janeyx99/287/base -> origin/gh/janeyx99/287/base 2025-08-14T21:21:50.2297076Z * [new branch] gh/janeyx99/287/head -> origin/gh/janeyx99/287/head 2025-08-14T21:21:50.2297665Z * [new branch] gh/janeyx99/287/orig -> origin/gh/janeyx99/287/orig 2025-08-14T21:21:50.2298093Z * [new branch] gh/janeyx99/288/base -> origin/gh/janeyx99/288/base 2025-08-14T21:21:50.2298456Z * [new branch] gh/janeyx99/288/head -> origin/gh/janeyx99/288/head 2025-08-14T21:21:50.2299707Z * [new branch] gh/janeyx99/288/orig -> origin/gh/janeyx99/288/orig 2025-08-14T21:21:50.2301191Z * [new branch] gh/janeyx99/289/base -> origin/gh/janeyx99/289/base 2025-08-14T21:21:50.2301602Z * [new branch] gh/janeyx99/289/head -> origin/gh/janeyx99/289/head 2025-08-14T21:21:50.2302156Z * [new branch] gh/janeyx99/289/orig -> origin/gh/janeyx99/289/orig 2025-08-14T21:21:50.2302969Z * [new branch] gh/janeyx99/290/base -> origin/gh/janeyx99/290/base 2025-08-14T21:21:50.2303622Z * [new branch] gh/janeyx99/290/head -> origin/gh/janeyx99/290/head 2025-08-14T21:21:50.2304307Z * [new branch] gh/janeyx99/290/orig -> origin/gh/janeyx99/290/orig 2025-08-14T21:21:50.2305888Z * [new branch] gh/janeyx99/291/base -> origin/gh/janeyx99/291/base 2025-08-14T21:21:50.2306311Z * [new branch] gh/janeyx99/291/head -> origin/gh/janeyx99/291/head 2025-08-14T21:21:50.2307352Z * [new branch] gh/janeyx99/291/orig -> origin/gh/janeyx99/291/orig 2025-08-14T21:21:50.2307834Z * [new branch] gh/janeyx99/292/base -> origin/gh/janeyx99/292/base 2025-08-14T21:21:50.2310465Z * [new branch] gh/janeyx99/292/head -> origin/gh/janeyx99/292/head 2025-08-14T21:21:50.2311004Z * [new branch] gh/janeyx99/292/orig -> origin/gh/janeyx99/292/orig 2025-08-14T21:21:50.2311478Z * [new branch] gh/janeyx99/293/base -> origin/gh/janeyx99/293/base 2025-08-14T21:21:50.2311829Z * [new branch] gh/janeyx99/293/head -> origin/gh/janeyx99/293/head 2025-08-14T21:21:50.2312194Z * [new branch] gh/janeyx99/293/orig -> origin/gh/janeyx99/293/orig 2025-08-14T21:21:50.2313577Z * [new branch] gh/janeyx99/294/base -> origin/gh/janeyx99/294/base 2025-08-14T21:21:50.2313937Z * [new branch] gh/janeyx99/294/head -> origin/gh/janeyx99/294/head 2025-08-14T21:21:50.2314393Z * [new branch] gh/janeyx99/294/orig -> origin/gh/janeyx99/294/orig 2025-08-14T21:21:50.2315682Z * [new branch] gh/janeyx99/295/base -> origin/gh/janeyx99/295/base 2025-08-14T21:21:50.2316422Z * [new branch] gh/janeyx99/295/head -> origin/gh/janeyx99/295/head 2025-08-14T21:21:50.2317222Z * [new branch] gh/janeyx99/295/orig -> origin/gh/janeyx99/295/orig 2025-08-14T21:21:50.2318553Z * [new branch] gh/janeyx99/296/base -> origin/gh/janeyx99/296/base 2025-08-14T21:21:50.2318931Z * [new branch] gh/janeyx99/296/head -> origin/gh/janeyx99/296/head 2025-08-14T21:21:50.2319720Z * [new branch] gh/janeyx99/296/orig -> origin/gh/janeyx99/296/orig 2025-08-14T21:21:50.2320307Z * [new branch] gh/janeyx99/297/base -> origin/gh/janeyx99/297/base 2025-08-14T21:21:50.2321063Z * [new branch] gh/janeyx99/297/head -> origin/gh/janeyx99/297/head 2025-08-14T21:21:50.2322494Z * [new branch] gh/janeyx99/297/orig -> origin/gh/janeyx99/297/orig 2025-08-14T21:21:50.2322932Z * [new branch] gh/janeyx99/298/base -> origin/gh/janeyx99/298/base 2025-08-14T21:21:50.2323323Z * [new branch] gh/janeyx99/298/head -> origin/gh/janeyx99/298/head 2025-08-14T21:21:50.2325018Z * [new branch] gh/janeyx99/298/orig -> origin/gh/janeyx99/298/orig 2025-08-14T21:21:50.2325450Z * [new branch] gh/janeyx99/299/base -> origin/gh/janeyx99/299/base 2025-08-14T21:21:50.2326048Z * [new branch] gh/janeyx99/299/head -> origin/gh/janeyx99/299/head 2025-08-14T21:21:50.2326437Z * [new branch] gh/janeyx99/299/orig -> origin/gh/janeyx99/299/orig 2025-08-14T21:21:50.2328165Z * [new branch] gh/janeyx99/300/base -> origin/gh/janeyx99/300/base 2025-08-14T21:21:50.2329159Z * [new branch] gh/janeyx99/300/head -> origin/gh/janeyx99/300/head 2025-08-14T21:21:50.2329597Z * [new branch] gh/janeyx99/300/orig -> origin/gh/janeyx99/300/orig 2025-08-14T21:21:50.2330974Z * [new branch] gh/janeyx99/88/base -> origin/gh/janeyx99/88/base 2025-08-14T21:21:50.2331327Z * [new branch] gh/janeyx99/88/head -> origin/gh/janeyx99/88/head 2025-08-14T21:21:50.2337681Z * [new branch] gh/janeyx99/88/orig -> origin/gh/janeyx99/88/orig 2025-08-14T21:21:50.2340892Z * [new branch] gh/jansel/360/base -> origin/gh/jansel/360/base 2025-08-14T21:21:50.2341290Z * [new branch] gh/jansel/360/head -> origin/gh/jansel/360/head 2025-08-14T21:21:50.2341671Z * [new branch] gh/jansel/451/base -> origin/gh/jansel/451/base 2025-08-14T21:21:50.2341999Z * [new branch] gh/jansel/451/head -> origin/gh/jansel/451/head 2025-08-14T21:21:50.2342372Z * [new branch] gh/jansel/451/orig -> origin/gh/jansel/451/orig 2025-08-14T21:21:50.2342709Z * [new branch] gh/jansel/462/base -> origin/gh/jansel/462/base 2025-08-14T21:21:50.2343043Z * [new branch] gh/jansel/462/head -> origin/gh/jansel/462/head 2025-08-14T21:21:50.2343388Z * [new branch] gh/jansel/462/orig -> origin/gh/jansel/462/orig 2025-08-14T21:21:50.2343729Z * [new branch] gh/jansel/531/base -> origin/gh/jansel/531/base 2025-08-14T21:21:50.2344069Z * [new branch] gh/jansel/531/head -> origin/gh/jansel/531/head 2025-08-14T21:21:50.2344401Z * [new branch] gh/jansel/531/orig -> origin/gh/jansel/531/orig 2025-08-14T21:21:50.2345796Z * [new branch] gh/jansel/534/base -> origin/gh/jansel/534/base 2025-08-14T21:21:50.2346156Z * [new branch] gh/jansel/534/head -> origin/gh/jansel/534/head 2025-08-14T21:21:50.2346488Z * [new branch] gh/jansel/534/orig -> origin/gh/jansel/534/orig 2025-08-14T21:21:50.2346847Z * [new branch] gh/jbschlosser/226/base -> origin/gh/jbschlosser/226/base 2025-08-14T21:21:50.2347219Z * [new branch] gh/jbschlosser/226/head -> origin/gh/jbschlosser/226/head 2025-08-14T21:21:50.2347579Z * [new branch] gh/jbschlosser/226/orig -> origin/gh/jbschlosser/226/orig 2025-08-14T21:21:50.2347947Z * [new branch] gh/jbschlosser/239/base -> origin/gh/jbschlosser/239/base 2025-08-14T21:21:50.2348324Z * [new branch] gh/jbschlosser/239/head -> origin/gh/jbschlosser/239/head 2025-08-14T21:21:50.2348884Z * [new branch] gh/jbschlosser/239/orig -> origin/gh/jbschlosser/239/orig 2025-08-14T21:21:50.2349256Z * [new branch] gh/jbschlosser/247/base -> origin/gh/jbschlosser/247/base 2025-08-14T21:21:50.2352070Z * [new branch] gh/jbschlosser/247/head -> origin/gh/jbschlosser/247/head 2025-08-14T21:21:50.2357908Z * [new branch] gh/jbschlosser/247/orig -> origin/gh/jbschlosser/247/orig 2025-08-14T21:21:50.2363043Z * [new branch] gh/jbschlosser/248/base -> origin/gh/jbschlosser/248/base 2025-08-14T21:21:50.2364220Z * [new branch] gh/jbschlosser/248/head -> origin/gh/jbschlosser/248/head 2025-08-14T21:21:50.2364623Z * [new branch] gh/jbschlosser/248/orig -> origin/gh/jbschlosser/248/orig 2025-08-14T21:21:50.2365008Z * [new branch] gh/jbschlosser/249/base -> origin/gh/jbschlosser/249/base 2025-08-14T21:21:50.2365582Z * [new branch] gh/jbschlosser/249/head -> origin/gh/jbschlosser/249/head 2025-08-14T21:21:50.2365980Z * [new branch] gh/jbschlosser/249/orig -> origin/gh/jbschlosser/249/orig 2025-08-14T21:21:50.2366360Z * [new branch] gh/jbschlosser/250/base -> origin/gh/jbschlosser/250/base 2025-08-14T21:21:50.2366742Z * [new branch] gh/jbschlosser/250/head -> origin/gh/jbschlosser/250/head 2025-08-14T21:21:50.2367129Z * [new branch] gh/jbschlosser/250/orig -> origin/gh/jbschlosser/250/orig 2025-08-14T21:21:50.2367511Z * [new branch] gh/jiayisunx/57/base -> origin/gh/jiayisunx/57/base 2025-08-14T21:21:50.2367878Z * [new branch] gh/jiayisunx/57/head -> origin/gh/jiayisunx/57/head 2025-08-14T21:21:50.2368236Z * [new branch] gh/jiayisunx/57/orig -> origin/gh/jiayisunx/57/orig 2025-08-14T21:21:50.2368590Z * [new branch] gh/jiayisunx/59/base -> origin/gh/jiayisunx/59/base 2025-08-14T21:21:50.2369331Z * [new branch] gh/jiayisunx/59/head -> origin/gh/jiayisunx/59/head 2025-08-14T21:21:50.2369698Z * [new branch] gh/jiayisunx/59/orig -> origin/gh/jiayisunx/59/orig 2025-08-14T21:21:50.2370117Z * [new branch] gh/jiayisunx/61/base -> origin/gh/jiayisunx/61/base 2025-08-14T21:21:50.2370483Z * [new branch] gh/jiayisunx/61/head -> origin/gh/jiayisunx/61/head 2025-08-14T21:21:50.2370842Z * [new branch] gh/jiayisunx/61/orig -> origin/gh/jiayisunx/61/orig 2025-08-14T21:21:50.2371190Z * [new branch] gh/jiayisunx/63/base -> origin/gh/jiayisunx/63/base 2025-08-14T21:21:50.2371556Z * [new branch] gh/jiayisunx/63/head -> origin/gh/jiayisunx/63/head 2025-08-14T21:21:50.2371916Z * [new branch] gh/jiayisunx/63/orig -> origin/gh/jiayisunx/63/orig 2025-08-14T21:21:50.2372278Z * [new branch] gh/jiayisunx/64/base -> origin/gh/jiayisunx/64/base 2025-08-14T21:21:50.2372642Z * [new branch] gh/jiayisunx/64/head -> origin/gh/jiayisunx/64/head 2025-08-14T21:21:50.2372998Z * [new branch] gh/jiayisunx/64/orig -> origin/gh/jiayisunx/64/orig 2025-08-14T21:21:50.2373360Z * [new branch] gh/jiayisunx/65/base -> origin/gh/jiayisunx/65/base 2025-08-14T21:21:50.2373716Z * [new branch] gh/jiayisunx/65/head -> origin/gh/jiayisunx/65/head 2025-08-14T21:21:50.2379804Z * [new branch] gh/jiayisunx/65/orig -> origin/gh/jiayisunx/65/orig 2025-08-14T21:21:50.2380244Z * [new branch] gh/jiayisunx/66/base -> origin/gh/jiayisunx/66/base 2025-08-14T21:21:50.2380614Z * [new branch] gh/jiayisunx/66/head -> origin/gh/jiayisunx/66/head 2025-08-14T21:21:50.2380975Z * [new branch] gh/jiayisunx/66/orig -> origin/gh/jiayisunx/66/orig 2025-08-14T21:21:50.2381327Z * [new branch] gh/jiayisunx/67/base -> origin/gh/jiayisunx/67/base 2025-08-14T21:21:50.2381867Z * [new branch] gh/jiayisunx/67/head -> origin/gh/jiayisunx/67/head 2025-08-14T21:21:50.2382232Z * [new branch] gh/jiayisunx/67/orig -> origin/gh/jiayisunx/67/orig 2025-08-14T21:21:50.2382596Z * [new branch] gh/jiayisunx/68/base -> origin/gh/jiayisunx/68/base 2025-08-14T21:21:50.2382948Z * [new branch] gh/jiayisunx/68/head -> origin/gh/jiayisunx/68/head 2025-08-14T21:21:50.2383301Z * [new branch] gh/jiayisunx/68/orig -> origin/gh/jiayisunx/68/orig 2025-08-14T21:21:50.2383674Z * [new branch] gh/jjwu@meta.com/1/base -> origin/gh/jjwu@meta.com/1/base 2025-08-14T21:21:50.2384041Z * [new branch] gh/jjwu@meta.com/1/head -> origin/gh/jjwu@meta.com/1/head 2025-08-14T21:21:50.2384862Z * [new branch] gh/justinchuby/111/base -> origin/gh/justinchuby/111/base 2025-08-14T21:21:50.2385304Z * [new branch] gh/justinchuby/111/head -> origin/gh/justinchuby/111/head 2025-08-14T21:21:50.2385689Z * [new branch] gh/justinchuby/111/orig -> origin/gh/justinchuby/111/orig 2025-08-14T21:21:50.2386062Z * [new branch] gh/kurtamohler/32/base -> origin/gh/kurtamohler/32/base 2025-08-14T21:21:50.2386438Z * [new branch] gh/kurtamohler/32/head -> origin/gh/kurtamohler/32/head 2025-08-14T21:21:50.2386817Z * [new branch] gh/kurtamohler/32/orig -> origin/gh/kurtamohler/32/orig 2025-08-14T21:21:50.2387187Z * [new branch] gh/kurtamohler/33/base -> origin/gh/kurtamohler/33/base 2025-08-14T21:21:50.2387559Z * [new branch] gh/kurtamohler/33/head -> origin/gh/kurtamohler/33/head 2025-08-14T21:21:50.2387945Z * [new branch] gh/kurtamohler/33/orig -> origin/gh/kurtamohler/33/orig 2025-08-14T21:21:50.2388320Z * [new branch] gh/kurtamohler/34/base -> origin/gh/kurtamohler/34/base 2025-08-14T21:21:50.2388706Z * [new branch] gh/kurtamohler/34/head -> origin/gh/kurtamohler/34/head 2025-08-14T21:21:50.2389073Z * [new branch] gh/kurtamohler/34/orig -> origin/gh/kurtamohler/34/orig 2025-08-14T21:21:50.2389751Z * [new branch] gh/kurtamohler/40/base -> origin/gh/kurtamohler/40/base 2025-08-14T21:21:50.2390342Z * [new branch] gh/kurtamohler/40/head -> origin/gh/kurtamohler/40/head 2025-08-14T21:21:50.2391083Z * [new branch] gh/kurtamohler/40/orig -> origin/gh/kurtamohler/40/orig 2025-08-14T21:21:50.2394170Z * [new branch] gh/kurtamohler/41/base -> origin/gh/kurtamohler/41/base 2025-08-14T21:21:50.2394647Z * [new branch] gh/kurtamohler/41/head -> origin/gh/kurtamohler/41/head 2025-08-14T21:21:50.2395016Z * [new branch] gh/kurtamohler/41/orig -> origin/gh/kurtamohler/41/orig 2025-08-14T21:21:50.2395394Z * [new branch] gh/kurtamohler/42/base -> origin/gh/kurtamohler/42/base 2025-08-14T21:21:50.2395766Z * [new branch] gh/kurtamohler/42/head -> origin/gh/kurtamohler/42/head 2025-08-14T21:21:50.2396127Z * [new branch] gh/kurtamohler/42/orig -> origin/gh/kurtamohler/42/orig 2025-08-14T21:21:50.2396502Z * [new branch] gh/kurtamohler/43/base -> origin/gh/kurtamohler/43/base 2025-08-14T21:21:50.2396874Z * [new branch] gh/kurtamohler/43/head -> origin/gh/kurtamohler/43/head 2025-08-14T21:21:50.2397712Z * [new branch] gh/kurtamohler/43/orig -> origin/gh/kurtamohler/43/orig 2025-08-14T21:21:50.2399386Z * [new branch] gh/kurtamohler/44/base -> origin/gh/kurtamohler/44/base 2025-08-14T21:21:50.2399758Z * [new branch] gh/kurtamohler/44/head -> origin/gh/kurtamohler/44/head 2025-08-14T21:21:50.2401412Z * [new branch] gh/kurtamohler/44/orig -> origin/gh/kurtamohler/44/orig 2025-08-14T21:21:50.2402088Z * [new branch] gh/kurtamohler/45/base -> origin/gh/kurtamohler/45/base 2025-08-14T21:21:50.2402459Z * [new branch] gh/kurtamohler/45/head -> origin/gh/kurtamohler/45/head 2025-08-14T21:21:50.2403043Z * [new branch] gh/kurtamohler/45/orig -> origin/gh/kurtamohler/45/orig 2025-08-14T21:21:50.2403855Z * [new branch] gh/kurtamohler/46/base -> origin/gh/kurtamohler/46/base 2025-08-14T21:21:50.2404241Z * [new branch] gh/kurtamohler/46/head -> origin/gh/kurtamohler/46/head 2025-08-14T21:21:50.2404980Z * [new branch] gh/kurtamohler/46/orig -> origin/gh/kurtamohler/46/orig 2025-08-14T21:21:50.2406485Z * [new branch] gh/kwen2501/130/base -> origin/gh/kwen2501/130/base 2025-08-14T21:21:50.2407166Z * [new branch] gh/kwen2501/130/head -> origin/gh/kwen2501/130/head 2025-08-14T21:21:50.2407914Z * [new branch] gh/kwen2501/130/orig -> origin/gh/kwen2501/130/orig 2025-08-14T21:21:50.2409744Z * [new branch] gh/kwen2501/142/base -> origin/gh/kwen2501/142/base 2025-08-14T21:21:50.2410337Z * [new branch] gh/kwen2501/142/head -> origin/gh/kwen2501/142/head 2025-08-14T21:21:50.2410868Z * [new branch] gh/kwen2501/142/orig -> origin/gh/kwen2501/142/orig 2025-08-14T21:21:50.2411351Z * [new branch] gh/kwen2501/15/base -> origin/gh/kwen2501/15/base 2025-08-14T21:21:50.2411887Z * [new branch] gh/kwen2501/15/head -> origin/gh/kwen2501/15/head 2025-08-14T21:21:50.2413158Z * [new branch] gh/kwen2501/156/base -> origin/gh/kwen2501/156/base 2025-08-14T21:21:50.2413544Z * [new branch] gh/kwen2501/156/head -> origin/gh/kwen2501/156/head 2025-08-14T21:21:50.2414185Z * [new branch] gh/kwen2501/156/orig -> origin/gh/kwen2501/156/orig 2025-08-14T21:21:50.2416066Z * [new branch] gh/kwen2501/170/base -> origin/gh/kwen2501/170/base 2025-08-14T21:21:50.2421493Z * [new branch] gh/kwen2501/170/head -> origin/gh/kwen2501/170/head 2025-08-14T21:21:50.2427210Z * [new branch] gh/kwen2501/179/base -> origin/gh/kwen2501/179/base 2025-08-14T21:21:50.2432434Z * [new branch] gh/kwen2501/179/head -> origin/gh/kwen2501/179/head 2025-08-14T21:21:50.2437065Z * [new branch] gh/kwen2501/179/orig -> origin/gh/kwen2501/179/orig 2025-08-14T21:21:50.2437887Z * [new branch] gh/kwen2501/181/base -> origin/gh/kwen2501/181/base 2025-08-14T21:21:50.2438237Z * [new branch] gh/kwen2501/181/head -> origin/gh/kwen2501/181/head 2025-08-14T21:21:50.2438575Z * [new branch] gh/kwen2501/181/orig -> origin/gh/kwen2501/181/orig 2025-08-14T21:21:50.2438910Z * [new branch] gh/kwen2501/183/base -> origin/gh/kwen2501/183/base 2025-08-14T21:21:50.2439258Z * [new branch] gh/kwen2501/183/head -> origin/gh/kwen2501/183/head 2025-08-14T21:21:50.2439600Z * [new branch] gh/kwen2501/183/orig -> origin/gh/kwen2501/183/orig 2025-08-14T21:21:50.2439931Z * [new branch] gh/kwen2501/184/base -> origin/gh/kwen2501/184/base 2025-08-14T21:21:50.2440262Z * [new branch] gh/kwen2501/184/head -> origin/gh/kwen2501/184/head 2025-08-14T21:21:50.2440587Z * [new branch] gh/kwen2501/184/orig -> origin/gh/kwen2501/184/orig 2025-08-14T21:21:50.2440917Z * [new branch] gh/kwen2501/186/base -> origin/gh/kwen2501/186/base 2025-08-14T21:21:50.2441244Z * [new branch] gh/kwen2501/186/head -> origin/gh/kwen2501/186/head 2025-08-14T21:21:50.2441572Z * [new branch] gh/kwen2501/186/orig -> origin/gh/kwen2501/186/orig 2025-08-14T21:21:50.2478645Z * [new branch] gh/kwen2501/187/base -> origin/gh/kwen2501/187/base 2025-08-14T21:21:50.2479378Z * [new branch] gh/kwen2501/187/head -> origin/gh/kwen2501/187/head 2025-08-14T21:21:50.2479751Z * [new branch] gh/kwen2501/187/orig -> origin/gh/kwen2501/187/orig 2025-08-14T21:21:50.2480112Z * [new branch] gh/kwen2501/188/base -> origin/gh/kwen2501/188/base 2025-08-14T21:21:50.2480441Z * [new branch] gh/kwen2501/188/head -> origin/gh/kwen2501/188/head 2025-08-14T21:21:50.2480773Z * [new branch] gh/kwen2501/188/orig -> origin/gh/kwen2501/188/orig 2025-08-14T21:21:50.2481109Z * [new branch] gh/kwen2501/194/base -> origin/gh/kwen2501/194/base 2025-08-14T21:21:50.2481444Z * [new branch] gh/kwen2501/194/head -> origin/gh/kwen2501/194/head 2025-08-14T21:21:50.2481791Z * [new branch] gh/kwen2501/194/orig -> origin/gh/kwen2501/194/orig 2025-08-14T21:21:50.2482131Z * [new branch] gh/kwen2501/195/base -> origin/gh/kwen2501/195/base 2025-08-14T21:21:50.2482540Z * [new branch] gh/kwen2501/195/head -> origin/gh/kwen2501/195/head 2025-08-14T21:21:50.2482878Z * [new branch] gh/kwen2501/195/orig -> origin/gh/kwen2501/195/orig 2025-08-14T21:21:50.2483222Z * [new branch] gh/kwen2501/196/base -> origin/gh/kwen2501/196/base 2025-08-14T21:21:50.2483566Z * [new branch] gh/kwen2501/196/head -> origin/gh/kwen2501/196/head 2025-08-14T21:21:50.2483913Z * [new branch] gh/kwen2501/196/orig -> origin/gh/kwen2501/196/orig 2025-08-14T21:21:50.2484252Z * [new branch] gh/kwen2501/197/base -> origin/gh/kwen2501/197/base 2025-08-14T21:21:50.2484599Z * [new branch] gh/kwen2501/197/head -> origin/gh/kwen2501/197/head 2025-08-14T21:21:50.2484946Z * [new branch] gh/kwen2501/197/orig -> origin/gh/kwen2501/197/orig 2025-08-14T21:21:50.2485299Z * [new branch] gh/kwen2501/198/base -> origin/gh/kwen2501/198/base 2025-08-14T21:21:50.2485637Z * [new branch] gh/kwen2501/198/head -> origin/gh/kwen2501/198/head 2025-08-14T21:21:50.2485986Z * [new branch] gh/kwen2501/198/orig -> origin/gh/kwen2501/198/orig 2025-08-14T21:21:50.2486332Z * [new branch] gh/kwen2501/199/base -> origin/gh/kwen2501/199/base 2025-08-14T21:21:50.2486667Z * [new branch] gh/kwen2501/199/head -> origin/gh/kwen2501/199/head 2025-08-14T21:21:50.2487008Z * [new branch] gh/kwen2501/199/orig -> origin/gh/kwen2501/199/orig 2025-08-14T21:21:50.2487366Z * [new branch] gh/kwen2501/200/base -> origin/gh/kwen2501/200/base 2025-08-14T21:21:50.2487706Z * [new branch] gh/kwen2501/200/head -> origin/gh/kwen2501/200/head 2025-08-14T21:21:50.2488043Z * [new branch] gh/kwen2501/200/orig -> origin/gh/kwen2501/200/orig 2025-08-14T21:21:50.2488390Z * [new branch] gh/kwen2501/201/base -> origin/gh/kwen2501/201/base 2025-08-14T21:21:50.2488832Z * [new branch] gh/kwen2501/201/head -> origin/gh/kwen2501/201/head 2025-08-14T21:21:50.2489182Z * [new branch] gh/kwen2501/201/orig -> origin/gh/kwen2501/201/orig 2025-08-14T21:21:50.2489521Z * [new branch] gh/kwen2501/202/base -> origin/gh/kwen2501/202/base 2025-08-14T21:21:50.2489863Z * [new branch] gh/kwen2501/202/head -> origin/gh/kwen2501/202/head 2025-08-14T21:21:50.2490208Z * [new branch] gh/kwen2501/202/orig -> origin/gh/kwen2501/202/orig 2025-08-14T21:21:50.2490542Z * [new branch] gh/kwen2501/203/base -> origin/gh/kwen2501/203/base 2025-08-14T21:21:50.2490880Z * [new branch] gh/kwen2501/203/head -> origin/gh/kwen2501/203/head 2025-08-14T21:21:50.2491220Z * [new branch] gh/kwen2501/203/orig -> origin/gh/kwen2501/203/orig 2025-08-14T21:21:50.2491649Z * [new branch] gh/laithsakka/152/base -> origin/gh/laithsakka/152/base 2025-08-14T21:21:50.2492025Z * [new branch] gh/laithsakka/152/head -> origin/gh/laithsakka/152/head 2025-08-14T21:21:50.2492399Z * [new branch] gh/laithsakka/152/orig -> origin/gh/laithsakka/152/orig 2025-08-14T21:21:50.2492772Z * [new branch] gh/laithsakka/156/base -> origin/gh/laithsakka/156/base 2025-08-14T21:21:50.2493145Z * [new branch] gh/laithsakka/156/head -> origin/gh/laithsakka/156/head 2025-08-14T21:21:50.2493506Z * [new branch] gh/laithsakka/156/orig -> origin/gh/laithsakka/156/orig 2025-08-14T21:21:50.2493877Z * [new branch] gh/laithsakka/159/base -> origin/gh/laithsakka/159/base 2025-08-14T21:21:50.2494248Z * [new branch] gh/laithsakka/159/head -> origin/gh/laithsakka/159/head 2025-08-14T21:21:50.2494618Z * [new branch] gh/laithsakka/159/orig -> origin/gh/laithsakka/159/orig 2025-08-14T21:21:50.2495048Z * [new branch] gh/laithsakka/160/base -> origin/gh/laithsakka/160/base 2025-08-14T21:21:50.2495413Z * [new branch] gh/laithsakka/160/head -> origin/gh/laithsakka/160/head 2025-08-14T21:21:50.2495770Z * [new branch] gh/laithsakka/160/orig -> origin/gh/laithsakka/160/orig 2025-08-14T21:21:50.2496158Z * [new branch] gh/laithsakka/178/base -> origin/gh/laithsakka/178/base 2025-08-14T21:21:50.2496523Z * [new branch] gh/laithsakka/178/head -> origin/gh/laithsakka/178/head 2025-08-14T21:21:50.2496898Z * [new branch] gh/laithsakka/178/orig -> origin/gh/laithsakka/178/orig 2025-08-14T21:21:50.2497258Z * [new branch] gh/laithsakka/191/base -> origin/gh/laithsakka/191/base 2025-08-14T21:21:50.2497621Z * [new branch] gh/laithsakka/191/head -> origin/gh/laithsakka/191/head 2025-08-14T21:21:50.2497987Z * [new branch] gh/laithsakka/191/orig -> origin/gh/laithsakka/191/orig 2025-08-14T21:21:50.2498354Z * [new branch] gh/laithsakka/234/base -> origin/gh/laithsakka/234/base 2025-08-14T21:21:50.2498709Z * [new branch] gh/laithsakka/234/head -> origin/gh/laithsakka/234/head 2025-08-14T21:21:50.2499070Z * [new branch] gh/laithsakka/234/orig -> origin/gh/laithsakka/234/orig 2025-08-14T21:21:50.2499430Z * [new branch] gh/laithsakka/237/base -> origin/gh/laithsakka/237/base 2025-08-14T21:21:50.2499782Z * [new branch] gh/laithsakka/237/head -> origin/gh/laithsakka/237/head 2025-08-14T21:21:50.2500140Z * [new branch] gh/laithsakka/237/orig -> origin/gh/laithsakka/237/orig 2025-08-14T21:21:50.2500501Z * [new branch] gh/laithsakka/238/base -> origin/gh/laithsakka/238/base 2025-08-14T21:21:50.2500869Z * [new branch] gh/laithsakka/238/head -> origin/gh/laithsakka/238/head 2025-08-14T21:21:50.2501220Z * [new branch] gh/laithsakka/238/orig -> origin/gh/laithsakka/238/orig 2025-08-14T21:21:50.2501570Z * [new branch] gh/laithsakka/239/base -> origin/gh/laithsakka/239/base 2025-08-14T21:21:50.2501920Z * [new branch] gh/laithsakka/239/head -> origin/gh/laithsakka/239/head 2025-08-14T21:21:50.2502271Z * [new branch] gh/laithsakka/239/orig -> origin/gh/laithsakka/239/orig 2025-08-14T21:21:50.2502783Z * [new branch] gh/laithsakka/240/base -> origin/gh/laithsakka/240/base 2025-08-14T21:21:50.2503152Z * [new branch] gh/laithsakka/240/head -> origin/gh/laithsakka/240/head 2025-08-14T21:21:50.2503508Z * [new branch] gh/laithsakka/240/orig -> origin/gh/laithsakka/240/orig 2025-08-14T21:21:50.2503862Z * [new branch] gh/laithsakka/242/base -> origin/gh/laithsakka/242/base 2025-08-14T21:21:50.2504224Z * [new branch] gh/laithsakka/242/head -> origin/gh/laithsakka/242/head 2025-08-14T21:21:50.2504669Z * [new branch] gh/laithsakka/242/orig -> origin/gh/laithsakka/242/orig 2025-08-14T21:21:50.2505034Z * [new branch] gh/laithsakka/243/base -> origin/gh/laithsakka/243/base 2025-08-14T21:21:50.2505384Z * [new branch] gh/laithsakka/243/head -> origin/gh/laithsakka/243/head 2025-08-14T21:21:50.2505742Z * [new branch] gh/laithsakka/243/orig -> origin/gh/laithsakka/243/orig 2025-08-14T21:21:50.2506155Z * [new branch] gh/laithsakka/244/base -> origin/gh/laithsakka/244/base 2025-08-14T21:21:50.2506543Z * [new branch] gh/laithsakka/244/head -> origin/gh/laithsakka/244/head 2025-08-14T21:21:50.2506953Z * [new branch] gh/laithsakka/244/orig -> origin/gh/laithsakka/244/orig 2025-08-14T21:21:50.2507367Z * [new branch] gh/laithsakka/245/base -> origin/gh/laithsakka/245/base 2025-08-14T21:21:50.2507815Z * [new branch] gh/laithsakka/245/head -> origin/gh/laithsakka/245/head 2025-08-14T21:21:50.2508191Z * [new branch] gh/laithsakka/245/orig -> origin/gh/laithsakka/245/orig 2025-08-14T21:21:50.2508599Z * [new branch] gh/laithsakka/246/base -> origin/gh/laithsakka/246/base 2025-08-14T21:21:50.2509022Z * [new branch] gh/laithsakka/246/head -> origin/gh/laithsakka/246/head 2025-08-14T21:21:50.2509437Z * [new branch] gh/laithsakka/246/orig -> origin/gh/laithsakka/246/orig 2025-08-14T21:21:50.2509850Z * [new branch] gh/laithsakka/247/base -> origin/gh/laithsakka/247/base 2025-08-14T21:21:50.2510272Z * [new branch] gh/laithsakka/247/head -> origin/gh/laithsakka/247/head 2025-08-14T21:21:50.2510664Z * [new branch] gh/laithsakka/247/orig -> origin/gh/laithsakka/247/orig 2025-08-14T21:21:50.2511085Z * [new branch] gh/laithsakka/248/base -> origin/gh/laithsakka/248/base 2025-08-14T21:21:50.2511477Z * [new branch] gh/laithsakka/248/head -> origin/gh/laithsakka/248/head 2025-08-14T21:21:50.2511876Z * [new branch] gh/laithsakka/248/orig -> origin/gh/laithsakka/248/orig 2025-08-14T21:21:50.2512267Z * [new branch] gh/laithsakka/249/base -> origin/gh/laithsakka/249/base 2025-08-14T21:21:50.2512650Z * [new branch] gh/laithsakka/249/head -> origin/gh/laithsakka/249/head 2025-08-14T21:21:50.2513014Z * [new branch] gh/laithsakka/249/orig -> origin/gh/laithsakka/249/orig 2025-08-14T21:21:50.2513369Z * [new branch] gh/laithsakka/250/base -> origin/gh/laithsakka/250/base 2025-08-14T21:21:50.2513758Z * [new branch] gh/laithsakka/250/head -> origin/gh/laithsakka/250/head 2025-08-14T21:21:50.2514123Z * [new branch] gh/laithsakka/250/orig -> origin/gh/laithsakka/250/orig 2025-08-14T21:21:50.2514485Z * [new branch] gh/laithsakka/251/base -> origin/gh/laithsakka/251/base 2025-08-14T21:21:50.2514844Z * [new branch] gh/laithsakka/251/head -> origin/gh/laithsakka/251/head 2025-08-14T21:21:50.2515202Z * [new branch] gh/laithsakka/251/orig -> origin/gh/laithsakka/251/orig 2025-08-14T21:21:50.2515560Z * [new branch] gh/laithsakka/252/base -> origin/gh/laithsakka/252/base 2025-08-14T21:21:50.2515918Z * [new branch] gh/laithsakka/252/head -> origin/gh/laithsakka/252/head 2025-08-14T21:21:50.2516290Z * [new branch] gh/laithsakka/252/orig -> origin/gh/laithsakka/252/orig 2025-08-14T21:21:50.2516652Z * [new branch] gh/laithsakka/253/base -> origin/gh/laithsakka/253/base 2025-08-14T21:21:50.2517016Z * [new branch] gh/laithsakka/253/head -> origin/gh/laithsakka/253/head 2025-08-14T21:21:50.2517370Z * [new branch] gh/laithsakka/253/orig -> origin/gh/laithsakka/253/orig 2025-08-14T21:21:50.2517777Z * [new branch] gh/laithsakka/254/base -> origin/gh/laithsakka/254/base 2025-08-14T21:21:50.2518139Z * [new branch] gh/laithsakka/254/head -> origin/gh/laithsakka/254/head 2025-08-14T21:21:50.2518503Z * [new branch] gh/laithsakka/254/orig -> origin/gh/laithsakka/254/orig 2025-08-14T21:21:50.2518877Z * [new branch] gh/laithsakka/255/base -> origin/gh/laithsakka/255/base 2025-08-14T21:21:50.2519250Z * [new branch] gh/laithsakka/255/head -> origin/gh/laithsakka/255/head 2025-08-14T21:21:50.2519622Z * [new branch] gh/laithsakka/255/orig -> origin/gh/laithsakka/255/orig 2025-08-14T21:21:50.2519984Z * [new branch] gh/laithsakka/256/base -> origin/gh/laithsakka/256/base 2025-08-14T21:21:50.2520605Z * [new branch] gh/laithsakka/256/head -> origin/gh/laithsakka/256/head 2025-08-14T21:21:50.2521162Z * [new branch] gh/laithsakka/256/orig -> origin/gh/laithsakka/256/orig 2025-08-14T21:21:50.2521670Z * [new branch] gh/laithsakka/257/base -> origin/gh/laithsakka/257/base 2025-08-14T21:21:50.2522163Z * [new branch] gh/laithsakka/257/head -> origin/gh/laithsakka/257/head 2025-08-14T21:21:50.2522654Z * [new branch] gh/laithsakka/257/orig -> origin/gh/laithsakka/257/orig 2025-08-14T21:21:50.2523922Z * [new branch] gh/laithsakka/258/base -> origin/gh/laithsakka/258/base 2025-08-14T21:21:50.2524419Z * [new branch] gh/laithsakka/258/head -> origin/gh/laithsakka/258/head 2025-08-14T21:21:50.2524872Z * [new branch] gh/laithsakka/258/orig -> origin/gh/laithsakka/258/orig 2025-08-14T21:21:50.2527035Z * [new branch] gh/laithsakka/259/base -> origin/gh/laithsakka/259/base 2025-08-14T21:21:50.2527516Z * [new branch] gh/laithsakka/259/head -> origin/gh/laithsakka/259/head 2025-08-14T21:21:50.2527949Z * [new branch] gh/laithsakka/259/orig -> origin/gh/laithsakka/259/orig 2025-08-14T21:21:50.2528991Z * [new branch] gh/laithsakka/260/base -> origin/gh/laithsakka/260/base 2025-08-14T21:21:50.2533786Z * [new branch] gh/laithsakka/260/head -> origin/gh/laithsakka/260/head 2025-08-14T21:21:50.2534649Z * [new branch] gh/laithsakka/260/orig -> origin/gh/laithsakka/260/orig 2025-08-14T21:21:50.2535094Z * [new branch] gh/laithsakka/261/base -> origin/gh/laithsakka/261/base 2025-08-14T21:21:50.2535476Z * [new branch] gh/laithsakka/261/head -> origin/gh/laithsakka/261/head 2025-08-14T21:21:50.2535885Z * [new branch] gh/laithsakka/261/orig -> origin/gh/laithsakka/261/orig 2025-08-14T21:21:50.2536308Z * [new branch] gh/laithsakka/262/base -> origin/gh/laithsakka/262/base 2025-08-14T21:21:50.2537544Z * [new branch] gh/laithsakka/262/head -> origin/gh/laithsakka/262/head 2025-08-14T21:21:50.2537954Z * [new branch] gh/laithsakka/262/orig -> origin/gh/laithsakka/262/orig 2025-08-14T21:21:50.2538368Z * [new branch] gh/laithsakka/28/base -> origin/gh/laithsakka/28/base 2025-08-14T21:21:50.2538742Z * [new branch] gh/laithsakka/29/base -> origin/gh/laithsakka/29/base 2025-08-14T21:21:50.2539114Z * [new branch] gh/laithsakka/30/base -> origin/gh/laithsakka/30/base 2025-08-14T21:21:50.2539481Z * [new branch] gh/laithsakka/30/head -> origin/gh/laithsakka/30/head 2025-08-14T21:21:50.2539996Z * [new branch] gh/laithsakka/31/base -> origin/gh/laithsakka/31/base 2025-08-14T21:21:50.2540969Z * [new branch] gh/laithsakka/31/head -> origin/gh/laithsakka/31/head 2025-08-14T21:21:50.2541381Z * [new branch] gh/laithsakka/32/base -> origin/gh/laithsakka/32/base 2025-08-14T21:21:50.2541765Z * [new branch] gh/laithsakka/32/head -> origin/gh/laithsakka/32/head 2025-08-14T21:21:50.2545944Z * [new branch] gh/lucaskabela/1/base -> origin/gh/lucaskabela/1/base 2025-08-14T21:21:50.2546398Z * [new branch] gh/lucaskabela/1/head -> origin/gh/lucaskabela/1/head 2025-08-14T21:21:50.2546781Z * [new branch] gh/lucaskabela/10/base -> origin/gh/lucaskabela/10/base 2025-08-14T21:21:50.2550941Z * [new branch] gh/lucaskabela/10/head -> origin/gh/lucaskabela/10/head 2025-08-14T21:21:50.2551359Z * [new branch] gh/lucaskabela/10/orig -> origin/gh/lucaskabela/10/orig 2025-08-14T21:21:50.2551732Z * [new branch] gh/lucaskabela/11/base -> origin/gh/lucaskabela/11/base 2025-08-14T21:21:50.2552110Z * [new branch] gh/lucaskabela/11/head -> origin/gh/lucaskabela/11/head 2025-08-14T21:21:50.2552481Z * [new branch] gh/lucaskabela/11/orig -> origin/gh/lucaskabela/11/orig 2025-08-14T21:21:50.2554161Z * [new branch] gh/lucaskabela/12/base -> origin/gh/lucaskabela/12/base 2025-08-14T21:21:50.2554592Z * [new branch] gh/lucaskabela/12/head -> origin/gh/lucaskabela/12/head 2025-08-14T21:21:50.2554967Z * [new branch] gh/lucaskabela/12/orig -> origin/gh/lucaskabela/12/orig 2025-08-14T21:21:50.2555321Z * [new branch] gh/lucaskabela/13/base -> origin/gh/lucaskabela/13/base 2025-08-14T21:21:50.2555685Z * [new branch] gh/lucaskabela/13/head -> origin/gh/lucaskabela/13/head 2025-08-14T21:21:50.2556054Z * [new branch] gh/lucaskabela/13/orig -> origin/gh/lucaskabela/13/orig 2025-08-14T21:21:50.2556421Z * [new branch] gh/lucaskabela/14/base -> origin/gh/lucaskabela/14/base 2025-08-14T21:21:50.2558181Z * [new branch] gh/lucaskabela/14/head -> origin/gh/lucaskabela/14/head 2025-08-14T21:21:50.2558576Z * [new branch] gh/lucaskabela/14/orig -> origin/gh/lucaskabela/14/orig 2025-08-14T21:21:50.2558948Z * [new branch] gh/lucaskabela/15/base -> origin/gh/lucaskabela/15/base 2025-08-14T21:21:50.2559316Z * [new branch] gh/lucaskabela/15/head -> origin/gh/lucaskabela/15/head 2025-08-14T21:21:50.2559678Z * [new branch] gh/lucaskabela/15/orig -> origin/gh/lucaskabela/15/orig 2025-08-14T21:21:50.2560051Z * [new branch] gh/lucaskabela/16/base -> origin/gh/lucaskabela/16/base 2025-08-14T21:21:50.2560428Z * [new branch] gh/lucaskabela/16/head -> origin/gh/lucaskabela/16/head 2025-08-14T21:21:50.2561115Z * [new branch] gh/lucaskabela/16/orig -> origin/gh/lucaskabela/16/orig 2025-08-14T21:21:50.2561494Z * [new branch] gh/lucaskabela/17/base -> origin/gh/lucaskabela/17/base 2025-08-14T21:21:50.2561855Z * [new branch] gh/lucaskabela/17/head -> origin/gh/lucaskabela/17/head 2025-08-14T21:21:50.2562271Z * [new branch] gh/lucaskabela/17/orig -> origin/gh/lucaskabela/17/orig 2025-08-14T21:21:50.2562650Z * [new branch] gh/lucaskabela/2/base -> origin/gh/lucaskabela/2/base 2025-08-14T21:21:50.2563001Z * [new branch] gh/lucaskabela/2/head -> origin/gh/lucaskabela/2/head 2025-08-14T21:21:50.2563541Z * [new branch] gh/lucaskabela/2/orig -> origin/gh/lucaskabela/2/orig 2025-08-14T21:21:50.2564810Z * [new branch] gh/lucaskabela/3/base -> origin/gh/lucaskabela/3/base 2025-08-14T21:21:50.2565194Z * [new branch] gh/lucaskabela/3/head -> origin/gh/lucaskabela/3/head 2025-08-14T21:21:50.2569414Z * [new branch] gh/lucaskabela/3/orig -> origin/gh/lucaskabela/3/orig 2025-08-14T21:21:50.2570318Z * [new branch] gh/lucaskabela/4/base -> origin/gh/lucaskabela/4/base 2025-08-14T21:21:50.2575740Z * [new branch] gh/lucaskabela/4/head -> origin/gh/lucaskabela/4/head 2025-08-14T21:21:50.2576382Z * [new branch] gh/lucaskabela/4/orig -> origin/gh/lucaskabela/4/orig 2025-08-14T21:21:50.2577003Z * [new branch] gh/lucaskabela/5/base -> origin/gh/lucaskabela/5/base 2025-08-14T21:21:50.2577386Z * [new branch] gh/lucaskabela/5/head -> origin/gh/lucaskabela/5/head 2025-08-14T21:21:50.2577770Z * [new branch] gh/lucaskabela/5/orig -> origin/gh/lucaskabela/5/orig 2025-08-14T21:21:50.2578128Z * [new branch] gh/lucaskabela/6/base -> origin/gh/lucaskabela/6/base 2025-08-14T21:21:50.2578536Z * [new branch] gh/lucaskabela/6/head -> origin/gh/lucaskabela/6/head 2025-08-14T21:21:50.2578913Z * [new branch] gh/lucaskabela/6/orig -> origin/gh/lucaskabela/6/orig 2025-08-14T21:21:50.2579274Z * [new branch] gh/lucaskabela/7/base -> origin/gh/lucaskabela/7/base 2025-08-14T21:21:50.2580138Z * [new branch] gh/lucaskabela/7/head -> origin/gh/lucaskabela/7/head 2025-08-14T21:21:50.2580507Z * [new branch] gh/lucaskabela/7/orig -> origin/gh/lucaskabela/7/orig 2025-08-14T21:21:50.2580878Z * [new branch] gh/lucaskabela/8/base -> origin/gh/lucaskabela/8/base 2025-08-14T21:21:50.2581266Z * [new branch] gh/lucaskabela/8/head -> origin/gh/lucaskabela/8/head 2025-08-14T21:21:50.2581619Z * [new branch] gh/lucaskabela/8/orig -> origin/gh/lucaskabela/8/orig 2025-08-14T21:21:50.2581981Z * [new branch] gh/lucaskabela/9/base -> origin/gh/lucaskabela/9/base 2025-08-14T21:21:50.2582586Z * [new branch] gh/lucaskabela/9/head -> origin/gh/lucaskabela/9/head 2025-08-14T21:21:50.2582967Z * [new branch] gh/lucaskabela/9/orig -> origin/gh/lucaskabela/9/orig 2025-08-14T21:21:50.2583314Z * [new branch] gh/lw/1/base -> origin/gh/lw/1/base 2025-08-14T21:21:50.2583646Z * [new branch] gh/lw/1/head -> origin/gh/lw/1/head 2025-08-14T21:21:50.2583955Z * [new branch] gh/lw/1/orig -> origin/gh/lw/1/orig 2025-08-14T21:21:50.2589091Z * [new branch] gh/lw/2/base -> origin/gh/lw/2/base 2025-08-14T21:21:50.2589504Z * [new branch] gh/lw/2/head -> origin/gh/lw/2/head 2025-08-14T21:21:50.2589840Z * [new branch] gh/lw/2/orig -> origin/gh/lw/2/orig 2025-08-14T21:21:50.2590159Z * [new branch] gh/lw/3/base -> origin/gh/lw/3/base 2025-08-14T21:21:50.2590460Z * [new branch] gh/lw/3/head -> origin/gh/lw/3/head 2025-08-14T21:21:50.2590769Z * [new branch] gh/lw/3/orig -> origin/gh/lw/3/orig 2025-08-14T21:21:50.2591275Z * [new branch] gh/malfet/14/base -> origin/gh/malfet/14/base 2025-08-14T21:21:50.2591768Z * [new branch] gh/malfet/330/base -> origin/gh/malfet/330/base 2025-08-14T21:21:50.2592304Z * [new branch] gh/malfet/330/head -> origin/gh/malfet/330/head 2025-08-14T21:21:50.2592667Z * [new branch] gh/malfet/330/orig -> origin/gh/malfet/330/orig 2025-08-14T21:21:50.2593055Z * [new branch] gh/malfet/396/base -> origin/gh/malfet/396/base 2025-08-14T21:21:50.2597220Z * [new branch] gh/malfet/396/head -> origin/gh/malfet/396/head 2025-08-14T21:21:50.2597590Z * [new branch] gh/malfet/396/orig -> origin/gh/malfet/396/orig 2025-08-14T21:21:50.2597924Z * [new branch] gh/malfet/397/base -> origin/gh/malfet/397/base 2025-08-14T21:21:50.2598267Z * [new branch] gh/malfet/397/head -> origin/gh/malfet/397/head 2025-08-14T21:21:50.2598640Z * [new branch] gh/malfet/397/orig -> origin/gh/malfet/397/orig 2025-08-14T21:21:50.2599003Z * [new branch] gh/malfet/398/base -> origin/gh/malfet/398/base 2025-08-14T21:21:50.2599485Z * [new branch] gh/malfet/398/head -> origin/gh/malfet/398/head 2025-08-14T21:21:50.2599864Z * [new branch] gh/malfet/398/orig -> origin/gh/malfet/398/orig 2025-08-14T21:21:50.2602356Z * [new branch] gh/malfet/399/base -> origin/gh/malfet/399/base 2025-08-14T21:21:50.2602946Z * [new branch] gh/malfet/399/head -> origin/gh/malfet/399/head 2025-08-14T21:21:50.2603310Z * [new branch] gh/malfet/399/orig -> origin/gh/malfet/399/orig 2025-08-14T21:21:50.2603656Z * [new branch] gh/malfet/414/base -> origin/gh/malfet/414/base 2025-08-14T21:21:50.2603999Z * [new branch] gh/malfet/414/head -> origin/gh/malfet/414/head 2025-08-14T21:21:50.2604330Z * [new branch] gh/malfet/414/orig -> origin/gh/malfet/414/orig 2025-08-14T21:21:50.2604956Z * [new branch] gh/malfet/417/base -> origin/gh/malfet/417/base 2025-08-14T21:21:50.2605306Z * [new branch] gh/malfet/417/head -> origin/gh/malfet/417/head 2025-08-14T21:21:50.2605638Z * [new branch] gh/malfet/417/orig -> origin/gh/malfet/417/orig 2025-08-14T21:21:50.2605962Z * [new branch] gh/malfet/418/base -> origin/gh/malfet/418/base 2025-08-14T21:21:50.2608066Z * [new branch] gh/malfet/418/head -> origin/gh/malfet/418/head 2025-08-14T21:21:50.2608804Z * [new branch] gh/malfet/418/orig -> origin/gh/malfet/418/orig 2025-08-14T21:21:50.2609227Z * [new branch] gh/malfet/422/base -> origin/gh/malfet/422/base 2025-08-14T21:21:50.2609555Z * [new branch] gh/malfet/422/head -> origin/gh/malfet/422/head 2025-08-14T21:21:50.2609907Z * [new branch] gh/malfet/422/orig -> origin/gh/malfet/422/orig 2025-08-14T21:21:50.2610258Z * [new branch] gh/malfet/438/base -> origin/gh/malfet/438/base 2025-08-14T21:21:50.2610587Z * [new branch] gh/malfet/438/head -> origin/gh/malfet/438/head 2025-08-14T21:21:50.2611191Z * [new branch] gh/malfet/438/orig -> origin/gh/malfet/438/orig 2025-08-14T21:21:50.2617207Z * [new branch] gh/malfet/439/base -> origin/gh/malfet/439/base 2025-08-14T21:21:50.2617633Z * [new branch] gh/malfet/439/head -> origin/gh/malfet/439/head 2025-08-14T21:21:50.2617994Z * [new branch] gh/malfet/439/orig -> origin/gh/malfet/439/orig 2025-08-14T21:21:50.2618352Z * [new branch] gh/malfet/440/base -> origin/gh/malfet/440/base 2025-08-14T21:21:50.2618683Z * [new branch] gh/malfet/440/head -> origin/gh/malfet/440/head 2025-08-14T21:21:50.2618828Z * [new branch] gh/malfet/440/orig -> origin/gh/malfet/440/orig 2025-08-14T21:21:50.2618993Z * [new branch] gh/malfet/441/base -> origin/gh/malfet/441/base 2025-08-14T21:21:50.2619154Z * [new branch] gh/malfet/441/head -> origin/gh/malfet/441/head 2025-08-14T21:21:50.2619455Z * [new branch] gh/malfet/441/orig -> origin/gh/malfet/441/orig 2025-08-14T21:21:50.2619633Z * [new branch] gh/malfet/442/base -> origin/gh/malfet/442/base 2025-08-14T21:21:50.2619777Z * [new branch] gh/malfet/442/head -> origin/gh/malfet/442/head 2025-08-14T21:21:50.2620385Z * [new branch] gh/malfet/442/orig -> origin/gh/malfet/442/orig 2025-08-14T21:21:50.2621173Z * [new branch] gh/malfet/443/base -> origin/gh/malfet/443/base 2025-08-14T21:21:50.2623624Z * [new branch] gh/malfet/443/head -> origin/gh/malfet/443/head 2025-08-14T21:21:50.2623958Z * [new branch] gh/malfet/443/orig -> origin/gh/malfet/443/orig 2025-08-14T21:21:50.2624429Z * [new branch] gh/malfet/444/base -> origin/gh/malfet/444/base 2025-08-14T21:21:50.2624729Z * [new branch] gh/malfet/444/head -> origin/gh/malfet/444/head 2025-08-14T21:21:50.2624957Z * [new branch] gh/malfet/444/orig -> origin/gh/malfet/444/orig 2025-08-14T21:21:50.2632516Z * [new branch] gh/malfet/445/base -> origin/gh/malfet/445/base 2025-08-14T21:21:50.2638200Z * [new branch] gh/malfet/445/head -> origin/gh/malfet/445/head 2025-08-14T21:21:50.2638698Z * [new branch] gh/malfet/445/orig -> origin/gh/malfet/445/orig 2025-08-14T21:21:50.2638868Z * [new branch] gh/malfet/446/base -> origin/gh/malfet/446/base 2025-08-14T21:21:50.2639019Z * [new branch] gh/malfet/446/head -> origin/gh/malfet/446/head 2025-08-14T21:21:50.2639165Z * [new branch] gh/malfet/446/orig -> origin/gh/malfet/446/orig 2025-08-14T21:21:50.2639483Z * [new branch] gh/malfet/447/base -> origin/gh/malfet/447/base 2025-08-14T21:21:50.2639659Z * [new branch] gh/malfet/447/head -> origin/gh/malfet/447/head 2025-08-14T21:21:50.2639808Z * [new branch] gh/malfet/448/base -> origin/gh/malfet/448/base 2025-08-14T21:21:50.2639957Z * [new branch] gh/malfet/448/head -> origin/gh/malfet/448/head 2025-08-14T21:21:50.2640092Z * [new branch] gh/malfet/449/base -> origin/gh/malfet/449/base 2025-08-14T21:21:50.2640241Z * [new branch] gh/malfet/449/head -> origin/gh/malfet/449/head 2025-08-14T21:21:50.2640390Z * [new branch] gh/malfet/450/base -> origin/gh/malfet/450/base 2025-08-14T21:21:50.2640533Z * [new branch] gh/malfet/450/head -> origin/gh/malfet/450/head 2025-08-14T21:21:50.2640672Z * [new branch] gh/malfet/451/base -> origin/gh/malfet/451/base 2025-08-14T21:21:50.2640847Z * [new branch] gh/malfet/451/head -> origin/gh/malfet/451/head 2025-08-14T21:21:50.2641002Z * [new branch] gh/malfet/452/base -> origin/gh/malfet/452/base 2025-08-14T21:21:50.2645664Z * [new branch] gh/malfet/452/head -> origin/gh/malfet/452/head 2025-08-14T21:21:50.2645842Z * [new branch] gh/malfet/452/orig -> origin/gh/malfet/452/orig 2025-08-14T21:21:50.2645995Z * [new branch] gh/malfet/453/base -> origin/gh/malfet/453/base 2025-08-14T21:21:50.2646160Z * [new branch] gh/malfet/453/head -> origin/gh/malfet/453/head 2025-08-14T21:21:50.2646334Z * [new branch] gh/malfet/453/orig -> origin/gh/malfet/453/orig 2025-08-14T21:21:50.2646484Z * [new branch] gh/malfet/454/base -> origin/gh/malfet/454/base 2025-08-14T21:21:50.2646633Z * [new branch] gh/malfet/454/head -> origin/gh/malfet/454/head 2025-08-14T21:21:50.2646778Z * [new branch] gh/malfet/454/orig -> origin/gh/malfet/454/orig 2025-08-14T21:21:50.2646911Z * [new branch] gh/malfet/455/base -> origin/gh/malfet/455/base 2025-08-14T21:21:50.2647051Z * [new branch] gh/malfet/455/head -> origin/gh/malfet/455/head 2025-08-14T21:21:50.2647199Z * [new branch] gh/malfet/455/orig -> origin/gh/malfet/455/orig 2025-08-14T21:21:50.2647372Z * [new branch] gh/malfet/456/base -> origin/gh/malfet/456/base 2025-08-14T21:21:50.2648278Z * [new branch] gh/malfet/456/head -> origin/gh/malfet/456/head 2025-08-14T21:21:50.2649058Z * [new branch] gh/malfet/456/orig -> origin/gh/malfet/456/orig 2025-08-14T21:21:50.2650129Z * [new branch] gh/malfet/457/base -> origin/gh/malfet/457/base 2025-08-14T21:21:50.2650557Z * [new branch] gh/malfet/457/head -> origin/gh/malfet/457/head 2025-08-14T21:21:50.2651422Z * [new branch] gh/malfet/457/orig -> origin/gh/malfet/457/orig 2025-08-14T21:21:50.2652449Z * [new branch] gh/malfet/458/base -> origin/gh/malfet/458/base 2025-08-14T21:21:50.2656973Z * [new branch] gh/malfet/458/head -> origin/gh/malfet/458/head 2025-08-14T21:21:50.2657294Z * [new branch] gh/malfet/458/orig -> origin/gh/malfet/458/orig 2025-08-14T21:21:50.2657455Z * [new branch] gh/malfet/459/base -> origin/gh/malfet/459/base 2025-08-14T21:21:50.2657596Z * [new branch] gh/malfet/459/head -> origin/gh/malfet/459/head 2025-08-14T21:21:50.2657728Z * [new branch] gh/malfet/459/orig -> origin/gh/malfet/459/orig 2025-08-14T21:21:50.2657871Z * [new branch] gh/malfet/460/base -> origin/gh/malfet/460/base 2025-08-14T21:21:50.2660791Z * [new branch] gh/malfet/460/head -> origin/gh/malfet/460/head 2025-08-14T21:21:50.2661042Z * [new branch] gh/malfet/460/orig -> origin/gh/malfet/460/orig 2025-08-14T21:21:50.2661173Z * [new branch] gh/malfet/461/base -> origin/gh/malfet/461/base 2025-08-14T21:21:50.2661313Z * [new branch] gh/malfet/461/head -> origin/gh/malfet/461/head 2025-08-14T21:21:50.2661446Z * [new branch] gh/malfet/461/orig -> origin/gh/malfet/461/orig 2025-08-14T21:21:50.2661596Z * [new branch] gh/malfet/462/base -> origin/gh/malfet/462/base 2025-08-14T21:21:50.2668609Z * [new branch] gh/malfet/462/head -> origin/gh/malfet/462/head 2025-08-14T21:21:50.2672976Z * [new branch] gh/malfet/462/orig -> origin/gh/malfet/462/orig 2025-08-14T21:21:50.2674904Z * [new branch] gh/malfet/463/base -> origin/gh/malfet/463/base 2025-08-14T21:21:50.2675090Z * [new branch] gh/malfet/463/head -> origin/gh/malfet/463/head 2025-08-14T21:21:50.2675248Z * [new branch] gh/malfet/463/orig -> origin/gh/malfet/463/orig 2025-08-14T21:21:50.2675386Z * [new branch] gh/malfet/464/base -> origin/gh/malfet/464/base 2025-08-14T21:21:50.2675531Z * [new branch] gh/malfet/464/head -> origin/gh/malfet/464/head 2025-08-14T21:21:50.2675668Z * [new branch] gh/malfet/464/orig -> origin/gh/malfet/464/orig 2025-08-14T21:21:50.2675804Z * [new branch] gh/malfet/465/base -> origin/gh/malfet/465/base 2025-08-14T21:21:50.2675946Z * [new branch] gh/malfet/465/head -> origin/gh/malfet/465/head 2025-08-14T21:21:50.2676079Z * [new branch] gh/malfet/465/orig -> origin/gh/malfet/465/orig 2025-08-14T21:21:50.2676222Z * [new branch] gh/malfet/466/base -> origin/gh/malfet/466/base 2025-08-14T21:21:50.2676368Z * [new branch] gh/malfet/466/head -> origin/gh/malfet/466/head 2025-08-14T21:21:50.2676506Z * [new branch] gh/malfet/466/orig -> origin/gh/malfet/466/orig 2025-08-14T21:21:50.2676647Z * [new branch] gh/malfet/467/base -> origin/gh/malfet/467/base 2025-08-14T21:21:50.2676782Z * [new branch] gh/malfet/467/head -> origin/gh/malfet/467/head 2025-08-14T21:21:50.2682480Z * [new branch] gh/malfet/467/orig -> origin/gh/malfet/467/orig 2025-08-14T21:21:50.2685388Z * [new branch] gh/malfet/468/base -> origin/gh/malfet/468/base 2025-08-14T21:21:50.2685558Z * [new branch] gh/malfet/468/head -> origin/gh/malfet/468/head 2025-08-14T21:21:50.2686023Z * [new branch] gh/malfet/468/orig -> origin/gh/malfet/468/orig 2025-08-14T21:21:50.2686196Z * [new branch] gh/malfet/469/base -> origin/gh/malfet/469/base 2025-08-14T21:21:50.2686357Z * [new branch] gh/malfet/469/head -> origin/gh/malfet/469/head 2025-08-14T21:21:50.2686671Z * [new branch] gh/malfet/469/orig -> origin/gh/malfet/469/orig 2025-08-14T21:21:50.2686837Z * [new branch] gh/malfet/470/base -> origin/gh/malfet/470/base 2025-08-14T21:21:50.2686983Z * [new branch] gh/malfet/470/head -> origin/gh/malfet/470/head 2025-08-14T21:21:50.2687139Z * [new branch] gh/malfet/470/orig -> origin/gh/malfet/470/orig 2025-08-14T21:21:50.2687283Z * [new branch] gh/malfet/471/base -> origin/gh/malfet/471/base 2025-08-14T21:21:50.2687434Z * [new branch] gh/malfet/471/head -> origin/gh/malfet/471/head 2025-08-14T21:21:50.2687581Z * [new branch] gh/malfet/471/orig -> origin/gh/malfet/471/orig 2025-08-14T21:21:50.2687730Z * [new branch] gh/malfet/472/base -> origin/gh/malfet/472/base 2025-08-14T21:21:50.2687960Z * [new branch] gh/malfet/472/head -> origin/gh/malfet/472/head 2025-08-14T21:21:50.2688110Z * [new branch] gh/malfet/472/orig -> origin/gh/malfet/472/orig 2025-08-14T21:21:50.2688251Z * [new branch] gh/malfet/473/base -> origin/gh/malfet/473/base 2025-08-14T21:21:50.2688396Z * [new branch] gh/malfet/473/head -> origin/gh/malfet/473/head 2025-08-14T21:21:50.2688545Z * [new branch] gh/malfet/473/orig -> origin/gh/malfet/473/orig 2025-08-14T21:21:50.2688867Z * [new branch] gh/malfet/474/base -> origin/gh/malfet/474/base 2025-08-14T21:21:50.2694337Z * [new branch] gh/malfet/474/head -> origin/gh/malfet/474/head 2025-08-14T21:21:50.2696501Z * [new branch] gh/malfet/474/orig -> origin/gh/malfet/474/orig 2025-08-14T21:21:50.2696666Z * [new branch] gh/malfet/475/base -> origin/gh/malfet/475/base 2025-08-14T21:21:50.2697212Z * [new branch] gh/malfet/475/head -> origin/gh/malfet/475/head 2025-08-14T21:21:50.2697457Z * [new branch] gh/malfet/475/orig -> origin/gh/malfet/475/orig 2025-08-14T21:21:50.2697638Z * [new branch] gh/malfet/476/base -> origin/gh/malfet/476/base 2025-08-14T21:21:50.2697787Z * [new branch] gh/malfet/476/head -> origin/gh/malfet/476/head 2025-08-14T21:21:50.2697926Z * [new branch] gh/malfet/476/orig -> origin/gh/malfet/476/orig 2025-08-14T21:21:50.2698077Z * [new branch] gh/malfet/477/base -> origin/gh/malfet/477/base 2025-08-14T21:21:50.2701492Z * [new branch] gh/malfet/477/head -> origin/gh/malfet/477/head 2025-08-14T21:21:50.2701714Z * [new branch] gh/malfet/477/orig -> origin/gh/malfet/477/orig 2025-08-14T21:21:50.2702355Z * [new branch] gh/malfet/478/base -> origin/gh/malfet/478/base 2025-08-14T21:21:50.2708046Z * [new branch] gh/malfet/478/head -> origin/gh/malfet/478/head 2025-08-14T21:21:50.2715395Z * [new branch] gh/malfet/478/orig -> origin/gh/malfet/478/orig 2025-08-14T21:21:50.2715655Z * [new branch] gh/malfet/479/base -> origin/gh/malfet/479/base 2025-08-14T21:21:50.2718943Z * [new branch] gh/malfet/479/head -> origin/gh/malfet/479/head 2025-08-14T21:21:50.2719222Z * [new branch] gh/malfet/479/orig -> origin/gh/malfet/479/orig 2025-08-14T21:21:50.2719373Z * [new branch] gh/malfet/480/base -> origin/gh/malfet/480/base 2025-08-14T21:21:50.2719600Z * [new branch] gh/malfet/480/head -> origin/gh/malfet/480/head 2025-08-14T21:21:50.2719761Z * [new branch] gh/malfet/480/orig -> origin/gh/malfet/480/orig 2025-08-14T21:21:50.2719980Z * [new branch] gh/malfet/481/base -> origin/gh/malfet/481/base 2025-08-14T21:21:50.2720369Z * [new branch] gh/malfet/481/head -> origin/gh/malfet/481/head 2025-08-14T21:21:50.2720597Z * [new branch] gh/malfet/481/orig -> origin/gh/malfet/481/orig 2025-08-14T21:21:50.2725444Z * [new branch] gh/malfet/482/base -> origin/gh/malfet/482/base 2025-08-14T21:21:50.2725762Z * [new branch] gh/malfet/482/head -> origin/gh/malfet/482/head 2025-08-14T21:21:50.2725940Z * [new branch] gh/malfet/482/orig -> origin/gh/malfet/482/orig 2025-08-14T21:21:50.2726191Z * [new branch] gh/malfet/483/base -> origin/gh/malfet/483/base 2025-08-14T21:21:50.2726459Z * [new branch] gh/malfet/483/head -> origin/gh/malfet/483/head 2025-08-14T21:21:50.2726626Z * [new branch] gh/malfet/483/orig -> origin/gh/malfet/483/orig 2025-08-14T21:21:50.2726786Z * [new branch] gh/malfet/484/base -> origin/gh/malfet/484/base 2025-08-14T21:21:50.2727380Z * [new branch] gh/malfet/484/head -> origin/gh/malfet/484/head 2025-08-14T21:21:50.2727524Z * [new branch] gh/malfet/484/orig -> origin/gh/malfet/484/orig 2025-08-14T21:21:50.2727671Z * [new branch] gh/malfet/485/base -> origin/gh/malfet/485/base 2025-08-14T21:21:50.2727810Z * [new branch] gh/malfet/485/head -> origin/gh/malfet/485/head 2025-08-14T21:21:50.2727951Z * [new branch] gh/malfet/485/orig -> origin/gh/malfet/485/orig 2025-08-14T21:21:50.2728097Z * [new branch] gh/malfet/486/base -> origin/gh/malfet/486/base 2025-08-14T21:21:50.2728239Z * [new branch] gh/malfet/486/head -> origin/gh/malfet/486/head 2025-08-14T21:21:50.2728379Z * [new branch] gh/malfet/486/orig -> origin/gh/malfet/486/orig 2025-08-14T21:21:50.2728527Z * [new branch] gh/malfet/487/base -> origin/gh/malfet/487/base 2025-08-14T21:21:50.2728999Z * [new branch] gh/malfet/487/head -> origin/gh/malfet/487/head 2025-08-14T21:21:50.2729161Z * [new branch] gh/malfet/487/orig -> origin/gh/malfet/487/orig 2025-08-14T21:21:50.2729301Z * [new branch] gh/malfet/488/base -> origin/gh/malfet/488/base 2025-08-14T21:21:50.2729441Z * [new branch] gh/malfet/488/head -> origin/gh/malfet/488/head 2025-08-14T21:21:50.2729594Z * [new branch] gh/malfet/488/orig -> origin/gh/malfet/488/orig 2025-08-14T21:21:50.2729734Z * [new branch] gh/malfet/489/base -> origin/gh/malfet/489/base 2025-08-14T21:21:50.2729882Z * [new branch] gh/malfet/489/head -> origin/gh/malfet/489/head 2025-08-14T21:21:50.2730023Z * [new branch] gh/malfet/489/orig -> origin/gh/malfet/489/orig 2025-08-14T21:21:50.2730166Z * [new branch] gh/malfet/490/base -> origin/gh/malfet/490/base 2025-08-14T21:21:50.2730333Z * [new branch] gh/malfet/490/head -> origin/gh/malfet/490/head 2025-08-14T21:21:50.2730472Z * [new branch] gh/malfet/490/orig -> origin/gh/malfet/490/orig 2025-08-14T21:21:50.2730623Z * [new branch] gh/malfet/64/base -> origin/gh/malfet/64/base 2025-08-14T21:21:50.2730762Z * [new branch] gh/malfet/64/head -> origin/gh/malfet/64/head 2025-08-14T21:21:50.2730949Z * [new branch] gh/manuelcandales/10/base -> origin/gh/manuelcandales/10/base 2025-08-14T21:21:50.2731121Z * [new branch] gh/manuelcandales/10/head -> origin/gh/manuelcandales/10/head 2025-08-14T21:21:50.2731285Z * [new branch] gh/manuelcandales/10/orig -> origin/gh/manuelcandales/10/orig 2025-08-14T21:21:50.2731462Z * [new branch] gh/manuelcandales/9/base -> origin/gh/manuelcandales/9/base 2025-08-14T21:21:50.2735066Z * [new branch] gh/manuelcandales/9/head -> origin/gh/manuelcandales/9/head 2025-08-14T21:21:50.2739567Z * [new branch] gh/manuelcandales/9/orig -> origin/gh/manuelcandales/9/orig 2025-08-14T21:21:50.2744266Z * [new branch] gh/markkm/1/base -> origin/gh/markkm/1/base 2025-08-14T21:21:50.2749788Z * [new branch] gh/masnesral/204/base -> origin/gh/masnesral/204/base 2025-08-14T21:21:50.2754313Z * [new branch] gh/masnesral/204/head -> origin/gh/masnesral/204/head 2025-08-14T21:21:50.2754526Z * [new branch] gh/masnesral/204/orig -> origin/gh/masnesral/204/orig 2025-08-14T21:21:50.2754731Z * [new branch] gh/masnesral/223/base -> origin/gh/masnesral/223/base 2025-08-14T21:21:50.2755023Z * [new branch] gh/masnesral/223/head -> origin/gh/masnesral/223/head 2025-08-14T21:21:50.2755196Z * [new branch] gh/masnesral/223/orig -> origin/gh/masnesral/223/orig 2025-08-14T21:21:50.2755684Z * [new branch] gh/masnesral/224/base -> origin/gh/masnesral/224/base 2025-08-14T21:21:50.2755873Z * [new branch] gh/masnesral/224/head -> origin/gh/masnesral/224/head 2025-08-14T21:21:50.2756045Z * [new branch] gh/masnesral/224/orig -> origin/gh/masnesral/224/orig 2025-08-14T21:21:50.2756286Z * [new branch] gh/masnesral/225/base -> origin/gh/masnesral/225/base 2025-08-14T21:21:50.2756456Z * [new branch] gh/masnesral/225/head -> origin/gh/masnesral/225/head 2025-08-14T21:21:50.2756704Z * [new branch] gh/masnesral/225/orig -> origin/gh/masnesral/225/orig 2025-08-14T21:21:50.2756860Z * [new branch] gh/masnesral/226/base -> origin/gh/masnesral/226/base 2025-08-14T21:21:50.2757094Z * [new branch] gh/masnesral/226/head -> origin/gh/masnesral/226/head 2025-08-14T21:21:50.2757761Z * [new branch] gh/masnesral/226/orig -> origin/gh/masnesral/226/orig 2025-08-14T21:21:50.2757978Z * [new branch] gh/masnesral/227/base -> origin/gh/masnesral/227/base 2025-08-14T21:21:50.2758135Z * [new branch] gh/masnesral/227/head -> origin/gh/masnesral/227/head 2025-08-14T21:21:50.2758295Z * [new branch] gh/masnesral/227/orig -> origin/gh/masnesral/227/orig 2025-08-14T21:21:50.2758447Z * [new branch] gh/masnesral/228/base -> origin/gh/masnesral/228/base 2025-08-14T21:21:50.2758607Z * [new branch] gh/masnesral/228/head -> origin/gh/masnesral/228/head 2025-08-14T21:21:50.2758761Z * [new branch] gh/masnesral/228/orig -> origin/gh/masnesral/228/orig 2025-08-14T21:21:50.2758909Z * [new branch] gh/masnesral/229/base -> origin/gh/masnesral/229/base 2025-08-14T21:21:50.2759065Z * [new branch] gh/masnesral/229/head -> origin/gh/masnesral/229/head 2025-08-14T21:21:50.2759218Z * [new branch] gh/masnesral/229/orig -> origin/gh/masnesral/229/orig 2025-08-14T21:21:50.2759369Z * [new branch] gh/masnesral/230/base -> origin/gh/masnesral/230/base 2025-08-14T21:21:50.2759522Z * [new branch] gh/masnesral/230/head -> origin/gh/masnesral/230/head 2025-08-14T21:21:50.2759672Z * [new branch] gh/masnesral/230/orig -> origin/gh/masnesral/230/orig 2025-08-14T21:21:50.2759828Z * [new branch] gh/masnesral/231/base -> origin/gh/masnesral/231/base 2025-08-14T21:21:50.2759978Z * [new branch] gh/masnesral/231/head -> origin/gh/masnesral/231/head 2025-08-14T21:21:50.2760128Z * [new branch] gh/masnesral/231/orig -> origin/gh/masnesral/231/orig 2025-08-14T21:21:50.2760460Z * [new branch] gh/masnesral/232/base -> origin/gh/masnesral/232/base 2025-08-14T21:21:50.2760632Z * [new branch] gh/masnesral/232/head -> origin/gh/masnesral/232/head 2025-08-14T21:21:50.2761047Z * [new branch] gh/masnesral/232/orig -> origin/gh/masnesral/232/orig 2025-08-14T21:21:50.2761332Z * [new branch] gh/masnesral/233/base -> origin/gh/masnesral/233/base 2025-08-14T21:21:50.2762046Z * [new branch] gh/masnesral/233/head -> origin/gh/masnesral/233/head 2025-08-14T21:21:50.2762253Z * [new branch] gh/masnesral/233/orig -> origin/gh/masnesral/233/orig 2025-08-14T21:21:50.2766396Z * [new branch] gh/masnesral/234/base -> origin/gh/masnesral/234/base 2025-08-14T21:21:50.2766586Z * [new branch] gh/masnesral/234/head -> origin/gh/masnesral/234/head 2025-08-14T21:21:50.2766740Z * [new branch] gh/masnesral/234/orig -> origin/gh/masnesral/234/orig 2025-08-14T21:21:50.2766898Z * [new branch] gh/masnesral/235/base -> origin/gh/masnesral/235/base 2025-08-14T21:21:50.2767053Z * [new branch] gh/masnesral/235/head -> origin/gh/masnesral/235/head 2025-08-14T21:21:50.2767369Z * [new branch] gh/masnesral/235/orig -> origin/gh/masnesral/235/orig 2025-08-14T21:21:50.2767532Z * [new branch] gh/masnesral/236/base -> origin/gh/masnesral/236/base 2025-08-14T21:21:50.2767689Z * [new branch] gh/masnesral/236/head -> origin/gh/masnesral/236/head 2025-08-14T21:21:50.2767843Z * [new branch] gh/masnesral/236/orig -> origin/gh/masnesral/236/orig 2025-08-14T21:21:50.2768348Z * [new branch] gh/masnesral/34/base -> origin/gh/masnesral/34/base 2025-08-14T21:21:50.2779083Z * [new branch] gh/mhorowitz/0/base -> origin/gh/mhorowitz/0/base 2025-08-14T21:21:50.2779256Z * [new branch] gh/mhorowitz/0/head -> origin/gh/mhorowitz/0/head 2025-08-14T21:21:50.2779740Z * [new branch] gh/mhorowitz/1/base -> origin/gh/mhorowitz/1/base 2025-08-14T21:21:50.2779923Z * [new branch] gh/mhorowitz/1/head -> origin/gh/mhorowitz/1/head 2025-08-14T21:21:50.2780086Z * [new branch] gh/mhorowitz/2/base -> origin/gh/mhorowitz/2/base 2025-08-14T21:21:50.2780232Z * [new branch] gh/mhorowitz/2/head -> origin/gh/mhorowitz/2/head 2025-08-14T21:21:50.2780369Z * [new branch] gh/mhorowitz/3/base -> origin/gh/mhorowitz/3/base 2025-08-14T21:21:50.2780510Z * [new branch] gh/mhorowitz/3/head -> origin/gh/mhorowitz/3/head 2025-08-14T21:21:50.2780656Z * [new branch] gh/mhorowitz/4/base -> origin/gh/mhorowitz/4/base 2025-08-14T21:21:50.2780795Z * [new branch] gh/mhorowitz/4/head -> origin/gh/mhorowitz/4/head 2025-08-14T21:21:50.2780943Z * [new branch] gh/mhorowitz/5/base -> origin/gh/mhorowitz/5/base 2025-08-14T21:21:50.2781088Z * [new branch] gh/mhorowitz/5/head -> origin/gh/mhorowitz/5/head 2025-08-14T21:21:50.2781225Z * [new branch] gh/mhorowitz/6/base -> origin/gh/mhorowitz/6/base 2025-08-14T21:21:50.2782138Z * [new branch] gh/mhorowitz/6/head -> origin/gh/mhorowitz/6/head 2025-08-14T21:21:50.2782440Z * [new branch] gh/mikaylagawarecki/234/base -> origin/gh/mikaylagawarecki/234/base 2025-08-14T21:21:50.2782632Z * [new branch] gh/mikaylagawarecki/234/head -> origin/gh/mikaylagawarecki/234/head 2025-08-14T21:21:50.2783094Z * [new branch] gh/mikaylagawarecki/235/base -> origin/gh/mikaylagawarecki/235/base 2025-08-14T21:21:50.2783300Z * [new branch] gh/mikaylagawarecki/235/head -> origin/gh/mikaylagawarecki/235/head 2025-08-14T21:21:50.2787623Z * [new branch] gh/mikaylagawarecki/236/base -> origin/gh/mikaylagawarecki/236/base 2025-08-14T21:21:50.2788273Z * [new branch] gh/mikaylagawarecki/236/head -> origin/gh/mikaylagawarecki/236/head 2025-08-14T21:21:50.2788489Z * [new branch] gh/mikaylagawarecki/237/base -> origin/gh/mikaylagawarecki/237/base 2025-08-14T21:21:50.2788865Z * [new branch] gh/mikaylagawarecki/237/head -> origin/gh/mikaylagawarecki/237/head 2025-08-14T21:21:50.2789062Z * [new branch] gh/mikaylagawarecki/238/base -> origin/gh/mikaylagawarecki/238/base 2025-08-14T21:21:50.2789244Z * [new branch] gh/mikaylagawarecki/238/head -> origin/gh/mikaylagawarecki/238/head 2025-08-14T21:21:50.2789468Z * [new branch] gh/mikaylagawarecki/313/base -> origin/gh/mikaylagawarecki/313/base 2025-08-14T21:21:50.2796107Z * [new branch] gh/mikaylagawarecki/313/head -> origin/gh/mikaylagawarecki/313/head 2025-08-14T21:21:50.2796321Z * [new branch] gh/mikaylagawarecki/313/orig -> origin/gh/mikaylagawarecki/313/orig 2025-08-14T21:21:50.2796522Z * [new branch] gh/mikaylagawarecki/317/base -> origin/gh/mikaylagawarecki/317/base 2025-08-14T21:21:50.2796707Z * [new branch] gh/mikaylagawarecki/317/head -> origin/gh/mikaylagawarecki/317/head 2025-08-14T21:21:50.2797109Z * [new branch] gh/mikaylagawarecki/317/orig -> origin/gh/mikaylagawarecki/317/orig 2025-08-14T21:21:50.2797287Z * [new branch] gh/mikaylagawarecki/318/base -> origin/gh/mikaylagawarecki/318/base 2025-08-14T21:21:50.2797469Z * [new branch] gh/mikaylagawarecki/318/head -> origin/gh/mikaylagawarecki/318/head 2025-08-14T21:21:50.2797657Z * [new branch] gh/mikaylagawarecki/318/orig -> origin/gh/mikaylagawarecki/318/orig 2025-08-14T21:21:50.2798113Z * [new branch] gh/mikaylagawarecki/319/base -> origin/gh/mikaylagawarecki/319/base 2025-08-14T21:21:50.2798308Z * [new branch] gh/mikaylagawarecki/319/head -> origin/gh/mikaylagawarecki/319/head 2025-08-14T21:21:50.2798488Z * [new branch] gh/mikaylagawarecki/319/orig -> origin/gh/mikaylagawarecki/319/orig 2025-08-14T21:21:50.2798668Z * [new branch] gh/mikaylagawarecki/320/base -> origin/gh/mikaylagawarecki/320/base 2025-08-14T21:21:50.2798873Z * [new branch] gh/mikaylagawarecki/320/head -> origin/gh/mikaylagawarecki/320/head 2025-08-14T21:21:50.2799043Z * [new branch] gh/mikaylagawarecki/320/orig -> origin/gh/mikaylagawarecki/320/orig 2025-08-14T21:21:50.2805183Z * [new branch] gh/mikaylagawarecki/321/base -> origin/gh/mikaylagawarecki/321/base 2025-08-14T21:21:50.2805402Z * [new branch] gh/mikaylagawarecki/321/head -> origin/gh/mikaylagawarecki/321/head 2025-08-14T21:21:50.2805592Z * [new branch] gh/mikaylagawarecki/321/orig -> origin/gh/mikaylagawarecki/321/orig 2025-08-14T21:21:50.2805774Z * [new branch] gh/mikaylagawarecki/322/base -> origin/gh/mikaylagawarecki/322/base 2025-08-14T21:21:50.2805954Z * [new branch] gh/mikaylagawarecki/322/head -> origin/gh/mikaylagawarecki/322/head 2025-08-14T21:21:50.2806138Z * [new branch] gh/mikaylagawarecki/322/orig -> origin/gh/mikaylagawarecki/322/orig 2025-08-14T21:21:50.2806336Z * [new branch] gh/mikaylagawarecki/323/base -> origin/gh/mikaylagawarecki/323/base 2025-08-14T21:21:50.2806515Z * [new branch] gh/mikaylagawarecki/323/head -> origin/gh/mikaylagawarecki/323/head 2025-08-14T21:21:50.2806701Z * [new branch] gh/mikaylagawarecki/323/orig -> origin/gh/mikaylagawarecki/323/orig 2025-08-14T21:21:50.2806885Z * [new branch] gh/mikaylagawarecki/324/base -> origin/gh/mikaylagawarecki/324/base 2025-08-14T21:21:50.2807066Z * [new branch] gh/mikaylagawarecki/324/head -> origin/gh/mikaylagawarecki/324/head 2025-08-14T21:21:50.2807619Z * [new branch] gh/mikaylagawarecki/324/orig -> origin/gh/mikaylagawarecki/324/orig 2025-08-14T21:21:50.2808233Z * [new branch] gh/mikaylagawarecki/325/base -> origin/gh/mikaylagawarecki/325/base 2025-08-14T21:21:50.2809265Z * [new branch] gh/mikaylagawarecki/325/head -> origin/gh/mikaylagawarecki/325/head 2025-08-14T21:21:50.2809948Z * [new branch] gh/mikaylagawarecki/325/orig -> origin/gh/mikaylagawarecki/325/orig 2025-08-14T21:21:50.2810413Z * [new branch] gh/mikaylagawarecki/326/base -> origin/gh/mikaylagawarecki/326/base 2025-08-14T21:21:50.2813223Z * [new branch] gh/mikaylagawarecki/326/head -> origin/gh/mikaylagawarecki/326/head 2025-08-14T21:21:50.2813440Z * [new branch] gh/mikaylagawarecki/326/orig -> origin/gh/mikaylagawarecki/326/orig 2025-08-14T21:21:50.2818778Z * [new branch] gh/mikaylagawarecki/327/base -> origin/gh/mikaylagawarecki/327/base 2025-08-14T21:21:50.2819148Z * [new branch] gh/mikaylagawarecki/327/head -> origin/gh/mikaylagawarecki/327/head 2025-08-14T21:21:50.2819415Z * [new branch] gh/mikaylagawarecki/327/orig -> origin/gh/mikaylagawarecki/327/orig 2025-08-14T21:21:50.2819614Z * [new branch] gh/mikaylagawarecki/328/base -> origin/gh/mikaylagawarecki/328/base 2025-08-14T21:21:50.2820179Z * [new branch] gh/mikaylagawarecki/328/head -> origin/gh/mikaylagawarecki/328/head 2025-08-14T21:21:50.2820472Z * [new branch] gh/mikaylagawarecki/328/orig -> origin/gh/mikaylagawarecki/328/orig 2025-08-14T21:21:50.2820766Z * [new branch] gh/mikaylagawarecki/329/base -> origin/gh/mikaylagawarecki/329/base 2025-08-14T21:21:50.2820972Z * [new branch] gh/mikaylagawarecki/329/head -> origin/gh/mikaylagawarecki/329/head 2025-08-14T21:21:50.2821244Z * [new branch] gh/mikaylagawarecki/329/orig -> origin/gh/mikaylagawarecki/329/orig 2025-08-14T21:21:50.2821437Z * [new branch] gh/mikaylagawarecki/330/base -> origin/gh/mikaylagawarecki/330/base 2025-08-14T21:21:50.2822107Z * [new branch] gh/mikaylagawarecki/330/head -> origin/gh/mikaylagawarecki/330/head 2025-08-14T21:21:50.2822353Z * [new branch] gh/mikaylagawarecki/330/orig -> origin/gh/mikaylagawarecki/330/orig 2025-08-14T21:21:50.2823850Z * [new branch] gh/mikaylagawarecki/331/base -> origin/gh/mikaylagawarecki/331/base 2025-08-14T21:21:50.2824195Z * [new branch] gh/mikaylagawarecki/331/head -> origin/gh/mikaylagawarecki/331/head 2025-08-14T21:21:50.2824519Z * [new branch] gh/mikaylagawarecki/331/orig -> origin/gh/mikaylagawarecki/331/orig 2025-08-14T21:21:50.2826525Z * [new branch] gh/mikaylagawarecki/332/base -> origin/gh/mikaylagawarecki/332/base 2025-08-14T21:21:50.2826726Z * [new branch] gh/mikaylagawarecki/332/head -> origin/gh/mikaylagawarecki/332/head 2025-08-14T21:21:50.2826955Z * [new branch] gh/mikaylagawarecki/332/orig -> origin/gh/mikaylagawarecki/332/orig 2025-08-14T21:21:50.2832278Z * [new branch] gh/mikaylagawarecki/333/base -> origin/gh/mikaylagawarecki/333/base 2025-08-14T21:21:50.2832478Z * [new branch] gh/mikaylagawarecki/333/head -> origin/gh/mikaylagawarecki/333/head 2025-08-14T21:21:50.2832668Z * [new branch] gh/mikaylagawarecki/333/orig -> origin/gh/mikaylagawarecki/333/orig 2025-08-14T21:21:50.2832842Z * [new branch] gh/mikaylagawarecki/334/base -> origin/gh/mikaylagawarecki/334/base 2025-08-14T21:21:50.2833016Z * [new branch] gh/mikaylagawarecki/334/head -> origin/gh/mikaylagawarecki/334/head 2025-08-14T21:21:50.2833179Z * [new branch] gh/mikaylagawarecki/334/orig -> origin/gh/mikaylagawarecki/334/orig 2025-08-14T21:21:50.2833331Z * [new branch] gh/mlazos/1/base -> origin/gh/mlazos/1/base 2025-08-14T21:21:50.2833483Z * [new branch] gh/mlazos/1/head -> origin/gh/mlazos/1/head 2025-08-14T21:21:50.2833964Z * [new branch] gh/mlazos/1/orig -> origin/gh/mlazos/1/orig 2025-08-14T21:21:50.2835828Z * [new branch] gh/mlazos/10/base -> origin/gh/mlazos/10/base 2025-08-14T21:21:50.2836142Z * [new branch] gh/mlazos/10/head -> origin/gh/mlazos/10/head 2025-08-14T21:21:50.2836726Z * [new branch] gh/mlazos/10/orig -> origin/gh/mlazos/10/orig 2025-08-14T21:21:50.2838685Z * [new branch] gh/mlazos/11/base -> origin/gh/mlazos/11/base 2025-08-14T21:21:50.2839018Z * [new branch] gh/mlazos/11/head -> origin/gh/mlazos/11/head 2025-08-14T21:21:50.2839193Z * [new branch] gh/mlazos/11/orig -> origin/gh/mlazos/11/orig 2025-08-14T21:21:50.2842070Z * [new branch] gh/mlazos/12/base -> origin/gh/mlazos/12/base 2025-08-14T21:21:50.2842399Z * [new branch] gh/mlazos/12/head -> origin/gh/mlazos/12/head 2025-08-14T21:21:50.2842569Z * [new branch] gh/mlazos/12/orig -> origin/gh/mlazos/12/orig 2025-08-14T21:21:50.2842799Z * [new branch] gh/mlazos/13/base -> origin/gh/mlazos/13/base 2025-08-14T21:21:50.2844237Z * [new branch] gh/mlazos/13/head -> origin/gh/mlazos/13/head 2025-08-14T21:21:50.2844568Z * [new branch] gh/mlazos/13/orig -> origin/gh/mlazos/13/orig 2025-08-14T21:21:50.2845150Z * [new branch] gh/mlazos/2/base -> origin/gh/mlazos/2/base 2025-08-14T21:21:50.2845292Z * [new branch] gh/mlazos/2/head -> origin/gh/mlazos/2/head 2025-08-14T21:21:50.2846303Z * [new branch] gh/mlazos/2/orig -> origin/gh/mlazos/2/orig 2025-08-14T21:21:50.2847089Z * [new branch] gh/mlazos/3/base -> origin/gh/mlazos/3/base 2025-08-14T21:21:50.2847658Z * [new branch] gh/mlazos/3/head -> origin/gh/mlazos/3/head 2025-08-14T21:21:50.2848593Z * [new branch] gh/mlazos/3/orig -> origin/gh/mlazos/3/orig 2025-08-14T21:21:50.2849812Z * [new branch] gh/mlazos/4/base -> origin/gh/mlazos/4/base 2025-08-14T21:21:50.2850115Z * [new branch] gh/mlazos/4/head -> origin/gh/mlazos/4/head 2025-08-14T21:21:50.2850997Z * [new branch] gh/mlazos/4/orig -> origin/gh/mlazos/4/orig 2025-08-14T21:21:50.2852046Z * [new branch] gh/mlazos/5/base -> origin/gh/mlazos/5/base 2025-08-14T21:21:50.2852459Z * [new branch] gh/mlazos/5/head -> origin/gh/mlazos/5/head 2025-08-14T21:21:50.2853391Z * [new branch] gh/mlazos/5/orig -> origin/gh/mlazos/5/orig 2025-08-14T21:21:50.2854276Z * [new branch] gh/mlazos/6/base -> origin/gh/mlazos/6/base 2025-08-14T21:21:50.2854724Z * [new branch] gh/mlazos/6/head -> origin/gh/mlazos/6/head 2025-08-14T21:21:50.2855779Z * [new branch] gh/mlazos/6/orig -> origin/gh/mlazos/6/orig 2025-08-14T21:21:50.2856823Z * [new branch] gh/mlazos/7/base -> origin/gh/mlazos/7/base 2025-08-14T21:21:50.2857144Z * [new branch] gh/mlazos/7/head -> origin/gh/mlazos/7/head 2025-08-14T21:21:50.2858175Z * [new branch] gh/mlazos/7/orig -> origin/gh/mlazos/7/orig 2025-08-14T21:21:50.2858899Z * [new branch] gh/mlazos/8/base -> origin/gh/mlazos/8/base 2025-08-14T21:21:50.2859638Z * [new branch] gh/mlazos/8/head -> origin/gh/mlazos/8/head 2025-08-14T21:21:50.2860141Z * [new branch] gh/mlazos/8/orig -> origin/gh/mlazos/8/orig 2025-08-14T21:21:50.2861383Z * [new branch] gh/mlazos/9/base -> origin/gh/mlazos/9/base 2025-08-14T21:21:50.2862144Z * [new branch] gh/mlazos/9/head -> origin/gh/mlazos/9/head 2025-08-14T21:21:50.2865845Z * [new branch] gh/mlazos/9/orig -> origin/gh/mlazos/9/orig 2025-08-14T21:21:50.2866022Z * [new branch] gh/mrmiywj/1/base -> origin/gh/mrmiywj/1/base 2025-08-14T21:21:50.2866167Z * [new branch] gh/mrmiywj/1/head -> origin/gh/mrmiywj/1/head 2025-08-14T21:21:50.2866362Z * [new branch] gh/muchulee8/62/base -> origin/gh/muchulee8/62/base 2025-08-14T21:21:50.2867098Z * [new branch] gh/muchulee8/62/head -> origin/gh/muchulee8/62/head 2025-08-14T21:21:50.2873208Z * [new branch] gh/muchulee8/62/orig -> origin/gh/muchulee8/62/orig 2025-08-14T21:21:50.2873384Z * [new branch] gh/muchulee8/63/base -> origin/gh/muchulee8/63/base 2025-08-14T21:21:50.2873537Z * [new branch] gh/muchulee8/63/head -> origin/gh/muchulee8/63/head 2025-08-14T21:21:50.2873690Z * [new branch] gh/muchulee8/63/orig -> origin/gh/muchulee8/63/orig 2025-08-14T21:21:50.2873869Z * [new branch] gh/muchulee8/64/base -> origin/gh/muchulee8/64/base 2025-08-14T21:21:50.2874021Z * [new branch] gh/muchulee8/64/head -> origin/gh/muchulee8/64/head 2025-08-14T21:21:50.2879270Z * [new branch] gh/muchulee8/64/orig -> origin/gh/muchulee8/64/orig 2025-08-14T21:21:50.2879634Z * [new branch] gh/muchulee8/65/base -> origin/gh/muchulee8/65/base 2025-08-14T21:21:50.2879782Z * [new branch] gh/muchulee8/65/head -> origin/gh/muchulee8/65/head 2025-08-14T21:21:50.2879928Z * [new branch] gh/muchulee8/65/orig -> origin/gh/muchulee8/65/orig 2025-08-14T21:21:50.2880109Z * [new branch] gh/oulgen/35/base -> origin/gh/oulgen/35/base 2025-08-14T21:21:50.2880251Z * [new branch] gh/oulgen/35/head -> origin/gh/oulgen/35/head 2025-08-14T21:21:50.2880566Z * [new branch] gh/oulgen/35/orig -> origin/gh/oulgen/35/orig 2025-08-14T21:21:50.2880725Z * [new branch] gh/oulgen/44/base -> origin/gh/oulgen/44/base 2025-08-14T21:21:50.2880861Z * [new branch] gh/oulgen/44/head -> origin/gh/oulgen/44/head 2025-08-14T21:21:50.2881139Z * [new branch] gh/oulgen/44/orig -> origin/gh/oulgen/44/orig 2025-08-14T21:21:50.2881287Z * [new branch] gh/oulgen/45/base -> origin/gh/oulgen/45/base 2025-08-14T21:21:50.2881810Z * [new branch] gh/oulgen/45/head -> origin/gh/oulgen/45/head 2025-08-14T21:21:50.2884841Z * [new branch] gh/oulgen/45/orig -> origin/gh/oulgen/45/orig 2025-08-14T21:21:50.2885003Z * [new branch] gh/oulgen/46/base -> origin/gh/oulgen/46/base 2025-08-14T21:21:50.2885157Z * [new branch] gh/oulgen/46/head -> origin/gh/oulgen/46/head 2025-08-14T21:21:50.2885296Z * [new branch] gh/oulgen/46/orig -> origin/gh/oulgen/46/orig 2025-08-14T21:21:50.2885428Z * [new branch] gh/oulgen/47/base -> origin/gh/oulgen/47/base 2025-08-14T21:21:50.2885569Z * [new branch] gh/oulgen/47/head -> origin/gh/oulgen/47/head 2025-08-14T21:21:50.2885964Z * [new branch] gh/oulgen/47/orig -> origin/gh/oulgen/47/orig 2025-08-14T21:21:50.2886837Z * [new branch] gh/pearu/108/base -> origin/gh/pearu/108/base 2025-08-14T21:21:50.2888360Z * [new branch] gh/pearu/108/head -> origin/gh/pearu/108/head 2025-08-14T21:21:50.2889095Z * [new branch] gh/pearu/108/orig -> origin/gh/pearu/108/orig 2025-08-14T21:21:50.2892951Z * [new branch] gh/pearu/56/base -> origin/gh/pearu/56/base 2025-08-14T21:21:50.2893114Z * [new branch] gh/pearu/56/head -> origin/gh/pearu/56/head 2025-08-14T21:21:50.2893263Z * [new branch] gh/pearu/56/orig -> origin/gh/pearu/56/orig 2025-08-14T21:21:50.2893410Z * [new branch] gh/pearu/97/base -> origin/gh/pearu/97/base 2025-08-14T21:21:50.2899514Z * [new branch] gh/pearu/97/head -> origin/gh/pearu/97/head 2025-08-14T21:21:50.2899853Z * [new branch] gh/pearu/97/orig -> origin/gh/pearu/97/orig 2025-08-14T21:21:50.2900177Z * [new branch] gh/qqaatw/29/base -> origin/gh/qqaatw/29/base 2025-08-14T21:21:50.2900318Z * [new branch] gh/qqaatw/29/head -> origin/gh/qqaatw/29/head 2025-08-14T21:21:50.2900458Z * [new branch] gh/qqaatw/29/orig -> origin/gh/qqaatw/29/orig 2025-08-14T21:21:50.2900680Z * [new branch] gh/raymo/cleanup-dynamo-logging -> origin/gh/raymo/cleanup-dynamo-logging 2025-08-14T21:21:50.2900848Z * [new branch] gh/raymo/refresh-script -> origin/gh/raymo/refresh-script 2025-08-14T21:21:50.2903943Z * [new branch] gh/rec/141/base -> origin/gh/rec/141/base 2025-08-14T21:21:50.2904193Z * [new branch] gh/rec/141/head -> origin/gh/rec/141/head 2025-08-14T21:21:50.2904323Z * [new branch] gh/rec/153/base -> origin/gh/rec/153/base 2025-08-14T21:21:50.2904441Z * [new branch] gh/rec/153/head -> origin/gh/rec/153/head 2025-08-14T21:21:50.2904795Z * [new branch] gh/rec/153/orig -> origin/gh/rec/153/orig 2025-08-14T21:21:50.2910556Z * [new branch] gh/rec/154/base -> origin/gh/rec/154/base 2025-08-14T21:21:50.2912506Z * [new branch] gh/rec/154/head -> origin/gh/rec/154/head 2025-08-14T21:21:50.2912666Z * [new branch] gh/rec/154/orig -> origin/gh/rec/154/orig 2025-08-14T21:21:50.2912790Z * [new branch] gh/rec/156/base -> origin/gh/rec/156/base 2025-08-14T21:21:50.2912922Z * [new branch] gh/rec/156/head -> origin/gh/rec/156/head 2025-08-14T21:21:50.2920164Z * [new branch] gh/rec/156/orig -> origin/gh/rec/156/orig 2025-08-14T21:21:50.2920327Z * [new branch] gh/rec/158/base -> origin/gh/rec/158/base 2025-08-14T21:21:50.2920469Z * [new branch] gh/rec/158/head -> origin/gh/rec/158/head 2025-08-14T21:21:50.2920645Z * [new branch] gh/rec/158/orig -> origin/gh/rec/158/orig 2025-08-14T21:21:50.2920784Z * [new branch] gh/rec/159/base -> origin/gh/rec/159/base 2025-08-14T21:21:50.2920914Z * [new branch] gh/rec/159/head -> origin/gh/rec/159/head 2025-08-14T21:21:50.2921045Z * [new branch] gh/rec/160/base -> origin/gh/rec/160/base 2025-08-14T21:21:50.2924057Z * [new branch] gh/rec/160/head -> origin/gh/rec/160/head 2025-08-14T21:21:50.2924207Z * [new branch] gh/rec/160/orig -> origin/gh/rec/160/orig 2025-08-14T21:21:50.2924440Z * [new branch] gh/rec/161/base -> origin/gh/rec/161/base 2025-08-14T21:21:50.2924579Z * [new branch] gh/rec/161/head -> origin/gh/rec/161/head 2025-08-14T21:21:50.2924712Z * [new branch] gh/rec/161/orig -> origin/gh/rec/161/orig 2025-08-14T21:21:50.2924861Z * [new branch] gh/rec/162/base -> origin/gh/rec/162/base 2025-08-14T21:21:50.2924994Z * [new branch] gh/rec/162/head -> origin/gh/rec/162/head 2025-08-14T21:21:50.2925127Z * [new branch] gh/rec/162/orig -> origin/gh/rec/162/orig 2025-08-14T21:21:50.2925253Z * [new branch] gh/rec/163/base -> origin/gh/rec/163/base 2025-08-14T21:21:50.2925378Z * [new branch] gh/rec/163/head -> origin/gh/rec/163/head 2025-08-14T21:21:50.2925524Z * [new branch] gh/rec/163/orig -> origin/gh/rec/163/orig 2025-08-14T21:21:50.2926553Z * [new branch] gh/rec/164/base -> origin/gh/rec/164/base 2025-08-14T21:21:50.2927013Z * [new branch] gh/rec/164/head -> origin/gh/rec/164/head 2025-08-14T21:21:50.2927171Z * [new branch] gh/rec/164/orig -> origin/gh/rec/164/orig 2025-08-14T21:21:50.2929320Z * [new branch] gh/robert-hardwick/1/base -> origin/gh/robert-hardwick/1/base 2025-08-14T21:21:50.2929510Z * [new branch] gh/robert-hardwick/1/head -> origin/gh/robert-hardwick/1/head 2025-08-14T21:21:50.2930059Z * [new branch] gh/robert-hardwick/1/orig -> origin/gh/robert-hardwick/1/orig 2025-08-14T21:21:50.2933775Z * [new branch] gh/robert-hardwick/2/base -> origin/gh/robert-hardwick/2/base 2025-08-14T21:21:50.2934128Z * [new branch] gh/robert-hardwick/2/head -> origin/gh/robert-hardwick/2/head 2025-08-14T21:21:50.2934504Z * [new branch] gh/robert-hardwick/2/orig -> origin/gh/robert-hardwick/2/orig 2025-08-14T21:21:50.2938466Z * [new branch] gh/robert-hardwick/3/base -> origin/gh/robert-hardwick/3/base 2025-08-14T21:21:50.2938802Z * [new branch] gh/robert-hardwick/3/head -> origin/gh/robert-hardwick/3/head 2025-08-14T21:21:50.2939063Z * [new branch] gh/robert-hardwick/3/orig -> origin/gh/robert-hardwick/3/orig 2025-08-14T21:21:50.2939518Z * [new branch] gh/robert-hardwick/4/base -> origin/gh/robert-hardwick/4/base 2025-08-14T21:21:50.2939728Z * [new branch] gh/robert-hardwick/4/head -> origin/gh/robert-hardwick/4/head 2025-08-14T21:21:50.2939892Z * [new branch] gh/robert-hardwick/4/orig -> origin/gh/robert-hardwick/4/orig 2025-08-14T21:21:50.2940166Z * [new branch] gh/rtimpe/1/base -> origin/gh/rtimpe/1/base 2025-08-14T21:21:50.2942810Z * [new branch] gh/rtimpe/1/head -> origin/gh/rtimpe/1/head 2025-08-14T21:21:50.2943176Z * [new branch] gh/rtimpe/10/base -> origin/gh/rtimpe/10/base 2025-08-14T21:21:50.2943330Z * [new branch] gh/rtimpe/10/head -> origin/gh/rtimpe/10/head 2025-08-14T21:21:50.2943469Z * [new branch] gh/rtimpe/10/orig -> origin/gh/rtimpe/10/orig 2025-08-14T21:21:50.2943622Z * [new branch] gh/rtimpe/11/base -> origin/gh/rtimpe/11/base 2025-08-14T21:21:50.2943897Z * [new branch] gh/rtimpe/11/head -> origin/gh/rtimpe/11/head 2025-08-14T21:21:50.2944598Z * [new branch] gh/rtimpe/11/orig -> origin/gh/rtimpe/11/orig 2025-08-14T21:21:50.2948318Z * [new branch] gh/rtimpe/12/base -> origin/gh/rtimpe/12/base 2025-08-14T21:21:50.2948490Z * [new branch] gh/rtimpe/12/head -> origin/gh/rtimpe/12/head 2025-08-14T21:21:50.2949100Z * [new branch] gh/rtimpe/12/orig -> origin/gh/rtimpe/12/orig 2025-08-14T21:21:50.2949285Z * [new branch] gh/rtimpe/2/base -> origin/gh/rtimpe/2/base 2025-08-14T21:21:50.2949423Z * [new branch] gh/rtimpe/2/head -> origin/gh/rtimpe/2/head 2025-08-14T21:21:50.2949577Z * [new branch] gh/rtimpe/3/base -> origin/gh/rtimpe/3/base 2025-08-14T21:21:50.2949770Z * [new branch] gh/rtimpe/3/head -> origin/gh/rtimpe/3/head 2025-08-14T21:21:50.2954757Z * [new branch] gh/rtimpe/4/base -> origin/gh/rtimpe/4/base 2025-08-14T21:21:50.2960252Z * [new branch] gh/rtimpe/4/head -> origin/gh/rtimpe/4/head 2025-08-14T21:21:50.2962493Z * [new branch] gh/rtimpe/5/base -> origin/gh/rtimpe/5/base 2025-08-14T21:21:50.2962651Z * [new branch] gh/rtimpe/5/head -> origin/gh/rtimpe/5/head 2025-08-14T21:21:50.2962828Z * [new branch] gh/rtimpe/5/orig -> origin/gh/rtimpe/5/orig 2025-08-14T21:21:50.2962962Z * [new branch] gh/rtimpe/6/base -> origin/gh/rtimpe/6/base 2025-08-14T21:21:50.2963103Z * [new branch] gh/rtimpe/6/head -> origin/gh/rtimpe/6/head 2025-08-14T21:21:50.2963235Z * [new branch] gh/rtimpe/6/orig -> origin/gh/rtimpe/6/orig 2025-08-14T21:21:50.2963388Z * [new branch] gh/rtimpe/7/base -> origin/gh/rtimpe/7/base 2025-08-14T21:21:50.2963685Z * [new branch] gh/rtimpe/7/head -> origin/gh/rtimpe/7/head 2025-08-14T21:21:50.2963821Z * [new branch] gh/rtimpe/7/orig -> origin/gh/rtimpe/7/orig 2025-08-14T21:21:50.2963967Z * [new branch] gh/rtimpe/8/base -> origin/gh/rtimpe/8/base 2025-08-14T21:21:50.2964109Z * [new branch] gh/rtimpe/8/head -> origin/gh/rtimpe/8/head 2025-08-14T21:21:50.2964259Z * [new branch] gh/rtimpe/8/orig -> origin/gh/rtimpe/8/orig 2025-08-14T21:21:50.2964400Z * [new branch] gh/rtimpe/9/base -> origin/gh/rtimpe/9/base 2025-08-14T21:21:50.2964540Z * [new branch] gh/rtimpe/9/head -> origin/gh/rtimpe/9/head 2025-08-14T21:21:50.2964687Z * [new branch] gh/rtimpe/9/orig -> origin/gh/rtimpe/9/orig 2025-08-14T21:21:50.2964927Z * [new branch] gh/ruisizhang123/1/base -> origin/gh/ruisizhang123/1/base 2025-08-14T21:21:50.2965093Z * [new branch] gh/ruisizhang123/1/head -> origin/gh/ruisizhang123/1/head 2025-08-14T21:21:50.2965252Z * [new branch] gh/ruisizhang123/1/orig -> origin/gh/ruisizhang123/1/orig 2025-08-14T21:21:50.2966460Z * [new branch] gh/ruisizhang123/4/base -> origin/gh/ruisizhang123/4/base 2025-08-14T21:21:50.2966792Z * [new branch] gh/ruisizhang123/4/head -> origin/gh/ruisizhang123/4/head 2025-08-14T21:21:50.2966977Z * [new branch] gh/ruisizhang123/4/orig -> origin/gh/ruisizhang123/4/orig 2025-08-14T21:21:50.2967148Z * [new branch] gh/ruisizhang123/5/base -> origin/gh/ruisizhang123/5/base 2025-08-14T21:21:50.2974292Z * [new branch] gh/ruisizhang123/5/head -> origin/gh/ruisizhang123/5/head 2025-08-14T21:21:50.2976766Z * [new branch] gh/ruisizhang123/5/orig -> origin/gh/ruisizhang123/5/orig 2025-08-14T21:21:50.2977148Z * [new branch] gh/ruisizhang123/6/base -> origin/gh/ruisizhang123/6/base 2025-08-14T21:21:50.2977332Z * [new branch] gh/ruisizhang123/6/head -> origin/gh/ruisizhang123/6/head 2025-08-14T21:21:50.2977495Z * [new branch] gh/ruisizhang123/6/orig -> origin/gh/ruisizhang123/6/orig 2025-08-14T21:21:50.2977649Z * [new branch] gh/ruisizhang123/7/base -> origin/gh/ruisizhang123/7/base 2025-08-14T21:21:50.2977800Z * [new branch] gh/ruisizhang123/7/head -> origin/gh/ruisizhang123/7/head 2025-08-14T21:21:50.2977957Z * [new branch] gh/ruisizhang123/7/orig -> origin/gh/ruisizhang123/7/orig 2025-08-14T21:21:50.2978106Z * [new branch] gh/ruisizhang123/8/base -> origin/gh/ruisizhang123/8/base 2025-08-14T21:21:50.2978266Z * [new branch] gh/ruisizhang123/8/head -> origin/gh/ruisizhang123/8/head 2025-08-14T21:21:50.2978432Z * [new branch] gh/ruisizhang123/8/orig -> origin/gh/ruisizhang123/8/orig 2025-08-14T21:21:50.2981732Z * [new branch] gh/sarckk/2/base -> origin/gh/sarckk/2/base 2025-08-14T21:21:50.2984947Z * [new branch] gh/sarckk/2/head -> origin/gh/sarckk/2/head 2025-08-14T21:21:50.2985521Z * [new branch] gh/sarckk/2/orig -> origin/gh/sarckk/2/orig 2025-08-14T21:21:50.2985709Z * [new branch] gh/seemethere/23/head -> origin/gh/seemethere/23/head 2025-08-14T21:21:50.2985873Z * [new branch] gh/seemethere/24/base -> origin/gh/seemethere/24/base 2025-08-14T21:21:50.2986032Z * [new branch] gh/seemethere/24/head -> origin/gh/seemethere/24/head 2025-08-14T21:21:50.2986186Z * [new branch] gh/seemethere/24/orig -> origin/gh/seemethere/24/orig 2025-08-14T21:21:50.2986342Z * [new branch] gh/seemethere/30/base -> origin/gh/seemethere/30/base 2025-08-14T21:21:50.2986529Z * [new branch] gh/seemethere/30/head -> origin/gh/seemethere/30/head 2025-08-14T21:21:50.2986835Z * [new branch] gh/seemethere/30/orig -> origin/gh/seemethere/30/orig 2025-08-14T21:21:50.2986996Z * [new branch] gh/seemethere/32/base -> origin/gh/seemethere/32/base 2025-08-14T21:21:50.2991957Z * [new branch] gh/seemethere/32/head -> origin/gh/seemethere/32/head 2025-08-14T21:21:50.2996401Z * [new branch] gh/seemethere/32/orig -> origin/gh/seemethere/32/orig 2025-08-14T21:21:50.3001639Z * [new branch] gh/seemethere/33/base -> origin/gh/seemethere/33/base 2025-08-14T21:21:50.3003826Z * [new branch] gh/seemethere/33/head -> origin/gh/seemethere/33/head 2025-08-14T21:21:50.3004009Z * [new branch] gh/seemethere/33/orig -> origin/gh/seemethere/33/orig 2025-08-14T21:21:50.3004155Z * [new branch] gh/seemethere/34/base -> origin/gh/seemethere/34/base 2025-08-14T21:21:50.3004620Z * [new branch] gh/seemethere/34/head -> origin/gh/seemethere/34/head 2025-08-14T21:21:50.3004788Z * [new branch] gh/seemethere/34/orig -> origin/gh/seemethere/34/orig 2025-08-14T21:21:50.3004935Z * [new branch] gh/seemethere/35/base -> origin/gh/seemethere/35/base 2025-08-14T21:21:50.3005096Z * [new branch] gh/seemethere/35/head -> origin/gh/seemethere/35/head 2025-08-14T21:21:50.3005242Z * [new branch] gh/seemethere/35/orig -> origin/gh/seemethere/35/orig 2025-08-14T21:21:50.3005399Z * [new branch] gh/seemethere/37/base -> origin/gh/seemethere/37/base 2025-08-14T21:21:50.3005553Z * [new branch] gh/seemethere/37/head -> origin/gh/seemethere/37/head 2025-08-14T21:21:50.3005705Z * [new branch] gh/seemethere/37/orig -> origin/gh/seemethere/37/orig 2025-08-14T21:21:50.3005856Z * [new branch] gh/seemethere/39/base -> origin/gh/seemethere/39/base 2025-08-14T21:21:50.3006019Z * [new branch] gh/seemethere/39/head -> origin/gh/seemethere/39/head 2025-08-14T21:21:50.3006164Z * [new branch] gh/seemethere/39/orig -> origin/gh/seemethere/39/orig 2025-08-14T21:21:50.3006320Z * [new branch] gh/seemethere/40/base -> origin/gh/seemethere/40/base 2025-08-14T21:21:50.3006463Z * [new branch] gh/seemethere/40/head -> origin/gh/seemethere/40/head 2025-08-14T21:21:50.3006623Z * [new branch] gh/seemethere/40/orig -> origin/gh/seemethere/40/orig 2025-08-14T21:21:50.3006767Z * [new branch] gh/seemethere/41/base -> origin/gh/seemethere/41/base 2025-08-14T21:21:50.3006911Z * [new branch] gh/seemethere/41/head -> origin/gh/seemethere/41/head 2025-08-14T21:21:50.3007066Z * [new branch] gh/seemethere/41/orig -> origin/gh/seemethere/41/orig 2025-08-14T21:21:50.3007213Z * [new branch] gh/seemethere/42/base -> origin/gh/seemethere/42/base 2025-08-14T21:21:50.3007374Z * [new branch] gh/seemethere/42/head -> origin/gh/seemethere/42/head 2025-08-14T21:21:50.3007522Z * [new branch] gh/seemethere/42/orig -> origin/gh/seemethere/42/orig 2025-08-14T21:21:50.3007669Z * [new branch] gh/seemethere/43/base -> origin/gh/seemethere/43/base 2025-08-14T21:21:50.3007819Z * [new branch] gh/seemethere/43/head -> origin/gh/seemethere/43/head 2025-08-14T21:21:50.3007972Z * [new branch] gh/seemethere/43/orig -> origin/gh/seemethere/43/orig 2025-08-14T21:21:50.3008114Z * [new branch] gh/seemethere/44/base -> origin/gh/seemethere/44/base 2025-08-14T21:21:50.3014410Z * [new branch] gh/seemethere/44/head -> origin/gh/seemethere/44/head 2025-08-14T21:21:50.3016528Z * [new branch] gh/seemethere/44/orig -> origin/gh/seemethere/44/orig 2025-08-14T21:21:50.3017059Z * [new branch] gh/seemethere/45/base -> origin/gh/seemethere/45/base 2025-08-14T21:21:50.3024000Z * [new branch] gh/seemethere/45/head -> origin/gh/seemethere/45/head 2025-08-14T21:21:50.3029643Z * [new branch] gh/seemethere/45/orig -> origin/gh/seemethere/45/orig 2025-08-14T21:21:50.3035902Z * [new branch] gh/seemethere/46/base -> origin/gh/seemethere/46/base 2025-08-14T21:21:50.3040386Z * [new branch] gh/seemethere/46/head -> origin/gh/seemethere/46/head 2025-08-14T21:21:50.3040562Z * [new branch] gh/seemethere/46/orig -> origin/gh/seemethere/46/orig 2025-08-14T21:21:50.3041039Z * [new branch] gh/seemethere/47/base -> origin/gh/seemethere/47/base 2025-08-14T21:21:50.3041217Z * [new branch] gh/seemethere/47/head -> origin/gh/seemethere/47/head 2025-08-14T21:21:50.3041383Z * [new branch] gh/seemethere/47/orig -> origin/gh/seemethere/47/orig 2025-08-14T21:21:50.3041709Z * [new branch] gh/seemethere/48/base -> origin/gh/seemethere/48/base 2025-08-14T21:21:50.3041947Z * [new branch] gh/seemethere/48/head -> origin/gh/seemethere/48/head 2025-08-14T21:21:50.3042103Z * [new branch] gh/seemethere/48/orig -> origin/gh/seemethere/48/orig 2025-08-14T21:21:50.3042248Z * [new branch] gh/seemethere/49/base -> origin/gh/seemethere/49/base 2025-08-14T21:21:50.3042401Z * [new branch] gh/seemethere/49/head -> origin/gh/seemethere/49/head 2025-08-14T21:21:50.3042547Z * [new branch] gh/seemethere/49/orig -> origin/gh/seemethere/49/orig 2025-08-14T21:21:50.3042703Z * [new branch] gh/seemethere/50/base -> origin/gh/seemethere/50/base 2025-08-14T21:21:50.3042843Z * [new branch] gh/seemethere/50/head -> origin/gh/seemethere/50/head 2025-08-14T21:21:50.3042994Z * [new branch] gh/seemethere/50/orig -> origin/gh/seemethere/50/orig 2025-08-14T21:21:50.3043147Z * [new branch] gh/seemethere/51/base -> origin/gh/seemethere/51/base 2025-08-14T21:21:50.3043294Z * [new branch] gh/seemethere/51/head -> origin/gh/seemethere/51/head 2025-08-14T21:21:50.3043506Z * [new branch] gh/seemethere/51/orig -> origin/gh/seemethere/51/orig 2025-08-14T21:21:50.3043692Z * [new branch] gh/seemethere/52/base -> origin/gh/seemethere/52/base 2025-08-14T21:21:50.3043849Z * [new branch] gh/seemethere/52/head -> origin/gh/seemethere/52/head 2025-08-14T21:21:50.3044009Z * [new branch] gh/seemethere/52/orig -> origin/gh/seemethere/52/orig 2025-08-14T21:21:50.3044160Z * [new branch] gh/seemethere/53/base -> origin/gh/seemethere/53/base 2025-08-14T21:21:50.3044314Z * [new branch] gh/seemethere/53/head -> origin/gh/seemethere/53/head 2025-08-14T21:21:50.3044464Z * [new branch] gh/seemethere/53/orig -> origin/gh/seemethere/53/orig 2025-08-14T21:21:50.3044609Z * [new branch] gh/seemethere/54/base -> origin/gh/seemethere/54/base 2025-08-14T21:21:50.3044765Z * [new branch] gh/seemethere/54/head -> origin/gh/seemethere/54/head 2025-08-14T21:21:50.3044917Z * [new branch] gh/seemethere/54/orig -> origin/gh/seemethere/54/orig 2025-08-14T21:21:50.3045067Z * [new branch] gh/seemethere/55/base -> origin/gh/seemethere/55/base 2025-08-14T21:21:50.3045250Z * [new branch] gh/seemethere/55/head -> origin/gh/seemethere/55/head 2025-08-14T21:21:50.3045410Z * [new branch] gh/seemethere/55/orig -> origin/gh/seemethere/55/orig 2025-08-14T21:21:50.3045563Z * [new branch] gh/seemethere/56/base -> origin/gh/seemethere/56/base 2025-08-14T21:21:50.3045722Z * [new branch] gh/seemethere/56/head -> origin/gh/seemethere/56/head 2025-08-14T21:21:50.3045921Z * [new branch] gh/seemethere/56/orig -> origin/gh/seemethere/56/orig 2025-08-14T21:21:50.3046087Z * [new branch] gh/seemethere/57/base -> origin/gh/seemethere/57/base 2025-08-14T21:21:50.3046239Z * [new branch] gh/seemethere/57/head -> origin/gh/seemethere/57/head 2025-08-14T21:21:50.3046393Z * [new branch] gh/seemethere/57/orig -> origin/gh/seemethere/57/orig 2025-08-14T21:21:50.3046545Z * [new branch] gh/seemethere/58/base -> origin/gh/seemethere/58/base 2025-08-14T21:21:50.3046694Z * [new branch] gh/seemethere/58/head -> origin/gh/seemethere/58/head 2025-08-14T21:21:50.3046853Z * [new branch] gh/seemethere/58/orig -> origin/gh/seemethere/58/orig 2025-08-14T21:21:50.3046997Z * [new branch] gh/seemethere/59/base -> origin/gh/seemethere/59/base 2025-08-14T21:21:50.3047201Z * [new branch] gh/seemethere/59/head -> origin/gh/seemethere/59/head 2025-08-14T21:21:50.3047353Z * [new branch] gh/seemethere/59/orig -> origin/gh/seemethere/59/orig 2025-08-14T21:21:50.3047516Z * [new branch] gh/seemethere/7/head -> origin/gh/seemethere/7/head 2025-08-14T21:21:50.3047686Z * [new branch] gh/shunting314/145/base -> origin/gh/shunting314/145/base 2025-08-14T21:21:50.3047837Z * [new branch] gh/shunting314/145/head -> origin/gh/shunting314/145/head 2025-08-14T21:21:50.3047989Z * [new branch] gh/shunting314/145/orig -> origin/gh/shunting314/145/orig 2025-08-14T21:21:50.3048162Z * [new branch] gh/shunting314/176/base -> origin/gh/shunting314/176/base 2025-08-14T21:21:50.3048314Z * [new branch] gh/shunting314/176/head -> origin/gh/shunting314/176/head 2025-08-14T21:21:50.3048472Z * [new branch] gh/shunting314/176/orig -> origin/gh/shunting314/176/orig 2025-08-14T21:21:50.3048959Z * [new branch] gh/shunting314/211/base -> origin/gh/shunting314/211/base 2025-08-14T21:21:50.3049130Z * [new branch] gh/shunting314/211/head -> origin/gh/shunting314/211/head 2025-08-14T21:21:50.3049292Z * [new branch] gh/shunting314/211/orig -> origin/gh/shunting314/211/orig 2025-08-14T21:21:50.3049452Z * [new branch] gh/shunting314/212/base -> origin/gh/shunting314/212/base 2025-08-14T21:21:50.3049647Z * [new branch] gh/shunting314/212/head -> origin/gh/shunting314/212/head 2025-08-14T21:21:50.3054697Z * [new branch] gh/shunting314/212/orig -> origin/gh/shunting314/212/orig 2025-08-14T21:21:50.3059522Z * [new branch] gh/shunting314/213/base -> origin/gh/shunting314/213/base 2025-08-14T21:21:50.3062000Z * [new branch] gh/shunting314/213/head -> origin/gh/shunting314/213/head 2025-08-14T21:21:50.3066828Z * [new branch] gh/shunting314/213/orig -> origin/gh/shunting314/213/orig 2025-08-14T21:21:50.3071702Z * [new branch] gh/silverguo/1/base -> origin/gh/silverguo/1/base 2025-08-14T21:21:50.3076154Z * [new branch] gh/silverguo/1/head -> origin/gh/silverguo/1/head 2025-08-14T21:21:50.3076694Z * [new branch] gh/silverguo/2/base -> origin/gh/silverguo/2/base 2025-08-14T21:21:50.3076867Z * [new branch] gh/silverguo/2/head -> origin/gh/silverguo/2/head 2025-08-14T21:21:50.3077013Z * [new branch] gh/silverguo/3/base -> origin/gh/silverguo/3/base 2025-08-14T21:21:50.3077152Z * [new branch] gh/silverguo/3/head -> origin/gh/silverguo/3/head 2025-08-14T21:21:50.3077299Z * [new branch] gh/silverguo/4/base -> origin/gh/silverguo/4/base 2025-08-14T21:21:50.3077437Z * [new branch] gh/silverguo/4/head -> origin/gh/silverguo/4/head 2025-08-14T21:21:50.3077612Z * [new branch] gh/sinhaanhsul/1/base -> origin/gh/sinhaanhsul/1/base 2025-08-14T21:21:50.3077893Z * [new branch] gh/sinhaanhsul/1/head -> origin/gh/sinhaanhsul/1/head 2025-08-14T21:21:50.3078038Z * [new branch] gh/skarjala/11/base -> origin/gh/skarjala/11/base 2025-08-14T21:21:50.3078188Z * [new branch] gh/skarjala/11/head -> origin/gh/skarjala/11/head 2025-08-14T21:21:50.3078344Z * [new branch] gh/skarjala/11/orig -> origin/gh/skarjala/11/orig 2025-08-14T21:21:50.3078499Z * [new branch] gh/skarjala/13/base -> origin/gh/skarjala/13/base 2025-08-14T21:21:50.3078670Z * [new branch] gh/skarjala/13/head -> origin/gh/skarjala/13/head 2025-08-14T21:21:50.3078842Z * [new branch] gh/skarjala/13/orig -> origin/gh/skarjala/13/orig 2025-08-14T21:21:50.3078986Z * [new branch] gh/skarjala/14/base -> origin/gh/skarjala/14/base 2025-08-14T21:21:50.3079181Z * [new branch] gh/skarjala/14/head -> origin/gh/skarjala/14/head 2025-08-14T21:21:50.3079325Z * [new branch] gh/skarjala/14/orig -> origin/gh/skarjala/14/orig 2025-08-14T21:21:50.3079460Z * [new branch] gh/skarjala/15/base -> origin/gh/skarjala/15/base 2025-08-14T21:21:50.3079624Z * [new branch] gh/skarjala/15/head -> origin/gh/skarjala/15/head 2025-08-14T21:21:50.3079773Z * [new branch] gh/skarjala/15/orig -> origin/gh/skarjala/15/orig 2025-08-14T21:21:50.3079908Z * [new branch] gh/skarjala/16/base -> origin/gh/skarjala/16/base 2025-08-14T21:21:50.3080044Z * [new branch] gh/skarjala/16/head -> origin/gh/skarjala/16/head 2025-08-14T21:21:50.3080190Z * [new branch] gh/skarjala/16/orig -> origin/gh/skarjala/16/orig 2025-08-14T21:21:50.3080368Z * [new branch] gh/skarjala/17/base -> origin/gh/skarjala/17/base 2025-08-14T21:21:50.3080524Z * [new branch] gh/skarjala/17/head -> origin/gh/skarjala/17/head 2025-08-14T21:21:50.3080660Z * [new branch] gh/skarjala/17/orig -> origin/gh/skarjala/17/orig 2025-08-14T21:21:50.3080795Z * [new branch] gh/skarjala/18/base -> origin/gh/skarjala/18/base 2025-08-14T21:21:50.3080941Z * [new branch] gh/skarjala/18/head -> origin/gh/skarjala/18/head 2025-08-14T21:21:50.3081087Z * [new branch] gh/skarjala/18/orig -> origin/gh/skarjala/18/orig 2025-08-14T21:21:50.3081232Z * [new branch] gh/skarjala/19/base -> origin/gh/skarjala/19/base 2025-08-14T21:21:50.3081369Z * [new branch] gh/skarjala/19/head -> origin/gh/skarjala/19/head 2025-08-14T21:21:50.3081505Z * [new branch] gh/skarjala/19/orig -> origin/gh/skarjala/19/orig 2025-08-14T21:21:50.3081665Z * [new branch] gh/soulitzer/269/base -> origin/gh/soulitzer/269/base 2025-08-14T21:21:50.3081818Z * [new branch] gh/soulitzer/269/head -> origin/gh/soulitzer/269/head 2025-08-14T21:21:50.3081967Z * [new branch] gh/soulitzer/269/orig -> origin/gh/soulitzer/269/orig 2025-08-14T21:21:50.3083888Z * [new branch] gh/soulitzer/276/base -> origin/gh/soulitzer/276/base 2025-08-14T21:21:50.3084384Z * [new branch] gh/soulitzer/276/head -> origin/gh/soulitzer/276/head 2025-08-14T21:21:50.3084554Z * [new branch] gh/soulitzer/276/orig -> origin/gh/soulitzer/276/orig 2025-08-14T21:21:50.3084716Z * [new branch] gh/soulitzer/287/base -> origin/gh/soulitzer/287/base 2025-08-14T21:21:50.3084862Z * [new branch] gh/soulitzer/287/head -> origin/gh/soulitzer/287/head 2025-08-14T21:21:50.3085010Z * [new branch] gh/soulitzer/287/orig -> origin/gh/soulitzer/287/orig 2025-08-14T21:21:50.3085360Z * [new branch] gh/soulitzer/296/base -> origin/gh/soulitzer/296/base 2025-08-14T21:21:50.3085622Z * [new branch] gh/soulitzer/296/head -> origin/gh/soulitzer/296/head 2025-08-14T21:21:50.3086146Z * [new branch] gh/soulitzer/296/orig -> origin/gh/soulitzer/296/orig 2025-08-14T21:21:50.3087853Z * [new branch] gh/soulitzer/299/base -> origin/gh/soulitzer/299/base 2025-08-14T21:21:50.3088561Z * [new branch] gh/soulitzer/299/head -> origin/gh/soulitzer/299/head 2025-08-14T21:21:50.3089042Z * [new branch] gh/soulitzer/299/orig -> origin/gh/soulitzer/299/orig 2025-08-14T21:21:50.3090028Z * [new branch] gh/soulitzer/300/base -> origin/gh/soulitzer/300/base 2025-08-14T21:21:50.3090508Z * [new branch] gh/soulitzer/300/head -> origin/gh/soulitzer/300/head 2025-08-14T21:21:50.3093089Z * [new branch] gh/soulitzer/300/orig -> origin/gh/soulitzer/300/orig 2025-08-14T21:21:50.3093429Z * [new branch] gh/soulitzer/301/base -> origin/gh/soulitzer/301/base 2025-08-14T21:21:50.3093577Z * [new branch] gh/soulitzer/301/head -> origin/gh/soulitzer/301/head 2025-08-14T21:21:50.3093737Z * [new branch] gh/soulitzer/301/orig -> origin/gh/soulitzer/301/orig 2025-08-14T21:21:50.3096777Z * [new branch] gh/soulitzer/313/base -> origin/gh/soulitzer/313/base 2025-08-14T21:21:50.3097040Z * [new branch] gh/soulitzer/313/head -> origin/gh/soulitzer/313/head 2025-08-14T21:21:50.3097184Z * [new branch] gh/soulitzer/313/orig -> origin/gh/soulitzer/313/orig 2025-08-14T21:21:50.3097335Z * [new branch] gh/soulitzer/319/base -> origin/gh/soulitzer/319/base 2025-08-14T21:21:50.3100529Z * [new branch] gh/soulitzer/319/head -> origin/gh/soulitzer/319/head 2025-08-14T21:21:50.3100674Z * [new branch] gh/soulitzer/319/orig -> origin/gh/soulitzer/319/orig 2025-08-14T21:21:50.3100827Z * [new branch] gh/soulitzer/320/base -> origin/gh/soulitzer/320/base 2025-08-14T21:21:50.3100963Z * [new branch] gh/soulitzer/320/head -> origin/gh/soulitzer/320/head 2025-08-14T21:21:50.3101101Z * [new branch] gh/soulitzer/320/orig -> origin/gh/soulitzer/320/orig 2025-08-14T21:21:50.3107808Z * [new branch] gh/soulitzer/336/base -> origin/gh/soulitzer/336/base 2025-08-14T21:21:50.3108003Z * [new branch] gh/soulitzer/336/head -> origin/gh/soulitzer/336/head 2025-08-14T21:21:50.3108152Z * [new branch] gh/soulitzer/336/orig -> origin/gh/soulitzer/336/orig 2025-08-14T21:21:50.3108301Z * [new branch] gh/soulitzer/347/base -> origin/gh/soulitzer/347/base 2025-08-14T21:21:50.3108457Z * [new branch] gh/soulitzer/347/head -> origin/gh/soulitzer/347/head 2025-08-14T21:21:50.3113776Z * [new branch] gh/soulitzer/347/orig -> origin/gh/soulitzer/347/orig 2025-08-14T21:21:50.3114130Z * [new branch] gh/soulitzer/349/base -> origin/gh/soulitzer/349/base 2025-08-14T21:21:50.3114326Z * [new branch] gh/soulitzer/349/head -> origin/gh/soulitzer/349/head 2025-08-14T21:21:50.3114511Z * [new branch] gh/soulitzer/349/orig -> origin/gh/soulitzer/349/orig 2025-08-14T21:21:50.3114693Z * [new branch] gh/soulitzer/350/base -> origin/gh/soulitzer/350/base 2025-08-14T21:21:50.3114925Z * [new branch] gh/soulitzer/350/head -> origin/gh/soulitzer/350/head 2025-08-14T21:21:50.3115571Z * [new branch] gh/soulitzer/350/orig -> origin/gh/soulitzer/350/orig 2025-08-14T21:21:50.3115774Z * [new branch] gh/soulitzer/351/base -> origin/gh/soulitzer/351/base 2025-08-14T21:21:50.3115930Z * [new branch] gh/soulitzer/351/head -> origin/gh/soulitzer/351/head 2025-08-14T21:21:50.3116342Z * [new branch] gh/soulitzer/351/orig -> origin/gh/soulitzer/351/orig 2025-08-14T21:21:50.3116497Z * [new branch] gh/soulitzer/353/base -> origin/gh/soulitzer/353/base 2025-08-14T21:21:50.3116646Z * [new branch] gh/soulitzer/353/head -> origin/gh/soulitzer/353/head 2025-08-14T21:21:50.3116802Z * [new branch] gh/soulitzer/353/orig -> origin/gh/soulitzer/353/orig 2025-08-14T21:21:50.3117778Z * [new branch] gh/soulitzer/358/base -> origin/gh/soulitzer/358/base 2025-08-14T21:21:50.3118025Z * [new branch] gh/soulitzer/358/head -> origin/gh/soulitzer/358/head 2025-08-14T21:21:50.3118181Z * [new branch] gh/soulitzer/358/orig -> origin/gh/soulitzer/358/orig 2025-08-14T21:21:50.3121547Z * [new branch] gh/soulitzer/359/base -> origin/gh/soulitzer/359/base 2025-08-14T21:21:50.3121717Z * [new branch] gh/soulitzer/359/head -> origin/gh/soulitzer/359/head 2025-08-14T21:21:50.3122219Z * [new branch] gh/soulitzer/359/orig -> origin/gh/soulitzer/359/orig 2025-08-14T21:21:50.3122366Z * [new branch] gh/soulitzer/362/base -> origin/gh/soulitzer/362/base 2025-08-14T21:21:50.3122521Z * [new branch] gh/soulitzer/362/head -> origin/gh/soulitzer/362/head 2025-08-14T21:21:50.3122669Z * [new branch] gh/soulitzer/362/orig -> origin/gh/soulitzer/362/orig 2025-08-14T21:21:50.3122819Z * [new branch] gh/soulitzer/372/base -> origin/gh/soulitzer/372/base 2025-08-14T21:21:50.3123245Z * [new branch] gh/soulitzer/372/head -> origin/gh/soulitzer/372/head 2025-08-14T21:21:50.3124183Z * [new branch] gh/soulitzer/372/orig -> origin/gh/soulitzer/372/orig 2025-08-14T21:21:50.3125449Z * [new branch] gh/swolchok/728/next -> origin/gh/swolchok/728/next 2025-08-14T21:21:50.3126388Z * [new branch] gh/swolchok/758/base -> origin/gh/swolchok/758/base 2025-08-14T21:21:50.3126934Z * [new branch] gh/swolchok/758/head -> origin/gh/swolchok/758/head 2025-08-14T21:21:50.3127868Z * [new branch] gh/swolchok/758/orig -> origin/gh/swolchok/758/orig 2025-08-14T21:21:50.3134496Z * [new branch] gh/swolchok/767/base -> origin/gh/swolchok/767/base 2025-08-14T21:21:50.3134665Z * [new branch] gh/swolchok/767/head -> origin/gh/swolchok/767/head 2025-08-14T21:21:50.3134816Z * [new branch] gh/swolchok/767/orig -> origin/gh/swolchok/767/orig 2025-08-14T21:21:50.3134962Z * [new branch] gh/swolchok/768/base -> origin/gh/swolchok/768/base 2025-08-14T21:21:50.3135099Z * [new branch] gh/swolchok/768/head -> origin/gh/swolchok/768/head 2025-08-14T21:21:50.3135242Z * [new branch] gh/swolchok/768/orig -> origin/gh/swolchok/768/orig 2025-08-14T21:21:50.3135394Z * [new branch] gh/swolchok/769/base -> origin/gh/swolchok/769/base 2025-08-14T21:21:50.3135704Z * [new branch] gh/swolchok/769/head -> origin/gh/swolchok/769/head 2025-08-14T21:21:50.3136195Z * [new branch] gh/swolchok/769/orig -> origin/gh/swolchok/769/orig 2025-08-14T21:21:50.3138460Z * [new branch] gh/swolchok/771/base -> origin/gh/swolchok/771/base 2025-08-14T21:21:50.3138827Z * [new branch] gh/swolchok/771/head -> origin/gh/swolchok/771/head 2025-08-14T21:21:50.3138984Z * [new branch] gh/swolchok/771/orig -> origin/gh/swolchok/771/orig 2025-08-14T21:21:50.3143186Z * [new branch] gh/swolchok/772/base -> origin/gh/swolchok/772/base 2025-08-14T21:21:50.3143363Z * [new branch] gh/swolchok/772/head -> origin/gh/swolchok/772/head 2025-08-14T21:21:50.3143508Z * [new branch] gh/swolchok/772/orig -> origin/gh/swolchok/772/orig 2025-08-14T21:21:50.3143811Z * [new branch] gh/swolchok/773/base -> origin/gh/swolchok/773/base 2025-08-14T21:21:50.3143949Z * [new branch] gh/swolchok/773/head -> origin/gh/swolchok/773/head 2025-08-14T21:21:50.3144624Z * [new branch] gh/swolchok/773/orig -> origin/gh/swolchok/773/orig 2025-08-14T21:21:50.3145623Z * [new branch] gh/swolchok/786/base -> origin/gh/swolchok/786/base 2025-08-14T21:21:50.3145819Z * [new branch] gh/swolchok/786/head -> origin/gh/swolchok/786/head 2025-08-14T21:21:50.3152639Z * [new branch] gh/swolchok/786/orig -> origin/gh/swolchok/786/orig 2025-08-14T21:21:50.3152819Z * [new branch] gh/swolchok/787/base -> origin/gh/swolchok/787/base 2025-08-14T21:21:50.3152958Z * [new branch] gh/swolchok/787/head -> origin/gh/swolchok/787/head 2025-08-14T21:21:50.3153114Z * [new branch] gh/swolchok/787/orig -> origin/gh/swolchok/787/orig 2025-08-14T21:21:50.3153418Z * [new branch] gh/syed-ahmed/2/base -> origin/gh/syed-ahmed/2/base 2025-08-14T21:21:50.3153559Z * [new branch] gh/syed-ahmed/2/head -> origin/gh/syed-ahmed/2/head 2025-08-14T21:21:50.3153691Z * [new branch] gh/syed-ahmed/2/orig -> origin/gh/syed-ahmed/2/orig 2025-08-14T21:21:50.3153823Z * [new branch] gh/syed-ahmed/3/base -> origin/gh/syed-ahmed/3/base 2025-08-14T21:21:50.3153962Z * [new branch] gh/syed-ahmed/3/head -> origin/gh/syed-ahmed/3/head 2025-08-14T21:21:50.3154093Z * [new branch] gh/syed-ahmed/3/orig -> origin/gh/syed-ahmed/3/orig 2025-08-14T21:21:50.3162330Z * [new branch] gh/syed-ahmed/4/base -> origin/gh/syed-ahmed/4/base 2025-08-14T21:21:50.3162540Z * [new branch] gh/syed-ahmed/4/head -> origin/gh/syed-ahmed/4/head 2025-08-14T21:21:50.3162707Z * [new branch] gh/syed-ahmed/4/orig -> origin/gh/syed-ahmed/4/orig 2025-08-14T21:21:50.3162875Z * [new branch] gh/teja-rao/3/base -> origin/gh/teja-rao/3/base 2025-08-14T21:21:50.3163012Z * [new branch] gh/teja-rao/3/head -> origin/gh/teja-rao/3/head 2025-08-14T21:21:50.3163151Z * [new branch] gh/teja-rao/3/orig -> origin/gh/teja-rao/3/orig 2025-08-14T21:21:50.3163299Z * [new branch] gh/tianyu-l/2/base -> origin/gh/tianyu-l/2/base 2025-08-14T21:21:50.3163435Z * [new branch] gh/tianyu-l/2/head -> origin/gh/tianyu-l/2/head 2025-08-14T21:21:50.3163575Z * [new branch] gh/tianyu-l/2/orig -> origin/gh/tianyu-l/2/orig 2025-08-14T21:21:50.3163732Z * [new branch] gh/titaiwangms/1/base -> origin/gh/titaiwangms/1/base 2025-08-14T21:21:50.3163888Z * [new branch] gh/titaiwangms/1/head -> origin/gh/titaiwangms/1/head 2025-08-14T21:21:50.3164038Z * [new branch] gh/titaiwangms/1/orig -> origin/gh/titaiwangms/1/orig 2025-08-14T21:21:50.3164191Z * [new branch] gh/titaiwangms/2/base -> origin/gh/titaiwangms/2/base 2025-08-14T21:21:50.3164344Z * [new branch] gh/titaiwangms/2/head -> origin/gh/titaiwangms/2/head 2025-08-14T21:21:50.3164691Z * [new branch] gh/titaiwangms/2/orig -> origin/gh/titaiwangms/2/orig 2025-08-14T21:21:50.3166040Z * [new branch] gh/titaiwangms/3/base -> origin/gh/titaiwangms/3/base 2025-08-14T21:21:50.3166352Z * [new branch] gh/titaiwangms/3/head -> origin/gh/titaiwangms/3/head 2025-08-14T21:21:50.3167365Z * [new branch] gh/titaiwangms/3/orig -> origin/gh/titaiwangms/3/orig 2025-08-14T21:21:50.3168756Z * [new branch] gh/titaiwangms/4/base -> origin/gh/titaiwangms/4/base 2025-08-14T21:21:50.3169123Z * [new branch] gh/titaiwangms/4/head -> origin/gh/titaiwangms/4/head 2025-08-14T21:21:50.3170360Z * [new branch] gh/titaiwangms/4/orig -> origin/gh/titaiwangms/4/orig 2025-08-14T21:21:50.3170832Z * [new branch] gh/titaiwangms/5/base -> origin/gh/titaiwangms/5/base 2025-08-14T21:21:50.3172303Z * [new branch] gh/titaiwangms/5/head -> origin/gh/titaiwangms/5/head 2025-08-14T21:21:50.3172480Z * [new branch] gh/titaiwangms/5/orig -> origin/gh/titaiwangms/5/orig 2025-08-14T21:21:50.3173962Z * [new branch] gh/titaiwangms/6/base -> origin/gh/titaiwangms/6/base 2025-08-14T21:21:50.3174136Z * [new branch] gh/titaiwangms/6/head -> origin/gh/titaiwangms/6/head 2025-08-14T21:21:50.3179862Z * [new branch] gh/titaiwangms/6/orig -> origin/gh/titaiwangms/6/orig 2025-08-14T21:21:50.3180062Z * [new branch] gh/titaiwangms/7/base -> origin/gh/titaiwangms/7/base 2025-08-14T21:21:50.3180217Z * [new branch] gh/titaiwangms/7/head -> origin/gh/titaiwangms/7/head 2025-08-14T21:21:50.3180540Z * [new branch] gh/titaiwangms/7/orig -> origin/gh/titaiwangms/7/orig 2025-08-14T21:21:50.3180707Z * [new branch] gh/titaiwangms/8/base -> origin/gh/titaiwangms/8/base 2025-08-14T21:21:50.3180865Z * [new branch] gh/titaiwangms/8/head -> origin/gh/titaiwangms/8/head 2025-08-14T21:21:50.3181034Z * [new branch] gh/titaiwangms/8/orig -> origin/gh/titaiwangms/8/orig 2025-08-14T21:21:50.3182043Z * [new branch] gh/tugsbayasgalan/1/base -> origin/gh/tugsbayasgalan/1/base 2025-08-14T21:21:50.3182222Z * [new branch] gh/tugsbayasgalan/1/head -> origin/gh/tugsbayasgalan/1/head 2025-08-14T21:21:50.3183520Z * [new branch] gh/tugsbayasgalan/1/orig -> origin/gh/tugsbayasgalan/1/orig 2025-08-14T21:21:50.3184654Z * [new branch] gh/v0i0/1/base -> origin/gh/v0i0/1/base 2025-08-14T21:21:50.3185111Z * [new branch] gh/v0i0/1/head -> origin/gh/v0i0/1/head 2025-08-14T21:21:50.3186144Z * [new branch] gh/v0i0/1/orig -> origin/gh/v0i0/1/orig 2025-08-14T21:21:50.3218135Z * [new branch] gh/v0i0/2/base -> origin/gh/v0i0/2/base 2025-08-14T21:21:50.3220647Z * [new branch] gh/v0i0/2/head -> origin/gh/v0i0/2/head 2025-08-14T21:21:50.3223956Z * [new branch] gh/v0i0/2/orig -> origin/gh/v0i0/2/orig 2025-08-14T21:21:50.3224241Z * [new branch] gh/v0i0/3/base -> origin/gh/v0i0/3/base 2025-08-14T21:21:50.3224397Z * [new branch] gh/v0i0/3/head -> origin/gh/v0i0/3/head 2025-08-14T21:21:50.3224615Z * [new branch] gh/v0i0/3/orig -> origin/gh/v0i0/3/orig 2025-08-14T21:21:50.3224770Z * [new branch] gh/v0i0/4/base -> origin/gh/v0i0/4/base 2025-08-14T21:21:50.3224989Z * [new branch] gh/v0i0/4/head -> origin/gh/v0i0/4/head 2025-08-14T21:21:50.3225172Z * [new branch] gh/v0i0/4/orig -> origin/gh/v0i0/4/orig 2025-08-14T21:21:50.3225316Z * [new branch] gh/v0i0/5/base -> origin/gh/v0i0/5/base 2025-08-14T21:21:50.3225572Z * [new branch] gh/v0i0/5/head -> origin/gh/v0i0/5/head 2025-08-14T21:21:50.3225717Z * [new branch] gh/v0i0/5/orig -> origin/gh/v0i0/5/orig 2025-08-14T21:21:50.3225934Z * [new branch] gh/v0i0/6/base -> origin/gh/v0i0/6/base 2025-08-14T21:21:50.3226075Z * [new branch] gh/v0i0/6/head -> origin/gh/v0i0/6/head 2025-08-14T21:21:50.3226289Z * [new branch] gh/v0i0/6/orig -> origin/gh/v0i0/6/orig 2025-08-14T21:21:50.3226509Z * [new branch] gh/vkuzo/1/next -> origin/gh/vkuzo/1/next 2025-08-14T21:21:50.3227111Z * [new branch] gh/vkuzo/2/next -> origin/gh/vkuzo/2/next 2025-08-14T21:21:50.3227552Z * [new branch] gh/vkuzo/3/next -> origin/gh/vkuzo/3/next 2025-08-14T21:21:50.3227732Z * [new branch] gh/wconstab/392/base -> origin/gh/wconstab/392/base 2025-08-14T21:21:50.3227880Z * [new branch] gh/wconstab/392/head -> origin/gh/wconstab/392/head 2025-08-14T21:21:50.3228024Z * [new branch] gh/wconstab/392/orig -> origin/gh/wconstab/392/orig 2025-08-14T21:21:50.3228176Z * [new branch] gh/wconstab/419/base -> origin/gh/wconstab/419/base 2025-08-14T21:21:50.3228318Z * [new branch] gh/wconstab/419/head -> origin/gh/wconstab/419/head 2025-08-14T21:21:50.3228485Z * [new branch] gh/wconstab/419/orig -> origin/gh/wconstab/419/orig 2025-08-14T21:21:50.3228667Z * [new branch] gh/wconstab/424/base -> origin/gh/wconstab/424/base 2025-08-14T21:21:50.3228819Z * [new branch] gh/wconstab/424/head -> origin/gh/wconstab/424/head 2025-08-14T21:21:50.3229036Z * [new branch] gh/wconstab/424/orig -> origin/gh/wconstab/424/orig 2025-08-14T21:21:50.3229177Z * [new branch] gh/wconstab/425/base -> origin/gh/wconstab/425/base 2025-08-14T21:21:50.3229325Z * [new branch] gh/wconstab/425/head -> origin/gh/wconstab/425/head 2025-08-14T21:21:50.3229474Z * [new branch] gh/wconstab/425/orig -> origin/gh/wconstab/425/orig 2025-08-14T21:21:50.3229613Z * [new branch] gh/wconstab/426/base -> origin/gh/wconstab/426/base 2025-08-14T21:21:50.3229761Z * [new branch] gh/wconstab/426/head -> origin/gh/wconstab/426/head 2025-08-14T21:21:50.3229907Z * [new branch] gh/wconstab/426/orig -> origin/gh/wconstab/426/orig 2025-08-14T21:21:50.3230044Z * [new branch] gh/wconstab/427/base -> origin/gh/wconstab/427/base 2025-08-14T21:21:50.3230194Z * [new branch] gh/wconstab/427/head -> origin/gh/wconstab/427/head 2025-08-14T21:21:50.3230333Z * [new branch] gh/wconstab/427/orig -> origin/gh/wconstab/427/orig 2025-08-14T21:21:50.3230478Z * [new branch] gh/wconstab/428/base -> origin/gh/wconstab/428/base 2025-08-14T21:21:50.3230615Z * [new branch] gh/wconstab/428/head -> origin/gh/wconstab/428/head 2025-08-14T21:21:50.3230752Z * [new branch] gh/wconstab/428/orig -> origin/gh/wconstab/428/orig 2025-08-14T21:21:50.3230896Z * [new branch] gh/wconstab/429/base -> origin/gh/wconstab/429/base 2025-08-14T21:21:50.3231029Z * [new branch] gh/wconstab/429/head -> origin/gh/wconstab/429/head 2025-08-14T21:21:50.3231176Z * [new branch] gh/wconstab/429/orig -> origin/gh/wconstab/429/orig 2025-08-14T21:21:50.3231322Z * [new branch] gh/wconstab/430/base -> origin/gh/wconstab/430/base 2025-08-14T21:21:50.3231463Z * [new branch] gh/wconstab/430/head -> origin/gh/wconstab/430/head 2025-08-14T21:21:50.3231606Z * [new branch] gh/wconstab/430/orig -> origin/gh/wconstab/430/orig 2025-08-14T21:21:50.3231750Z * [new branch] gh/wconstab/431/base -> origin/gh/wconstab/431/base 2025-08-14T21:21:50.3231896Z * [new branch] gh/wconstab/431/head -> origin/gh/wconstab/431/head 2025-08-14T21:21:50.3232042Z * [new branch] gh/wconstab/431/orig -> origin/gh/wconstab/431/orig 2025-08-14T21:21:50.3232193Z * [new branch] gh/wconstab/432/base -> origin/gh/wconstab/432/base 2025-08-14T21:21:50.3232334Z * [new branch] gh/wconstab/432/head -> origin/gh/wconstab/432/head 2025-08-14T21:21:50.3232476Z * [new branch] gh/wconstab/432/orig -> origin/gh/wconstab/432/orig 2025-08-14T21:21:50.3232624Z * [new branch] gh/wconstab/433/base -> origin/gh/wconstab/433/base 2025-08-14T21:21:50.3232798Z * [new branch] gh/wconstab/433/head -> origin/gh/wconstab/433/head 2025-08-14T21:21:50.3232935Z * [new branch] gh/wconstab/433/orig -> origin/gh/wconstab/433/orig 2025-08-14T21:21:50.3233753Z * [new branch] gh/wconstab/434/base -> origin/gh/wconstab/434/base 2025-08-14T21:21:50.3233988Z * [new branch] gh/wconstab/434/head -> origin/gh/wconstab/434/head 2025-08-14T21:21:50.3234153Z * [new branch] gh/wconstab/434/orig -> origin/gh/wconstab/434/orig 2025-08-14T21:21:50.3234295Z * [new branch] gh/wconstab/435/base -> origin/gh/wconstab/435/base 2025-08-14T21:21:50.3234434Z * [new branch] gh/wconstab/435/head -> origin/gh/wconstab/435/head 2025-08-14T21:21:50.3234714Z * [new branch] gh/wconstab/435/orig -> origin/gh/wconstab/435/orig 2025-08-14T21:21:50.3241660Z * [new branch] gh/wconstab/436/base -> origin/gh/wconstab/436/base 2025-08-14T21:21:50.3242136Z * [new branch] gh/wconstab/436/head -> origin/gh/wconstab/436/head 2025-08-14T21:21:50.3242316Z * [new branch] gh/wconstab/436/orig -> origin/gh/wconstab/436/orig 2025-08-14T21:21:50.3242571Z * [new branch] gh/wconstab/437/base -> origin/gh/wconstab/437/base 2025-08-14T21:21:50.3242739Z * [new branch] gh/wconstab/437/head -> origin/gh/wconstab/437/head 2025-08-14T21:21:50.3246589Z * [new branch] gh/wconstab/437/orig -> origin/gh/wconstab/437/orig 2025-08-14T21:21:50.3246763Z * [new branch] gh/wconstab/438/base -> origin/gh/wconstab/438/base 2025-08-14T21:21:50.3246923Z * [new branch] gh/wconstab/438/head -> origin/gh/wconstab/438/head 2025-08-14T21:21:50.3247070Z * [new branch] gh/wconstab/438/orig -> origin/gh/wconstab/438/orig 2025-08-14T21:21:50.3247251Z * [new branch] gh/wconstab/439/base -> origin/gh/wconstab/439/base 2025-08-14T21:21:50.3247404Z * [new branch] gh/wconstab/439/head -> origin/gh/wconstab/439/head 2025-08-14T21:21:50.3247546Z * [new branch] gh/wconstab/439/orig -> origin/gh/wconstab/439/orig 2025-08-14T21:21:50.3247699Z * [new branch] gh/wconstab/440/base -> origin/gh/wconstab/440/base 2025-08-14T21:21:50.3247837Z * [new branch] gh/wconstab/440/head -> origin/gh/wconstab/440/head 2025-08-14T21:21:50.3247986Z * [new branch] gh/wconstab/440/orig -> origin/gh/wconstab/440/orig 2025-08-14T21:21:50.3248125Z * [new branch] gh/wconstab/441/base -> origin/gh/wconstab/441/base 2025-08-14T21:21:50.3248268Z * [new branch] gh/wconstab/441/head -> origin/gh/wconstab/441/head 2025-08-14T21:21:50.3249547Z * [new branch] gh/wconstab/441/orig -> origin/gh/wconstab/441/orig 2025-08-14T21:21:50.3250158Z * [new branch] gh/wconstab/442/base -> origin/gh/wconstab/442/base 2025-08-14T21:21:50.3250338Z * [new branch] gh/wconstab/442/head -> origin/gh/wconstab/442/head 2025-08-14T21:21:50.3250482Z * [new branch] gh/wconstab/442/orig -> origin/gh/wconstab/442/orig 2025-08-14T21:21:50.3258808Z * [new branch] gh/weifengpy/27/base -> origin/gh/weifengpy/27/base 2025-08-14T21:21:50.3259004Z * [new branch] gh/weifengpy/27/head -> origin/gh/weifengpy/27/head 2025-08-14T21:21:50.3259158Z * [new branch] gh/weifengpy/27/orig -> origin/gh/weifengpy/27/orig 2025-08-14T21:21:50.3259308Z * [new branch] gh/weifengpy/30/base -> origin/gh/weifengpy/30/base 2025-08-14T21:21:50.3259480Z * [new branch] gh/weifengpy/30/head -> origin/gh/weifengpy/30/head 2025-08-14T21:21:50.3259638Z * [new branch] gh/weifengpy/30/orig -> origin/gh/weifengpy/30/orig 2025-08-14T21:21:50.3259950Z * [new branch] gh/weifengpy/31/base -> origin/gh/weifengpy/31/base 2025-08-14T21:21:50.3260109Z * [new branch] gh/weifengpy/31/head -> origin/gh/weifengpy/31/head 2025-08-14T21:21:50.3260264Z * [new branch] gh/weifengpy/31/orig -> origin/gh/weifengpy/31/orig 2025-08-14T21:21:50.3260427Z * [new branch] gh/weifengpy/32/base -> origin/gh/weifengpy/32/base 2025-08-14T21:21:50.3260581Z * [new branch] gh/weifengpy/32/head -> origin/gh/weifengpy/32/head 2025-08-14T21:21:50.3260735Z * [new branch] gh/weifengpy/32/orig -> origin/gh/weifengpy/32/orig 2025-08-14T21:21:50.3266202Z * [new branch] gh/weifengpy/33/base -> origin/gh/weifengpy/33/base 2025-08-14T21:21:50.3266534Z * [new branch] gh/weifengpy/33/head -> origin/gh/weifengpy/33/head 2025-08-14T21:21:50.3266782Z * [new branch] gh/weifengpy/33/orig -> origin/gh/weifengpy/33/orig 2025-08-14T21:21:50.3267263Z * [new branch] gh/williamwen42/196/base -> origin/gh/williamwen42/196/base 2025-08-14T21:21:50.3268019Z * [new branch] gh/williamwen42/196/head -> origin/gh/williamwen42/196/head 2025-08-14T21:21:50.3268209Z * [new branch] gh/williamwen42/196/orig -> origin/gh/williamwen42/196/orig 2025-08-14T21:21:50.3268371Z * [new branch] gh/williamwen42/209/base -> origin/gh/williamwen42/209/base 2025-08-14T21:21:50.3268518Z * [new branch] gh/williamwen42/209/head -> origin/gh/williamwen42/209/head 2025-08-14T21:21:50.3268675Z * [new branch] gh/williamwen42/209/orig -> origin/gh/williamwen42/209/orig 2025-08-14T21:21:50.3268823Z * [new branch] gh/williamwen42/250/base -> origin/gh/williamwen42/250/base 2025-08-14T21:21:50.3268977Z * [new branch] gh/williamwen42/250/head -> origin/gh/williamwen42/250/head 2025-08-14T21:21:50.3269145Z * [new branch] gh/williamwen42/250/orig -> origin/gh/williamwen42/250/orig 2025-08-14T21:21:50.3275539Z * [new branch] gh/williamwen42/252/base -> origin/gh/williamwen42/252/base 2025-08-14T21:21:50.3275886Z * [new branch] gh/williamwen42/252/head -> origin/gh/williamwen42/252/head 2025-08-14T21:21:50.3276149Z * [new branch] gh/williamwen42/252/orig -> origin/gh/williamwen42/252/orig 2025-08-14T21:21:50.3276393Z * [new branch] gh/williamwen42/256/base -> origin/gh/williamwen42/256/base 2025-08-14T21:21:50.3276566Z * [new branch] gh/williamwen42/256/head -> origin/gh/williamwen42/256/head 2025-08-14T21:21:50.3276820Z * [new branch] gh/williamwen42/256/orig -> origin/gh/williamwen42/256/orig 2025-08-14T21:21:50.3277531Z * [new branch] gh/williamwen42/258/base -> origin/gh/williamwen42/258/base 2025-08-14T21:21:50.3277725Z * [new branch] gh/williamwen42/258/head -> origin/gh/williamwen42/258/head 2025-08-14T21:21:50.3277916Z * [new branch] gh/williamwen42/258/orig -> origin/gh/williamwen42/258/orig 2025-08-14T21:21:50.3278078Z * [new branch] gh/williamwen42/260/base -> origin/gh/williamwen42/260/base 2025-08-14T21:21:50.3278224Z * [new branch] gh/williamwen42/260/head -> origin/gh/williamwen42/260/head 2025-08-14T21:21:50.3278386Z * [new branch] gh/williamwen42/260/orig -> origin/gh/williamwen42/260/orig 2025-08-14T21:21:50.3283803Z * [new branch] gh/williamwen42/261/base -> origin/gh/williamwen42/261/base 2025-08-14T21:21:50.3284152Z * [new branch] gh/williamwen42/261/head -> origin/gh/williamwen42/261/head 2025-08-14T21:21:50.3284432Z * [new branch] gh/williamwen42/261/orig -> origin/gh/williamwen42/261/orig 2025-08-14T21:21:50.3284681Z * [new branch] gh/williamwen42/262/base -> origin/gh/williamwen42/262/base 2025-08-14T21:21:50.3285041Z * [new branch] gh/williamwen42/262/head -> origin/gh/williamwen42/262/head 2025-08-14T21:21:50.3285223Z * [new branch] gh/williamwen42/262/orig -> origin/gh/williamwen42/262/orig 2025-08-14T21:21:50.3285719Z * [new branch] gh/williamwen42/263/base -> origin/gh/williamwen42/263/base 2025-08-14T21:21:50.3285901Z * [new branch] gh/williamwen42/263/head -> origin/gh/williamwen42/263/head 2025-08-14T21:21:50.3286074Z * [new branch] gh/williamwen42/263/orig -> origin/gh/williamwen42/263/orig 2025-08-14T21:21:50.3286231Z * [new branch] gh/williamwen42/264/base -> origin/gh/williamwen42/264/base 2025-08-14T21:21:50.3286394Z * [new branch] gh/williamwen42/264/head -> origin/gh/williamwen42/264/head 2025-08-14T21:21:50.3286554Z * [new branch] gh/williamwen42/264/orig -> origin/gh/williamwen42/264/orig 2025-08-14T21:21:50.3287469Z * [new branch] gh/williamwen42/265/base -> origin/gh/williamwen42/265/base 2025-08-14T21:21:50.3287775Z * [new branch] gh/williamwen42/265/head -> origin/gh/williamwen42/265/head 2025-08-14T21:21:50.3288384Z * [new branch] gh/williamwen42/265/orig -> origin/gh/williamwen42/265/orig 2025-08-14T21:21:50.3294575Z * [new branch] gh/williamwen42/266/base -> origin/gh/williamwen42/266/base 2025-08-14T21:21:50.3294917Z * [new branch] gh/williamwen42/266/head -> origin/gh/williamwen42/266/head 2025-08-14T21:21:50.3295165Z * [new branch] gh/williamwen42/266/orig -> origin/gh/williamwen42/266/orig 2025-08-14T21:21:50.3295365Z * [new branch] gh/williamwen42/267/base -> origin/gh/williamwen42/267/base 2025-08-14T21:21:50.3295531Z * [new branch] gh/williamwen42/267/head -> origin/gh/williamwen42/267/head 2025-08-14T21:21:50.3295779Z * [new branch] gh/williamwen42/267/orig -> origin/gh/williamwen42/267/orig 2025-08-14T21:21:50.3296457Z * [new branch] gh/williamwen42/268/base -> origin/gh/williamwen42/268/base 2025-08-14T21:21:50.3296661Z * [new branch] gh/williamwen42/268/head -> origin/gh/williamwen42/268/head 2025-08-14T21:21:50.3296819Z * [new branch] gh/williamwen42/268/orig -> origin/gh/williamwen42/268/orig 2025-08-14T21:21:50.3296978Z * [new branch] gh/williamwen42/269/base -> origin/gh/williamwen42/269/base 2025-08-14T21:21:50.3297365Z * [new branch] gh/williamwen42/269/head -> origin/gh/williamwen42/269/head 2025-08-14T21:21:50.3297708Z * [new branch] gh/williamwen42/269/orig -> origin/gh/williamwen42/269/orig 2025-08-14T21:21:50.3302050Z * [new branch] gh/williamwen42/270/base -> origin/gh/williamwen42/270/base 2025-08-14T21:21:50.3302809Z * [new branch] gh/williamwen42/270/head -> origin/gh/williamwen42/270/head 2025-08-14T21:21:50.3303036Z * [new branch] gh/williamwen42/270/orig -> origin/gh/williamwen42/270/orig 2025-08-14T21:21:50.3303205Z * [new branch] gh/williamwen42/271/base -> origin/gh/williamwen42/271/base 2025-08-14T21:21:50.3303365Z * [new branch] gh/williamwen42/271/head -> origin/gh/williamwen42/271/head 2025-08-14T21:21:50.3303515Z * [new branch] gh/williamwen42/271/orig -> origin/gh/williamwen42/271/orig 2025-08-14T21:21:50.3312808Z * [new branch] gh/williamwen42/272/base -> origin/gh/williamwen42/272/base 2025-08-14T21:21:50.3313004Z * [new branch] gh/williamwen42/272/head -> origin/gh/williamwen42/272/head 2025-08-14T21:21:50.3313160Z * [new branch] gh/williamwen42/272/orig -> origin/gh/williamwen42/272/orig 2025-08-14T21:21:50.3313318Z * [new branch] gh/williamwen42/273/base -> origin/gh/williamwen42/273/base 2025-08-14T21:21:50.3313465Z * [new branch] gh/williamwen42/273/head -> origin/gh/williamwen42/273/head 2025-08-14T21:21:50.3313990Z * [new branch] gh/williamwen42/273/orig -> origin/gh/williamwen42/273/orig 2025-08-14T21:21:50.3314237Z * [new branch] gh/williamwen42/274/base -> origin/gh/williamwen42/274/base 2025-08-14T21:21:50.3315241Z * [new branch] gh/williamwen42/274/head -> origin/gh/williamwen42/274/head 2025-08-14T21:21:50.3315579Z * [new branch] gh/williamwen42/274/orig -> origin/gh/williamwen42/274/orig 2025-08-14T21:21:50.3319039Z * [new branch] gh/williamwen42/275/base -> origin/gh/williamwen42/275/base 2025-08-14T21:21:50.3319381Z * [new branch] gh/williamwen42/275/head -> origin/gh/williamwen42/275/head 2025-08-14T21:21:50.3319626Z * [new branch] gh/williamwen42/276/base -> origin/gh/williamwen42/276/base 2025-08-14T21:21:50.3319809Z * [new branch] gh/williamwen42/276/head -> origin/gh/williamwen42/276/head 2025-08-14T21:21:50.3320319Z * [new branch] gh/williamwen42/276/orig -> origin/gh/williamwen42/276/orig 2025-08-14T21:21:50.3322985Z * [new branch] gh/williamwen42/277/base -> origin/gh/williamwen42/277/base 2025-08-14T21:21:50.3323646Z * [new branch] gh/williamwen42/277/head -> origin/gh/williamwen42/277/head 2025-08-14T21:21:50.3323842Z * [new branch] gh/williamwen42/277/orig -> origin/gh/williamwen42/277/orig 2025-08-14T21:21:50.3323999Z * [new branch] gh/williamwen42/278/base -> origin/gh/williamwen42/278/base 2025-08-14T21:21:50.3324154Z * [new branch] gh/williamwen42/278/head -> origin/gh/williamwen42/278/head 2025-08-14T21:21:50.3324314Z * [new branch] gh/williamwen42/278/orig -> origin/gh/williamwen42/278/orig 2025-08-14T21:21:50.3325926Z * [new branch] gh/williamwen42/279/base -> origin/gh/williamwen42/279/base 2025-08-14T21:21:50.3326112Z * [new branch] gh/williamwen42/279/head -> origin/gh/williamwen42/279/head 2025-08-14T21:21:50.3326403Z * [new branch] gh/williamwen42/279/orig -> origin/gh/williamwen42/279/orig 2025-08-14T21:21:50.3328044Z * [new branch] gh/xmfan/169/base -> origin/gh/xmfan/169/base 2025-08-14T21:21:50.3328214Z * [new branch] gh/xmfan/169/head -> origin/gh/xmfan/169/head 2025-08-14T21:21:50.3329446Z * [new branch] gh/xmfan/170/base -> origin/gh/xmfan/170/base 2025-08-14T21:21:50.3329893Z * [new branch] gh/xmfan/170/head -> origin/gh/xmfan/170/head 2025-08-14T21:21:50.3332365Z * [new branch] gh/xmfan/18/base -> origin/gh/xmfan/18/base 2025-08-14T21:21:50.3332687Z * [new branch] gh/xmfan/18/head -> origin/gh/xmfan/18/head 2025-08-14T21:21:50.3332845Z * [new branch] gh/xmfan/228/base -> origin/gh/xmfan/228/base 2025-08-14T21:21:50.3334272Z * [new branch] gh/xmfan/228/head -> origin/gh/xmfan/228/head 2025-08-14T21:21:50.3341359Z * [new branch] gh/xmfan/228/orig -> origin/gh/xmfan/228/orig 2025-08-14T21:21:50.3343409Z * [new branch] gh/xmfan/229/base -> origin/gh/xmfan/229/base 2025-08-14T21:21:50.3343679Z * [new branch] gh/xmfan/229/head -> origin/gh/xmfan/229/head 2025-08-14T21:21:50.3350263Z * [new branch] gh/xmfan/229/orig -> origin/gh/xmfan/229/orig 2025-08-14T21:21:50.3352952Z * [new branch] gh/xmfan/237/base -> origin/gh/xmfan/237/base 2025-08-14T21:21:50.3358012Z * [new branch] gh/xmfan/237/head -> origin/gh/xmfan/237/head 2025-08-14T21:21:50.3358263Z * [new branch] gh/xmfan/237/orig -> origin/gh/xmfan/237/orig 2025-08-14T21:21:50.3363509Z * [new branch] gh/xmfan/244/base -> origin/gh/xmfan/244/base 2025-08-14T21:21:50.3363706Z * [new branch] gh/xmfan/244/head -> origin/gh/xmfan/244/head 2025-08-14T21:21:50.3364009Z * [new branch] gh/xmfan/244/orig -> origin/gh/xmfan/244/orig 2025-08-14T21:21:50.3364154Z * [new branch] gh/xmfan/246/base -> origin/gh/xmfan/246/base 2025-08-14T21:21:50.3364290Z * [new branch] gh/xmfan/246/head -> origin/gh/xmfan/246/head 2025-08-14T21:21:50.3364419Z * [new branch] gh/xmfan/246/orig -> origin/gh/xmfan/246/orig 2025-08-14T21:21:50.3364559Z * [new branch] gh/xmfan/253/base -> origin/gh/xmfan/253/base 2025-08-14T21:21:50.3364693Z * [new branch] gh/xmfan/253/head -> origin/gh/xmfan/253/head 2025-08-14T21:21:50.3364831Z * [new branch] gh/xmfan/253/orig -> origin/gh/xmfan/253/orig 2025-08-14T21:21:50.3364962Z * [new branch] gh/xmfan/254/base -> origin/gh/xmfan/254/base 2025-08-14T21:21:50.3365092Z * [new branch] gh/xmfan/254/head -> origin/gh/xmfan/254/head 2025-08-14T21:21:50.3365313Z * [new branch] gh/xmfan/254/orig -> origin/gh/xmfan/254/orig 2025-08-14T21:21:50.3365447Z * [new branch] gh/xmfan/260/base -> origin/gh/xmfan/260/base 2025-08-14T21:21:50.3365579Z * [new branch] gh/xmfan/260/head -> origin/gh/xmfan/260/head 2025-08-14T21:21:50.3365717Z * [new branch] gh/xmfan/260/orig -> origin/gh/xmfan/260/orig 2025-08-14T21:21:50.3365848Z * [new branch] gh/xmfan/262/base -> origin/gh/xmfan/262/base 2025-08-14T21:21:50.3365987Z * [new branch] gh/xmfan/262/head -> origin/gh/xmfan/262/head 2025-08-14T21:21:50.3366120Z * [new branch] gh/xmfan/262/orig -> origin/gh/xmfan/262/orig 2025-08-14T21:21:50.3366253Z * [new branch] gh/xmfan/263/base -> origin/gh/xmfan/263/base 2025-08-14T21:21:50.3366392Z * [new branch] gh/xmfan/263/head -> origin/gh/xmfan/263/head 2025-08-14T21:21:50.3366534Z * [new branch] gh/xmfan/263/orig -> origin/gh/xmfan/263/orig 2025-08-14T21:21:50.3366675Z * [new branch] gh/xmfan/264/base -> origin/gh/xmfan/264/base 2025-08-14T21:21:50.3366805Z * [new branch] gh/xmfan/264/head -> origin/gh/xmfan/264/head 2025-08-14T21:21:50.3366937Z * [new branch] gh/xmfan/264/orig -> origin/gh/xmfan/264/orig 2025-08-14T21:21:50.3367081Z * [new branch] gh/xmfan/268/base -> origin/gh/xmfan/268/base 2025-08-14T21:21:50.3367212Z * [new branch] gh/xmfan/268/head -> origin/gh/xmfan/268/head 2025-08-14T21:21:50.3367367Z * [new branch] gh/xmfan/268/orig -> origin/gh/xmfan/268/orig 2025-08-14T21:21:50.3367498Z * [new branch] gh/xmfan/269/base -> origin/gh/xmfan/269/base 2025-08-14T21:21:50.3367630Z * [new branch] gh/xmfan/269/head -> origin/gh/xmfan/269/head 2025-08-14T21:21:50.3367774Z * [new branch] gh/xmfan/269/orig -> origin/gh/xmfan/269/orig 2025-08-14T21:21:50.3367907Z * [new branch] gh/xmfan/270/base -> origin/gh/xmfan/270/base 2025-08-14T21:21:50.3368047Z * [new branch] gh/xmfan/270/head -> origin/gh/xmfan/270/head 2025-08-14T21:21:50.3368248Z * [new branch] gh/xmfan/270/orig -> origin/gh/xmfan/270/orig 2025-08-14T21:21:50.3373906Z * [new branch] gh/xmfan/271/base -> origin/gh/xmfan/271/base 2025-08-14T21:21:50.3374232Z * [new branch] gh/xmfan/271/head -> origin/gh/xmfan/271/head 2025-08-14T21:21:50.3374387Z * [new branch] gh/xmfan/271/orig -> origin/gh/xmfan/271/orig 2025-08-14T21:21:50.3374622Z * [new branch] gh/xmfan/272/base -> origin/gh/xmfan/272/base 2025-08-14T21:21:50.3374799Z * [new branch] gh/xmfan/272/head -> origin/gh/xmfan/272/head 2025-08-14T21:21:50.3375091Z * [new branch] gh/xmfan/272/orig -> origin/gh/xmfan/272/orig 2025-08-14T21:21:50.3375255Z * [new branch] gh/xmfan/273/base -> origin/gh/xmfan/273/base 2025-08-14T21:21:50.3375434Z * [new branch] gh/xmfan/273/head -> origin/gh/xmfan/273/head 2025-08-14T21:21:50.3379560Z * [new branch] gh/xmfan/273/orig -> origin/gh/xmfan/273/orig 2025-08-14T21:21:50.3379760Z * [new branch] gh/xmfan/274/base -> origin/gh/xmfan/274/base 2025-08-14T21:21:50.3380074Z * [new branch] gh/xmfan/274/head -> origin/gh/xmfan/274/head 2025-08-14T21:21:50.3380243Z * [new branch] gh/xmfan/274/orig -> origin/gh/xmfan/274/orig 2025-08-14T21:21:50.3380466Z * [new branch] gh/xmfan/275/base -> origin/gh/xmfan/275/base 2025-08-14T21:21:50.3380615Z * [new branch] gh/xmfan/275/head -> origin/gh/xmfan/275/head 2025-08-14T21:21:50.3380937Z * [new branch] gh/xmfan/275/orig -> origin/gh/xmfan/275/orig 2025-08-14T21:21:50.3384538Z * [new branch] gh/xmfan/276/base -> origin/gh/xmfan/276/base 2025-08-14T21:21:50.3384697Z * [new branch] gh/xmfan/276/head -> origin/gh/xmfan/276/head 2025-08-14T21:21:50.3384837Z * [new branch] gh/xmfan/276/orig -> origin/gh/xmfan/276/orig 2025-08-14T21:21:50.3385002Z * [new branch] gh/xmfan/277/base -> origin/gh/xmfan/277/base 2025-08-14T21:21:50.3385172Z * [new branch] gh/xmfan/277/head -> origin/gh/xmfan/277/head 2025-08-14T21:21:50.3385411Z * [new branch] gh/xmfan/277/orig -> origin/gh/xmfan/277/orig 2025-08-14T21:21:50.3385696Z * [new branch] gh/xuanzhang816/12/base -> origin/gh/xuanzhang816/12/base 2025-08-14T21:21:50.3391032Z * [new branch] gh/xuanzhang816/12/head -> origin/gh/xuanzhang816/12/head 2025-08-14T21:21:50.3391276Z * [new branch] gh/xuanzhang816/12/orig -> origin/gh/xuanzhang816/12/orig 2025-08-14T21:21:50.3391455Z * [new branch] gh/xuanzhang816/14/base -> origin/gh/xuanzhang816/14/base 2025-08-14T21:21:50.3391623Z * [new branch] gh/xuanzhang816/14/head -> origin/gh/xuanzhang816/14/head 2025-08-14T21:21:50.3391801Z * [new branch] gh/xuanzhang816/14/orig -> origin/gh/xuanzhang816/14/orig 2025-08-14T21:21:50.3391958Z * [new branch] gh/xuanzhang816/18/base -> origin/gh/xuanzhang816/18/base 2025-08-14T21:21:50.3392112Z * [new branch] gh/xuanzhang816/18/head -> origin/gh/xuanzhang816/18/head 2025-08-14T21:21:50.3392436Z * [new branch] gh/xuanzhang816/18/orig -> origin/gh/xuanzhang816/18/orig 2025-08-14T21:21:50.3392639Z * [new branch] gh/xuanzhang816/19/base -> origin/gh/xuanzhang816/19/base 2025-08-14T21:21:50.3392825Z * [new branch] gh/xuanzhang816/19/head -> origin/gh/xuanzhang816/19/head 2025-08-14T21:21:50.3393151Z * [new branch] gh/xuanzhang816/19/orig -> origin/gh/xuanzhang816/19/orig 2025-08-14T21:21:50.3403942Z * [new branch] gh/xuanzhang816/20/base -> origin/gh/xuanzhang816/20/base 2025-08-14T21:21:50.3404312Z * [new branch] gh/xuanzhang816/20/head -> origin/gh/xuanzhang816/20/head 2025-08-14T21:21:50.3406170Z * [new branch] gh/xuanzhang816/20/orig -> origin/gh/xuanzhang816/20/orig 2025-08-14T21:21:50.3406378Z * [new branch] gh/xuanzhang816/21/base -> origin/gh/xuanzhang816/21/base 2025-08-14T21:21:50.3406546Z * [new branch] gh/xuanzhang816/21/head -> origin/gh/xuanzhang816/21/head 2025-08-14T21:21:50.3406711Z * [new branch] gh/xuanzhang816/21/orig -> origin/gh/xuanzhang816/21/orig 2025-08-14T21:21:50.3406869Z * [new branch] gh/xuanzhang816/22/base -> origin/gh/xuanzhang816/22/base 2025-08-14T21:21:50.3407260Z * [new branch] gh/xuanzhang816/22/head -> origin/gh/xuanzhang816/22/head 2025-08-14T21:21:50.3407435Z * [new branch] gh/xuanzhang816/22/orig -> origin/gh/xuanzhang816/22/orig 2025-08-14T21:21:50.3407606Z * [new branch] gh/xuanzhang816/23/base -> origin/gh/xuanzhang816/23/base 2025-08-14T21:21:50.3407775Z * [new branch] gh/xuanzhang816/23/head -> origin/gh/xuanzhang816/23/head 2025-08-14T21:21:50.3407947Z * [new branch] gh/xuanzhang816/23/orig -> origin/gh/xuanzhang816/23/orig 2025-08-14T21:21:50.3408118Z * [new branch] gh/xuanzhang816/24/base -> origin/gh/xuanzhang816/24/base 2025-08-14T21:21:50.3408283Z * [new branch] gh/xuanzhang816/24/head -> origin/gh/xuanzhang816/24/head 2025-08-14T21:21:50.3408445Z * [new branch] gh/xuanzhang816/24/orig -> origin/gh/xuanzhang816/24/orig 2025-08-14T21:21:50.3408866Z * [new branch] gh/yanbing-j/11/base -> origin/gh/yanbing-j/11/base 2025-08-14T21:21:50.3409066Z * [new branch] gh/yanbing-j/11/head -> origin/gh/yanbing-j/11/head 2025-08-14T21:21:50.3409221Z * [new branch] gh/yanbing-j/11/orig -> origin/gh/yanbing-j/11/orig 2025-08-14T21:21:50.3409370Z * [new branch] gh/yanbing-j/12/base -> origin/gh/yanbing-j/12/base 2025-08-14T21:21:50.3409520Z * [new branch] gh/yanbing-j/12/head -> origin/gh/yanbing-j/12/head 2025-08-14T21:21:50.3409684Z * [new branch] gh/yanbing-j/12/orig -> origin/gh/yanbing-j/12/orig 2025-08-14T21:21:50.3409836Z * [new branch] gh/yanbing-j/13/base -> origin/gh/yanbing-j/13/base 2025-08-14T21:21:50.3409978Z * [new branch] gh/yanbing-j/13/head -> origin/gh/yanbing-j/13/head 2025-08-14T21:21:50.3410113Z * [new branch] gh/yanbing-j/13/orig -> origin/gh/yanbing-j/13/orig 2025-08-14T21:21:50.3410453Z * [new branch] gh/yanbing-j/14/base -> origin/gh/yanbing-j/14/base 2025-08-14T21:21:50.3414994Z * [new branch] gh/yanbing-j/14/head -> origin/gh/yanbing-j/14/head 2025-08-14T21:21:50.3415300Z * [new branch] gh/yanbing-j/14/orig -> origin/gh/yanbing-j/14/orig 2025-08-14T21:21:50.3415517Z * [new branch] gh/yanbing-j/15/base -> origin/gh/yanbing-j/15/base 2025-08-14T21:21:50.3415744Z * [new branch] gh/yanbing-j/15/head -> origin/gh/yanbing-j/15/head 2025-08-14T21:21:50.3415900Z * [new branch] gh/yanbing-j/15/orig -> origin/gh/yanbing-j/15/orig 2025-08-14T21:21:50.3416145Z * [new branch] gh/yanbing-j/18/base -> origin/gh/yanbing-j/18/base 2025-08-14T21:21:50.3416314Z * [new branch] gh/yanbing-j/18/head -> origin/gh/yanbing-j/18/head 2025-08-14T21:21:50.3416980Z * [new branch] gh/yanbing-j/18/orig -> origin/gh/yanbing-j/18/orig 2025-08-14T21:21:50.3421942Z * [new branch] gh/yanbing-j/19/base -> origin/gh/yanbing-j/19/base 2025-08-14T21:21:50.3427041Z * [new branch] gh/yanbing-j/19/head -> origin/gh/yanbing-j/19/head 2025-08-14T21:21:50.3431472Z * [new branch] gh/yanbing-j/19/orig -> origin/gh/yanbing-j/19/orig 2025-08-14T21:21:50.3436763Z * [new branch] gh/yanbing-j/20/base -> origin/gh/yanbing-j/20/base 2025-08-14T21:21:50.3436944Z * [new branch] gh/yanbing-j/20/head -> origin/gh/yanbing-j/20/head 2025-08-14T21:21:50.3437107Z * [new branch] gh/yanbing-j/20/orig -> origin/gh/yanbing-j/20/orig 2025-08-14T21:21:50.3437253Z * [new branch] gh/yanbing-j/21/base -> origin/gh/yanbing-j/21/base 2025-08-14T21:21:50.3437393Z * [new branch] gh/yanbing-j/21/head -> origin/gh/yanbing-j/21/head 2025-08-14T21:21:50.3437548Z * [new branch] gh/yanbing-j/22/base -> origin/gh/yanbing-j/22/base 2025-08-14T21:21:50.3437838Z * [new branch] gh/yanbing-j/22/head -> origin/gh/yanbing-j/22/head 2025-08-14T21:21:50.3437979Z * [new branch] gh/yanbing-j/22/orig -> origin/gh/yanbing-j/22/orig 2025-08-14T21:21:50.3438124Z * [new branch] gh/yanbing-j/23/base -> origin/gh/yanbing-j/23/base 2025-08-14T21:21:50.3438258Z * [new branch] gh/yanbing-j/23/head -> origin/gh/yanbing-j/23/head 2025-08-14T21:21:50.3438391Z * [new branch] gh/yanbing-j/23/orig -> origin/gh/yanbing-j/23/orig 2025-08-14T21:21:50.3438533Z * [new branch] gh/yanbing-j/24/base -> origin/gh/yanbing-j/24/base 2025-08-14T21:21:50.3438665Z * [new branch] gh/yanbing-j/24/head -> origin/gh/yanbing-j/24/head 2025-08-14T21:21:50.3438809Z * [new branch] gh/yanbing-j/24/orig -> origin/gh/yanbing-j/24/orig 2025-08-14T21:21:50.3439023Z * [new branch] gh/yanbing-j/25/base -> origin/gh/yanbing-j/25/base 2025-08-14T21:21:50.3439159Z * [new branch] gh/yanbing-j/25/head -> origin/gh/yanbing-j/25/head 2025-08-14T21:21:50.3439301Z * [new branch] gh/yanbing-j/25/orig -> origin/gh/yanbing-j/25/orig 2025-08-14T21:21:50.3439434Z * [new branch] gh/yanbing-j/26/base -> origin/gh/yanbing-j/26/base 2025-08-14T21:21:50.3439583Z * [new branch] gh/yanbing-j/26/head -> origin/gh/yanbing-j/26/head 2025-08-14T21:21:50.3439718Z * [new branch] gh/yanbing-j/26/orig -> origin/gh/yanbing-j/26/orig 2025-08-14T21:21:50.3439851Z * [new branch] gh/yanbing-j/36/base -> origin/gh/yanbing-j/36/base 2025-08-14T21:21:50.3439994Z * [new branch] gh/yanbing-j/36/head -> origin/gh/yanbing-j/36/head 2025-08-14T21:21:50.3440129Z * [new branch] gh/yanbing-j/36/orig -> origin/gh/yanbing-j/36/orig 2025-08-14T21:21:50.3440270Z * [new branch] gh/yanbing-j/37/base -> origin/gh/yanbing-j/37/base 2025-08-14T21:21:50.3440416Z * [new branch] gh/yanbing-j/37/head -> origin/gh/yanbing-j/37/head 2025-08-14T21:21:50.3440554Z * [new branch] gh/yanbing-j/37/orig -> origin/gh/yanbing-j/37/orig 2025-08-14T21:21:50.3440700Z * [new branch] gh/yanbing-j/39/base -> origin/gh/yanbing-j/39/base 2025-08-14T21:21:50.3441049Z * [new branch] gh/yanbing-j/39/head -> origin/gh/yanbing-j/39/head 2025-08-14T21:21:50.3441361Z * [new branch] gh/yanbing-j/39/orig -> origin/gh/yanbing-j/39/orig 2025-08-14T21:21:50.3443293Z * [new branch] gh/yangw-dev/1/base -> origin/gh/yangw-dev/1/base 2025-08-14T21:21:50.3443508Z * [new branch] gh/yangw-dev/10/base -> origin/gh/yangw-dev/10/base 2025-08-14T21:21:50.3444839Z * [new branch] gh/yangw-dev/10/head -> origin/gh/yangw-dev/10/head 2025-08-14T21:21:50.3445155Z * [new branch] gh/yangw-dev/10/orig -> origin/gh/yangw-dev/10/orig 2025-08-14T21:21:50.3445943Z * [new branch] gh/yangw-dev/11/base -> origin/gh/yangw-dev/11/base 2025-08-14T21:21:50.3446549Z * [new branch] gh/yangw-dev/11/head -> origin/gh/yangw-dev/11/head 2025-08-14T21:21:50.3447606Z * [new branch] gh/yangw-dev/11/orig -> origin/gh/yangw-dev/11/orig 2025-08-14T21:21:50.3448138Z * [new branch] gh/yangw-dev/12/base -> origin/gh/yangw-dev/12/base 2025-08-14T21:21:50.3449185Z * [new branch] gh/yangw-dev/12/head -> origin/gh/yangw-dev/12/head 2025-08-14T21:21:50.3452911Z * [new branch] gh/yangw-dev/12/orig -> origin/gh/yangw-dev/12/orig 2025-08-14T21:21:50.3453098Z * [new branch] gh/yangw-dev/13/base -> origin/gh/yangw-dev/13/base 2025-08-14T21:21:50.3453549Z * [new branch] gh/yangw-dev/13/head -> origin/gh/yangw-dev/13/head 2025-08-14T21:21:50.3453867Z * [new branch] gh/yangw-dev/13/orig -> origin/gh/yangw-dev/13/orig 2025-08-14T21:21:50.3454024Z * [new branch] gh/yangw-dev/14/base -> origin/gh/yangw-dev/14/base 2025-08-14T21:21:50.3454250Z * [new branch] gh/yangw-dev/14/head -> origin/gh/yangw-dev/14/head 2025-08-14T21:21:50.3458066Z * [new branch] gh/yangw-dev/14/orig -> origin/gh/yangw-dev/14/orig 2025-08-14T21:21:50.3458284Z * [new branch] gh/yangw-dev/15/base -> origin/gh/yangw-dev/15/base 2025-08-14T21:21:50.3458435Z * [new branch] gh/yangw-dev/15/head -> origin/gh/yangw-dev/15/head 2025-08-14T21:21:50.3458591Z * [new branch] gh/yangw-dev/15/orig -> origin/gh/yangw-dev/15/orig 2025-08-14T21:21:50.3458889Z * [new branch] gh/yangw-dev/16/base -> origin/gh/yangw-dev/16/base 2025-08-14T21:21:50.3459212Z * [new branch] gh/yangw-dev/16/head -> origin/gh/yangw-dev/16/head 2025-08-14T21:21:50.3461353Z * [new branch] gh/yangw-dev/16/orig -> origin/gh/yangw-dev/16/orig 2025-08-14T21:21:50.3464134Z * [new branch] gh/yangw-dev/17/base -> origin/gh/yangw-dev/17/base 2025-08-14T21:21:50.3464413Z * [new branch] gh/yangw-dev/17/head -> origin/gh/yangw-dev/17/head 2025-08-14T21:21:50.3469684Z * [new branch] gh/yangw-dev/17/orig -> origin/gh/yangw-dev/17/orig 2025-08-14T21:21:50.3470025Z * [new branch] gh/yangw-dev/18/base -> origin/gh/yangw-dev/18/base 2025-08-14T21:21:50.3470194Z * [new branch] gh/yangw-dev/18/head -> origin/gh/yangw-dev/18/head 2025-08-14T21:21:50.3470347Z * [new branch] gh/yangw-dev/18/orig -> origin/gh/yangw-dev/18/orig 2025-08-14T21:21:50.3470613Z * [new branch] gh/yangw-dev/19/base -> origin/gh/yangw-dev/19/base 2025-08-14T21:21:50.3470817Z * [new branch] gh/yangw-dev/19/head -> origin/gh/yangw-dev/19/head 2025-08-14T21:21:50.3470959Z * [new branch] gh/yangw-dev/19/orig -> origin/gh/yangw-dev/19/orig 2025-08-14T21:21:50.3471120Z * [new branch] gh/yangw-dev/2/base -> origin/gh/yangw-dev/2/base 2025-08-14T21:21:50.3471275Z * [new branch] gh/yangw-dev/2/head -> origin/gh/yangw-dev/2/head 2025-08-14T21:21:50.3471415Z * [new branch] gh/yangw-dev/3/base -> origin/gh/yangw-dev/3/base 2025-08-14T21:21:50.3471565Z * [new branch] gh/yangw-dev/3/head -> origin/gh/yangw-dev/3/head 2025-08-14T21:21:50.3471709Z * [new branch] gh/yangw-dev/4/base -> origin/gh/yangw-dev/4/base 2025-08-14T21:21:50.3471846Z * [new branch] gh/yangw-dev/4/head -> origin/gh/yangw-dev/4/head 2025-08-14T21:21:50.3472331Z * [new branch] gh/yangw-dev/5/base -> origin/gh/yangw-dev/5/base 2025-08-14T21:21:50.3472830Z * [new branch] gh/yangw-dev/5/head -> origin/gh/yangw-dev/5/head 2025-08-14T21:21:50.3475942Z * [new branch] gh/yangw-dev/6/base -> origin/gh/yangw-dev/6/base 2025-08-14T21:21:50.3476274Z * [new branch] gh/yangw-dev/6/head -> origin/gh/yangw-dev/6/head 2025-08-14T21:21:50.3476471Z * [new branch] gh/yangw-dev/7/base -> origin/gh/yangw-dev/7/base 2025-08-14T21:21:50.3476689Z * [new branch] gh/yangw-dev/7/head -> origin/gh/yangw-dev/7/head 2025-08-14T21:21:50.3477094Z * [new branch] gh/yangw-dev/8/base -> origin/gh/yangw-dev/8/base 2025-08-14T21:21:50.3478083Z * [new branch] gh/yangw-dev/8/head -> origin/gh/yangw-dev/8/head 2025-08-14T21:21:50.3478289Z * [new branch] gh/yangw-dev/8/orig -> origin/gh/yangw-dev/8/orig 2025-08-14T21:21:50.3481663Z * [new branch] gh/yangw-dev/9/base -> origin/gh/yangw-dev/9/base 2025-08-14T21:21:50.3482137Z * [new branch] gh/yangw-dev/9/head -> origin/gh/yangw-dev/9/head 2025-08-14T21:21:50.3482427Z * [new branch] gh/yangw-dev/9/orig -> origin/gh/yangw-dev/9/orig 2025-08-14T21:21:50.3482590Z * [new branch] gh/ydwu4/233/base -> origin/gh/ydwu4/233/base 2025-08-14T21:21:50.3482726Z * [new branch] gh/ydwu4/233/head -> origin/gh/ydwu4/233/head 2025-08-14T21:21:50.3483533Z * [new branch] gh/ydwu4/233/orig -> origin/gh/ydwu4/233/orig 2025-08-14T21:21:50.3484723Z * [new branch] gh/ydwu4/246/base -> origin/gh/ydwu4/246/base 2025-08-14T21:21:50.3485398Z * [new branch] gh/ydwu4/246/head -> origin/gh/ydwu4/246/head 2025-08-14T21:21:50.3485813Z * [new branch] gh/ydwu4/246/orig -> origin/gh/ydwu4/246/orig 2025-08-14T21:21:50.3487116Z * [new branch] gh/ydwu4/253/base -> origin/gh/ydwu4/253/base 2025-08-14T21:21:50.3487539Z * [new branch] gh/ydwu4/253/head -> origin/gh/ydwu4/253/head 2025-08-14T21:21:50.3488580Z * [new branch] gh/ydwu4/253/orig -> origin/gh/ydwu4/253/orig 2025-08-14T21:21:50.3493570Z * [new branch] gh/ydwu4/255/base -> origin/gh/ydwu4/255/base 2025-08-14T21:21:50.3493755Z * [new branch] gh/ydwu4/255/head -> origin/gh/ydwu4/255/head 2025-08-14T21:21:50.3493894Z * [new branch] gh/ydwu4/255/orig -> origin/gh/ydwu4/255/orig 2025-08-14T21:21:50.3494026Z * [new branch] gh/ydwu4/259/base -> origin/gh/ydwu4/259/base 2025-08-14T21:21:50.3494172Z * [new branch] gh/ydwu4/259/head -> origin/gh/ydwu4/259/head 2025-08-14T21:21:50.3494307Z * [new branch] gh/ydwu4/259/orig -> origin/gh/ydwu4/259/orig 2025-08-14T21:21:50.3494466Z * [new branch] gh/ydwu4/262/base -> origin/gh/ydwu4/262/base 2025-08-14T21:21:50.3494615Z * [new branch] gh/ydwu4/262/head -> origin/gh/ydwu4/262/head 2025-08-14T21:21:50.3501035Z * [new branch] gh/ydwu4/262/orig -> origin/gh/ydwu4/262/orig 2025-08-14T21:21:50.3501187Z * [new branch] gh/ydwu4/263/base -> origin/gh/ydwu4/263/base 2025-08-14T21:21:50.3501322Z * [new branch] gh/ydwu4/263/head -> origin/gh/ydwu4/263/head 2025-08-14T21:21:50.3501471Z * [new branch] gh/ydwu4/263/orig -> origin/gh/ydwu4/263/orig 2025-08-14T21:21:50.3501602Z * [new branch] gh/ydwu4/269/base -> origin/gh/ydwu4/269/base 2025-08-14T21:21:50.3501734Z * [new branch] gh/ydwu4/269/head -> origin/gh/ydwu4/269/head 2025-08-14T21:21:50.3501871Z * [new branch] gh/ydwu4/269/orig -> origin/gh/ydwu4/269/orig 2025-08-14T21:21:50.3507420Z * [new branch] gh/ydwu4/270/base -> origin/gh/ydwu4/270/base 2025-08-14T21:21:50.3507619Z * [new branch] gh/ydwu4/270/head -> origin/gh/ydwu4/270/head 2025-08-14T21:21:50.3507756Z * [new branch] gh/ydwu4/270/orig -> origin/gh/ydwu4/270/orig 2025-08-14T21:21:50.3507893Z * [new branch] gh/ydwu4/272/base -> origin/gh/ydwu4/272/base 2025-08-14T21:21:50.3508029Z * [new branch] gh/ydwu4/272/head -> origin/gh/ydwu4/272/head 2025-08-14T21:21:50.3508162Z * [new branch] gh/ydwu4/272/orig -> origin/gh/ydwu4/272/orig 2025-08-14T21:21:50.3508305Z * [new branch] gh/ydwu4/275/base -> origin/gh/ydwu4/275/base 2025-08-14T21:21:50.3508440Z * [new branch] gh/ydwu4/275/head -> origin/gh/ydwu4/275/head 2025-08-14T21:21:50.3512481Z * [new branch] gh/ydwu4/275/orig -> origin/gh/ydwu4/275/orig 2025-08-14T21:21:50.3517065Z * [new branch] gh/ydwu4/276/base -> origin/gh/ydwu4/276/base 2025-08-14T21:21:50.3517679Z * [new branch] gh/ydwu4/276/head -> origin/gh/ydwu4/276/head 2025-08-14T21:21:50.3517965Z * [new branch] gh/ydwu4/276/orig -> origin/gh/ydwu4/276/orig 2025-08-14T21:21:50.3518117Z * [new branch] gh/ydwu4/277/base -> origin/gh/ydwu4/277/base 2025-08-14T21:21:50.3518258Z * [new branch] gh/ydwu4/277/head -> origin/gh/ydwu4/277/head 2025-08-14T21:21:50.3518526Z * [new branch] gh/ydwu4/277/orig -> origin/gh/ydwu4/277/orig 2025-08-14T21:21:50.3518989Z * [new branch] gh/ydwu4/278/base -> origin/gh/ydwu4/278/base 2025-08-14T21:21:50.3519156Z * [new branch] gh/ydwu4/278/head -> origin/gh/ydwu4/278/head 2025-08-14T21:21:50.3519291Z * [new branch] gh/ydwu4/278/orig -> origin/gh/ydwu4/278/orig 2025-08-14T21:21:50.3519643Z * [new branch] gh/ydwu4/279/base -> origin/gh/ydwu4/279/base 2025-08-14T21:21:50.3519800Z * [new branch] gh/ydwu4/279/head -> origin/gh/ydwu4/279/head 2025-08-14T21:21:50.3519947Z * [new branch] gh/ydwu4/279/orig -> origin/gh/ydwu4/279/orig 2025-08-14T21:21:50.3520075Z * [new branch] gh/ydwu4/280/base -> origin/gh/ydwu4/280/base 2025-08-14T21:21:50.3520214Z * [new branch] gh/ydwu4/280/head -> origin/gh/ydwu4/280/head 2025-08-14T21:21:50.3520341Z * [new branch] gh/ydwu4/280/orig -> origin/gh/ydwu4/280/orig 2025-08-14T21:21:50.3520484Z * [new branch] gh/ydwu4/281/base -> origin/gh/ydwu4/281/base 2025-08-14T21:21:50.3520609Z * [new branch] gh/ydwu4/281/head -> origin/gh/ydwu4/281/head 2025-08-14T21:21:50.3524568Z * [new branch] gh/ydwu4/281/orig -> origin/gh/ydwu4/281/orig 2025-08-14T21:21:50.3524785Z * [new branch] gh/ydwu4/282/base -> origin/gh/ydwu4/282/base 2025-08-14T21:21:50.3525413Z * [new branch] gh/ydwu4/282/head -> origin/gh/ydwu4/282/head 2025-08-14T21:21:50.3525600Z * [new branch] gh/ydwu4/282/orig -> origin/gh/ydwu4/282/orig 2025-08-14T21:21:50.3525738Z * [new branch] gh/ydwu4/283/base -> origin/gh/ydwu4/283/base 2025-08-14T21:21:50.3525873Z * [new branch] gh/ydwu4/283/head -> origin/gh/ydwu4/283/head 2025-08-14T21:21:50.3526021Z * [new branch] gh/ydwu4/283/orig -> origin/gh/ydwu4/283/orig 2025-08-14T21:21:50.3526156Z * [new branch] gh/ydwu4/284/base -> origin/gh/ydwu4/284/base 2025-08-14T21:21:50.3526393Z * [new branch] gh/ydwu4/284/head -> origin/gh/ydwu4/284/head 2025-08-14T21:21:50.3526704Z * [new branch] gh/ydwu4/284/orig -> origin/gh/ydwu4/284/orig 2025-08-14T21:21:50.3527987Z * [new branch] gh/ydwu4/285/base -> origin/gh/ydwu4/285/base 2025-08-14T21:21:50.3528177Z * [new branch] gh/ydwu4/285/head -> origin/gh/ydwu4/285/head 2025-08-14T21:21:50.3529142Z * [new branch] gh/ydwu4/285/orig -> origin/gh/ydwu4/285/orig 2025-08-14T21:21:50.3529970Z * [new branch] gh/ydwu4/286/base -> origin/gh/ydwu4/286/base 2025-08-14T21:21:50.3530478Z * [new branch] gh/ydwu4/286/head -> origin/gh/ydwu4/286/head 2025-08-14T21:21:50.3531429Z * [new branch] gh/ydwu4/286/orig -> origin/gh/ydwu4/286/orig 2025-08-14T21:21:50.3535388Z * [new branch] gh/ydwu4/287/base -> origin/gh/ydwu4/287/base 2025-08-14T21:21:50.3535559Z * [new branch] gh/ydwu4/287/head -> origin/gh/ydwu4/287/head 2025-08-14T21:21:50.3535707Z * [new branch] gh/ydwu4/287/orig -> origin/gh/ydwu4/287/orig 2025-08-14T21:21:50.3535858Z * [new branch] gh/ydwu4/288/base -> origin/gh/ydwu4/288/base 2025-08-14T21:21:50.3536145Z * [new branch] gh/ydwu4/288/head -> origin/gh/ydwu4/288/head 2025-08-14T21:21:50.3536460Z * [new branch] gh/ydwu4/288/orig -> origin/gh/ydwu4/288/orig 2025-08-14T21:21:50.3536922Z * [new branch] gh/ydwu4/289/base -> origin/gh/ydwu4/289/base 2025-08-14T21:21:50.3539150Z * [new branch] gh/ydwu4/289/head -> origin/gh/ydwu4/289/head 2025-08-14T21:21:50.3539479Z * [new branch] gh/ydwu4/289/orig -> origin/gh/ydwu4/289/orig 2025-08-14T21:21:50.3539664Z * [new branch] gh/ydwu4/290/base -> origin/gh/ydwu4/290/base 2025-08-14T21:21:50.3539886Z * [new branch] gh/ydwu4/290/head -> origin/gh/ydwu4/290/head 2025-08-14T21:21:50.3541394Z * [new branch] gh/ydwu4/290/orig -> origin/gh/ydwu4/290/orig 2025-08-14T21:21:50.3541881Z * [new branch] gh/ydwu4/291/base -> origin/gh/ydwu4/291/base 2025-08-14T21:21:50.3544759Z * [new branch] gh/ydwu4/291/head -> origin/gh/ydwu4/291/head 2025-08-14T21:21:50.3545104Z * [new branch] gh/ydwu4/291/orig -> origin/gh/ydwu4/291/orig 2025-08-14T21:21:50.3545338Z * [new branch] gh/ydwu4/292/base -> origin/gh/ydwu4/292/base 2025-08-14T21:21:50.3545493Z * [new branch] gh/ydwu4/292/head -> origin/gh/ydwu4/292/head 2025-08-14T21:21:50.3545729Z * [new branch] gh/ydwu4/292/orig -> origin/gh/ydwu4/292/orig 2025-08-14T21:21:50.3550292Z * [new branch] gh/ydwu4/293/base -> origin/gh/ydwu4/293/base 2025-08-14T21:21:50.3550629Z * [new branch] gh/ydwu4/293/head -> origin/gh/ydwu4/293/head 2025-08-14T21:21:50.3550805Z * [new branch] gh/ydwu4/293/orig -> origin/gh/ydwu4/293/orig 2025-08-14T21:21:50.3550987Z * [new branch] gh/ydwu4/294/base -> origin/gh/ydwu4/294/base 2025-08-14T21:21:50.3551139Z * [new branch] gh/ydwu4/294/head -> origin/gh/ydwu4/294/head 2025-08-14T21:21:50.3551345Z * [new branch] gh/ydwu4/294/orig -> origin/gh/ydwu4/294/orig 2025-08-14T21:21:50.3551493Z * [new branch] gh/ydwu4/295/base -> origin/gh/ydwu4/295/base 2025-08-14T21:21:50.3551686Z * [new branch] gh/ydwu4/295/head -> origin/gh/ydwu4/295/head 2025-08-14T21:21:50.3553078Z * [new branch] gh/ydwu4/295/orig -> origin/gh/ydwu4/295/orig 2025-08-14T21:21:50.3556607Z * [new branch] gh/ydwu4/296/base -> origin/gh/ydwu4/296/base 2025-08-14T21:21:50.3562378Z * [new branch] gh/ydwu4/296/head -> origin/gh/ydwu4/296/head 2025-08-14T21:21:50.3565594Z * [new branch] gh/ydwu4/296/orig -> origin/gh/ydwu4/296/orig 2025-08-14T21:21:50.3565793Z * [new branch] gh/ydwu4/297/base -> origin/gh/ydwu4/297/base 2025-08-14T21:21:50.3565950Z * [new branch] gh/ydwu4/297/head -> origin/gh/ydwu4/297/head 2025-08-14T21:21:50.3566096Z * [new branch] gh/ydwu4/297/orig -> origin/gh/ydwu4/297/orig 2025-08-14T21:21:50.3566236Z * [new branch] gh/ydwu4/298/base -> origin/gh/ydwu4/298/base 2025-08-14T21:21:50.3566374Z * [new branch] gh/ydwu4/298/head -> origin/gh/ydwu4/298/head 2025-08-14T21:21:50.3566523Z * [new branch] gh/ydwu4/298/orig -> origin/gh/ydwu4/298/orig 2025-08-14T21:21:50.3566661Z * [new branch] gh/ydwu4/299/base -> origin/gh/ydwu4/299/base 2025-08-14T21:21:50.3566805Z * [new branch] gh/ydwu4/299/head -> origin/gh/ydwu4/299/head 2025-08-14T21:21:50.3566941Z * [new branch] gh/ydwu4/299/orig -> origin/gh/ydwu4/299/orig 2025-08-14T21:21:50.3567080Z * [new branch] gh/ydwu4/300/base -> origin/gh/ydwu4/300/base 2025-08-14T21:21:50.3567356Z * [new branch] gh/ydwu4/300/head -> origin/gh/ydwu4/300/head 2025-08-14T21:21:50.3567504Z * [new branch] gh/ydwu4/300/orig -> origin/gh/ydwu4/300/orig 2025-08-14T21:21:50.3567659Z * [new branch] gh/ydwu4/301/base -> origin/gh/ydwu4/301/base 2025-08-14T21:21:50.3567799Z * [new branch] gh/ydwu4/301/head -> origin/gh/ydwu4/301/head 2025-08-14T21:21:50.3569670Z * [new branch] gh/ydwu4/301/orig -> origin/gh/ydwu4/301/orig 2025-08-14T21:21:50.3569842Z * [new branch] gh/ydwu4/302/base -> origin/gh/ydwu4/302/base 2025-08-14T21:21:50.3569984Z * [new branch] gh/ydwu4/302/head -> origin/gh/ydwu4/302/head 2025-08-14T21:21:50.3576729Z * [new branch] gh/ydwu4/302/orig -> origin/gh/ydwu4/302/orig 2025-08-14T21:21:50.3577066Z * [new branch] gh/ydwu4/303/base -> origin/gh/ydwu4/303/base 2025-08-14T21:21:50.3577246Z * [new branch] gh/ydwu4/303/head -> origin/gh/ydwu4/303/head 2025-08-14T21:21:50.3577400Z * [new branch] gh/ydwu4/303/orig -> origin/gh/ydwu4/303/orig 2025-08-14T21:21:50.3577529Z * [new branch] gh/ydwu4/304/base -> origin/gh/ydwu4/304/base 2025-08-14T21:21:50.3577666Z * [new branch] gh/ydwu4/304/head -> origin/gh/ydwu4/304/head 2025-08-14T21:21:50.3577989Z * [new branch] gh/ydwu4/304/orig -> origin/gh/ydwu4/304/orig 2025-08-14T21:21:50.3578130Z * [new branch] gh/ydwu4/305/base -> origin/gh/ydwu4/305/base 2025-08-14T21:21:50.3578279Z * [new branch] gh/ydwu4/305/head -> origin/gh/ydwu4/305/head 2025-08-14T21:21:50.3578422Z * [new branch] gh/ydwu4/305/orig -> origin/gh/ydwu4/305/orig 2025-08-14T21:21:50.3578729Z * [new branch] gh/ydwu4/306/base -> origin/gh/ydwu4/306/base 2025-08-14T21:21:50.3578902Z * [new branch] gh/ydwu4/306/head -> origin/gh/ydwu4/306/head 2025-08-14T21:21:50.3583628Z * [new branch] gh/ydwu4/306/orig -> origin/gh/ydwu4/306/orig 2025-08-14T21:21:50.3583973Z * [new branch] gh/ydwu4/307/base -> origin/gh/ydwu4/307/base 2025-08-14T21:21:50.3584133Z * [new branch] gh/ydwu4/307/head -> origin/gh/ydwu4/307/head 2025-08-14T21:21:50.3584377Z * [new branch] gh/ydwu4/307/orig -> origin/gh/ydwu4/307/orig 2025-08-14T21:21:50.3584536Z * [new branch] gh/ydwu4/308/base -> origin/gh/ydwu4/308/base 2025-08-14T21:21:50.3585992Z * [new branch] gh/ydwu4/308/head -> origin/gh/ydwu4/308/head 2025-08-14T21:21:50.3586222Z * [new branch] gh/ydwu4/308/orig -> origin/gh/ydwu4/308/orig 2025-08-14T21:21:50.3586394Z * [new branch] gh/ydwu4/309/base -> origin/gh/ydwu4/309/base 2025-08-14T21:21:50.3586551Z * [new branch] gh/ydwu4/309/head -> origin/gh/ydwu4/309/head 2025-08-14T21:21:50.3586815Z * [new branch] gh/ydwu4/309/orig -> origin/gh/ydwu4/309/orig 2025-08-14T21:21:50.3586976Z * [new branch] gh/ydwu4/310/base -> origin/gh/ydwu4/310/base 2025-08-14T21:21:50.3590200Z * [new branch] gh/ydwu4/310/head -> origin/gh/ydwu4/310/head 2025-08-14T21:21:50.3590372Z * [new branch] gh/ydwu4/310/orig -> origin/gh/ydwu4/310/orig 2025-08-14T21:21:50.3590506Z * [new branch] gh/ydwu4/311/base -> origin/gh/ydwu4/311/base 2025-08-14T21:21:50.3590652Z * [new branch] gh/ydwu4/311/head -> origin/gh/ydwu4/311/head 2025-08-14T21:21:50.3590796Z * [new branch] gh/ydwu4/311/orig -> origin/gh/ydwu4/311/orig 2025-08-14T21:21:50.3594709Z * [new branch] gh/yf225/133/base -> origin/gh/yf225/133/base 2025-08-14T21:21:50.3595219Z * [new branch] gh/yf225/133/head -> origin/gh/yf225/133/head 2025-08-14T21:21:50.3595360Z * [new branch] gh/yf225/171/base -> origin/gh/yf225/171/base 2025-08-14T21:21:50.3595500Z * [new branch] gh/yf225/171/head -> origin/gh/yf225/171/head 2025-08-14T21:21:50.3595627Z * [new branch] gh/yf225/171/orig -> origin/gh/yf225/171/orig 2025-08-14T21:21:50.3599688Z * [new branch] gh/yf225/172/base -> origin/gh/yf225/172/base 2025-08-14T21:21:50.3600302Z * [new branch] gh/yf225/172/head -> origin/gh/yf225/172/head 2025-08-14T21:21:50.3600498Z * [new branch] gh/yf225/172/orig -> origin/gh/yf225/172/orig 2025-08-14T21:21:50.3600635Z * [new branch] gh/yf225/93/base -> origin/gh/yf225/93/base 2025-08-14T21:21:50.3600943Z * [new branch] gh/yf225/93/head -> origin/gh/yf225/93/head 2025-08-14T21:21:50.3601147Z * [new branch] gh/yifuwang/152/base -> origin/gh/yifuwang/152/base 2025-08-14T21:21:50.3601300Z * [new branch] gh/yifuwang/152/head -> origin/gh/yifuwang/152/head 2025-08-14T21:21:50.3601454Z * [new branch] gh/yifuwang/152/orig -> origin/gh/yifuwang/152/orig 2025-08-14T21:21:50.3601600Z * [new branch] gh/yifuwang/195/base -> origin/gh/yifuwang/195/base 2025-08-14T21:21:50.3601941Z * [new branch] gh/yifuwang/195/head -> origin/gh/yifuwang/195/head 2025-08-14T21:21:50.3602273Z * [new branch] gh/yifuwang/195/orig -> origin/gh/yifuwang/195/orig 2025-08-14T21:21:50.3604161Z * [new branch] gh/yiming0416/1/base -> origin/gh/yiming0416/1/base 2025-08-14T21:21:50.3604473Z * [new branch] gh/yiming0416/1/head -> origin/gh/yiming0416/1/head 2025-08-14T21:21:50.3605018Z * [new branch] gh/yiming0416/2/base -> origin/gh/yiming0416/2/base 2025-08-14T21:21:50.3605606Z * [new branch] gh/yiming0416/2/head -> origin/gh/yiming0416/2/head 2025-08-14T21:21:50.3607105Z * [new branch] gh/ysiraichi/79/base -> origin/gh/ysiraichi/79/base 2025-08-14T21:21:50.3607344Z * [new branch] gh/ysiraichi/79/head -> origin/gh/ysiraichi/79/head 2025-08-14T21:21:50.3609143Z * [new branch] gh/ysiraichi/79/orig -> origin/gh/ysiraichi/79/orig 2025-08-14T21:21:50.3609307Z * [new branch] gh/ysiraichi/81/base -> origin/gh/ysiraichi/81/base 2025-08-14T21:21:50.3617993Z * [new branch] gh/ysiraichi/81/head -> origin/gh/ysiraichi/81/head 2025-08-14T21:21:50.3618195Z * [new branch] gh/ysiraichi/81/orig -> origin/gh/ysiraichi/81/orig 2025-08-14T21:21:50.3618345Z * [new branch] gh/ysiraichi/84/base -> origin/gh/ysiraichi/84/base 2025-08-14T21:21:50.3618529Z * [new branch] gh/ysiraichi/84/head -> origin/gh/ysiraichi/84/head 2025-08-14T21:21:50.3618679Z * [new branch] gh/ysiraichi/84/orig -> origin/gh/ysiraichi/84/orig 2025-08-14T21:21:50.3618831Z * [new branch] gh/ysiraichi/85/base -> origin/gh/ysiraichi/85/base 2025-08-14T21:21:50.3618978Z * [new branch] gh/ysiraichi/85/head -> origin/gh/ysiraichi/85/head 2025-08-14T21:21:50.3619125Z * [new branch] gh/ysiraichi/85/orig -> origin/gh/ysiraichi/85/orig 2025-08-14T21:21:50.3619266Z * [new branch] gh/ysiraichi/86/base -> origin/gh/ysiraichi/86/base 2025-08-14T21:21:50.3619416Z * [new branch] gh/ysiraichi/86/head -> origin/gh/ysiraichi/86/head 2025-08-14T21:21:50.3619556Z * [new branch] gh/ysiraichi/86/orig -> origin/gh/ysiraichi/86/orig 2025-08-14T21:21:50.3619710Z * [new branch] gh/ysiraichi/87/base -> origin/gh/ysiraichi/87/base 2025-08-14T21:21:50.3625846Z * [new branch] gh/ysiraichi/87/head -> origin/gh/ysiraichi/87/head 2025-08-14T21:21:50.3626025Z * [new branch] gh/ysiraichi/87/orig -> origin/gh/ysiraichi/87/orig 2025-08-14T21:21:50.3626171Z * [new branch] gh/ysiraichi/88/base -> origin/gh/ysiraichi/88/base 2025-08-14T21:21:50.3626304Z * [new branch] gh/ysiraichi/88/head -> origin/gh/ysiraichi/88/head 2025-08-14T21:21:50.3626439Z * [new branch] gh/ysiraichi/88/orig -> origin/gh/ysiraichi/88/orig 2025-08-14T21:21:50.3626582Z * [new branch] gh/yuguo68/1/base -> origin/gh/yuguo68/1/base 2025-08-14T21:21:50.3626710Z * [new branch] gh/yuguo68/1/head -> origin/gh/yuguo68/1/head 2025-08-14T21:21:50.3626841Z * [new branch] gh/yuguo68/1/orig -> origin/gh/yuguo68/1/orig 2025-08-14T21:21:50.3627053Z * [new branch] gh/yuguo68/2/base -> origin/gh/yuguo68/2/base 2025-08-14T21:21:50.3627183Z * [new branch] gh/yuguo68/2/head -> origin/gh/yuguo68/2/head 2025-08-14T21:21:50.3627316Z * [new branch] gh/yuguo68/2/orig -> origin/gh/yuguo68/2/orig 2025-08-14T21:21:50.3633173Z * [new branch] gh/zhxchen17/25/base -> origin/gh/zhxchen17/25/base 2025-08-14T21:21:50.3633354Z * [new branch] gh/zhxchen17/25/head -> origin/gh/zhxchen17/25/head 2025-08-14T21:21:50.3633490Z * [new branch] gh/zhxchen17/25/orig -> origin/gh/zhxchen17/25/orig 2025-08-14T21:21:50.3633620Z * [new branch] gh/zhxchen17/31/base -> origin/gh/zhxchen17/31/base 2025-08-14T21:21:50.3633760Z * [new branch] gh/zhxchen17/31/head -> origin/gh/zhxchen17/31/head 2025-08-14T21:21:50.3633891Z * [new branch] gh/zhxchen17/31/orig -> origin/gh/zhxchen17/31/orig 2025-08-14T21:21:50.3634041Z * [new branch] gh/zhxchen17/33/base -> origin/gh/zhxchen17/33/base 2025-08-14T21:21:50.3634434Z * [new branch] gh/zhxchen17/33/head -> origin/gh/zhxchen17/33/head 2025-08-14T21:21:50.3634573Z * [new branch] gh/zhxchen17/33/orig -> origin/gh/zhxchen17/33/orig 2025-08-14T21:21:50.3634714Z * [new branch] gh/zhxchen17/34/base -> origin/gh/zhxchen17/34/base 2025-08-14T21:21:50.3634847Z * [new branch] gh/zhxchen17/34/head -> origin/gh/zhxchen17/34/head 2025-08-14T21:21:50.3634989Z * [new branch] gh/zhxchen17/35/base -> origin/gh/zhxchen17/35/base 2025-08-14T21:21:50.3645206Z * [new branch] gh/zhxchen17/35/head -> origin/gh/zhxchen17/35/head 2025-08-14T21:21:50.3645407Z * [new branch] gh/zhxchen17/36/base -> origin/gh/zhxchen17/36/base 2025-08-14T21:21:50.3645562Z * [new branch] gh/zhxchen17/36/head -> origin/gh/zhxchen17/36/head 2025-08-14T21:21:50.3645729Z * [new branch] gh/zhxchen17/36/orig -> origin/gh/zhxchen17/36/orig 2025-08-14T21:21:50.3645888Z * [new branch] gh/zklaus/1/base -> origin/gh/zklaus/1/base 2025-08-14T21:21:50.3646027Z * [new branch] gh/zklaus/1/head -> origin/gh/zklaus/1/head 2025-08-14T21:21:50.3646169Z * [new branch] gh/zklaus/1/orig -> origin/gh/zklaus/1/orig 2025-08-14T21:21:50.3646323Z * [new branch] gh/zklaus/10/base -> origin/gh/zklaus/10/base 2025-08-14T21:21:50.3646459Z * [new branch] gh/zklaus/10/head -> origin/gh/zklaus/10/head 2025-08-14T21:21:50.3646601Z * [new branch] gh/zklaus/10/orig -> origin/gh/zklaus/10/orig 2025-08-14T21:21:50.3646739Z * [new branch] gh/zklaus/11/base -> origin/gh/zklaus/11/base 2025-08-14T21:21:50.3646879Z * [new branch] gh/zklaus/11/head -> origin/gh/zklaus/11/head 2025-08-14T21:21:50.3647195Z * [new branch] gh/zklaus/11/orig -> origin/gh/zklaus/11/orig 2025-08-14T21:21:50.3647344Z * [new branch] gh/zklaus/12/base -> origin/gh/zklaus/12/base 2025-08-14T21:21:50.3647494Z * [new branch] gh/zklaus/12/head -> origin/gh/zklaus/12/head 2025-08-14T21:21:50.3647640Z * [new branch] gh/zklaus/12/orig -> origin/gh/zklaus/12/orig 2025-08-14T21:21:50.3647778Z * [new branch] gh/zklaus/14/base -> origin/gh/zklaus/14/base 2025-08-14T21:21:50.3647926Z * [new branch] gh/zklaus/14/head -> origin/gh/zklaus/14/head 2025-08-14T21:21:50.3648451Z * [new branch] gh/zklaus/14/orig -> origin/gh/zklaus/14/orig 2025-08-14T21:21:50.3650189Z * [new branch] gh/zklaus/15/base -> origin/gh/zklaus/15/base 2025-08-14T21:21:50.3650333Z * [new branch] gh/zklaus/15/head -> origin/gh/zklaus/15/head 2025-08-14T21:21:50.3650880Z * [new branch] gh/zklaus/15/orig -> origin/gh/zklaus/15/orig 2025-08-14T21:21:50.3652324Z * [new branch] gh/zklaus/16/base -> origin/gh/zklaus/16/base 2025-08-14T21:21:50.3652625Z * [new branch] gh/zklaus/16/head -> origin/gh/zklaus/16/head 2025-08-14T21:21:50.3652971Z * [new branch] gh/zklaus/16/orig -> origin/gh/zklaus/16/orig 2025-08-14T21:21:50.3659704Z * [new branch] gh/zklaus/17/base -> origin/gh/zklaus/17/base 2025-08-14T21:21:50.3659906Z * [new branch] gh/zklaus/17/head -> origin/gh/zklaus/17/head 2025-08-14T21:21:50.3660053Z * [new branch] gh/zklaus/17/orig -> origin/gh/zklaus/17/orig 2025-08-14T21:21:50.3660198Z * [new branch] gh/zklaus/18/base -> origin/gh/zklaus/18/base 2025-08-14T21:21:50.3660348Z * [new branch] gh/zklaus/18/head -> origin/gh/zklaus/18/head 2025-08-14T21:21:50.3660504Z * [new branch] gh/zklaus/18/orig -> origin/gh/zklaus/18/orig 2025-08-14T21:21:50.3660650Z * [new branch] gh/zklaus/19/base -> origin/gh/zklaus/19/base 2025-08-14T21:21:50.3660802Z * [new branch] gh/zklaus/19/head -> origin/gh/zklaus/19/head 2025-08-14T21:21:50.3660944Z * [new branch] gh/zklaus/19/orig -> origin/gh/zklaus/19/orig 2025-08-14T21:21:50.3661951Z * [new branch] gh/zklaus/7/base -> origin/gh/zklaus/7/base 2025-08-14T21:21:50.3662104Z * [new branch] gh/zklaus/7/head -> origin/gh/zklaus/7/head 2025-08-14T21:21:50.3662246Z * [new branch] gh/zklaus/7/orig -> origin/gh/zklaus/7/orig 2025-08-14T21:21:50.3670086Z * [new branch] gh/zklaus/9/base -> origin/gh/zklaus/9/base 2025-08-14T21:21:50.3672015Z * [new branch] gh/zklaus/9/head -> origin/gh/zklaus/9/head 2025-08-14T21:21:50.3672320Z * [new branch] gh/zklaus/9/orig -> origin/gh/zklaus/9/orig 2025-08-14T21:21:50.3676010Z * [new branch] gh/zou3519/1175/base -> origin/gh/zou3519/1175/base 2025-08-14T21:21:50.3676269Z * [new branch] gh/zou3519/1175/head -> origin/gh/zou3519/1175/head 2025-08-14T21:21:50.3681632Z * [new branch] gh/zou3519/1175/orig -> origin/gh/zou3519/1175/orig 2025-08-14T21:21:50.3684260Z * [new branch] gh/zou3519/1177/base -> origin/gh/zou3519/1177/base 2025-08-14T21:21:50.3684452Z * [new branch] gh/zou3519/1177/head -> origin/gh/zou3519/1177/head 2025-08-14T21:21:50.3684599Z * [new branch] gh/zou3519/1177/orig -> origin/gh/zou3519/1177/orig 2025-08-14T21:21:50.3684839Z * [new branch] gh/zou3519/1187/base -> origin/gh/zou3519/1187/base 2025-08-14T21:21:50.3685025Z * [new branch] gh/zou3519/1187/head -> origin/gh/zou3519/1187/head 2025-08-14T21:21:50.3685316Z * [new branch] gh/zou3519/1187/orig -> origin/gh/zou3519/1187/orig 2025-08-14T21:21:50.3685467Z * [new branch] gh/zou3519/1188/base -> origin/gh/zou3519/1188/base 2025-08-14T21:21:50.3685642Z * [new branch] gh/zou3519/1188/head -> origin/gh/zou3519/1188/head 2025-08-14T21:21:50.3685793Z * [new branch] gh/zou3519/1188/orig -> origin/gh/zou3519/1188/orig 2025-08-14T21:21:50.3685939Z * [new branch] gh/zou3519/1189/base -> origin/gh/zou3519/1189/base 2025-08-14T21:21:50.3686079Z * [new branch] gh/zou3519/1189/head -> origin/gh/zou3519/1189/head 2025-08-14T21:21:50.3686223Z * [new branch] gh/zou3519/1189/orig -> origin/gh/zou3519/1189/orig 2025-08-14T21:21:50.3686371Z * [new branch] gh/zou3519/1190/base -> origin/gh/zou3519/1190/base 2025-08-14T21:21:50.3686577Z * [new branch] gh/zou3519/1190/head -> origin/gh/zou3519/1190/head 2025-08-14T21:21:50.3686765Z * [new branch] gh/zou3519/1190/orig -> origin/gh/zou3519/1190/orig 2025-08-14T21:21:50.3686918Z * [new branch] gh/zou3519/1191/base -> origin/gh/zou3519/1191/base 2025-08-14T21:21:50.3687060Z * [new branch] gh/zou3519/1191/head -> origin/gh/zou3519/1191/head 2025-08-14T21:21:50.3687231Z * [new branch] gh/zou3519/1191/orig -> origin/gh/zou3519/1191/orig 2025-08-14T21:21:50.3687386Z * [new branch] gh/zpcore/1/base -> origin/gh/zpcore/1/base 2025-08-14T21:21:50.3687535Z * [new branch] gh/zpcore/1/head -> origin/gh/zpcore/1/head 2025-08-14T21:21:50.3687678Z * [new branch] gh/zpcore/10/base -> origin/gh/zpcore/10/base 2025-08-14T21:21:50.3687817Z * [new branch] gh/zpcore/10/head -> origin/gh/zpcore/10/head 2025-08-14T21:21:50.3687962Z * [new branch] gh/zpcore/10/orig -> origin/gh/zpcore/10/orig 2025-08-14T21:21:50.3688109Z * [new branch] gh/zpcore/11/base -> origin/gh/zpcore/11/base 2025-08-14T21:21:50.3688254Z * [new branch] gh/zpcore/11/head -> origin/gh/zpcore/11/head 2025-08-14T21:21:50.3688392Z * [new branch] gh/zpcore/11/orig -> origin/gh/zpcore/11/orig 2025-08-14T21:21:50.3688532Z * [new branch] gh/zpcore/12/base -> origin/gh/zpcore/12/base 2025-08-14T21:21:50.3692535Z * [new branch] gh/zpcore/12/head -> origin/gh/zpcore/12/head 2025-08-14T21:21:50.3692677Z * [new branch] gh/zpcore/12/orig -> origin/gh/zpcore/12/orig 2025-08-14T21:21:50.3692822Z * [new branch] gh/zpcore/2/base -> origin/gh/zpcore/2/base 2025-08-14T21:21:50.3692957Z * [new branch] gh/zpcore/2/head -> origin/gh/zpcore/2/head 2025-08-14T21:21:50.3693094Z * [new branch] gh/zpcore/3/base -> origin/gh/zpcore/3/base 2025-08-14T21:21:50.3697247Z * [new branch] gh/zpcore/3/head -> origin/gh/zpcore/3/head 2025-08-14T21:21:50.3697591Z * [new branch] gh/zpcore/4/base -> origin/gh/zpcore/4/base 2025-08-14T21:21:50.3697748Z * [new branch] gh/zpcore/4/head -> origin/gh/zpcore/4/head 2025-08-14T21:21:50.3697895Z * [new branch] gh/zpcore/5/base -> origin/gh/zpcore/5/base 2025-08-14T21:21:50.3698043Z * [new branch] gh/zpcore/5/head -> origin/gh/zpcore/5/head 2025-08-14T21:21:50.3698185Z * [new branch] gh/zpcore/6/base -> origin/gh/zpcore/6/base 2025-08-14T21:21:50.3698330Z * [new branch] gh/zpcore/6/head -> origin/gh/zpcore/6/head 2025-08-14T21:21:50.3701009Z * [new branch] gh/zpcore/7/base -> origin/gh/zpcore/7/base 2025-08-14T21:21:50.3701177Z * [new branch] gh/zpcore/7/head -> origin/gh/zpcore/7/head 2025-08-14T21:21:50.3701471Z * [new branch] gh/zpcore/8/base -> origin/gh/zpcore/8/base 2025-08-14T21:21:50.3701616Z * [new branch] gh/zpcore/8/head -> origin/gh/zpcore/8/head 2025-08-14T21:21:50.3701762Z * [new branch] gh/zpcore/9/head -> origin/gh/zpcore/9/head 2025-08-14T21:21:50.3701902Z * [new branch] gh/zpcore/9/orig -> origin/gh/zpcore/9/orig 2025-08-14T21:21:50.3705373Z * [new branch] google-main -> origin/google-main 2025-08-14T21:21:50.3705872Z * [new branch] guangyey/external_stream -> origin/guangyey/external_stream 2025-08-14T21:21:50.3706061Z * [new branch] guangyey/host_alloc -> origin/guangyey/host_alloc 2025-08-14T21:21:50.3711661Z * [new branch] guangyey/test_2025 -> origin/guangyey/test_2025 2025-08-14T21:21:50.3712168Z * [new branch] guilhermeleobas/cherry-pick-55d87d9dfd9 -> origin/guilhermeleobas/cherry-pick-55d87d9dfd9 2025-08-14T21:21:50.3712347Z * [new branch] haozhe/bf16-dynamic-shape -> origin/haozhe/bf16-dynamic-shape 2025-08-14T21:21:50.3712473Z * [new branch] hc_baseline -> origin/hc_baseline 2025-08-14T21:21:50.3716341Z * [new branch] headeronlyScalarType -> origin/headeronlyScalarType 2025-08-14T21:21:50.3716554Z * [new branch] hf_update -> origin/hf_update 2025-08-14T21:21:50.3716722Z * [new branch] hhh_decomp_mul -> origin/hhh_decomp_mul 2025-08-14T21:21:50.3716860Z * [new branch] hhh_rand -> origin/hhh_rand 2025-08-14T21:21:50.3716996Z * [new branch] hoy/mmsplitk -> origin/hoy/mmsplitk 2025-08-14T21:21:50.3717287Z * [new branch] hoy/triton-PR3973 -> origin/hoy/triton-PR3973 2025-08-14T21:21:50.3717507Z * [new branch] hoy/triton-coalescing-baseline -> origin/hoy/triton-coalescing-baseline 2025-08-14T21:21:50.3717706Z * [new branch] hoy/triton-coalescing-min -> origin/hoy/triton-coalescing-min 2025-08-14T21:21:50.3717890Z * [new branch] hoy/triton-coalescing-new -> origin/hoy/triton-coalescing-new 2025-08-14T21:21:50.3718042Z * [new branch] hoy/triton-coalescing-vec -> origin/hoy/triton-coalescing-vec 2025-08-14T21:21:50.3718189Z * [new branch] inductordecompfix -> origin/inductordecompfix 2025-08-14T21:21:50.3718479Z * [new branch] inline -> origin/inline 2025-08-14T21:21:50.3718665Z * [new branch] inlining -> origin/inlining 2025-08-14T21:21:50.3724386Z * [new branch] inlining-ezyang -> origin/inlining-ezyang 2025-08-14T21:21:50.3724554Z * [new branch] int8_sdpa -> origin/int8_sdpa 2025-08-14T21:21:50.3724742Z * [new branch] invoke-subgraph -> origin/invoke-subgraph 2025-08-14T21:21:50.3724887Z * [new branch] issue#58739 -> origin/issue#58739 2025-08-14T21:21:50.3725017Z * [new branch] issue-154849 -> origin/issue-154849 2025-08-14T21:21:50.3725233Z * [new branch] ivanov/cherry-pick-ckpt-fixes -> origin/ivanov/cherry-pick-ckpt-fixes 2025-08-14T21:21:50.3725462Z * [new branch] jcaip/test-cusparselt-version-0.6.2 -> origin/jcaip/test-cusparselt-version-0.6.2 2025-08-14T21:21:50.3725659Z * [new branch] jcaip/update-cusparselt-0.6.2 -> origin/jcaip/update-cusparselt-0.6.2 2025-08-14T21:21:50.3725825Z * [new branch] jithunnair-amd-patch-1 -> origin/jithunnair-amd-patch-1 2025-08-14T21:21:50.3726247Z * [new branch] justinchu/attention-tests -> origin/justinchu/attention-tests 2025-08-14T21:21:50.3726780Z * [new branch] justinchu/native-qdq -> origin/justinchu/native-qdq 2025-08-14T21:21:50.3728377Z * [new branch] justinchuby/JitScalarType -> origin/justinchuby/JitScalarType 2025-08-14T21:21:50.3729079Z * [new branch] justinchuby/dynamo-true -> origin/justinchuby/dynamo-true 2025-08-14T21:21:50.3729442Z * [new branch] justinchuby/opset-20 -> origin/justinchuby/opset-20 2025-08-14T21:21:50.3733367Z * [new branch] kainan666/xlf_debug -> origin/kainan666/xlf_debug 2025-08-14T21:21:50.3733523Z * [new branch] kainan_test -> origin/kainan_test 2025-08-14T21:21:50.3733739Z * [new branch] leslie/enable_poc_reduction_fusion -> origin/leslie/enable_poc_reduction_fusion 2025-08-14T21:21:50.3739695Z * [new branch] leslie/test_group_gemm_epilogues -> origin/leslie/test_group_gemm_epilogues 2025-08-14T21:21:50.3739910Z * [new branch] lessw2020/fix_cutlass_cache_error -> origin/lessw2020/fix_cutlass_cache_error 2025-08-14T21:21:50.3740288Z * [new branch] liaoxuan/shm_all_reduce -> origin/liaoxuan/shm_all_reduce 2025-08-14T21:21:50.3740440Z * [new branch] liaoxuan/tags_issue -> origin/liaoxuan/tags_issue 2025-08-14T21:21:50.3740632Z * [new branch] liaoxuan/test_fa_disable_softmax -> origin/liaoxuan/test_fa_disable_softmax 2025-08-14T21:21:50.3740790Z * [new branch] liaoxuan/test_int8_sdpa -> origin/liaoxuan/test_int8_sdpa 2025-08-14T21:21:50.3740934Z * [new branch] lintbuilddocker -> origin/lintbuilddocker 2025-08-14T21:21:50.3741082Z * [new branch] llama4-stable -> origin/llama4-stable 2025-08-14T21:21:50.3741211Z * [new branch] logdetfix -> origin/logdetfix 2025-08-14T21:21:50.3745502Z * [new branch] lts/release/1.8 -> origin/lts/release/1.8 2025-08-14T21:21:50.3745672Z * [new branch] lucaskabela/#94773 -> origin/lucaskabela/#94773 2025-08-14T21:21:50.3745854Z * [new branch] lucaskabela/fix_157452 -> origin/lucaskabela/fix_157452 2025-08-14T21:21:50.3746082Z * [new branch] lucaskabela/fix_circular_import_158120 -> origin/lucaskabela/fix_circular_import_158120 2025-08-14T21:21:50.3746263Z * [new branch] lucaskabela/func_under_decomp -> origin/lucaskabela/func_under_decomp 2025-08-14T21:21:50.3746471Z * [new branch] lucaskabela/functional_in_dynamo -> origin/lucaskabela/functional_in_dynamo 2025-08-14T21:21:50.3747023Z * [new branch] lucaskabela/install_params_as_graph_attr -> origin/lucaskabela/install_params_as_graph_attr 2025-08-14T21:21:50.3747205Z * [new branch] lucaskabela/issue_120648 -> origin/lucaskabela/issue_120648 2025-08-14T21:21:50.3747417Z * [new branch] lucaskabela/parameters_as_graph_attr -> origin/lucaskabela/parameters_as_graph_attr 2025-08-14T21:21:50.3747583Z * [new branch] lucaskabela/registry_fix -> origin/lucaskabela/registry_fix 2025-08-14T21:21:50.3747830Z * [new branch] lucaskabela/remove_aot_dispatcher_metadata -> origin/lucaskabela/remove_aot_dispatcher_metadata 2025-08-14T21:21:50.3747991Z * [new branch] lucaskabela/type_guards -> origin/lucaskabela/type_guards 2025-08-14T21:21:50.3748162Z * [new branch] lucaskabela/typing-misc -> origin/lucaskabela/typing-misc 2025-08-14T21:21:50.3754219Z * [new branch] lucaskabela/typing_backends -> origin/lucaskabela/typing_backends 2025-08-14T21:21:50.3754921Z * [new branch] lucaskabela/typing_bytecode_analysis_transform -> origin/lucaskabela/typing_bytecode_analysis_transform 2025-08-14T21:21:50.3755300Z * [new branch] lucaskabela/typing_cache_files -> origin/lucaskabela/typing_cache_files 2025-08-14T21:21:50.3755539Z * [new branch] lucaskabela/typing_compile_autograd -> origin/lucaskabela/typing_compile_autograd 2025-08-14T21:21:50.3755896Z * [new branch] lucaskabela/typing_debug_utils.py -> origin/lucaskabela/typing_debug_utils.py 2025-08-14T21:21:50.3756081Z * [new branch] lucaskabela/typing_decorators -> origin/lucaskabela/typing_decorators 2025-08-14T21:21:50.3756253Z * [new branch] lucaskabela/typing_eval_frame -> origin/lucaskabela/typing_eval_frame 2025-08-14T21:21:50.3756579Z * [new branch] lucaskabela/typing_for_codegen -> origin/lucaskabela/typing_for_codegen 2025-08-14T21:21:50.3756779Z * [new branch] lucaskabela/typing_output_graph -> origin/lucaskabela/typing_output_graph 2025-08-14T21:21:50.3757068Z * [new branch] lucaskabela/typing_side_effects -> origin/lucaskabela/typing_side_effects 2025-08-14T21:21:50.3757361Z * [new branch] lucaskabela/typing_source_guard -> origin/lucaskabela/typing_source_guard 2025-08-14T21:21:50.3757688Z * [new branch] lucaskabela/typing_trace_rules -> origin/lucaskabela/typing_trace_rules 2025-08-14T21:21:50.3758081Z * [new branch] lucaskabela/typing_utils.py -> origin/lucaskabela/typing_utils.py 2025-08-14T21:21:50.3758422Z * [new branch] lucaskabela/typing_utils_improvements -> origin/lucaskabela/typing_utils_improvements 2025-08-14T21:21:50.3762693Z * [new branch] main -> origin/main 2025-08-14T21:21:50.3763335Z * [new branch] main-enable-b200-distributed-tests -> origin/main-enable-b200-distributed-tests 2025-08-14T21:21:50.3763517Z * [new branch] malfet-patch-1 -> origin/malfet-patch-1 2025-08-14T21:21:50.3763676Z * [new branch] malfet-patch-10 -> origin/malfet-patch-10 2025-08-14T21:21:50.3763822Z * [new branch] malfet-patch-11 -> origin/malfet-patch-11 2025-08-14T21:21:50.3763974Z * [new branch] malfet-patch-13 -> origin/malfet-patch-13 2025-08-14T21:21:50.3764446Z * [new branch] malfet-patch-14 -> origin/malfet-patch-14 2025-08-14T21:21:50.3764632Z * [new branch] malfet-patch-2 -> origin/malfet-patch-2 2025-08-14T21:21:50.3764783Z * [new branch] malfet-patch-3 -> origin/malfet-patch-3 2025-08-14T21:21:50.3764916Z * [new branch] malfet-patch-4 -> origin/malfet-patch-4 2025-08-14T21:21:50.3765046Z * [new branch] malfet-patch-5 -> origin/malfet-patch-5 2025-08-14T21:21:50.3765193Z * [new branch] malfet-patch-6 -> origin/malfet-patch-6 2025-08-14T21:21:50.3766061Z * [new branch] malfet-patch-7 -> origin/malfet-patch-7 2025-08-14T21:21:50.3768054Z * [new branch] malfet-patch-8 -> origin/malfet-patch-8 2025-08-14T21:21:50.3768230Z * [new branch] malfet-patch-9 -> origin/malfet-patch-9 2025-08-14T21:21:50.3769297Z * [new branch] malfet/delete-upsteam-cuda -> origin/malfet/delete-upsteam-cuda 2025-08-14T21:21:50.3769722Z * [new branch] malfet/mps-implement-col2im -> origin/malfet/mps-implement-col2im 2025-08-14T21:21:50.3778039Z * [new branch] manuel/fix_multidim_boolean_indexing -> origin/manuel/fix_multidim_boolean_indexing 2025-08-14T21:21:50.3782998Z * [new branch] manuel/np_empty_ellipsis -> origin/manuel/np_empty_ellipsis 2025-08-14T21:21:50.3787424Z * [new branch] manuel/test-ops-common-allow-mps -> origin/manuel/test-ops-common-allow-mps 2025-08-14T21:21:50.3792570Z * [new branch] metascroy-patch-1 -> origin/metascroy-patch-1 2025-08-14T21:21:50.3794397Z * [new branch] mlazos/S429861-debug -> origin/mlazos/S429861-debug 2025-08-14T21:21:50.3794540Z * [new branch] mlazos/aa -> origin/mlazos/aa 2025-08-14T21:21:50.3794712Z * [new branch] mlazos/arg-renames -> origin/mlazos/arg-renames 2025-08-14T21:21:50.3795107Z * [new branch] mlazos/backup-test-branch -> origin/mlazos/backup-test-branch 2025-08-14T21:21:50.3795271Z * [new branch] mlazos/bad-cudagraphs -> origin/mlazos/bad-cudagraphs 2025-08-14T21:21:50.3795423Z * [new branch] mlazos/baseline -> origin/mlazos/baseline 2025-08-14T21:21:50.3795613Z * [new branch] mlazos/baseline-graph-breaks -> origin/mlazos/baseline-graph-breaks 2025-08-14T21:21:50.3795756Z * [new branch] mlazos/beta-tensor -> origin/mlazos/beta-tensor 2025-08-14T21:21:50.3795882Z * [new branch] mlazos/buffers -> origin/mlazos/buffers 2025-08-14T21:21:50.3796008Z * [new branch] mlazos/buffers2 -> origin/mlazos/buffers2 2025-08-14T21:21:50.3796142Z * [new branch] mlazos/buffers3 -> origin/mlazos/buffers3 2025-08-14T21:21:50.3796264Z * [new branch] mlazos/ck2 -> origin/mlazos/ck2 2025-08-14T21:21:50.3796474Z * [new branch] mlazos/combokernels -> origin/mlazos/combokernels 2025-08-14T21:21:50.3796612Z * [new branch] mlazos/ctx-cleanup -> origin/mlazos/ctx-cleanup 2025-08-14T21:21:50.3796769Z * [new branch] mlazos/cudagraph-tests -> origin/mlazos/cudagraph-tests 2025-08-14T21:21:50.3796961Z * [new branch] mlazos/cudagraphs-measurement -> origin/mlazos/cudagraphs-measurement 2025-08-14T21:21:50.3797102Z * [new branch] mlazos/cutlass-test -> origin/mlazos/cutlass-test 2025-08-14T21:21:50.3797262Z * [new branch] mlazos/cutlass-topo-bug -> origin/mlazos/cutlass-topo-bug 2025-08-14T21:21:50.3797403Z * [new branch] mlazos/data-gather -> origin/mlazos/data-gather 2025-08-14T21:21:50.3797551Z * [new branch] mlazos/data-ptrs2 -> origin/mlazos/data-ptrs2 2025-08-14T21:21:50.3797694Z * [new branch] mlazos/data-ptrs3 -> origin/mlazos/data-ptrs3 2025-08-14T21:21:50.3797856Z * [new branch] mlazos/dataclass-proxy -> origin/mlazos/dataclass-proxy 2025-08-14T21:21:50.3797985Z * [new branch] mlazos/dc-attrs -> origin/mlazos/dc-attrs 2025-08-14T21:21:50.3798128Z * [new branch] mlazos/dc-helion -> origin/mlazos/dc-helion 2025-08-14T21:21:50.3798259Z * [new branch] mlazos/dict-fix -> origin/mlazos/dict-fix 2025-08-14T21:21:50.3798422Z * [new branch] mlazos/disable-closures -> origin/mlazos/disable-closures 2025-08-14T21:21:50.3798558Z * [new branch] mlazos/disable-tf -> origin/mlazos/disable-tf 2025-08-14T21:21:50.3798689Z * [new branch] mlazos/dupe-fix -> origin/mlazos/dupe-fix 2025-08-14T21:21:50.3798830Z * [new branch] mlazos/dyn-batch -> origin/mlazos/dyn-batch 2025-08-14T21:21:50.3798954Z * [new branch] mlazos/evt -> origin/mlazos/evt 2025-08-14T21:21:50.3799107Z * [new branch] mlazos/exp_disable -> origin/mlazos/exp_disable 2025-08-14T21:21:50.3799260Z * [new branch] mlazos/extract-examples -> origin/mlazos/extract-examples 2025-08-14T21:21:50.3799396Z * [new branch] mlazos/foreach-op -> origin/mlazos/foreach-op 2025-08-14T21:21:50.3799525Z * [new branch] mlazos/fp8 -> origin/mlazos/fp8 2025-08-14T21:21:50.3799682Z * [new branch] mlazos/fp8-bias -> origin/mlazos/fp8-bias 2025-08-14T21:21:50.3800258Z * [new branch] mlazos/fp8-bias-fusion -> origin/mlazos/fp8-bias-fusion 2025-08-14T21:21:50.3800428Z * [new branch] mlazos/freezing -> origin/mlazos/freezing 2025-08-14T21:21:50.3800574Z * [new branch] mlazos/h-comp -> origin/mlazos/h-comp 2025-08-14T21:21:50.3800708Z * [new branch] mlazos/h-comp2 -> origin/mlazos/h-comp2 2025-08-14T21:21:50.3801062Z * [new branch] mlazos/hash-hop -> origin/mlazos/hash-hop 2025-08-14T21:21:50.3805161Z * [new branch] mlazos/hc -> origin/mlazos/hc 2025-08-14T21:21:50.3805328Z * [new branch] mlazos/hc-cycles -> origin/mlazos/hc-cycles 2025-08-14T21:21:50.3805493Z * [new branch] mlazos/hc-fixes -> origin/mlazos/hc-fixes 2025-08-14T21:21:50.3805633Z * [new branch] mlazos/hc-fixes3 -> origin/mlazos/hc-fixes3 2025-08-14T21:21:50.3805776Z * [new branch] mlazos/hc-fixes4 -> origin/mlazos/hc-fixes4 2025-08-14T21:21:50.3805908Z * [new branch] mlazos/hc-hf -> origin/mlazos/hc-hf 2025-08-14T21:21:50.3806041Z * [new branch] mlazos/hc-mut -> origin/mlazos/hc-mut 2025-08-14T21:21:50.3806181Z * [new branch] mlazos/hc10 -> origin/mlazos/hc10 2025-08-14T21:21:50.3806436Z * [new branch] mlazos/hc11 -> origin/mlazos/hc11 2025-08-14T21:21:50.3806570Z * [new branch] mlazos/hc12 -> origin/mlazos/hc12 2025-08-14T21:21:50.3806691Z * [new branch] mlazos/hc13 -> origin/mlazos/hc13 2025-08-14T21:21:50.3813740Z * [new branch] mlazos/hc14 -> origin/mlazos/hc14 2025-08-14T21:21:50.3813911Z * [new branch] mlazos/hc15 -> origin/mlazos/hc15 2025-08-14T21:21:50.3814040Z * [new branch] mlazos/hc2 -> origin/mlazos/hc2 2025-08-14T21:21:50.3814169Z * [new branch] mlazos/hc4 -> origin/mlazos/hc4 2025-08-14T21:21:50.3814290Z * [new branch] mlazos/hc5 -> origin/mlazos/hc5 2025-08-14T21:21:50.3814409Z * [new branch] mlazos/hc6 -> origin/mlazos/hc6 2025-08-14T21:21:50.3814556Z * [new branch] mlazos/hc7 -> origin/mlazos/hc7 2025-08-14T21:21:50.3814682Z * [new branch] mlazos/hc8 -> origin/mlazos/hc8 2025-08-14T21:21:50.3814808Z * [new branch] mlazos/hc9 -> origin/mlazos/hc9 2025-08-14T21:21:50.3814960Z * [new branch] mlazos/hc_baseline2 -> origin/mlazos/hc_baseline2 2025-08-14T21:21:50.3815110Z * [new branch] mlazos/hop-modes -> origin/mlazos/hop-modes 2025-08-14T21:21:50.3820019Z * [new branch] mlazos/init-per-param -> origin/mlazos/init-per-param 2025-08-14T21:21:50.3820584Z * [new branch] mlazos/init_per_param -> origin/mlazos/init_per_param 2025-08-14T21:21:50.3820769Z * [new branch] mlazos/less-guards -> origin/mlazos/less-guards 2025-08-14T21:21:50.3820952Z * [new branch] mlazos/lr-composibility -> origin/mlazos/lr-composibility 2025-08-14T21:21:50.3821109Z * [new branch] mlazos/main -> origin/mlazos/main 2025-08-14T21:21:50.3821296Z * [new branch] mlazos/main-test-enablement -> origin/mlazos/main-test-enablement 2025-08-14T21:21:50.3821427Z * [new branch] mlazos/main2 -> origin/mlazos/main2 2025-08-14T21:21:50.3821561Z * [new branch] mlazos/mcg -> origin/mlazos/mcg 2025-08-14T21:21:50.3821686Z * [new branch] mlazos/mcg2 -> origin/mlazos/mcg2 2025-08-14T21:21:50.3821852Z * [new branch] mlazos/meta-guards -> origin/mlazos/meta-guards 2025-08-14T21:21:50.3827071Z * [new branch] mlazos/mlazos/ck2 -> origin/mlazos/mlazos/ck2 2025-08-14T21:21:50.3827325Z * [new branch] mlazos/mlazos/foreach-map-adam -> origin/mlazos/mlazos/foreach-map-adam 2025-08-14T21:21:50.3827511Z * [new branch] mlazos/mlazos/tf-mode-backup -> origin/mlazos/mlazos/tf-mode-backup 2025-08-14T21:21:50.3827675Z * [new branch] mlazos/mod-fix -> origin/mlazos/mod-fix 2025-08-14T21:21:50.3828015Z * [new branch] mlazos/mode-fix -> origin/mlazos/mode-fix 2025-08-14T21:21:50.3828163Z * [new branch] mlazos/more-tests -> origin/mlazos/more-tests 2025-08-14T21:21:50.3828356Z * [new branch] mlazos/nested-dc -> origin/mlazos/nested-dc 2025-08-14T21:21:50.3828496Z * [new branch] mlazos/no-cpp -> origin/mlazos/no-cpp 2025-08-14T21:21:50.3828728Z * [new branch] mlazos/no-init-group-handling -> origin/mlazos/no-init-group-handling 2025-08-14T21:21:50.3831181Z * [new branch] mlazos/offsets -> origin/mlazos/offsets 2025-08-14T21:21:50.3831382Z * [new branch] mlazos/opt-bench-exp2 -> origin/mlazos/opt-bench-exp2 2025-08-14T21:21:50.3832865Z * [new branch] mlazos/opt-incr -> origin/mlazos/opt-incr 2025-08-14T21:21:50.3833227Z * [new branch] mlazos/proxy-ctors -> origin/mlazos/proxy-ctors 2025-08-14T21:21:50.3833415Z * [new branch] mlazos/proxy-opt -> origin/mlazos/proxy-opt 2025-08-14T21:21:50.3833554Z * [new branch] mlazos/quant-fix -> origin/mlazos/quant-fix 2025-08-14T21:21:50.3833707Z * [new branch] mlazos/rm-buf-names -> origin/mlazos/rm-buf-names 2025-08-14T21:21:50.3833855Z * [new branch] mlazos/rm-spam -> origin/mlazos/rm-spam 2025-08-14T21:21:50.3835889Z * [new branch] mlazos/rtp -> origin/mlazos/rtp 2025-08-14T21:21:50.3836224Z * [new branch] mlazos/static-idx-dbg -> origin/mlazos/static-idx-dbg 2025-08-14T21:21:50.3836406Z * [new branch] mlazos/static-inputs-log -> origin/mlazos/static-inputs-log 2025-08-14T21:21:50.3836592Z * [new branch] mlazos/sub-param-fix -> origin/mlazos/sub-param-fix 2025-08-14T21:21:50.3836765Z * [new branch] mlazos/td-fix2 -> origin/mlazos/td-fix2 2025-08-14T21:21:50.3836963Z * [new branch] mlazos/tensor-hasattr2 -> origin/mlazos/tensor-hasattr2 2025-08-14T21:21:50.3837091Z * [new branch] mlazos/test -> origin/mlazos/test 2025-08-14T21:21:50.3837227Z * [new branch] mlazos/tf-mode -> origin/mlazos/tf-mode 2025-08-14T21:21:50.3837386Z * [new branch] mlazos/tf-mode-backup2 -> origin/mlazos/tf-mode-backup2 2025-08-14T21:21:50.3841576Z * [new branch] mlazos/tf-mode-reland -> origin/mlazos/tf-mode-reland 2025-08-14T21:21:50.3841913Z * [new branch] mlazos/tf-mode-reland2 -> origin/mlazos/tf-mode-reland2 2025-08-14T21:21:50.3842119Z * [new branch] mlazos/tf-mode-reland3 -> origin/mlazos/tf-mode-reland3 2025-08-14T21:21:50.3842270Z * [new branch] mlazos/topo-fix -> origin/mlazos/topo-fix 2025-08-14T21:21:50.3842473Z * [new branch] mlazos/triton-no-epi -> origin/mlazos/triton-no-epi 2025-08-14T21:21:50.3842629Z * [new branch] mlazos/tune-proto -> origin/mlazos/tune-proto 2025-08-14T21:21:50.3842782Z * [new branch] mlazos/tuple-fixes -> origin/mlazos/tuple-fixes 2025-08-14T21:21:50.3842929Z * [new branch] mlazos/tuple-fixes2 -> origin/mlazos/tuple-fixes2 2025-08-14T21:21:50.3843082Z * [new branch] mlazos/tuple-handling -> origin/mlazos/tuple-handling 2025-08-14T21:21:50.3843236Z * [new branch] mlazos/user-streams -> origin/mlazos/user-streams 2025-08-14T21:21:50.3843507Z * [new branch] mlazos/vary-beta -> origin/mlazos/vary-beta 2025-08-14T21:21:50.3846131Z * [new branch] mlazos/vary-beta2 -> origin/mlazos/vary-beta2 2025-08-14T21:21:50.3846318Z * [new branch] mlazos/weird-perf1 -> origin/mlazos/weird-perf1 2025-08-14T21:21:50.3846978Z * [new branch] mm_out_dtype_compile -> origin/mm_out_dtype_compile 2025-08-14T21:21:50.3847167Z * [new branch] modify-setupvllm -> origin/modify-setupvllm 2025-08-14T21:21:50.3848468Z * [new branch] move-theme-out-docker -> origin/move-theme-out-docker 2025-08-14T21:21:50.3848922Z * [new branch] mps-linear-1d -> origin/mps-linear-1d 2025-08-14T21:21:50.3849402Z * [new branch] msaroufim/be1 -> origin/msaroufim/be1 2025-08-14T21:21:50.3850476Z * [new branch] msaroufim/cn_path -> origin/msaroufim/cn_path 2025-08-14T21:21:50.3850739Z * [new branch] msaroufim/dtensorfusedadam -> origin/msaroufim/dtensorfusedadam 2025-08-14T21:21:50.3852091Z * [new branch] msaroufim/reduce -> origin/msaroufim/reduce 2025-08-14T21:21:50.3852968Z * [new branch] mtia/basic-cmake -> origin/mtia/basic-cmake 2025-08-14T21:21:50.3853518Z * [new branch] muon_dev -> origin/muon_dev 2025-08-14T21:21:50.3855055Z * [new branch] new-modifiy-setupvllm -> origin/new-modifiy-setupvllm 2025-08-14T21:21:50.3855683Z * [new branch] new-setupvllm -> origin/new-setupvllm 2025-08-14T21:21:50.3855847Z * [new branch] newtest-base -> origin/newtest-base 2025-08-14T21:21:50.3856908Z * [new branch] ngimel/cat_perf -> origin/ngimel/cat_perf 2025-08-14T21:21:50.3857190Z * [new branch] ngimel/cudamoduleload -> origin/ngimel/cudamoduleload 2025-08-14T21:21:50.3858633Z * [new branch] ngimel/fabric_driver_version -> origin/ngimel/fabric_driver_version 2025-08-14T21:21:50.3858928Z * [new branch] ngimel/fabric_symm -> origin/ngimel/fabric_symm 2025-08-14T21:21:50.3860101Z * [new branch] ngimel/gg_new -> origin/ngimel/gg_new 2025-08-14T21:21:50.3860323Z * [new branch] ngimel/grouped_mm_checks -> origin/ngimel/grouped_mm_checks 2025-08-14T21:21:50.3861755Z * [new branch] ngimel/guardfabric -> origin/ngimel/guardfabric 2025-08-14T21:21:50.3862023Z * [new branch] ngimel/index_None -> origin/ngimel/index_None 2025-08-14T21:21:50.3862339Z * [new branch] ngimel/modeguard -> origin/ngimel/modeguard 2025-08-14T21:21:50.3863471Z * [new branch] ngimel/multicast_fix -> origin/ngimel/multicast_fix 2025-08-14T21:21:50.3865084Z * [new branch] ngimel/unbind_multimem -> origin/ngimel/unbind_multimem 2025-08-14T21:21:50.3865616Z * [new branch] nightly -> origin/nightly 2025-08-14T21:21:50.3865787Z * [new branch] nmacchioni-patch-10 -> origin/nmacchioni-patch-10 2025-08-14T21:21:50.3870200Z * [new branch] nmacchioni-patch-7 -> origin/nmacchioni-patch-7 2025-08-14T21:21:50.3870412Z * [new branch] nmacchioni-patch-8 -> origin/nmacchioni-patch-8 2025-08-14T21:21:50.3870569Z * [new branch] nmacchioni-patch-9 -> origin/nmacchioni-patch-9 2025-08-14T21:21:50.3870727Z * [new branch] nullplay_fuse_matmul -> origin/nullplay_fuse_matmul 2025-08-14T21:21:50.3870990Z * [new branch] nweidia/enable-B200-inductor-nightly-ci -> origin/nweidia/enable-B200-inductor-nightly-ci 2025-08-14T21:21:50.3871119Z * [new branch] one-off -> origin/one-off 2025-08-14T21:21:50.3877304Z * [new branch] orig/release/1.10 -> origin/orig/release/1.10 2025-08-14T21:21:50.3877985Z * [new branch] orig/release/1.11 -> origin/orig/release/1.11 2025-08-14T21:21:50.3878166Z * [new branch] orig/release/1.12 -> origin/orig/release/1.12 2025-08-14T21:21:50.3878517Z * [new branch] orig/release/1.13 -> origin/orig/release/1.13 2025-08-14T21:21:50.3878839Z * [new branch] orig/release/1.6 -> origin/orig/release/1.6 2025-08-14T21:21:50.3879851Z * [new branch] orig/release/1.7 -> origin/orig/release/1.7 2025-08-14T21:21:50.3879997Z * [new branch] orig/release/1.8 -> origin/orig/release/1.8 2025-08-14T21:21:50.3880124Z * [new branch] orig/release/1.9 -> origin/orig/release/1.9 2025-08-14T21:21:50.3880252Z * [new branch] orig/release/2.0 -> origin/orig/release/2.0 2025-08-14T21:21:50.3880391Z * [new branch] orig/release/2.1 -> origin/orig/release/2.1 2025-08-14T21:21:50.3880518Z * [new branch] orig/release/2.2 -> origin/orig/release/2.2 2025-08-14T21:21:50.3880652Z * [new branch] orig/release/2.3 -> origin/orig/release/2.3 2025-08-14T21:21:50.3880780Z * [new branch] orig/release/2.4 -> origin/orig/release/2.4 2025-08-14T21:21:50.3884234Z * [new branch] orig/release/2.5 -> origin/orig/release/2.5 2025-08-14T21:21:50.3884413Z * [new branch] orig/release/2.6 -> origin/orig/release/2.6 2025-08-14T21:21:50.3884551Z * [new branch] orig/release/2.7 -> origin/orig/release/2.7 2025-08-14T21:21:50.3884689Z * [new branch] orig/release/2.8 -> origin/orig/release/2.8 2025-08-14T21:21:50.3884838Z * [new branch] oulgen/fx_graph -> origin/oulgen/fx_graph 2025-08-14T21:21:50.3884990Z * [new branch] padded-tensor -> origin/padded-tensor 2025-08-14T21:21:50.3886397Z * [new branch] parallel_cat -> origin/parallel_cat 2025-08-14T21:21:50.3886760Z * [new branch] pca2 -> origin/pca2 2025-08-14T21:21:50.3887103Z * [new branch] pianpwk-patch-1 -> origin/pianpwk-patch-1 2025-08-14T21:21:50.3889263Z * [new branch] pianpwk/backed_size_oblivious_export -> origin/pianpwk/backed_size_oblivious_export 2025-08-14T21:21:50.3889813Z * [new branch] pianpwk/dde_repeat_cat -> origin/pianpwk/dde_repeat_cat 2025-08-14T21:21:50.3890035Z * [new branch] pianpwk/draft_export_normalize -> origin/pianpwk/draft_export_normalize 2025-08-14T21:21:50.3890241Z * [new branch] pianpwk/dynamic_source_dim -> origin/pianpwk/dynamic_source_dim 2025-08-14T21:21:50.3892823Z * [new branch] pianpwk/invalidate_fake_memo -> origin/pianpwk/invalidate_fake_memo 2025-08-14T21:21:50.3893197Z * [new branch] pianpwk/lru_cache_bound_sympy -> origin/pianpwk/lru_cache_bound_sympy 2025-08-14T21:21:50.3893448Z * [new branch] pianpwk/max_1_strides -> origin/pianpwk/max_1_strides 2025-08-14T21:21:50.3893623Z * [new branch] pianpwk/nonzero_memo -> origin/pianpwk/nonzero_memo 2025-08-14T21:21:50.3894390Z * [new branch] pianpwk/oblivious_reshape_view_better -> origin/pianpwk/oblivious_reshape_view_better 2025-08-14T21:21:50.3898380Z * [new branch] pianpwk/oblivious_should_swap -> origin/pianpwk/oblivious_should_swap 2025-08-14T21:21:50.3898768Z * [new branch] pianpwk/oblivious_slice_forward -> origin/pianpwk/oblivious_slice_forward 2025-08-14T21:21:50.3899059Z * [new branch] pianpwk/oblivious_where -> origin/pianpwk/oblivious_where 2025-08-14T21:21:50.3899261Z * [new branch] pianpwk/param_static_pgo -> origin/pianpwk/param_static_pgo 2025-08-14T21:21:50.3899561Z * [new branch] pianpwk/pre_forward_hook -> origin/pianpwk/pre_forward_hook 2025-08-14T21:21:50.3899771Z * [new branch] pianpwk/remove_guard_fail_break -> origin/pianpwk/remove_guard_fail_break 2025-08-14T21:21:50.3899950Z * [new branch] pianpwk/slice_fresh_symbols -> origin/pianpwk/slice_fresh_symbols 2025-08-14T21:21:50.3900259Z * [new branch] pianpwk/sym_sym -> origin/pianpwk/sym_sym 2025-08-14T21:21:50.3901179Z * [new branch] pianpwk/test_slice_fake_impl -> origin/pianpwk/test_slice_fake_impl 2025-08-14T21:21:50.3901563Z * [new branch] pianpwk/unbacked_channels_last -> origin/pianpwk/unbacked_channels_last 2025-08-14T21:21:50.3905346Z * [new branch] pianpwk/unbacked_safe_conv1d -> origin/pianpwk/unbacked_safe_conv1d 2025-08-14T21:21:50.3905685Z * [new branch] pianpwk/unbacked_sdpa_flash -> origin/pianpwk/unbacked_sdpa_flash 2025-08-14T21:21:50.3905894Z * [new branch] pianpwk/unbacked_should_swap -> origin/pianpwk/unbacked_should_swap 2025-08-14T21:21:50.3906087Z * [new branch] pianpwk/unbacked_should_swap_2 -> origin/pianpwk/unbacked_should_swap_2 2025-08-14T21:21:50.3910458Z * [new branch] pianpwk/unbacked_slice_binding -> origin/pianpwk/unbacked_slice_binding 2025-08-14T21:21:50.3910880Z * [new branch] pianpwk/unbacked_slice_forward -> origin/pianpwk/unbacked_slice_forward 2025-08-14T21:21:50.3911068Z * [new branch] pianpwk/verbose_tensor_guards -> origin/pianpwk/verbose_tensor_guards 2025-08-14T21:21:50.3911239Z * [new branch] pianpwk/wan21_reshape -> origin/pianpwk/wan21_reshape 2025-08-14T21:21:50.3911417Z * [new branch] pianpwk/whitelist_optimizer -> origin/pianpwk/whitelist_optimizer 2025-08-14T21:21:50.3911551Z * [new branch] pin-torchao -> origin/pin-torchao 2025-08-14T21:21:50.3911723Z * [new branch] piz/fall_back_missing_0705 -> origin/piz/fall_back_missing_0705 2025-08-14T21:21:50.3911876Z * [new branch] piz/fall_back_missing_0716 -> origin/piz/fall_back_missing_0716 2025-08-14T21:21:50.3912165Z * [new branch] piz/fill_dist_cost_0702-3 -> origin/piz/fill_dist_cost_0702-3 2025-08-14T21:21:50.3918034Z * [new branch] piz/fill_dist_cost_0702-4 -> origin/piz/fill_dist_cost_0702-4 2025-08-14T21:21:50.3920374Z * [new branch] piz/fill_dist_cost_0702-5 -> origin/piz/fill_dist_cost_0702-5 2025-08-14T21:21:50.3920646Z * [new branch] piz/fix_sort_ -> origin/piz/fix_sort_ 2025-08-14T21:21:50.3925384Z * [new branch] piz/improve_scatter_0808 -> origin/piz/improve_scatter_0808 2025-08-14T21:21:50.3925570Z * [new branch] pool-separate -> origin/pool-separate 2025-08-14T21:21:50.3925701Z * [new branch] pr-156087 -> origin/pr-156087 2025-08-14T21:21:50.3925831Z * [new branch] pr/131860 -> origin/pr/131860 2025-08-14T21:21:50.3925977Z * [new branch] predispatch_to -> origin/predispatch_to 2025-08-14T21:21:50.3926114Z * [new branch] pt-opt-cuda3 -> origin/pt-opt-cuda3 2025-08-14T21:21:50.3926296Z * [new branch] pt2e-cache-model-device -> origin/pt2e-cache-model-device 2025-08-14T21:21:50.3926467Z * [new branch] pull-latest-theme -> origin/pull-latest-theme 2025-08-14T21:21:50.3926607Z * [new branch] pyobjectslot -> origin/pyobjectslot 2025-08-14T21:21:50.3926770Z * [new branch] python_compiled_autograd -> origin/python_compiled_autograd 2025-08-14T21:21:50.3926932Z * [new branch] qchip/export-D54134695 -> origin/qchip/export-D54134695 2025-08-14T21:21:50.3927068Z * [new branch] quint-bits -> origin/quint-bits 2025-08-14T21:21:50.3927195Z * [new branch] release/1.10 -> origin/release/1.10 2025-08-14T21:21:50.3927337Z * [new branch] release/1.11 -> origin/release/1.11 2025-08-14T21:21:50.3927459Z * [new branch] release/1.12 -> origin/release/1.12 2025-08-14T21:21:50.3927855Z * [new branch] release/1.13 -> origin/release/1.13 2025-08-14T21:21:50.3929115Z * [new branch] release/1.4 -> origin/release/1.4 2025-08-14T21:21:50.3929273Z * [new branch] release/1.4.1 -> origin/release/1.4.1 2025-08-14T21:21:50.3929982Z * [new branch] release/1.5 -> origin/release/1.5 2025-08-14T21:21:50.3933748Z * [new branch] release/1.6 -> origin/release/1.6 2025-08-14T21:21:50.3934061Z * [new branch] release/1.7 -> origin/release/1.7 2025-08-14T21:21:50.3934237Z * [new branch] release/1.8 -> origin/release/1.8 2025-08-14T21:21:50.3934417Z * [new branch] release/1.9 -> origin/release/1.9 2025-08-14T21:21:50.3934563Z * [new branch] release/2.0 -> origin/release/2.0 2025-08-14T21:21:50.3934698Z * [new branch] release/2.1 -> origin/release/2.1 2025-08-14T21:21:50.3942527Z * [new branch] release/2.2 -> origin/release/2.2 2025-08-14T21:21:50.3942780Z * [new branch] release/2.3 -> origin/release/2.3 2025-08-14T21:21:50.3942978Z * [new branch] release/2.4 -> origin/release/2.4 2025-08-14T21:21:50.3943116Z * [new branch] release/2.5 -> origin/release/2.5 2025-08-14T21:21:50.3943279Z * [new branch] release/2.6 -> origin/release/2.6 2025-08-14T21:21:50.3943463Z * [new branch] release/2.7 -> origin/release/2.7 2025-08-14T21:21:50.3945858Z * [new branch] release/2.8 -> origin/release/2.8 2025-08-14T21:21:50.3946191Z * [new branch] release_notes -> origin/release_notes 2025-08-14T21:21:50.3946391Z * [new branch] remove-actionable-label -> origin/remove-actionable-label 2025-08-14T21:21:50.3946647Z * [new branch] remove-ao -> origin/remove-ao 2025-08-14T21:21:50.3946927Z * [new branch] replace-pytorch-labs-20250812-195836 -> origin/replace-pytorch-labs-20250812-195836 2025-08-14T21:21:50.3947153Z * [new branch] replace-pytorch-labs-20250812-200248 -> origin/replace-pytorch-labs-20250812-200248 2025-08-14T21:21:50.3947498Z * [new branch] replace-pytorch-labs-20250812-200324 -> origin/replace-pytorch-labs-20250812-200324 2025-08-14T21:21:50.3947715Z * [new branch] replace-pytorch-labs-20250812-204020 -> origin/replace-pytorch-labs-20250812-204020 2025-08-14T21:21:50.3947929Z * [new branch] replace-pytorch-labs-20250812-204125 -> origin/replace-pytorch-labs-20250812-204125 2025-08-14T21:21:50.3952808Z * [new branch] replace-pytorch-labs-20250812-205624 -> origin/replace-pytorch-labs-20250812-205624 2025-08-14T21:21:50.3953245Z * [new branch] revert-131069-gh/krzysztofjordan/1/head -> origin/revert-131069-gh/krzysztofjordan/1/head 2025-08-14T21:21:50.3953611Z * [new branch] revert-131469-gh/andrewor14/51/head -> origin/revert-131469-gh/andrewor14/51/head 2025-08-14T21:21:50.3953886Z * [new branch] revert-156870-gh/skarjala/3/head -> origin/revert-156870-gh/skarjala/3/head 2025-08-14T21:21:50.3958140Z * [new branch] revert-157914-cherry-pick-157503-by-pytorch_bot_bot_ -> origin/revert-157914-cherry-pick-157503-by-pytorch_bot_bot_ 2025-08-14T21:21:50.3958339Z * [new branch] revert-direct-updates -> origin/revert-direct-updates 2025-08-14T21:21:50.3958497Z * [new branch] rocm-monitoring -> origin/rocm-monitoring 2025-08-14T21:21:50.3958766Z * [new branch] ryanguo99/cleanup-dynamo-expected-failures -> origin/ryanguo99/cleanup-dynamo-expected-failures 2025-08-14T21:21:50.3958942Z * [new branch] ryanguo99/fix-closure-var -> origin/ryanguo99/fix-closure-var 2025-08-14T21:21:50.3959110Z * [new branch] rzou/faketensor_bench -> origin/rzou/faketensor_bench 2025-08-14T21:21:50.3959446Z * [new branch] rzou/njt -> origin/rzou/njt 2025-08-14T21:21:50.3959593Z * [new branch] rzou/operator -> origin/rzou/operator 2025-08-14T21:21:50.3959866Z * [new branch] rzou/pca -> origin/rzou/pca 2025-08-14T21:21:50.3963302Z * [new branch] rzou/pipe_split -> origin/rzou/pipe_split 2025-08-14T21:21:50.3963484Z * [new branch] rzou/realprop -> origin/rzou/realprop 2025-08-14T21:21:50.3963928Z * [new branch] rzou/setup_context -> origin/rzou/setup_context 2025-08-14T21:21:50.3964203Z * [new branch] sanchitintel/refactor_aten_int8_woq_gemm -> origin/sanchitintel/refactor_aten_int8_woq_gemm 2025-08-14T21:21:50.3964526Z * [new branch] sanchitintel/weird_thing_with_test_cpu_select_algorithm -> origin/sanchitintel/weird_thing_with_test_cpu_select_algorithm 2025-08-14T21:21:50.3964892Z * [new branch] sapling-pr-archive-SS-JIA -> origin/sapling-pr-archive-SS-JIA 2025-08-14T21:21:50.3965019Z * [new branch] save -> origin/save 2025-08-14T21:21:50.3965161Z * [new branch] sdym/2.5.1 -> origin/sdym/2.5.1 2025-08-14T21:21:50.3965324Z * [new branch] seemethere-patch-1 -> origin/seemethere-patch-1 2025-08-14T21:21:50.3965470Z * [new branch] setup-torchci -> origin/setup-torchci 2025-08-14T21:21:50.3965611Z * [new branch] setupvllm -> origin/setupvllm 2025-08-14T21:21:50.3966240Z * [new branch] share_and_pin_fork -> origin/share_and_pin_fork 2025-08-14T21:21:50.3967420Z * [new branch] shengf/fx-xform-perf -> origin/shengf/fx-xform-perf 2025-08-14T21:21:50.3968341Z * [new branch] shikaili_fp8_allgather -> origin/shikaili_fp8_allgather 2025-08-14T21:21:50.3969111Z * [new branch] shoumikhin-patch-12 -> origin/shoumikhin-patch-12 2025-08-14T21:21:50.3973945Z * [new branch] simplify-fq-per-channel -> origin/simplify-fq-per-channel 2025-08-14T21:21:50.3974122Z * [new branch] solve-accuracy-fix -> origin/solve-accuracy-fix 2025-08-14T21:21:50.3974277Z * [new branch] sqzhang/flight4 -> origin/sqzhang/flight4 2025-08-14T21:21:50.3974437Z * [new branch] sqzhang/flight4plus -> origin/sqzhang/flight4plus 2025-08-14T21:21:50.3974615Z * [new branch] sraikund/record_funct_test -> origin/sraikund/record_funct_test 2025-08-14T21:21:50.3974755Z * [new branch] sraikund16/test -> origin/sraikund16/test 2025-08-14T21:21:50.3975079Z * [new branch] stablize-compilation-time -> origin/stablize-compilation-time 2025-08-14T21:21:50.3976768Z * [new branch] standalone-templates -> origin/standalone-templates 2025-08-14T21:21:50.3977016Z * [new branch] standalone_package_weights -> origin/standalone_package_weights 2025-08-14T21:21:50.3977487Z * [new branch] starterTaskUpdate -> origin/starterTaskUpdate 2025-08-14T21:21:50.3982189Z * [new branch] step2vllmsetup -> origin/step2vllmsetup 2025-08-14T21:21:50.3982506Z * [new branch] subgraph_fuse -> origin/subgraph_fuse 2025-08-14T21:21:50.3982756Z * [new branch] support-uv-in-collect_env -> origin/support-uv-in-collect_env 2025-08-14T21:21:50.3983030Z * [new branch] suryasub/fix-nccl-hang -> origin/suryasub/fix-nccl-hang 2025-08-14T21:21:50.3983174Z * [new branch] sve-poc -> origin/sve-poc 2025-08-14T21:21:50.3987857Z * [new branch] svekars-patch-1 -> origin/svekars-patch-1 2025-08-14T21:21:50.3988202Z * [new branch] svekars-patch-2 -> origin/svekars-patch-2 2025-08-14T21:21:50.3988597Z * [new branch] switch-bn -> origin/switch-bn 2025-08-14T21:21:50.3988909Z * [new branch] sympy-bottleneck-repro -> origin/sympy-bottleneck-repro 2025-08-14T21:21:50.3989127Z * [new branch] tenpercent/ck_inductor_gfx950 -> origin/tenpercent/ck_inductor_gfx950 2025-08-14T21:21:50.3989294Z * [new branch] tensordict_integration -> origin/tensordict_integration 2025-08-14T21:21:50.3989510Z * [new branch] test-half-migration-internally -> origin/test-half-migration-internally 2025-08-14T21:21:50.3989671Z * [new branch] test-internal-et -> origin/test-internal-et 2025-08-14T21:21:50.3989827Z * [new branch] test-move-conda-builds -> origin/test-move-conda-builds 2025-08-14T21:21:50.3990024Z * [new branch] test-myst-markdown-docstring -> origin/test-myst-markdown-docstring 2025-08-14T21:21:50.3990294Z * [new branch] test-old -> origin/test-old 2025-08-14T21:21:50.3990489Z * [new branch] test-vec-migration-internally -> origin/test-vec-migration-internally 2025-08-14T21:21:50.3995561Z * [new branch] test/bmm_heur -> origin/test/bmm_heur 2025-08-14T21:21:50.3995945Z * [new branch] test/inductor -> origin/test/inductor 2025-08-14T21:21:50.3996112Z * [new branch] tidy_performance_cyy -> origin/tidy_performance_cyy 2025-08-14T21:21:50.3996238Z * [new branch] torchtitan_ep -> origin/torchtitan_ep 2025-08-14T21:21:50.3996397Z * [new branch] trace_fsdp_torchtune_lora -> origin/trace_fsdp_torchtune_lora 2025-08-14T21:21:50.3996561Z * [new branch] traceable_fsdp_unit_tests -> origin/traceable_fsdp_unit_tests 2025-08-14T21:21:50.3996695Z * [new branch] trackMonitor -> origin/trackMonitor 2025-08-14T21:21:50.3996854Z * [new branch] tree_loop_vec_base -> origin/tree_loop_vec_base 2025-08-14T21:21:50.3996995Z * [new branch] tree_vec_base -> origin/tree_vec_base 2025-08-14T21:21:50.3999456Z * [new branch] triton-update -> origin/triton-update 2025-08-14T21:21:50.3999765Z * [new branch] triton_kernel -> origin/triton_kernel 2025-08-14T21:21:50.3999945Z * [new branch] triton_kernel_perf -> origin/triton_kernel_perf 2025-08-14T21:21:50.4000169Z * [new branch] try-runllm -> origin/try-runllm 2025-08-14T21:21:50.4000728Z * [new branch] type_dec -> origin/type_dec 2025-08-14T21:21:50.4004259Z * [new branch] udate-sphinx-dependancies -> origin/udate-sphinx-dependancies 2025-08-14T21:21:50.4004552Z * [new branch] update-audio-commit-hash/16307312222-1661-1 -> origin/update-audio-commit-hash/16307312222-1661-1 2025-08-14T21:21:50.4004830Z * [new branch] update-audio-commit-hash/16431348808-1673-1 -> origin/update-audio-commit-hash/16431348808-1673-1 2025-08-14T21:21:50.4005075Z * [new branch] update-audio-commit-hash/16510774365-1683-1 -> origin/update-audio-commit-hash/16510774365-1683-1 2025-08-14T21:21:50.4005850Z * [new branch] update-audio-commit-hash/16583472358-1693-1 -> origin/update-audio-commit-hash/16583472358-1693-1 2025-08-14T21:21:50.4006089Z * [new branch] update-audio-commit-hash/16663082088-1700-1 -> origin/update-audio-commit-hash/16663082088-1700-1 2025-08-14T21:21:50.4006338Z * [new branch] update-audio-commit-hash/16737365217-1704-1 -> origin/update-audio-commit-hash/16737365217-1704-1 2025-08-14T21:21:50.4007602Z * [new branch] update-audio-commit-hash/16791960928-1711-1 -> origin/update-audio-commit-hash/16791960928-1711-1 2025-08-14T21:21:50.4008245Z * [new branch] update-audio-commit-hash/16818882925-1712-1 -> origin/update-audio-commit-hash/16818882925-1712-1 2025-08-14T21:21:50.4009471Z * [new branch] update-audio-commit-hash/16895560422-1720-1 -> origin/update-audio-commit-hash/16895560422-1720-1 2025-08-14T21:21:50.4009713Z * [new branch] update-audio-commit-hash/16924174496-1738-1 -> origin/update-audio-commit-hash/16924174496-1738-1 2025-08-14T21:21:50.4010567Z * [new branch] update-dynamic-shapes-doc -> origin/update-dynamic-shapes-doc 2025-08-14T21:21:50.4015585Z * [new branch] update-executorch-commit-hash/15694981040-1626-1 -> origin/update-executorch-commit-hash/15694981040-1626-1 2025-08-14T21:21:50.4015862Z * [new branch] update-triton-commit-hash/13663274526-1487-2 -> origin/update-triton-commit-hash/13663274526-1487-2 2025-08-14T21:21:50.4016110Z * [new branch] update-vision-commit-hash/15336342773-1607-1 -> origin/update-vision-commit-hash/15336342773-1607-1 2025-08-14T21:21:50.4016527Z * [new branch] update-vllm-commit-hash/16431348808-1673-1 -> origin/update-vllm-commit-hash/16431348808-1673-1 2025-08-14T21:21:50.4016745Z * [new branch] update-vllm-commit-hash/16484773233-1682-1 -> origin/update-vllm-commit-hash/16484773233-1682-1 2025-08-14T21:21:50.4016955Z * [new branch] update-vllm-commit-hash/16510774365-1683-1 -> origin/update-vllm-commit-hash/16510774365-1683-1 2025-08-14T21:21:50.4017179Z * [new branch] update-vllm-commit-hash/16534031105-1684-1 -> origin/update-vllm-commit-hash/16534031105-1684-1 2025-08-14T21:21:50.4017387Z * [new branch] update-vllm-commit-hash/16545403308-1687-1 -> origin/update-vllm-commit-hash/16545403308-1687-1 2025-08-14T21:21:50.4017951Z * [new branch] update-vllm-commit-hash/16557202787-1688-1 -> origin/update-vllm-commit-hash/16557202787-1688-1 2025-08-14T21:21:50.4018716Z * [new branch] update-vllm-commit-hash/16583472358-1693-1 -> origin/update-vllm-commit-hash/16583472358-1693-1 2025-08-14T21:21:50.4019508Z * [new branch] update-vllm-commit-hash/16663082088-1700-1 -> origin/update-vllm-commit-hash/16663082088-1700-1 2025-08-14T21:21:50.4020113Z * [new branch] update-vllm-commit-hash/16737365217-1704-1 -> origin/update-vllm-commit-hash/16737365217-1704-1 2025-08-14T21:21:50.4021183Z * [new branch] update-vllm-commit-hash/16843157111-1713-1 -> origin/update-vllm-commit-hash/16843157111-1713-1 2025-08-14T21:21:50.4026202Z * [new branch] update-vllm-commit-hash/16855312394-1714-1 -> origin/update-vllm-commit-hash/16855312394-1714-1 2025-08-14T21:21:50.4031294Z * [new branch] update-vllm-commit-hash/16924174496-1738-1 -> origin/update-vllm-commit-hash/16924174496-1738-1 2025-08-14T21:21:50.4036530Z * [new branch] update-vllm-commit-hash/16952608705-1745-1 -> origin/update-vllm-commit-hash/16952608705-1745-1 2025-08-14T21:21:50.4042309Z * [new branch] update-xla-commit-hash/16260974441-194-1 -> origin/update-xla-commit-hash/16260974441-194-1 2025-08-14T21:21:50.4044194Z * [new branch] update-xla-commit-hash/16717126778-197-1 -> origin/update-xla-commit-hash/16717126778-197-1 2025-08-14T21:21:50.4044422Z * [new branch] update-xla-commit-hash/16873912760-198-1 -> origin/update-xla-commit-hash/16873912760-198-1 2025-08-14T21:21:50.4044678Z * [new branch] update_docs_torch_multinomial_issue#125388 -> origin/update_docs_torch_multinomial_issue#125388 2025-08-14T21:21:50.4044840Z * [new branch] update_executorch_pin -> origin/update_executorch_pin 2025-08-14T21:21:50.4045004Z * [new branch] update_slow_tests_1722488736 -> origin/update_slow_tests_1722488736 2025-08-14T21:21:50.4045165Z * [new branch] update_slow_tests_1722879173 -> origin/update_slow_tests_1722879173 2025-08-14T21:21:50.4045317Z * [new branch] update_slow_tests_1752478971 -> origin/update_slow_tests_1752478971 2025-08-14T21:21:50.4045640Z * [new branch] update_submodule_FBGEMM -> origin/update_submodule_FBGEMM 2025-08-14T21:21:50.4045801Z * [new branch] update_submodule_kineto -> origin/update_submodule_kineto 2025-08-14T21:21:50.4045975Z * [new branch] update_submodule_tensorpipe -> origin/update_submodule_tensorpipe 2025-08-14T21:21:50.4046108Z * [new branch] v0.1.2 -> origin/v0.1.2 2025-08-14T21:21:50.4046237Z * [new branch] v1.0.1 -> origin/v1.0.1 2025-08-14T21:21:50.4046353Z * [new branch] v1.0.3 -> origin/v1.0.3 2025-08-14T21:21:50.4046472Z * [new branch] v1.1.0 -> origin/v1.1.0 2025-08-14T21:21:50.4046582Z * [new branch] v1.2.0 -> origin/v1.2.0 2025-08-14T21:21:50.4046699Z * [new branch] v1.3.0 -> origin/v1.3.0 2025-08-14T21:21:50.4046856Z * [new branch] v1.3.1 -> origin/v1.3.1 2025-08-14T21:21:50.4046989Z * [new branch] validate_fn -> origin/validate_fn 2025-08-14T21:21:50.4047135Z * [new branch] validations_2.6 -> origin/validations_2.6 2025-08-14T21:21:50.4047272Z * [new branch] validations_2.8 -> origin/validations_2.8 2025-08-14T21:21:50.4047406Z * [new branch] viable/strict -> origin/viable/strict 2025-08-14T21:21:50.4047544Z * [new branch] vllmbuildci -> origin/vllmbuildci 2025-08-14T21:21:50.4047677Z * [new branch] vllmpin -> origin/vllmpin 2025-08-14T21:21:50.4047812Z * [new branch] vllmpintest -> origin/vllmpintest 2025-08-14T21:21:50.4047956Z * [new branch] wdvr-patch-1 -> origin/wdvr-patch-1 2025-08-14T21:21:50.4048087Z * [new branch] wdvr-patch-2 -> origin/wdvr-patch-2 2025-08-14T21:21:50.4048263Z * [new branch] wdvr/conda_devcontainer -> origin/wdvr/conda_devcontainer 2025-08-14T21:21:50.4048418Z * [new branch] wdvr/fix_logging_test -> origin/wdvr/fix_logging_test 2025-08-14T21:21:50.4048558Z * [new branch] wdvr/iss_145259 -> origin/wdvr/iss_145259 2025-08-14T21:21:50.4048907Z * [new branch] weight_sharing_cpp -> origin/weight_sharing_cpp 2025-08-14T21:21:50.4049430Z * [new branch] whc/flight -> origin/whc/flight 2025-08-14T21:21:50.4049571Z * [new branch] whc/flight4 -> origin/whc/flight4 2025-08-14T21:21:50.4057557Z * [new branch] whc/flight51 -> origin/whc/flight51 2025-08-14T21:21:50.4057807Z * [new branch] whc/flight53 -> origin/whc/flight53 2025-08-14T21:21:50.4058019Z * [new branch] whc/p2phang -> origin/whc/p2phang 2025-08-14T21:21:50.4058168Z * [new branch] whc/stage2 -> origin/whc/stage2 2025-08-14T21:21:50.4058372Z * [new branch] whc/uneven -> origin/whc/uneven 2025-08-14T21:21:50.4063064Z * [new branch] whc/uneven-merge -> origin/whc/uneven-merge 2025-08-14T21:21:50.4063692Z * [new branch] win_warnings -> origin/win_warnings 2025-08-14T21:21:50.4063985Z * [new branch] workonoldcommit -> origin/workonoldcommit 2025-08-14T21:21:50.4070932Z * [new branch] wwen/programming-model-2.8 -> origin/wwen/programming-model-2.8 2025-08-14T21:21:50.4075940Z * [new branch] xmfan/ca_0516 -> origin/xmfan/ca_0516 2025-08-14T21:21:50.4080355Z * [new branch] xmfan/ca_1051b93192 -> origin/xmfan/ca_1051b93192 2025-08-14T21:21:50.4082329Z * [new branch] xmfan/ca_1a722f62c248391fc4a542e8851a5559aa356ae8 -> origin/xmfan/ca_1a722f62c248391fc4a542e8851a5559aa356ae8 2025-08-14T21:21:50.4082702Z * [new branch] xmfan/ca_5a2be192d1 -> origin/xmfan/ca_5a2be192d1 2025-08-14T21:21:50.4082859Z * [new branch] xmfan/ca_9d59b516e9 -> origin/xmfan/ca_9d59b516e9 2025-08-14T21:21:50.4082996Z * [new branch] xmfan/ca_api -> origin/xmfan/ca_api 2025-08-14T21:21:50.4083130Z * [new branch] xmfan/ca_apr8 -> origin/xmfan/ca_apr8 2025-08-14T21:21:50.4083262Z * [new branch] xmfan/ca_base -> origin/xmfan/ca_base 2025-08-14T21:21:50.4083408Z * [new branch] xmfan/ca_cudagraphs -> origin/xmfan/ca_cudagraphs 2025-08-14T21:21:50.4083548Z * [new branch] xmfan/ca_dynamic -> origin/xmfan/ca_dynamic 2025-08-14T21:21:50.4083689Z * [new branch] xmfan/ca_fix_dyn -> origin/xmfan/ca_fix_dyn 2025-08-14T21:21:50.4083835Z * [new branch] xmfan/ca_fix_lowering -> origin/xmfan/ca_fix_lowering 2025-08-14T21:21:50.4084042Z * [new branch] xmfan/ca_fix_polyfills -> origin/xmfan/ca_fix_polyfills 2025-08-14T21:21:50.4084166Z * [new branch] xmfan/ca_jan3 -> origin/xmfan/ca_jan3 2025-08-14T21:21:50.4084294Z * [new branch] xmfan/ca_jun18 -> origin/xmfan/ca_jun18 2025-08-14T21:21:50.4084424Z * [new branch] xmfan/ca_jun24 -> origin/xmfan/ca_jun24 2025-08-14T21:21:50.4084555Z * [new branch] xmfan/ca_mem_base -> origin/xmfan/ca_mem_base 2025-08-14T21:21:50.4084694Z * [new branch] xmfan/ca_mem_fix -> origin/xmfan/ca_mem_fix 2025-08-14T21:21:50.4084831Z * [new branch] xmfan/ca_memory_fix -> origin/xmfan/ca_memory_fix 2025-08-14T21:21:50.4084991Z * [new branch] xmfan/ca_memory_fix_rebased -> origin/xmfan/ca_memory_fix_rebased 2025-08-14T21:21:50.4085173Z * [new branch] xmfan/ca_memory_fix_rebased2 -> origin/xmfan/ca_memory_fix_rebased2 2025-08-14T21:21:50.4085315Z * [new branch] xmfan/ca_move_to_cuda -> origin/xmfan/ca_move_to_cuda 2025-08-14T21:21:50.4085456Z * [new branch] xmfan/ca_nested -> origin/xmfan/ca_nested 2025-08-14T21:21:50.4085621Z * [new branch] xmfan/ca_overhead -> origin/xmfan/ca_overhead 2025-08-14T21:21:50.4085783Z * [new branch] xmfan/ca_overhead_0eba7e5451 -> origin/xmfan/ca_overhead_0eba7e5451 2025-08-14T21:21:50.4085945Z * [new branch] xmfan/ca_scalar -> origin/xmfan/ca_scalar 2025-08-14T21:21:50.4086099Z * [new branch] xmfan/ca_subclass_mem_fix -> origin/xmfan/ca_subclass_mem_fix 2025-08-14T21:21:50.4086239Z * [new branch] xmfan/ca_warm_mem -> origin/xmfan/ca_warm_mem 2025-08-14T21:21:50.4086380Z * [new branch] xmfan/ca_warm_mem_base -> origin/xmfan/ca_warm_mem_base 2025-08-14T21:21:50.4086517Z * [new branch] xmfan/cacu_jun18 -> origin/xmfan/cacu_jun18 2025-08-14T21:21:50.4086654Z * [new branch] xmfan/cacu_jun19 -> origin/xmfan/cacu_jun19 2025-08-14T21:21:50.4086782Z * [new branch] xmfan/cacu_jun4 -> origin/xmfan/cacu_jun4 2025-08-14T21:21:50.4086914Z * [new branch] xmfan/cacu_may27 -> origin/xmfan/cacu_may27 2025-08-14T21:21:50.4087055Z * [new branch] xmfan/circular_dep -> origin/xmfan/circular_dep 2025-08-14T21:21:50.4087232Z * [new branch] xmfan/compiled_autograd_feb_29 -> origin/xmfan/compiled_autograd_feb_29 2025-08-14T21:21:50.4087445Z * [new branch] xmfan/compiled_autograd_graph_breaks -> origin/xmfan/compiled_autograd_graph_breaks 2025-08-14T21:21:50.4087602Z * [new branch] xmfan/disable_duck_shape -> origin/xmfan/disable_duck_shape 2025-08-14T21:21:50.4087794Z * [new branch] xmfan/fca_cpp_node_passthrough -> origin/xmfan/fca_cpp_node_passthrough 2025-08-14T21:21:50.4087972Z * [new branch] xmfan/issue_123374 -> origin/xmfan/issue_123374 2025-08-14T21:21:50.4088247Z * [new branch] xmfan/post_3945954741e2d37023c5d6954f9483008e0892f9 -> origin/xmfan/post_3945954741e2d37023c5d6954f9483008e0892f9 2025-08-14T21:21:50.4088524Z * [new branch] xmfan/pre_3945954741e2d37023c5d6954f9483008e0892f9 -> origin/xmfan/pre_3945954741e2d37023c5d6954f9483008e0892f9 2025-08-14T21:21:50.4088977Z * [new branch] xmfan/segfault_test -> origin/xmfan/segfault_test 2025-08-14T21:21:50.4089143Z * [new branch] xmfan/single_step -> origin/xmfan/single_step 2025-08-14T21:21:50.4089271Z * [new branch] xmfan/sth_0829 -> origin/xmfan/sth_0829 2025-08-14T21:21:50.4089399Z * [new branch] xmfan/test -> origin/xmfan/test 2025-08-14T21:21:50.4089602Z * [new branch] y-do-we-have-7-build-systems -> origin/y-do-we-have-7-build-systems 2025-08-14T21:21:50.4089821Z * [new branch] yguo/debug-0226-constexpr -> origin/yguo/debug-0226-constexpr 2025-08-14T21:21:50.4089977Z * [new branch] yguo/new_latest_changes -> origin/yguo/new_latest_changes 2025-08-14T21:21:50.4090177Z * [new branch] yguo/patch_constexpr_changes -> origin/yguo/patch_constexpr_changes 2025-08-14T21:21:50.4090331Z * [new branch] yihan_quantization -> origin/yihan_quantization 2025-08-14T21:21:50.4090520Z * [new branch] yiming/add_nativert_benchmark -> origin/yiming/add_nativert_benchmark 2025-08-14T21:21:50.4090654Z * [new branch] yiming/bootcamp -> origin/yiming/bootcamp 2025-08-14T21:21:50.4090800Z * [new branch] zainr/canary-test -> origin/zainr/canary-test 2025-08-14T21:21:50.4091151Z * [new branch] zainr/cleanup-gh-runners -> origin/zainr/cleanup-gh-runners 2025-08-14T21:21:50.4091355Z * [new branch] zainr/fixlint -> origin/zainr/fixlint 2025-08-14T21:21:50.4094280Z * [new branch] zainr/git-push-v2 -> origin/zainr/git-push-v2 2025-08-14T21:21:50.4094468Z * [new branch] zainr/lint-py3.9 -> origin/zainr/lint-py3.9 2025-08-14T21:21:50.4094639Z * [new branch] zainr/mypy15-claude -> origin/zainr/mypy15-claude 2025-08-14T21:21:50.4094803Z * [new branch] zainr/pre-push-hooks -> origin/zainr/pre-push-hooks 2025-08-14T21:21:50.4095125Z * [new branch] zainr/pull-migration-c -> origin/zainr/pull-migration-c 2025-08-14T21:21:50.4095573Z * [new branch] zainr/test2 -> origin/zainr/test2 2025-08-14T21:21:50.4097118Z * [new branch] zainr/unstable -> origin/zainr/unstable 2025-08-14T21:21:50.4097513Z * [new branch] zainr/unstable-xla -> origin/zainr/unstable-xla 2025-08-14T21:21:50.4098025Z * [new branch] zainr/uv-pip-fix -> origin/zainr/uv-pip-fix 2025-08-14T21:21:50.4098841Z * [new branch] zainr/vs-aarch64 -> origin/zainr/vs-aarch64 2025-08-14T21:21:50.4101952Z * [new branch] zasdfgbnm-patch-3 -> origin/zasdfgbnm-patch-3 2025-08-14T21:21:50.4102113Z * [new branch] zb2p -> origin/zb2p 2025-08-14T21:21:50.4102268Z * [new branch] zdevito-patch-1 -> origin/zdevito-patch-1 2025-08-14T21:21:50.4102450Z * [new branch] zeros-and-scatter-part2 -> origin/zeros-and-scatter-part2 2025-08-14T21:21:50.4111347Z * [new branch] zhxchen17/nativert/0 -> origin/zhxchen17/nativert/0 2025-08-14T21:21:50.4111539Z * [new branch] zhxchen17/scratch/0 -> origin/zhxchen17/scratch/0 2025-08-14T21:21:50.4111751Z * [new branch] zhxhcen17/moodycamel -> origin/zhxhcen17/moodycamel 2025-08-14T21:21:50.4111923Z * [new branch] zxiiro/bazel -> origin/zxiiro/bazel 2025-08-14T21:21:50.4116733Z * [new branch] zxiiro/get-hardware -> origin/zxiiro/get-hardware 2025-08-14T21:21:50.4116910Z * [new branch] zxiiro/main -> origin/zxiiro/main 2025-08-14T21:21:50.4117042Z * [new branch] zxiiro/test -> origin/zxiiro/test 2025-08-14T21:21:50.4117369Z * [new tag] bc2caa7fdf006894eff7af936babde69ab5a40f8-huydhn-debug -> bc2caa7fdf006894eff7af936babde69ab5a40f8-huydhn-debug 2025-08-14T21:21:50.4117492Z * [new tag] ci/binaries/77164 -> ci/binaries/77164 2025-08-14T21:21:50.4119031Z * [new tag] ciflow/binaries/138996 -> ciflow/binaries/138996 2025-08-14T21:21:50.4119177Z * [new tag] ciflow/binaries/143959 -> ciflow/binaries/143959 2025-08-14T21:21:50.4119303Z * [new tag] ciflow/binaries/154595 -> ciflow/binaries/154595 2025-08-14T21:21:50.4119575Z * [new tag] ciflow/binaries/156049 -> ciflow/binaries/156049 2025-08-14T21:21:50.4119701Z * [new tag] ciflow/binaries/156712 -> ciflow/binaries/156712 2025-08-14T21:21:50.4119831Z * [new tag] ciflow/binaries/157432 -> ciflow/binaries/157432 2025-08-14T21:21:50.4119955Z * [new tag] ciflow/binaries/157685 -> ciflow/binaries/157685 2025-08-14T21:21:50.4120073Z * [new tag] ciflow/binaries/157689 -> ciflow/binaries/157689 2025-08-14T21:21:50.4120201Z * [new tag] ciflow/binaries/158104 -> ciflow/binaries/158104 2025-08-14T21:21:50.4122037Z * [new tag] ciflow/binaries/158623 -> ciflow/binaries/158623 2025-08-14T21:21:50.4122179Z * [new tag] ciflow/binaries/159827 -> ciflow/binaries/159827 2025-08-14T21:21:50.4122306Z * [new tag] ciflow/binaries/159869 -> ciflow/binaries/159869 2025-08-14T21:21:50.4122442Z * [new tag] ciflow/binaries/160593 -> ciflow/binaries/160593 2025-08-14T21:21:50.4122617Z * [new tag] ciflow/binaries_libtorch/143959 -> ciflow/binaries_libtorch/143959 2025-08-14T21:21:50.4122775Z * [new tag] ciflow/binaries_libtorch/156049 -> ciflow/binaries_libtorch/156049 2025-08-14T21:21:50.4122942Z * [new tag] ciflow/binaries_libtorch/157432 -> ciflow/binaries_libtorch/157432 2025-08-14T21:21:50.4123109Z * [new tag] ciflow/binaries_wheel/143959 -> ciflow/binaries_wheel/143959 2025-08-14T21:21:50.4123477Z * [new tag] ciflow/binaries_wheel/156049 -> ciflow/binaries_wheel/156049 2025-08-14T21:21:50.4124173Z * [new tag] ciflow/binaries_wheel/157432 -> ciflow/binaries_wheel/157432 2025-08-14T21:21:50.4124491Z * [new tag] ciflow/binaries_wheel/158733 -> ciflow/binaries_wheel/158733 2025-08-14T21:21:50.4125086Z * [new tag] ciflow/binaries_wheel/160301 -> ciflow/binaries_wheel/160301 2025-08-14T21:21:50.4126367Z * [new tag] ciflow/binaries_wheel/160496 -> ciflow/binaries_wheel/160496 2025-08-14T21:21:50.4126973Z * [new tag] ciflow/h100-distributed/156703 -> ciflow/h100-distributed/156703 2025-08-14T21:21:50.4127161Z * [new tag] ciflow/h100-symm-mem/151845 -> ciflow/h100-symm-mem/151845 2025-08-14T21:21:50.4127340Z * [new tag] ciflow/h100-symm-mem/155923 -> ciflow/h100-symm-mem/155923 2025-08-14T21:21:50.4127861Z * [new tag] ciflow/h100-symm-mem/157635 -> ciflow/h100-symm-mem/157635 2025-08-14T21:21:50.4134360Z * [new tag] ciflow/h100-symm-mem/159118 -> ciflow/h100-symm-mem/159118 2025-08-14T21:21:50.4134556Z * [new tag] ciflow/h100-symm-mem/159562 -> ciflow/h100-symm-mem/159562 2025-08-14T21:21:50.4134714Z * [new tag] ciflow/h100-symm-mem/159889 -> ciflow/h100-symm-mem/159889 2025-08-14T21:21:50.4134946Z * [new tag] ciflow/h100/159158 -> ciflow/h100/159158 2025-08-14T21:21:50.4135228Z * [new tag] ciflow/h100/160450 -> ciflow/h100/160450 2025-08-14T21:21:50.4135341Z * [new tag] ciflow/h100/160480 -> ciflow/h100/160480 2025-08-14T21:21:50.4135447Z * [new tag] ciflow/h100/160614 -> ciflow/h100/160614 2025-08-14T21:21:50.4135861Z * [new tag] ciflow/inductor-perf-test-nightly-rocm/151845 -> ciflow/inductor-perf-test-nightly-rocm/151845 2025-08-14T21:21:50.4136106Z * [new tag] ciflow/inductor-perf-test-nightly-rocm/160538 -> ciflow/inductor-perf-test-nightly-rocm/160538 2025-08-14T21:21:50.4136385Z * [new tag] ciflow/inductor-perf-test-nightly-x86-zen/156599 -> ciflow/inductor-perf-test-nightly-x86-zen/156599 2025-08-14T21:21:50.4136557Z * [new tag] ciflow/inductor-periodic/160406 -> ciflow/inductor-periodic/160406 2025-08-14T21:21:50.4136775Z * [new tag] ciflow/inductor-periodic/160538 -> ciflow/inductor-periodic/160538 2025-08-14T21:21:50.4136919Z * [new tag] ciflow/inductor-rocm/151845 -> ciflow/inductor-rocm/151845 2025-08-14T21:21:50.4137058Z * [new tag] ciflow/inductor-rocm/159158 -> ciflow/inductor-rocm/159158 2025-08-14T21:21:50.4137200Z * [new tag] ciflow/inductor-rocm/160073 -> ciflow/inductor-rocm/160073 2025-08-14T21:21:50.4137794Z * [new tag] ciflow/inductor-rocm/160538 -> ciflow/inductor-rocm/160538 2025-08-14T21:21:50.4138326Z * [new tag] ciflow/inductor/134881 -> ciflow/inductor/134881 2025-08-14T21:21:50.4138705Z * [new tag] ciflow/inductor/137400 -> ciflow/inductor/137400 2025-08-14T21:21:50.4139837Z * [new tag] ciflow/inductor/144516 -> ciflow/inductor/144516 2025-08-14T21:21:50.4140057Z * [new tag] ciflow/inductor/146506 -> ciflow/inductor/146506 2025-08-14T21:21:50.4140276Z * [new tag] ciflow/inductor/147360 -> ciflow/inductor/147360 2025-08-14T21:21:50.4140581Z * [new tag] ciflow/inductor/147990 -> ciflow/inductor/147990 2025-08-14T21:21:50.4141020Z * [new tag] ciflow/inductor/148180 -> ciflow/inductor/148180 2025-08-14T21:21:50.4141487Z * [new tag] ciflow/inductor/148328 -> ciflow/inductor/148328 2025-08-14T21:21:50.4143288Z * [new tag] ciflow/inductor/148484 -> ciflow/inductor/148484 2025-08-14T21:21:50.4143438Z * [new tag] ciflow/inductor/148492 -> ciflow/inductor/148492 2025-08-14T21:21:50.4143560Z * [new tag] ciflow/inductor/150302 -> ciflow/inductor/150302 2025-08-14T21:21:50.4147084Z * [new tag] ciflow/inductor/151845 -> ciflow/inductor/151845 2025-08-14T21:21:50.4147246Z * [new tag] ciflow/inductor/152198 -> ciflow/inductor/152198 2025-08-14T21:21:50.4147420Z * [new tag] ciflow/inductor/152624 -> ciflow/inductor/152624 2025-08-14T21:21:50.4147638Z * [new tag] ciflow/inductor/153966 -> ciflow/inductor/153966 2025-08-14T21:21:50.4147775Z * [new tag] ciflow/inductor/154193 -> ciflow/inductor/154193 2025-08-14T21:21:50.4147909Z * [new tag] ciflow/inductor/154650 -> ciflow/inductor/154650 2025-08-14T21:21:50.4148034Z * [new tag] ciflow/inductor/154694 -> ciflow/inductor/154694 2025-08-14T21:21:50.4153478Z * [new tag] ciflow/inductor/155072 -> ciflow/inductor/155072 2025-08-14T21:21:50.4153656Z * [new tag] ciflow/inductor/155152 -> ciflow/inductor/155152 2025-08-14T21:21:50.4153789Z * [new tag] ciflow/inductor/155153 -> ciflow/inductor/155153 2025-08-14T21:21:50.4153937Z * [new tag] ciflow/inductor/155154 -> ciflow/inductor/155154 2025-08-14T21:21:50.4154102Z * [new tag] ciflow/inductor/155501 -> ciflow/inductor/155501 2025-08-14T21:21:50.4154427Z * [new tag] ciflow/inductor/155502 -> ciflow/inductor/155502 2025-08-14T21:21:50.4154583Z * [new tag] ciflow/inductor/155503 -> ciflow/inductor/155503 2025-08-14T21:21:50.4154718Z * [new tag] ciflow/inductor/155504 -> ciflow/inductor/155504 2025-08-14T21:21:50.4154857Z * [new tag] ciflow/inductor/155557 -> ciflow/inductor/155557 2025-08-14T21:21:50.4154990Z * [new tag] ciflow/inductor/155608 -> ciflow/inductor/155608 2025-08-14T21:21:50.4155178Z * [new tag] ciflow/inductor/155923 -> ciflow/inductor/155923 2025-08-14T21:21:50.4155434Z * [new tag] ciflow/inductor/155928 -> ciflow/inductor/155928 2025-08-14T21:21:50.4155679Z * [new tag] ciflow/inductor/155958 -> ciflow/inductor/155958 2025-08-14T21:21:50.4156126Z * [new tag] ciflow/inductor/156049 -> ciflow/inductor/156049 2025-08-14T21:21:50.4156303Z * [new tag] ciflow/inductor/156851 -> ciflow/inductor/156851 2025-08-14T21:21:50.4156571Z * [new tag] ciflow/inductor/156967 -> ciflow/inductor/156967 2025-08-14T21:21:50.4156735Z * [new tag] ciflow/inductor/157148 -> ciflow/inductor/157148 2025-08-14T21:21:50.4157009Z * [new tag] ciflow/inductor/157149 -> ciflow/inductor/157149 2025-08-14T21:21:50.4157185Z * [new tag] ciflow/inductor/157152 -> ciflow/inductor/157152 2025-08-14T21:21:50.4157444Z * [new tag] ciflow/inductor/157542 -> ciflow/inductor/157542 2025-08-14T21:21:50.4160559Z * [new tag] ciflow/inductor/157572 -> ciflow/inductor/157572 2025-08-14T21:21:50.4160866Z * [new tag] ciflow/inductor/157635 -> ciflow/inductor/157635 2025-08-14T21:21:50.4161014Z * [new tag] ciflow/inductor/157685 -> ciflow/inductor/157685 2025-08-14T21:21:50.4161166Z * [new tag] ciflow/inductor/157686 -> ciflow/inductor/157686 2025-08-14T21:21:50.4161444Z * [new tag] ciflow/inductor/157689 -> ciflow/inductor/157689 2025-08-14T21:21:50.4161561Z * [new tag] ciflow/inductor/157699 -> ciflow/inductor/157699 2025-08-14T21:21:50.4162221Z * [new tag] ciflow/inductor/157743 -> ciflow/inductor/157743 2025-08-14T21:21:50.4162386Z * [new tag] ciflow/inductor/157944 -> ciflow/inductor/157944 2025-08-14T21:21:50.4162513Z * [new tag] ciflow/inductor/157971 -> ciflow/inductor/157971 2025-08-14T21:21:50.4162642Z * [new tag] ciflow/inductor/157994 -> ciflow/inductor/157994 2025-08-14T21:21:50.4162769Z * [new tag] ciflow/inductor/158061 -> ciflow/inductor/158061 2025-08-14T21:21:50.4162901Z * [new tag] ciflow/inductor/158091 -> ciflow/inductor/158091 2025-08-14T21:21:50.4163072Z * [new tag] ciflow/inductor/158097 -> ciflow/inductor/158097 2025-08-14T21:21:50.4166550Z * [new tag] ciflow/inductor/158098 -> ciflow/inductor/158098 2025-08-14T21:21:50.4166857Z * [new tag] ciflow/inductor/158104 -> ciflow/inductor/158104 2025-08-14T21:21:50.4167051Z * [new tag] ciflow/inductor/158168 -> ciflow/inductor/158168 2025-08-14T21:21:50.4167180Z * [new tag] ciflow/inductor/158250 -> ciflow/inductor/158250 2025-08-14T21:21:50.4167309Z * [new tag] ciflow/inductor/158321 -> ciflow/inductor/158321 2025-08-14T21:21:50.4167434Z * [new tag] ciflow/inductor/158609 -> ciflow/inductor/158609 2025-08-14T21:21:50.4167563Z * [new tag] ciflow/inductor/158647 -> ciflow/inductor/158647 2025-08-14T21:21:50.4167699Z * [new tag] ciflow/inductor/158914 -> ciflow/inductor/158914 2025-08-14T21:21:50.4167986Z * [new tag] ciflow/inductor/158932 -> ciflow/inductor/158932 2025-08-14T21:21:50.4168130Z * [new tag] ciflow/inductor/158987 -> ciflow/inductor/158987 2025-08-14T21:21:50.4168259Z * [new tag] ciflow/inductor/159009 -> ciflow/inductor/159009 2025-08-14T21:21:50.4168384Z * [new tag] ciflow/inductor/159010 -> ciflow/inductor/159010 2025-08-14T21:21:50.4168530Z * [new tag] ciflow/inductor/159093 -> ciflow/inductor/159093 2025-08-14T21:21:50.4168895Z * [new tag] ciflow/inductor/159158 -> ciflow/inductor/159158 2025-08-14T21:21:50.4169982Z * [new tag] ciflow/inductor/159197 -> ciflow/inductor/159197 2025-08-14T21:21:50.4170135Z * [new tag] ciflow/inductor/159274 -> ciflow/inductor/159274 2025-08-14T21:21:50.4170377Z * [new tag] ciflow/inductor/159281 -> ciflow/inductor/159281 2025-08-14T21:21:50.4170667Z * [new tag] ciflow/inductor/159329 -> ciflow/inductor/159329 2025-08-14T21:21:50.4170823Z * [new tag] ciflow/inductor/159361 -> ciflow/inductor/159361 2025-08-14T21:21:50.4170985Z * [new tag] ciflow/inductor/159365 -> ciflow/inductor/159365 2025-08-14T21:21:50.4171248Z * [new tag] ciflow/inductor/159366 -> ciflow/inductor/159366 2025-08-14T21:21:50.4171397Z * [new tag] ciflow/inductor/159367 -> ciflow/inductor/159367 2025-08-14T21:21:50.4171534Z * [new tag] ciflow/inductor/159368 -> ciflow/inductor/159368 2025-08-14T21:21:50.4174252Z * [new tag] ciflow/inductor/159473 -> ciflow/inductor/159473 2025-08-14T21:21:50.4174454Z * [new tag] ciflow/inductor/159483 -> ciflow/inductor/159483 2025-08-14T21:21:50.4174592Z * [new tag] ciflow/inductor/159508 -> ciflow/inductor/159508 2025-08-14T21:21:50.4174813Z * [new tag] ciflow/inductor/159523 -> ciflow/inductor/159523 2025-08-14T21:21:50.4174954Z * [new tag] ciflow/inductor/159678 -> ciflow/inductor/159678 2025-08-14T21:21:50.4175150Z * [new tag] ciflow/inductor/159691 -> ciflow/inductor/159691 2025-08-14T21:21:50.4175310Z * [new tag] ciflow/inductor/159778 -> ciflow/inductor/159778 2025-08-14T21:21:50.4175526Z * [new tag] ciflow/inductor/159786 -> ciflow/inductor/159786 2025-08-14T21:21:50.4175648Z * [new tag] ciflow/inductor/159817 -> ciflow/inductor/159817 2025-08-14T21:21:50.4175886Z * [new tag] ciflow/inductor/159842 -> ciflow/inductor/159842 2025-08-14T21:21:50.4178997Z * [new tag] ciflow/inductor/159864 -> ciflow/inductor/159864 2025-08-14T21:21:50.4179132Z * [new tag] ciflow/inductor/159865 -> ciflow/inductor/159865 2025-08-14T21:21:50.4179339Z * [new tag] ciflow/inductor/159869 -> ciflow/inductor/159869 2025-08-14T21:21:50.4179487Z * [new tag] ciflow/inductor/159875 -> ciflow/inductor/159875 2025-08-14T21:21:50.4179695Z * [new tag] ciflow/inductor/159889 -> ciflow/inductor/159889 2025-08-14T21:21:50.4179845Z * [new tag] ciflow/inductor/159902 -> ciflow/inductor/159902 2025-08-14T21:21:50.4180054Z * [new tag] ciflow/inductor/159923 -> ciflow/inductor/159923 2025-08-14T21:21:50.4180202Z * [new tag] ciflow/inductor/159944 -> ciflow/inductor/159944 2025-08-14T21:21:50.4180321Z * [new tag] ciflow/inductor/160004 -> ciflow/inductor/160004 2025-08-14T21:21:50.4180439Z * [new tag] ciflow/inductor/160080 -> ciflow/inductor/160080 2025-08-14T21:21:50.4180703Z * [new tag] ciflow/inductor/160108 -> ciflow/inductor/160108 2025-08-14T21:21:50.4181346Z * [new tag] ciflow/inductor/160109 -> ciflow/inductor/160109 2025-08-14T21:21:50.4181645Z * [new tag] ciflow/inductor/160111 -> ciflow/inductor/160111 2025-08-14T21:21:50.4184848Z * [new tag] ciflow/inductor/160113 -> ciflow/inductor/160113 2025-08-14T21:21:50.4184995Z * [new tag] ciflow/inductor/160127 -> ciflow/inductor/160127 2025-08-14T21:21:50.4185112Z * [new tag] ciflow/inductor/160131 -> ciflow/inductor/160131 2025-08-14T21:21:50.4185226Z * [new tag] ciflow/inductor/160132 -> ciflow/inductor/160132 2025-08-14T21:21:50.4185349Z * [new tag] ciflow/inductor/160136 -> ciflow/inductor/160136 2025-08-14T21:21:50.4185461Z * [new tag] ciflow/inductor/160138 -> ciflow/inductor/160138 2025-08-14T21:21:50.4185585Z * [new tag] ciflow/inductor/160151 -> ciflow/inductor/160151 2025-08-14T21:21:50.4185698Z * [new tag] ciflow/inductor/160152 -> ciflow/inductor/160152 2025-08-14T21:21:50.4185876Z * [new tag] ciflow/inductor/160154 -> ciflow/inductor/160154 2025-08-14T21:21:50.4186000Z * [new tag] ciflow/inductor/160156 -> ciflow/inductor/160156 2025-08-14T21:21:50.4186113Z * [new tag] ciflow/inductor/160161 -> ciflow/inductor/160161 2025-08-14T21:21:50.4186237Z * [new tag] ciflow/inductor/160166 -> ciflow/inductor/160166 2025-08-14T21:21:50.4186352Z * [new tag] ciflow/inductor/160168 -> ciflow/inductor/160168 2025-08-14T21:21:50.4192045Z * [new tag] ciflow/inductor/160174 -> ciflow/inductor/160174 2025-08-14T21:21:50.4192209Z * [new tag] ciflow/inductor/160181 -> ciflow/inductor/160181 2025-08-14T21:21:50.4192344Z * [new tag] ciflow/inductor/160183 -> ciflow/inductor/160183 2025-08-14T21:21:50.4192470Z * [new tag] ciflow/inductor/160190 -> ciflow/inductor/160190 2025-08-14T21:21:50.4192643Z * [new tag] ciflow/inductor/160198 -> ciflow/inductor/160198 2025-08-14T21:21:50.4192775Z * [new tag] ciflow/inductor/160201 -> ciflow/inductor/160201 2025-08-14T21:21:50.4192897Z * [new tag] ciflow/inductor/160209 -> ciflow/inductor/160209 2025-08-14T21:21:50.4193029Z * [new tag] ciflow/inductor/160218 -> ciflow/inductor/160218 2025-08-14T21:21:50.4193150Z * [new tag] ciflow/inductor/160239 -> ciflow/inductor/160239 2025-08-14T21:21:50.4193273Z * [new tag] ciflow/inductor/160250 -> ciflow/inductor/160250 2025-08-14T21:21:50.4193403Z * [new tag] ciflow/inductor/160253 -> ciflow/inductor/160253 2025-08-14T21:21:50.4193524Z * [new tag] ciflow/inductor/160266 -> ciflow/inductor/160266 2025-08-14T21:21:50.4194794Z * [new tag] ciflow/inductor/160282 -> ciflow/inductor/160282 2025-08-14T21:21:50.4195414Z * [new tag] ciflow/inductor/160298 -> ciflow/inductor/160298 2025-08-14T21:21:50.4195593Z * [new tag] ciflow/inductor/160301 -> ciflow/inductor/160301 2025-08-14T21:21:50.4195724Z * [new tag] ciflow/inductor/160310 -> ciflow/inductor/160310 2025-08-14T21:21:50.4195841Z * [new tag] ciflow/inductor/160323 -> ciflow/inductor/160323 2025-08-14T21:21:50.4195957Z * [new tag] ciflow/inductor/160324 -> ciflow/inductor/160324 2025-08-14T21:21:50.4196084Z * [new tag] ciflow/inductor/160325 -> ciflow/inductor/160325 2025-08-14T21:21:50.4196203Z * [new tag] ciflow/inductor/160326 -> ciflow/inductor/160326 2025-08-14T21:21:50.4196330Z * [new tag] ciflow/inductor/160327 -> ciflow/inductor/160327 2025-08-14T21:21:50.4202223Z * [new tag] ciflow/inductor/160328 -> ciflow/inductor/160328 2025-08-14T21:21:50.4202510Z * [new tag] ciflow/inductor/160329 -> ciflow/inductor/160329 2025-08-14T21:21:50.4202879Z * [new tag] ciflow/inductor/160351 -> ciflow/inductor/160351 2025-08-14T21:21:50.4203028Z * [new tag] ciflow/inductor/160353 -> ciflow/inductor/160353 2025-08-14T21:21:50.4203163Z * [new tag] ciflow/inductor/160362 -> ciflow/inductor/160362 2025-08-14T21:21:50.4203291Z * [new tag] ciflow/inductor/160363 -> ciflow/inductor/160363 2025-08-14T21:21:50.4203420Z * [new tag] ciflow/inductor/160364 -> ciflow/inductor/160364 2025-08-14T21:21:50.4203555Z * [new tag] ciflow/inductor/160365 -> ciflow/inductor/160365 2025-08-14T21:21:50.4203678Z * [new tag] ciflow/inductor/160366 -> ciflow/inductor/160366 2025-08-14T21:21:50.4203802Z * [new tag] ciflow/inductor/160367 -> ciflow/inductor/160367 2025-08-14T21:21:50.4204057Z * [new tag] ciflow/inductor/160368 -> ciflow/inductor/160368 2025-08-14T21:21:50.4204496Z * [new tag] ciflow/inductor/160369 -> ciflow/inductor/160369 2025-08-14T21:21:50.4204635Z * [new tag] ciflow/inductor/160371 -> ciflow/inductor/160371 2025-08-14T21:21:50.4204762Z * [new tag] ciflow/inductor/160374 -> ciflow/inductor/160374 2025-08-14T21:21:50.4204893Z * [new tag] ciflow/inductor/160375 -> ciflow/inductor/160375 2025-08-14T21:21:50.4205041Z * [new tag] ciflow/inductor/160377 -> ciflow/inductor/160377 2025-08-14T21:21:50.4205168Z * [new tag] ciflow/inductor/160380 -> ciflow/inductor/160380 2025-08-14T21:21:50.4205322Z * [new tag] ciflow/inductor/160381 -> ciflow/inductor/160381 2025-08-14T21:21:50.4205448Z * [new tag] ciflow/inductor/160383 -> ciflow/inductor/160383 2025-08-14T21:21:50.4205582Z * [new tag] ciflow/inductor/160394 -> ciflow/inductor/160394 2025-08-14T21:21:50.4205730Z * [new tag] ciflow/inductor/160401 -> ciflow/inductor/160401 2025-08-14T21:21:50.4205862Z * [new tag] ciflow/inductor/160402 -> ciflow/inductor/160402 2025-08-14T21:21:50.4205997Z * [new tag] ciflow/inductor/160403 -> ciflow/inductor/160403 2025-08-14T21:21:50.4206414Z * [new tag] ciflow/inductor/160424 -> ciflow/inductor/160424 2025-08-14T21:21:50.4206597Z * [new tag] ciflow/inductor/160426 -> ciflow/inductor/160426 2025-08-14T21:21:50.4207619Z * [new tag] ciflow/inductor/160431 -> ciflow/inductor/160431 2025-08-14T21:21:50.4208139Z * [new tag] ciflow/inductor/160448 -> ciflow/inductor/160448 2025-08-14T21:21:50.4208378Z * [new tag] ciflow/inductor/160450 -> ciflow/inductor/160450 2025-08-14T21:21:50.4208891Z * [new tag] ciflow/inductor/160455 -> ciflow/inductor/160455 2025-08-14T21:21:50.4214884Z * [new tag] ciflow/inductor/160456 -> ciflow/inductor/160456 2025-08-14T21:21:50.4224182Z * [new tag] ciflow/inductor/160461 -> ciflow/inductor/160461 2025-08-14T21:21:50.4231165Z * [new tag] ciflow/inductor/160462 -> ciflow/inductor/160462 2025-08-14T21:21:50.4236203Z * [new tag] ciflow/inductor/160467 -> ciflow/inductor/160467 2025-08-14T21:21:50.4241522Z * [new tag] ciflow/inductor/160470 -> ciflow/inductor/160470 2025-08-14T21:21:50.4243201Z * [new tag] ciflow/inductor/160473 -> ciflow/inductor/160473 2025-08-14T21:21:50.4243737Z * [new tag] ciflow/inductor/160476 -> ciflow/inductor/160476 2025-08-14T21:21:50.4243928Z * [new tag] ciflow/inductor/160480 -> ciflow/inductor/160480 2025-08-14T21:21:50.4244059Z * [new tag] ciflow/inductor/160481 -> ciflow/inductor/160481 2025-08-14T21:21:50.4244402Z * [new tag] ciflow/inductor/160482 -> ciflow/inductor/160482 2025-08-14T21:21:50.4244545Z * [new tag] ciflow/inductor/160483 -> ciflow/inductor/160483 2025-08-14T21:21:50.4244675Z * [new tag] ciflow/inductor/160485 -> ciflow/inductor/160485 2025-08-14T21:21:50.4244803Z * [new tag] ciflow/inductor/160486 -> ciflow/inductor/160486 2025-08-14T21:21:50.4244939Z * [new tag] ciflow/inductor/160503 -> ciflow/inductor/160503 2025-08-14T21:21:50.4245069Z * [new tag] ciflow/inductor/160510 -> ciflow/inductor/160510 2025-08-14T21:21:50.4245202Z * [new tag] ciflow/inductor/160527 -> ciflow/inductor/160527 2025-08-14T21:21:50.4245329Z * [new tag] ciflow/inductor/160530 -> ciflow/inductor/160530 2025-08-14T21:21:50.4245455Z * [new tag] ciflow/inductor/160531 -> ciflow/inductor/160531 2025-08-14T21:21:50.4245637Z * [new tag] ciflow/inductor/160538 -> ciflow/inductor/160538 2025-08-14T21:21:50.4245763Z * [new tag] ciflow/inductor/160539 -> ciflow/inductor/160539 2025-08-14T21:21:50.4245894Z * [new tag] ciflow/inductor/160540 -> ciflow/inductor/160540 2025-08-14T21:21:50.4246021Z * [new tag] ciflow/inductor/160548 -> ciflow/inductor/160548 2025-08-14T21:21:50.4246146Z * [new tag] ciflow/inductor/160561 -> ciflow/inductor/160561 2025-08-14T21:21:50.4246278Z * [new tag] ciflow/inductor/160576 -> ciflow/inductor/160576 2025-08-14T21:21:50.4246405Z * [new tag] ciflow/inductor/160578 -> ciflow/inductor/160578 2025-08-14T21:21:50.4246539Z * [new tag] ciflow/inductor/160580 -> ciflow/inductor/160580 2025-08-14T21:21:50.4246665Z * [new tag] ciflow/inductor/160583 -> ciflow/inductor/160583 2025-08-14T21:21:50.4246804Z * [new tag] ciflow/inductor/160589 -> ciflow/inductor/160589 2025-08-14T21:21:50.4246937Z * [new tag] ciflow/inductor/160590 -> ciflow/inductor/160590 2025-08-14T21:21:50.4247062Z * [new tag] ciflow/inductor/160592 -> ciflow/inductor/160592 2025-08-14T21:21:50.4247187Z * [new tag] ciflow/inductor/160596 -> ciflow/inductor/160596 2025-08-14T21:21:50.4247321Z * [new tag] ciflow/inductor/160601 -> ciflow/inductor/160601 2025-08-14T21:21:50.4247446Z * [new tag] ciflow/inductor/160607 -> ciflow/inductor/160607 2025-08-14T21:21:50.4247579Z * [new tag] ciflow/inductor/160608 -> ciflow/inductor/160608 2025-08-14T21:21:50.4247705Z * [new tag] ciflow/inductor/160611 -> ciflow/inductor/160611 2025-08-14T21:21:50.4247830Z * [new tag] ciflow/inductor/160614 -> ciflow/inductor/160614 2025-08-14T21:21:50.4247973Z * [new tag] ciflow/inductor/160616 -> ciflow/inductor/160616 2025-08-14T21:21:50.4248102Z * [new tag] ciflow/inductor/160619 -> ciflow/inductor/160619 2025-08-14T21:21:50.4248232Z * [new tag] ciflow/inductor/160625 -> ciflow/inductor/160625 2025-08-14T21:21:50.4248357Z * [new tag] ciflow/inductor/160635 -> ciflow/inductor/160635 2025-08-14T21:21:50.4248482Z * [new tag] ciflow/inductor/160649 -> ciflow/inductor/160649 2025-08-14T21:21:50.4248614Z * [new tag] ciflow/inductor/160658 -> ciflow/inductor/160658 2025-08-14T21:21:50.4248822Z * [new tag] ciflow/inductor/160662 -> ciflow/inductor/160662 2025-08-14T21:21:50.4248948Z * [new tag] ciflow/inductor/160668 -> ciflow/inductor/160668 2025-08-14T21:21:50.4249079Z * [new tag] ciflow/inductor/160669 -> ciflow/inductor/160669 2025-08-14T21:21:50.4249207Z * [new tag] ciflow/inductor/160670 -> ciflow/inductor/160670 2025-08-14T21:21:50.4249400Z * [new tag] ciflow/inductor/160671 -> ciflow/inductor/160671 2025-08-14T21:21:50.4249526Z * [new tag] ciflow/inductor/160677 -> ciflow/inductor/160677 2025-08-14T21:21:50.4249653Z * [new tag] ciflow/inductor/160679 -> ciflow/inductor/160679 2025-08-14T21:21:50.4249801Z * [new tag] ciflow/inductor/3b9a386 -> ciflow/inductor/3b9a386 2025-08-14T21:21:50.4249936Z * [new tag] ciflow/inductor/3d4b92b -> ciflow/inductor/3d4b92b 2025-08-14T21:21:50.4250075Z * [new tag] ciflow/inductor/d224ac7 -> ciflow/inductor/d224ac7 2025-08-14T21:21:50.4250227Z * [new tag] ciflow/linux-aarch64/147855 -> ciflow/linux-aarch64/147855 2025-08-14T21:21:50.4250368Z * [new tag] ciflow/linux-aarch64/157994 -> ciflow/linux-aarch64/157994 2025-08-14T21:21:50.4250551Z * [new tag] ciflow/linux-aarch64/159737 -> ciflow/linux-aarch64/159737 2025-08-14T21:21:50.4250691Z * [new tag] ciflow/linux-aarch64/160078 -> ciflow/linux-aarch64/160078 2025-08-14T21:21:50.4250828Z * [new tag] ciflow/linux-aarch64/160299 -> ciflow/linux-aarch64/160299 2025-08-14T21:21:50.4250979Z * [new tag] ciflow/linux-aarch64/160301 -> ciflow/linux-aarch64/160301 2025-08-14T21:21:50.4251104Z * [new tag] ciflow/mps/155923 -> ciflow/mps/155923 2025-08-14T21:21:50.4251232Z * [new tag] ciflow/mps/157553 -> ciflow/mps/157553 2025-08-14T21:21:50.4251346Z * [new tag] ciflow/mps/157635 -> ciflow/mps/157635 2025-08-14T21:21:50.4251460Z * [new tag] ciflow/mps/160541 -> ciflow/mps/160541 2025-08-14T21:21:50.4251602Z * [new tag] ciflow/nightly/156049 -> ciflow/nightly/156049 2025-08-14T21:21:50.4251733Z * [new tag] ciflow/nightly/158104 -> ciflow/nightly/158104 2025-08-14T21:21:50.4251896Z * [new tag] ciflow/op-benchmark/157994 -> ciflow/op-benchmark/157994 2025-08-14T21:21:50.4252090Z * [new tag] ciflow/periodic-rocm-mi300/139971 -> ciflow/periodic-rocm-mi300/139971 2025-08-14T21:21:50.4252277Z * [new tag] ciflow/periodic-rocm-mi300/160073 -> ciflow/periodic-rocm-mi300/160073 2025-08-14T21:21:50.4252453Z * [new tag] ciflow/periodic-rocm-mi300/160538 -> ciflow/periodic-rocm-mi300/160538 2025-08-14T21:21:50.4252601Z * [new tag] ciflow/periodic/054a2fd -> ciflow/periodic/054a2fd 2025-08-14T21:21:50.4252754Z * [new tag] ciflow/periodic/131296 -> ciflow/periodic/131296 2025-08-14T21:21:50.4252883Z * [new tag] ciflow/periodic/139971 -> ciflow/periodic/139971 2025-08-14T21:21:50.4253008Z * [new tag] ciflow/periodic/143959 -> ciflow/periodic/143959 2025-08-14T21:21:50.4253148Z * [new tag] ciflow/periodic/154595 -> ciflow/periodic/154595 2025-08-14T21:21:50.4253275Z * [new tag] ciflow/periodic/156703 -> ciflow/periodic/156703 2025-08-14T21:21:50.4253407Z * [new tag] ciflow/periodic/160201 -> ciflow/periodic/160201 2025-08-14T21:21:50.4253548Z * [new tag] ciflow/periodic/160424 -> ciflow/periodic/160424 2025-08-14T21:21:50.4253673Z * [new tag] ciflow/periodic/160538 -> ciflow/periodic/160538 2025-08-14T21:21:50.4254004Z * [new tag] ciflow/periodic/1febab2a89302464f6c7d69cfbef7a24c421ea65 -> ciflow/periodic/1febab2a89302464f6c7d69cfbef7a24c421ea65 2025-08-14T21:21:50.4254140Z * [new tag] ciflow/periodic/2a6d37d -> ciflow/periodic/2a6d37d 2025-08-14T21:21:50.4254450Z * [new tag] ciflow/periodic/2ee22e435131369a7e4f8cc4732579acc29a941b -> ciflow/periodic/2ee22e435131369a7e4f8cc4732579acc29a941b 2025-08-14T21:21:50.4254622Z * [new tag] ciflow/periodic/317eeb8 -> ciflow/periodic/317eeb8 2025-08-14T21:21:50.4254753Z * [new tag] ciflow/periodic/3c32 -> ciflow/periodic/3c32 2025-08-14T21:21:50.4254896Z * [new tag] ciflow/periodic/3e98831 -> ciflow/periodic/3e98831 2025-08-14T21:21:50.4255318Z * [new tag] ciflow/periodic/4a773e1e867f28a8ff0b15203e5cd9548f74fcee -> ciflow/periodic/4a773e1e867f28a8ff0b15203e5cd9548f74fcee 2025-08-14T21:21:50.4255625Z * [new tag] ciflow/periodic/5f5f508aa836a46dfe88857fb223049616b94e93 -> ciflow/periodic/5f5f508aa836a46dfe88857fb223049616b94e93 2025-08-14T21:21:50.4255776Z * [new tag] ciflow/periodic/94512-point -> ciflow/periodic/94512-point 2025-08-14T21:21:50.4256059Z * [new tag] ciflow/periodic/csl/test87519 -> ciflow/periodic/csl/test87519 2025-08-14T21:21:50.4256371Z * [new tag] ciflow/periodic/csltest88275 -> ciflow/periodic/csltest88275 2025-08-14T21:21:50.4257254Z * [new tag] ciflow/periodic/csltest88761 -> ciflow/periodic/csltest88761 2025-08-14T21:21:50.4257956Z * [new tag] ciflow/periodic/d7114f05b10de8e6de81ffc567d63944c3117d51 -> ciflow/periodic/d7114f05b10de8e6de81ffc567d63944c3117d51 2025-08-14T21:21:50.4258423Z * [new tag] ciflow/periodic/release_1.12 -> ciflow/periodic/release_1.12 2025-08-14T21:21:50.4259535Z * [new tag] ciflow/periodic/release_1.12.0 -> ciflow/periodic/release_1.12.0 2025-08-14T21:21:50.4260775Z * [new tag] ciflow/periodic/sha-ec5b83 -> ciflow/periodic/sha-ec5b83 2025-08-14T21:21:50.4261315Z * [new tag] ciflow/rocm-mi300/151360 -> ciflow/rocm-mi300/151360 2025-08-14T21:21:50.4261477Z * [new tag] ciflow/rocm-mi300/159158 -> ciflow/rocm-mi300/159158 2025-08-14T21:21:50.4261625Z * [new tag] ciflow/rocm-mi300/160073 -> ciflow/rocm-mi300/160073 2025-08-14T21:21:50.4261945Z * [new tag] ciflow/rocm-mi300/160468 -> ciflow/rocm-mi300/160468 2025-08-14T21:21:50.4262138Z * [new tag] ciflow/rocm-mi300/160538 -> ciflow/rocm-mi300/160538 2025-08-14T21:21:50.4264152Z * [new tag] ciflow/rocm-mi355/160215 -> ciflow/rocm-mi355/160215 2025-08-14T21:21:50.4264507Z * [new tag] ciflow/rocm/148492 -> ciflow/rocm/148492 2025-08-14T21:21:50.4264633Z * [new tag] ciflow/rocm/151360 -> ciflow/rocm/151360 2025-08-14T21:21:50.4264749Z * [new tag] ciflow/rocm/151845 -> ciflow/rocm/151845 2025-08-14T21:21:50.4264886Z * [new tag] ciflow/rocm/154864 -> ciflow/rocm/154864 2025-08-14T21:21:50.4265417Z * [new tag] ciflow/rocm/156491 -> ciflow/rocm/156491 2025-08-14T21:21:50.4265833Z * [new tag] ciflow/rocm/158219 -> ciflow/rocm/158219 2025-08-14T21:21:50.4266212Z * [new tag] ciflow/rocm/158220 -> ciflow/rocm/158220 2025-08-14T21:21:50.4266637Z * [new tag] ciflow/rocm/158224 -> ciflow/rocm/158224 2025-08-14T21:21:50.4267119Z * [new tag] ciflow/rocm/159158 -> ciflow/rocm/159158 2025-08-14T21:21:50.4267516Z * [new tag] ciflow/rocm/160215 -> ciflow/rocm/160215 2025-08-14T21:21:50.4268385Z * [new tag] ciflow/rocm/160468 -> ciflow/rocm/160468 2025-08-14T21:21:50.4268520Z * [new tag] ciflow/rocm/160538 -> ciflow/rocm/160538 2025-08-14T21:21:50.4271114Z * [new tag] ciflow/s390/143959 -> ciflow/s390/143959 2025-08-14T21:21:50.4271265Z * [new tag] ciflow/slow/01c7106 -> ciflow/slow/01c7106 2025-08-14T21:21:50.4278210Z * [new tag] ciflow/slow/0577043 -> ciflow/slow/0577043 2025-08-14T21:21:50.4283384Z * [new tag] ciflow/slow/0d5b74da0cab798fbfdb9caa53fad816999c8386-sdym -> ciflow/slow/0d5b74da0cab798fbfdb9caa53fad816999c8386-sdym 2025-08-14T21:21:50.4283727Z * [new tag] ciflow/slow/0e81104 -> ciflow/slow/0e81104 2025-08-14T21:21:50.4283863Z * [new tag] ciflow/slow/154595 -> ciflow/slow/154595 2025-08-14T21:21:50.4283982Z * [new tag] ciflow/slow/1732077 -> ciflow/slow/1732077 2025-08-14T21:21:50.4284106Z * [new tag] ciflow/slow/187eb7c -> ciflow/slow/187eb7c 2025-08-14T21:21:50.4284227Z * [new tag] ciflow/slow/1faef89 -> ciflow/slow/1faef89 2025-08-14T21:21:50.4284345Z * [new tag] ciflow/slow/3920ec1 -> ciflow/slow/3920ec1 2025-08-14T21:21:50.4284471Z * [new tag] ciflow/slow/3b7c6b2 -> ciflow/slow/3b7c6b2 2025-08-14T21:21:50.4284601Z * [new tag] ciflow/slow/59a3759 -> ciflow/slow/59a3759 2025-08-14T21:21:50.4284726Z * [new tag] ciflow/slow/70ef0bb -> ciflow/slow/70ef0bb 2025-08-14T21:21:50.4284930Z * [new tag] ciflow/slow/788ff06 -> ciflow/slow/788ff06 2025-08-14T21:21:50.4285237Z * [new tag] ciflow/slow/8751002215790a3a88750faa8f4366933e296693-sdym -> ciflow/slow/8751002215790a3a88750faa8f4366933e296693-sdym 2025-08-14T21:21:50.4285364Z * [new tag] ciflow/slow/9d85864 -> ciflow/slow/9d85864 2025-08-14T21:21:50.4285482Z * [new tag] ciflow/slow/9ffad5b -> ciflow/slow/9ffad5b 2025-08-14T21:21:50.4285601Z * [new tag] ciflow/slow/a206e8b -> ciflow/slow/a206e8b 2025-08-14T21:21:50.4285725Z * [new tag] ciflow/slow/a837609 -> ciflow/slow/a837609 2025-08-14T21:21:50.4285844Z * [new tag] ciflow/slow/af841f3 -> ciflow/slow/af841f3 2025-08-14T21:21:50.4286165Z * [new tag] ciflow/slow/da3aba1e46157c4df504b067477cdf2b3c96b194-sdym -> ciflow/slow/da3aba1e46157c4df504b067477cdf2b3c96b194-sdym 2025-08-14T21:21:50.4286289Z * [new tag] ciflow/trunk/131296 -> ciflow/trunk/131296 2025-08-14T21:21:50.4286403Z * [new tag] ciflow/trunk/137400 -> ciflow/trunk/137400 2025-08-14T21:21:50.4286527Z * [new tag] ciflow/trunk/138996 -> ciflow/trunk/138996 2025-08-14T21:21:50.4286644Z * [new tag] ciflow/trunk/139971 -> ciflow/trunk/139971 2025-08-14T21:21:50.4286768Z * [new tag] ciflow/trunk/147360 -> ciflow/trunk/147360 2025-08-14T21:21:50.4286885Z * [new tag] ciflow/trunk/147855 -> ciflow/trunk/147855 2025-08-14T21:21:50.4286999Z * [new tag] ciflow/trunk/148180 -> ciflow/trunk/148180 2025-08-14T21:21:50.4287122Z * [new tag] ciflow/trunk/148328 -> ciflow/trunk/148328 2025-08-14T21:21:50.4287237Z * [new tag] ciflow/trunk/148492 -> ciflow/trunk/148492 2025-08-14T21:21:50.4287365Z * [new tag] ciflow/trunk/150282 -> ciflow/trunk/150282 2025-08-14T21:21:50.4287482Z * [new tag] ciflow/trunk/150302 -> ciflow/trunk/150302 2025-08-14T21:21:50.4287596Z * [new tag] ciflow/trunk/151845 -> ciflow/trunk/151845 2025-08-14T21:21:50.4287718Z * [new tag] ciflow/trunk/152624 -> ciflow/trunk/152624 2025-08-14T21:21:50.4288014Z * [new tag] ciflow/trunk/154193 -> ciflow/trunk/154193 2025-08-14T21:21:50.4288163Z * [new tag] ciflow/trunk/154595 -> ciflow/trunk/154595 2025-08-14T21:21:50.4288524Z * [new tag] ciflow/trunk/154650 -> ciflow/trunk/154650 2025-08-14T21:21:50.4289319Z * [new tag] ciflow/trunk/154694 -> ciflow/trunk/154694 2025-08-14T21:21:50.4289455Z * [new tag] ciflow/trunk/155958 -> ciflow/trunk/155958 2025-08-14T21:21:50.4290965Z * [new tag] ciflow/trunk/156049 -> ciflow/trunk/156049 2025-08-14T21:21:50.4291298Z * [new tag] ciflow/trunk/156703 -> ciflow/trunk/156703 2025-08-14T21:21:50.4291431Z * [new tag] ciflow/trunk/156851 -> ciflow/trunk/156851 2025-08-14T21:21:50.4292447Z * [new tag] ciflow/trunk/157148 -> ciflow/trunk/157148 2025-08-14T21:21:50.4293003Z * [new tag] ciflow/trunk/157152 -> ciflow/trunk/157152 2025-08-14T21:21:50.4293148Z * [new tag] ciflow/trunk/157432 -> ciflow/trunk/157432 2025-08-14T21:21:50.4293275Z * [new tag] ciflow/trunk/157685 -> ciflow/trunk/157685 2025-08-14T21:21:50.4294319Z * [new tag] ciflow/trunk/157689 -> ciflow/trunk/157689 2025-08-14T21:21:50.4294454Z * [new tag] ciflow/trunk/157699 -> ciflow/trunk/157699 2025-08-14T21:21:50.4294663Z * [new tag] ciflow/trunk/157813 -> ciflow/trunk/157813 2025-08-14T21:21:50.4295092Z * [new tag] ciflow/trunk/157994 -> ciflow/trunk/157994 2025-08-14T21:21:50.4300122Z * [new tag] ciflow/trunk/158091 -> ciflow/trunk/158091 2025-08-14T21:21:50.4300283Z * [new tag] ciflow/trunk/158104 -> ciflow/trunk/158104 2025-08-14T21:21:50.4300403Z * [new tag] ciflow/trunk/158219 -> ciflow/trunk/158219 2025-08-14T21:21:50.4300518Z * [new tag] ciflow/trunk/158220 -> ciflow/trunk/158220 2025-08-14T21:21:50.4300642Z * [new tag] ciflow/trunk/158224 -> ciflow/trunk/158224 2025-08-14T21:21:50.4300761Z * [new tag] ciflow/trunk/158529 -> ciflow/trunk/158529 2025-08-14T21:21:50.4300875Z * [new tag] ciflow/trunk/158647 -> ciflow/trunk/158647 2025-08-14T21:21:50.4301000Z * [new tag] ciflow/trunk/158810 -> ciflow/trunk/158810 2025-08-14T21:21:50.4301147Z * [new tag] ciflow/trunk/158812 -> ciflow/trunk/158812 2025-08-14T21:21:50.4301291Z * [new tag] ciflow/trunk/158863 -> ciflow/trunk/158863 2025-08-14T21:21:50.4301401Z * [new tag] ciflow/trunk/158864 -> ciflow/trunk/158864 2025-08-14T21:21:50.4301513Z * [new tag] ciflow/trunk/158883 -> ciflow/trunk/158883 2025-08-14T21:21:50.4301808Z * [new tag] ciflow/trunk/158914 -> ciflow/trunk/158914 2025-08-14T21:21:50.4301942Z * [new tag] ciflow/trunk/158965 -> ciflow/trunk/158965 2025-08-14T21:21:50.4302829Z * [new tag] ciflow/trunk/158987 -> ciflow/trunk/158987 2025-08-14T21:21:50.4303017Z * [new tag] ciflow/trunk/159033 -> ciflow/trunk/159033 2025-08-14T21:21:50.4311493Z * [new tag] ciflow/trunk/159140 -> ciflow/trunk/159140 2025-08-14T21:21:50.4312162Z * [new tag] ciflow/trunk/159158 -> ciflow/trunk/159158 2025-08-14T21:21:50.4312350Z * [new tag] ciflow/trunk/159553 -> ciflow/trunk/159553 2025-08-14T21:21:50.4312484Z * [new tag] ciflow/trunk/159562 -> ciflow/trunk/159562 2025-08-14T21:21:50.4312602Z * [new tag] ciflow/trunk/159682 -> ciflow/trunk/159682 2025-08-14T21:21:50.4312714Z * [new tag] ciflow/trunk/159691 -> ciflow/trunk/159691 2025-08-14T21:21:50.4312835Z * [new tag] ciflow/trunk/159842 -> ciflow/trunk/159842 2025-08-14T21:21:50.4312947Z * [new tag] ciflow/trunk/159889 -> ciflow/trunk/159889 2025-08-14T21:21:50.4313059Z * [new tag] ciflow/trunk/159923 -> ciflow/trunk/159923 2025-08-14T21:21:50.4313179Z * [new tag] ciflow/trunk/160004 -> ciflow/trunk/160004 2025-08-14T21:21:50.4313291Z * [new tag] ciflow/trunk/160113 -> ciflow/trunk/160113 2025-08-14T21:21:50.4313413Z * [new tag] ciflow/trunk/160161 -> ciflow/trunk/160161 2025-08-14T21:21:50.4313728Z * [new tag] ciflow/trunk/160168 -> ciflow/trunk/160168 2025-08-14T21:21:50.4313853Z * [new tag] ciflow/trunk/160181 -> ciflow/trunk/160181 2025-08-14T21:21:50.4318859Z * [new tag] ciflow/trunk/160183 -> ciflow/trunk/160183 2025-08-14T21:21:50.4320893Z * [new tag] ciflow/trunk/160190 -> ciflow/trunk/160190 2025-08-14T21:21:50.4321019Z * [new tag] ciflow/trunk/160198 -> ciflow/trunk/160198 2025-08-14T21:21:50.4321136Z * [new tag] ciflow/trunk/160205 -> ciflow/trunk/160205 2025-08-14T21:21:50.4321249Z * [new tag] ciflow/trunk/160219 -> ciflow/trunk/160219 2025-08-14T21:21:50.4321387Z * [new tag] ciflow/trunk/160224 -> ciflow/trunk/160224 2025-08-14T21:21:50.4321512Z * [new tag] ciflow/trunk/160250 -> ciflow/trunk/160250 2025-08-14T21:21:50.4321827Z * [new tag] ciflow/trunk/160253 -> ciflow/trunk/160253 2025-08-14T21:21:50.4321934Z * [new tag] ciflow/trunk/160335 -> ciflow/trunk/160335 2025-08-14T21:21:50.4322042Z * [new tag] ciflow/trunk/160338 -> ciflow/trunk/160338 2025-08-14T21:21:50.4322160Z * [new tag] ciflow/trunk/160383 -> ciflow/trunk/160383 2025-08-14T21:21:50.4322273Z * [new tag] ciflow/trunk/160401 -> ciflow/trunk/160401 2025-08-14T21:21:50.4322394Z * [new tag] ciflow/trunk/160403 -> ciflow/trunk/160403 2025-08-14T21:21:50.4322516Z * [new tag] ciflow/trunk/160430 -> ciflow/trunk/160430 2025-08-14T21:21:50.4322626Z * [new tag] ciflow/trunk/160431 -> ciflow/trunk/160431 2025-08-14T21:21:50.4322747Z * [new tag] ciflow/trunk/160439 -> ciflow/trunk/160439 2025-08-14T21:21:50.4322868Z * [new tag] ciflow/trunk/160449 -> ciflow/trunk/160449 2025-08-14T21:21:50.4322982Z * [new tag] ciflow/trunk/160454 -> ciflow/trunk/160454 2025-08-14T21:21:50.4323104Z * [new tag] ciflow/trunk/160468 -> ciflow/trunk/160468 2025-08-14T21:21:50.4323216Z * [new tag] ciflow/trunk/160481 -> ciflow/trunk/160481 2025-08-14T21:21:50.4323335Z * [new tag] ciflow/trunk/160485 -> ciflow/trunk/160485 2025-08-14T21:21:50.4323450Z * [new tag] ciflow/trunk/160519 -> ciflow/trunk/160519 2025-08-14T21:21:50.4323562Z * [new tag] ciflow/trunk/160527 -> ciflow/trunk/160527 2025-08-14T21:21:50.4323709Z * [new tag] ciflow/trunk/160560 -> ciflow/trunk/160560 2025-08-14T21:21:50.4324414Z * [new tag] ciflow/trunk/160578 -> ciflow/trunk/160578 2025-08-14T21:21:50.4324579Z * [new tag] ciflow/trunk/160589 -> ciflow/trunk/160589 2025-08-14T21:21:50.4324723Z * [new tag] ciflow/trunk/160592 -> ciflow/trunk/160592 2025-08-14T21:21:50.4324840Z * [new tag] ciflow/trunk/160649 -> ciflow/trunk/160649 2025-08-14T21:21:50.4324958Z * [new tag] ciflow/trunk/160656 -> ciflow/trunk/160656 2025-08-14T21:21:50.4325082Z * [new tag] ciflow/unstable/123 -> ciflow/unstable/123 2025-08-14T21:21:50.4325208Z * [new tag] ciflow/vllm/160116 -> ciflow/vllm/160116 2025-08-14T21:21:50.4325351Z * [new tag] ciflow/vllm/160583 -> ciflow/vllm/160583 2025-08-14T21:21:50.4325524Z * [new tag] ciflow/vllm/160619 -> ciflow/vllm/160619 2025-08-14T21:21:50.4325649Z * [new tag] ciflow/vllm/160625 -> ciflow/vllm/160625 2025-08-14T21:21:50.4325776Z * [new tag] ciflow/vllm/160627 -> ciflow/vllm/160627 2025-08-14T21:21:50.4326363Z * [new tag] ciflow/win-arm64/156049 -> ciflow/win-arm64/156049 2025-08-14T21:21:50.4326579Z * [new tag] ciflow/win-arm64/158104 -> ciflow/win-arm64/158104 2025-08-14T21:21:50.4327146Z * [new tag] ciflow/win-arm64/159553 -> ciflow/win-arm64/159553 2025-08-14T21:21:50.4327574Z * [new tag] ciflow/win-arm64/159562 -> ciflow/win-arm64/159562 2025-08-14T21:21:50.4329038Z * [new tag] ciflow/win-arm64/159777 -> ciflow/win-arm64/159777 2025-08-14T21:21:50.4329169Z * [new tag] ciflow/win-arm64/159780 -> ciflow/win-arm64/159780 2025-08-14T21:21:50.4329300Z * [new tag] ciflow/win-arm64/159842 -> ciflow/win-arm64/159842 2025-08-14T21:21:50.4329578Z * [new tag] ciflow/win-arm64/160250 -> ciflow/win-arm64/160250 2025-08-14T21:21:50.4336995Z * [new tag] ciflow/win-arm64/160253 -> ciflow/win-arm64/160253 2025-08-14T21:21:50.4339045Z * [new tag] ciflow/win-arm64/160454 -> ciflow/win-arm64/160454 2025-08-14T21:21:50.4339292Z * [new tag] ciflow/win-arm64/160560 -> ciflow/win-arm64/160560 2025-08-14T21:21:50.4339442Z * [new tag] ciflow/xpu/138996 -> ciflow/xpu/138996 2025-08-14T21:21:50.4339639Z * [new tag] ciflow/xpu/139971 -> ciflow/xpu/139971 2025-08-14T21:21:50.4339881Z * [new tag] ciflow/xpu/140972 -> ciflow/xpu/140972 2025-08-14T21:21:50.4340010Z * [new tag] ciflow/xpu/143553 -> ciflow/xpu/143553 2025-08-14T21:21:50.4340208Z * [new tag] ciflow/xpu/156272 -> ciflow/xpu/156272 2025-08-14T21:21:50.4340342Z * [new tag] ciflow/xpu/156812 -> ciflow/xpu/156812 2025-08-14T21:21:50.4340993Z * [new tag] ciflow/xpu/157699 -> ciflow/xpu/157699 2025-08-14T21:21:50.4346546Z * [new tag] ciflow/xpu/157994 -> ciflow/xpu/157994 2025-08-14T21:21:50.4351838Z * [new tag] ciflow/xpu/158336 -> ciflow/xpu/158336 2025-08-14T21:21:50.4356946Z * [new tag] ciflow/xpu/158733 -> ciflow/xpu/158733 2025-08-14T21:21:50.4357080Z * [new tag] ciflow/xpu/159033 -> ciflow/xpu/159033 2025-08-14T21:21:50.4357614Z * [new tag] ciflow/xpu/159118 -> ciflow/xpu/159118 2025-08-14T21:21:50.4357760Z * [new tag] ciflow/xpu/159140 -> ciflow/xpu/159140 2025-08-14T21:21:50.4357876Z * [new tag] ciflow/xpu/159241 -> ciflow/xpu/159241 2025-08-14T21:21:50.4358030Z * [new tag] ciflow/xpu/159473 -> ciflow/xpu/159473 2025-08-14T21:21:50.4358149Z * [new tag] ciflow/xpu/159474 -> ciflow/xpu/159474 2025-08-14T21:21:50.4358263Z * [new tag] ciflow/xpu/159553 -> ciflow/xpu/159553 2025-08-14T21:21:50.4358408Z * [new tag] ciflow/xpu/159944 -> ciflow/xpu/159944 2025-08-14T21:21:50.4358540Z * [new tag] ciflow/xpu/160062 -> ciflow/xpu/160062 2025-08-14T21:21:50.4358655Z * [new tag] ciflow/xpu/160067 -> ciflow/xpu/160067 2025-08-14T21:21:50.4358769Z * [new tag] ciflow/xpu/160158 -> ciflow/xpu/160158 2025-08-14T21:21:50.4358881Z * [new tag] ciflow/xpu/160173 -> ciflow/xpu/160173 2025-08-14T21:21:50.4358996Z * [new tag] ciflow/xpu/160183 -> ciflow/xpu/160183 2025-08-14T21:21:50.4359108Z * [new tag] ciflow/xpu/160301 -> ciflow/xpu/160301 2025-08-14T21:21:50.4359225Z * [new tag] ciflow/xpu/160403 -> ciflow/xpu/160403 2025-08-14T21:21:50.4359332Z * [new tag] ciflow/xpu/160606 -> ciflow/xpu/160606 2025-08-14T21:21:50.4359447Z * [new tag] cslpull75 -> cslpull75 2025-08-14T21:21:50.4359712Z * [new tag] cslpull76 -> cslpull76 2025-08-14T21:21:50.4359824Z * [new tag] cslpull77 -> cslpull77 2025-08-14T21:21:50.4359929Z * [new tag] cslpull78 -> cslpull78 2025-08-14T21:21:50.4360043Z * [new tag] cslpull79 -> cslpull79 2025-08-14T21:21:50.4360148Z * [new tag] cslpull80 -> cslpull80 2025-08-14T21:21:50.4360257Z * [new tag] cslpull81 -> cslpull81 2025-08-14T21:21:50.4360365Z * [new tag] cslpull82 -> cslpull82 2025-08-14T21:21:50.4360469Z * [new tag] cslpull83 -> cslpull83 2025-08-14T21:21:50.4360581Z * [new tag] cslpull84 -> cslpull84 2025-08-14T21:21:50.4360679Z * [new tag] cslpull85 -> cslpull85 2025-08-14T21:21:50.4360846Z * [new tag] cslpull86 -> cslpull86 2025-08-14T21:21:50.4360958Z * [new tag] cslpull87 -> cslpull87 2025-08-14T21:21:50.4361061Z * [new tag] cslpull88 -> cslpull88 2025-08-14T21:21:50.4361168Z * [new tag] cslpull89 -> cslpull89 2025-08-14T21:21:50.4361269Z * [new tag] cslpull90 -> cslpull90 2025-08-14T21:21:50.4361370Z * [new tag] cslpull91 -> cslpull91 2025-08-14T21:21:50.4361472Z * [new tag] cslpull92 -> cslpull92 2025-08-14T21:21:50.4361580Z * [new tag] flight_5 -> flight_5 2025-08-14T21:21:50.4361693Z * [new tag] flight_5.1 -> flight_5.1 2025-08-14T21:21:50.4361794Z * [new tag] flight_5.2 -> flight_5.2 2025-08-14T21:21:50.4361895Z * [new tag] flight_5.3 -> flight_5.3 2025-08-14T21:21:50.4362005Z * [new tag] forpull1 -> forpull1 2025-08-14T21:21:50.4362135Z * [new tag] malfet/tag-2ef5611 -> malfet/tag-2ef5611 2025-08-14T21:21:50.4362256Z * [new tag] malfet/tag-317b1a0 -> malfet/tag-317b1a0 2025-08-14T21:21:50.4362379Z * [new tag] malfet/tag-ec6f767 -> malfet/tag-ec6f767 2025-08-14T21:21:50.4362499Z * [new tag] nightly-binary -> nightly-binary 2025-08-14T21:21:50.4362632Z * [new tag] sqzhang_flight4_plus -> sqzhang_flight4_plus 2025-08-14T21:21:50.4362745Z * [new tag] sqzhang_flight_3 -> sqzhang_flight_3 2025-08-14T21:21:50.4362995Z * [new tag] trunk/01584d2a7d029c9749eb73678cf1dc313cc35df6 -> trunk/01584d2a7d029c9749eb73678cf1dc313cc35df6 2025-08-14T21:21:50.4363265Z * [new tag] trunk/017259f9c65b6fad55fb9597d7077e2543eaae46 -> trunk/017259f9c65b6fad55fb9597d7077e2543eaae46 2025-08-14T21:21:50.4363884Z * [new tag] trunk/01bcf9a40dea937637d2cdd530bed2652510943d -> trunk/01bcf9a40dea937637d2cdd530bed2652510943d 2025-08-14T21:21:50.4364457Z * [new tag] trunk/01f66d08d93365015f4af005a252f439c4d4013a -> trunk/01f66d08d93365015f4af005a252f439c4d4013a 2025-08-14T21:21:50.4365564Z * [new tag] trunk/03b254e49f2d4c092e6ca712e5702cf2895aa47e -> trunk/03b254e49f2d4c092e6ca712e5702cf2895aa47e 2025-08-14T21:21:50.4365822Z * [new tag] trunk/05029ad1c30865d3f7e7fd13384db9d826e563eb -> trunk/05029ad1c30865d3f7e7fd13384db9d826e563eb 2025-08-14T21:21:50.4366285Z * [new tag] trunk/05c19d1acecc01b0d2512364183058a6885b9869 -> trunk/05c19d1acecc01b0d2512364183058a6885b9869 2025-08-14T21:21:50.4366918Z * [new tag] trunk/05c417715f791875fbf28cfc3fc86142de1a3206 -> trunk/05c417715f791875fbf28cfc3fc86142de1a3206 2025-08-14T21:21:50.4368181Z * [new tag] trunk/06824f3c7268bb807a422b663047cd0900ddd126 -> trunk/06824f3c7268bb807a422b663047cd0900ddd126 2025-08-14T21:21:50.4368451Z * [new tag] trunk/077cb389746a7d61cfc018aad2ba29a8aa195610 -> trunk/077cb389746a7d61cfc018aad2ba29a8aa195610 2025-08-14T21:21:50.4368822Z * [new tag] trunk/089c4a1ba007ed4abb3e5e0eafd97b7584566057 -> trunk/089c4a1ba007ed4abb3e5e0eafd97b7584566057 2025-08-14T21:21:50.4369450Z * [new tag] trunk/09381f5dacda7bbbfa361f5df76bde5cd309adc1 -> trunk/09381f5dacda7bbbfa361f5df76bde5cd309adc1 2025-08-14T21:21:50.4369812Z * [new tag] trunk/0bd3af4fb87445f4de3a1f9b823e399c8b3cefde -> trunk/0bd3af4fb87445f4de3a1f9b823e399c8b3cefde 2025-08-14T21:21:50.4370584Z * [new tag] trunk/0d3461bac0fb5177e35152d980b301ea3a0aa2c4 -> trunk/0d3461bac0fb5177e35152d980b301ea3a0aa2c4 2025-08-14T21:21:50.4371027Z * [new tag] trunk/0d40ff3b496e68193bc16d5391fa2e3623709f81 -> trunk/0d40ff3b496e68193bc16d5391fa2e3623709f81 2025-08-14T21:21:50.4371757Z * [new tag] trunk/0d71ca2c46753bb268bfdcf815c14415c122a289 -> trunk/0d71ca2c46753bb268bfdcf815c14415c122a289 2025-08-14T21:21:50.4372257Z * [new tag] trunk/0d88593dd826544c9e7bd4aa615ef86847a78d2b -> trunk/0d88593dd826544c9e7bd4aa615ef86847a78d2b 2025-08-14T21:21:50.4372879Z * [new tag] trunk/0e3e377bd5126cfcc69d70c4d77b352d3404cc11 -> trunk/0e3e377bd5126cfcc69d70c4d77b352d3404cc11 2025-08-14T21:21:50.4374291Z * [new tag] trunk/0f3b10b8eebe68e3c75d473d499b87dfe14a2eca -> trunk/0f3b10b8eebe68e3c75d473d499b87dfe14a2eca 2025-08-14T21:21:50.4374533Z * [new tag] trunk/101276f81b4d2a8c31bfd6796b986d4c1bfdf483 -> trunk/101276f81b4d2a8c31bfd6796b986d4c1bfdf483 2025-08-14T21:21:50.4374775Z * [new tag] trunk/1028c5e2d50e121865bf98307e7c035f549a24b2 -> trunk/1028c5e2d50e121865bf98307e7c035f549a24b2 2025-08-14T21:21:50.4375866Z * [new tag] trunk/10bc36fe840cb3510fab84d2ea22663b76702f1e -> trunk/10bc36fe840cb3510fab84d2ea22663b76702f1e 2025-08-14T21:21:50.4376106Z * [new tag] trunk/10e3514c962b58cbbee994257872a626ff76d51b -> trunk/10e3514c962b58cbbee994257872a626ff76d51b 2025-08-14T21:21:50.4376826Z * [new tag] trunk/1128f4c2a822cbe34a9d966306af15097179ffe1 -> trunk/1128f4c2a822cbe34a9d966306af15097179ffe1 2025-08-14T21:21:50.4378196Z * [new tag] trunk/114a6c40434bfb9cfa5abc30e9e34d81300d743e -> trunk/114a6c40434bfb9cfa5abc30e9e34d81300d743e 2025-08-14T21:21:50.4378557Z * [new tag] trunk/118bc97b14c24ac88a4b0c0750a9e7bf93154c76 -> trunk/118bc97b14c24ac88a4b0c0750a9e7bf93154c76 2025-08-14T21:21:50.4378804Z * [new tag] trunk/1196bb1c2e4d5a7edc09f2260e3034132f0c6c91 -> trunk/1196bb1c2e4d5a7edc09f2260e3034132f0c6c91 2025-08-14T21:21:50.4379068Z * [new tag] trunk/11a3565f1872bbad9c253a127e8d4ce7a1b40ec8 -> trunk/11a3565f1872bbad9c253a127e8d4ce7a1b40ec8 2025-08-14T21:21:50.4379490Z * [new tag] trunk/15e49f61643e4c0eef420f0981609709ef55b848 -> trunk/15e49f61643e4c0eef420f0981609709ef55b848 2025-08-14T21:21:50.4380387Z * [new tag] trunk/16d15445f8bd8740095b23de4af89d757af793ca -> trunk/16d15445f8bd8740095b23de4af89d757af793ca 2025-08-14T21:21:50.4380758Z * [new tag] trunk/178515d0ff6833c8e9221482b2a650ab31e00019 -> trunk/178515d0ff6833c8e9221482b2a650ab31e00019 2025-08-14T21:21:50.4381365Z * [new tag] trunk/182efe31dbe43376e7eef7338356aaf94d5bcabe -> trunk/182efe31dbe43376e7eef7338356aaf94d5bcabe 2025-08-14T21:21:50.4382890Z * [new tag] trunk/194fcfcfbdad0add1a1b695321e31a576058f4cf -> trunk/194fcfcfbdad0add1a1b695321e31a576058f4cf 2025-08-14T21:21:50.4383151Z * [new tag] trunk/195b5c2e27eb8f21cbc8ad1e90f42db5a8cfccca -> trunk/195b5c2e27eb8f21cbc8ad1e90f42db5a8cfccca 2025-08-14T21:21:50.4383406Z * [new tag] trunk/198b5fd2d47fa3d5110ceba6827a3b18e0064014 -> trunk/198b5fd2d47fa3d5110ceba6827a3b18e0064014 2025-08-14T21:21:50.4383727Z * [new tag] trunk/199e9abb6a366bbd27c39d1da7c3123b4eea9b5a -> trunk/199e9abb6a366bbd27c39d1da7c3123b4eea9b5a 2025-08-14T21:21:50.4384416Z * [new tag] trunk/19b4283884b2d9b3a0eb364da10b1540d14ab7a7 -> trunk/19b4283884b2d9b3a0eb364da10b1540d14ab7a7 2025-08-14T21:21:50.4386127Z * [new tag] trunk/1c2587119152cec3905647a47c65d3d26619c5a8 -> trunk/1c2587119152cec3905647a47c65d3d26619c5a8 2025-08-14T21:21:50.4386417Z * [new tag] trunk/1c26c53851c212a7c90a325549a72f0571613a8c -> trunk/1c26c53851c212a7c90a325549a72f0571613a8c 2025-08-14T21:21:50.4386709Z * [new tag] trunk/1c2cba17eab2b09d87142883da2bdbdbcf018613 -> trunk/1c2cba17eab2b09d87142883da2bdbdbcf018613 2025-08-14T21:21:50.4387227Z * [new tag] trunk/1d80d697a269234b47ec7ede192faf3bb9b159e3 -> trunk/1d80d697a269234b47ec7ede192faf3bb9b159e3 2025-08-14T21:21:50.4387839Z * [new tag] trunk/1ea688f9a2602fbcde32c0302b822526ca4219dc -> trunk/1ea688f9a2602fbcde32c0302b822526ca4219dc 2025-08-14T21:21:50.4388545Z * [new tag] trunk/1f4057c11ac941fb324386ca594d0a6882185aad -> trunk/1f4057c11ac941fb324386ca594d0a6882185aad 2025-08-14T21:21:50.4388982Z * [new tag] trunk/1fc683cf17c8c673044538d10266c00f92987be2 -> trunk/1fc683cf17c8c673044538d10266c00f92987be2 2025-08-14T21:21:50.4389795Z * [new tag] trunk/1febab2a89302464f6c7d69cfbef7a24c421ea65 -> trunk/1febab2a89302464f6c7d69cfbef7a24c421ea65 2025-08-14T21:21:50.4391128Z * [new tag] trunk/206c1eef6571f906c2792d899a09136b3fce9673 -> trunk/206c1eef6571f906c2792d899a09136b3fce9673 2025-08-14T21:21:50.4391393Z * [new tag] trunk/20bdabbb3c5d6b118a94b2e045c777662563d5bb -> trunk/20bdabbb3c5d6b118a94b2e045c777662563d5bb 2025-08-14T21:21:50.4391622Z * [new tag] trunk/21392c0e06ac2b2621950455975ca6332f0bf641 -> trunk/21392c0e06ac2b2621950455975ca6332f0bf641 2025-08-14T21:21:50.4391880Z * [new tag] trunk/2247aa6d1d43e256255f5c74a781c3190a4387b6 -> trunk/2247aa6d1d43e256255f5c74a781c3190a4387b6 2025-08-14T21:21:50.4392249Z * [new tag] trunk/2259dbed4e0d3f2a8174b5847fd0741aed42451d -> trunk/2259dbed4e0d3f2a8174b5847fd0741aed42451d 2025-08-14T21:21:50.4392635Z * [new tag] trunk/231c72240d80091f099c95e326d3600cba866eee -> trunk/231c72240d80091f099c95e326d3600cba866eee 2025-08-14T21:21:50.4392979Z * [new tag] trunk/24257f5bfaa37795f74d9f64c1b43584128d4b8c -> trunk/24257f5bfaa37795f74d9f64c1b43584128d4b8c 2025-08-14T21:21:50.4396543Z * [new tag] trunk/24f43d0da7ad9c6e95a09a2fee610387728cc1cd -> trunk/24f43d0da7ad9c6e95a09a2fee610387728cc1cd 2025-08-14T21:21:50.4396821Z * [new tag] trunk/2898d3f965e5cd9d02fc2ecdab7c580fd457fea9 -> trunk/2898d3f965e5cd9d02fc2ecdab7c580fd457fea9 2025-08-14T21:21:50.4397130Z * [new tag] trunk/28ccc9e7247798980fe00a11bcd64a8016b5f227 -> trunk/28ccc9e7247798980fe00a11bcd64a8016b5f227 2025-08-14T21:21:50.4397373Z * [new tag] trunk/29712314dd5cf500a8ea3d1c69483a3cb768ca72 -> trunk/29712314dd5cf500a8ea3d1c69483a3cb768ca72 2025-08-14T21:21:50.4397618Z * [new tag] trunk/29d20d49f0b7f4e362e1cefdcdc4b5659969312c -> trunk/29d20d49f0b7f4e362e1cefdcdc4b5659969312c 2025-08-14T21:21:50.4398075Z * [new tag] trunk/2c5e10a5fceb208b11c3d569ae02e348b5893b31 -> trunk/2c5e10a5fceb208b11c3d569ae02e348b5893b31 2025-08-14T21:21:50.4398347Z * [new tag] trunk/2d0cdee394bccadcd0abe19dd4623ed978a331ad -> trunk/2d0cdee394bccadcd0abe19dd4623ed978a331ad 2025-08-14T21:21:50.4398611Z * [new tag] trunk/2e4e5ab4be9e0aeffd9c49b5b2f9f820bd0895b1 -> trunk/2e4e5ab4be9e0aeffd9c49b5b2f9f820bd0895b1 2025-08-14T21:21:50.4398873Z * [new tag] trunk/2ea40fba841b3af8103f332ba62e54f350ba9a51 -> trunk/2ea40fba841b3af8103f332ba62e54f350ba9a51 2025-08-14T21:21:50.4399496Z * [new tag] trunk/2ee22e435131369a7e4f8cc4732579acc29a941b -> trunk/2ee22e435131369a7e4f8cc4732579acc29a941b 2025-08-14T21:21:50.4400007Z * [new tag] trunk/2f4c2226175512af787725c4d5ad7313c60d4db1 -> trunk/2f4c2226175512af787725c4d5ad7313c60d4db1 2025-08-14T21:21:50.4400268Z * [new tag] trunk/3008d985a8fc155eb89374afff50cb33a6bd10d5 -> trunk/3008d985a8fc155eb89374afff50cb33a6bd10d5 2025-08-14T21:21:50.4401693Z * [new tag] trunk/3028fa6ce9d9c96671722ab8213a1a30670d7cf2 -> trunk/3028fa6ce9d9c96671722ab8213a1a30670d7cf2 2025-08-14T21:21:50.4401950Z * [new tag] trunk/303c614f3df95ae2b659c5f6c1838b14e4776ce6 -> trunk/303c614f3df95ae2b659c5f6c1838b14e4776ce6 2025-08-14T21:21:50.4402186Z * [new tag] trunk/305fa2239365ad17ac9c534a68bba8a149c42d67 -> trunk/305fa2239365ad17ac9c534a68bba8a149c42d67 2025-08-14T21:21:50.4406120Z * [new tag] trunk/31c9ac4319c0cc2ed8c6be701c6ccf73f6cb4706 -> trunk/31c9ac4319c0cc2ed8c6be701c6ccf73f6cb4706 2025-08-14T21:21:50.4406430Z * [new tag] trunk/32099961d588fc19ead8afe805d6b5108de75669 -> trunk/32099961d588fc19ead8afe805d6b5108de75669 2025-08-14T21:21:50.4406682Z * [new tag] trunk/32e5e2f596d55bb9441d5d53f3c58bcb55828047 -> trunk/32e5e2f596d55bb9441d5d53f3c58bcb55828047 2025-08-14T21:21:50.4406936Z * [new tag] trunk/334b38ccc4427b1d14981c48a3a0b92180d58225 -> trunk/334b38ccc4427b1d14981c48a3a0b92180d58225 2025-08-14T21:21:50.4407191Z * [new tag] trunk/334ecbd4ffe11858cae7d23d1190ddb4777c2513 -> trunk/334ecbd4ffe11858cae7d23d1190ddb4777c2513 2025-08-14T21:21:50.4407467Z * [new tag] trunk/33d94018668951611b318b7515ae96f04e48eac0 -> trunk/33d94018668951611b318b7515ae96f04e48eac0 2025-08-14T21:21:50.4407716Z * [new tag] trunk/34358f335d95213d96b6cca6a83e7bf3af6a9fcb -> trunk/34358f335d95213d96b6cca6a83e7bf3af6a9fcb 2025-08-14T21:21:50.4408001Z * [new tag] trunk/34ec5ed275f8aa875c80daa97b3e82af0b06f673 -> trunk/34ec5ed275f8aa875c80daa97b3e82af0b06f673 2025-08-14T21:21:50.4408280Z * [new tag] trunk/355462e1278d818deb9ef4a184073d5b66074816 -> trunk/355462e1278d818deb9ef4a184073d5b66074816 2025-08-14T21:21:50.4418307Z * [new tag] trunk/3626ba711b34397d1fbf0a9b1979f85cbf68b919 -> trunk/3626ba711b34397d1fbf0a9b1979f85cbf68b919 2025-08-14T21:21:50.4418756Z * [new tag] trunk/36f46d082a4954921cb8493223f000f2aab79ed7 -> trunk/36f46d082a4954921cb8493223f000f2aab79ed7 2025-08-14T21:21:50.4419142Z * [new tag] trunk/39aa3d1471549b7829c207d634dfdc1d26e346a2 -> trunk/39aa3d1471549b7829c207d634dfdc1d26e346a2 2025-08-14T21:21:50.4419523Z * [new tag] trunk/3a562374401113187ce2566b87e3f1d87d7c53aa -> trunk/3a562374401113187ce2566b87e3f1d87d7c53aa 2025-08-14T21:21:50.4419901Z * [new tag] trunk/3ac86e728dfaa7383ff7f865e9e7d33486188dae -> trunk/3ac86e728dfaa7383ff7f865e9e7d33486188dae 2025-08-14T21:21:50.4420318Z * [new tag] trunk/3be70dc30e893b552fc0f23ca06cd8f7949b6d08 -> trunk/3be70dc30e893b552fc0f23ca06cd8f7949b6d08 2025-08-14T21:21:50.4420696Z * [new tag] trunk/3cec82a7e9aea040a34dd7a2587ae6d3bd65dba0 -> trunk/3cec82a7e9aea040a34dd7a2587ae6d3bd65dba0 2025-08-14T21:21:50.4421092Z * [new tag] trunk/3cf7b4024ef83e44e9ae223dbff7c7ab68240cb2 -> trunk/3cf7b4024ef83e44e9ae223dbff7c7ab68240cb2 2025-08-14T21:21:50.4421461Z * [new tag] trunk/3ef2e1ef769582a82c6ddf150e9d11bf4bf1c44f -> trunk/3ef2e1ef769582a82c6ddf150e9d11bf4bf1c44f 2025-08-14T21:21:50.4421904Z * [new tag] trunk/3f1636ebef9b45e8a3cb0eb20d327ee6acb74be0 -> trunk/3f1636ebef9b45e8a3cb0eb20d327ee6acb74be0 2025-08-14T21:21:50.4422287Z * [new tag] trunk/3faee0a6318afcbbbb48687009a459214910d820 -> trunk/3faee0a6318afcbbbb48687009a459214910d820 2025-08-14T21:21:50.4422864Z * [new tag] trunk/3fcd79e023da7156ac584992ebab29205d3b7881 -> trunk/3fcd79e023da7156ac584992ebab29205d3b7881 2025-08-14T21:21:50.4423136Z * [new tag] trunk/3fe19a7a0af3f4d692af30476c320be18c7e8ae6 -> trunk/3fe19a7a0af3f4d692af30476c320be18c7e8ae6 2025-08-14T21:21:50.4423411Z * [new tag] trunk/41673110cd7c5960824cc74a6fcaeda1a8bc7a23 -> trunk/41673110cd7c5960824cc74a6fcaeda1a8bc7a23 2025-08-14T21:21:50.4423664Z * [new tag] trunk/4183d4ff3dcc1d87400326a9a7998c3f9e966f60 -> trunk/4183d4ff3dcc1d87400326a9a7998c3f9e966f60 2025-08-14T21:21:50.4423907Z * [new tag] trunk/422bd6808bb98cbbac31d157d9c82ad11ba9732d -> trunk/422bd6808bb98cbbac31d157d9c82ad11ba9732d 2025-08-14T21:21:50.4424152Z * [new tag] trunk/42e51cd4b3973a053fcfa80878a3f346fd158e9f -> trunk/42e51cd4b3973a053fcfa80878a3f346fd158e9f 2025-08-14T21:21:50.4424394Z * [new tag] trunk/4416433c7c625127b7f975c92f8ec98ea4c67fd3 -> trunk/4416433c7c625127b7f975c92f8ec98ea4c67fd3 2025-08-14T21:21:50.4424703Z * [new tag] trunk/45ba7ecda876685b083cbbe932450560c566826b -> trunk/45ba7ecda876685b083cbbe932450560c566826b 2025-08-14T21:21:50.4424955Z * [new tag] trunk/47a1db823dfcdacdb99f317428fc3791a18c5812 -> trunk/47a1db823dfcdacdb99f317428fc3791a18c5812 2025-08-14T21:21:50.4425211Z * [new tag] trunk/4a773e1e867f28a8ff0b15203e5cd9548f74fcee -> trunk/4a773e1e867f28a8ff0b15203e5cd9548f74fcee 2025-08-14T21:21:50.4425459Z * [new tag] trunk/4a90dc0c1f68d1f98832b169f792ed1bb195a0f3 -> trunk/4a90dc0c1f68d1f98832b169f792ed1bb195a0f3 2025-08-14T21:21:50.4425714Z * [new tag] trunk/4cde0acc0e4e795e1a12cbdd9b93c8c04c1fa05d -> trunk/4cde0acc0e4e795e1a12cbdd9b93c8c04c1fa05d 2025-08-14T21:21:50.4425951Z * [new tag] trunk/4d419a74610c32b1372f8802dcc61893740a23cf -> trunk/4d419a74610c32b1372f8802dcc61893740a23cf 2025-08-14T21:21:50.4431731Z * [new tag] trunk/4d5b3f2d5af7c8e4f41da4ffca53fafe8bb86235 -> trunk/4d5b3f2d5af7c8e4f41da4ffca53fafe8bb86235 2025-08-14T21:21:50.4432041Z * [new tag] trunk/4e2ddb5db67617f9f5309c8bba0c17adc84cadbc -> trunk/4e2ddb5db67617f9f5309c8bba0c17adc84cadbc 2025-08-14T21:21:50.4432286Z * [new tag] trunk/50a8c118754a6c5a46968f5c8e215ccba6831d42 -> trunk/50a8c118754a6c5a46968f5c8e215ccba6831d42 2025-08-14T21:21:50.4432525Z * [new tag] trunk/50f23ff6f883db5021dd6bab4c146434f98dd15d -> trunk/50f23ff6f883db5021dd6bab4c146434f98dd15d 2025-08-14T21:21:50.4432768Z * [new tag] trunk/515cb70367e84fcbad23fcc5b39eb1d7706df2aa -> trunk/515cb70367e84fcbad23fcc5b39eb1d7706df2aa 2025-08-14T21:21:50.4432991Z * [new tag] trunk/53e39494958b7e2278cc8176f63636e812e8945f -> trunk/53e39494958b7e2278cc8176f63636e812e8945f 2025-08-14T21:21:50.4433236Z * [new tag] trunk/556e2a73f4f0643f7c2aeb5c7dddda43388a40ce -> trunk/556e2a73f4f0643f7c2aeb5c7dddda43388a40ce 2025-08-14T21:21:50.4433482Z * [new tag] trunk/5665dc9ab76b84d7c90d845ffb0f6349b3621919 -> trunk/5665dc9ab76b84d7c90d845ffb0f6349b3621919 2025-08-14T21:21:50.4433715Z * [new tag] trunk/566c6d52ef1411c8262d7b9cf85e2044fdfbe1a3 -> trunk/566c6d52ef1411c8262d7b9cf85e2044fdfbe1a3 2025-08-14T21:21:50.4433961Z * [new tag] trunk/56c828bef93eada0e18d2cc013207831ca80cc99 -> trunk/56c828bef93eada0e18d2cc013207831ca80cc99 2025-08-14T21:21:50.4434192Z * [new tag] trunk/5737372862253a0ac0292407a5844796f02380ad -> trunk/5737372862253a0ac0292407a5844796f02380ad 2025-08-14T21:21:50.4434435Z * [new tag] trunk/57f738b6357cc8fcdde479a0948e723809a1a44d -> trunk/57f738b6357cc8fcdde479a0948e723809a1a44d 2025-08-14T21:21:50.4434666Z * [new tag] trunk/5a40c5784482255b9baf14086cc4b9349fc6d512 -> trunk/5a40c5784482255b9baf14086cc4b9349fc6d512 2025-08-14T21:21:50.4434915Z * [new tag] trunk/5a9c4cfce42b9eb87da0de40c5633f083115c307 -> trunk/5a9c4cfce42b9eb87da0de40c5633f083115c307 2025-08-14T21:21:50.4435317Z * [new tag] trunk/5ace061254af71aa83d1baae81aa1864c9746add -> trunk/5ace061254af71aa83d1baae81aa1864c9746add 2025-08-14T21:21:50.4440018Z * [new tag] trunk/5dddcd5b07c6644efca8d613f4eca1dc95daa87f -> trunk/5dddcd5b07c6644efca8d613f4eca1dc95daa87f 2025-08-14T21:21:50.4440301Z * [new tag] trunk/5ed4f9177907fe403ec4c4499d0d0e9be6b68fcf -> trunk/5ed4f9177907fe403ec4c4499d0d0e9be6b68fcf 2025-08-14T21:21:50.4440546Z * [new tag] trunk/5f1010fbb3850d99c8fdf9a9de2f79260cdc586a -> trunk/5f1010fbb3850d99c8fdf9a9de2f79260cdc586a 2025-08-14T21:21:50.4440790Z * [new tag] trunk/5f5f508aa836a46dfe88857fb223049616b94e93 -> trunk/5f5f508aa836a46dfe88857fb223049616b94e93 2025-08-14T21:21:50.4441031Z * [new tag] trunk/62bac0798100e0e06a86b7a4cee1788413e3d0ca -> trunk/62bac0798100e0e06a86b7a4cee1788413e3d0ca 2025-08-14T21:21:50.4441418Z * [new tag] trunk/63654ba4c5178fd12220cfc9d1c878af2fdd07cc -> trunk/63654ba4c5178fd12220cfc9d1c878af2fdd07cc 2025-08-14T21:21:50.4441660Z * [new tag] trunk/639778b3ee3b80e0894367fdc4442b58ae1b3a62 -> trunk/639778b3ee3b80e0894367fdc4442b58ae1b3a62 2025-08-14T21:21:50.4441901Z * [new tag] trunk/641ee7478150f26969968f49d8b358e199679a8a -> trunk/641ee7478150f26969968f49d8b358e199679a8a 2025-08-14T21:21:50.4442165Z * [new tag] trunk/65053c03a3d209060cb239d20a229dac37cf9dd1 -> trunk/65053c03a3d209060cb239d20a229dac37cf9dd1 2025-08-14T21:21:50.4442404Z * [new tag] trunk/652a6f5954d039d61dc6e6575ccf89d385d74537 -> trunk/652a6f5954d039d61dc6e6575ccf89d385d74537 2025-08-14T21:21:50.4442658Z * [new tag] trunk/685f15dbea66e8ffa8564752f81ad2f6cb447a14 -> trunk/685f15dbea66e8ffa8564752f81ad2f6cb447a14 2025-08-14T21:21:50.4442904Z * [new tag] trunk/68a4b4b2e336cfd4451ce6546d900568e5ddf96c -> trunk/68a4b4b2e336cfd4451ce6546d900568e5ddf96c 2025-08-14T21:21:50.4443159Z * [new tag] trunk/69a0a9aa7f5e320a02e97fa789d2f72baff1554f -> trunk/69a0a9aa7f5e320a02e97fa789d2f72baff1554f 2025-08-14T21:21:50.4443402Z * [new tag] trunk/6be6d06295c870c77a6eb69f96b3170d983520d5 -> trunk/6be6d06295c870c77a6eb69f96b3170d983520d5 2025-08-14T21:21:50.4443650Z * [new tag] trunk/6c05ea6475beaf3acc05e1bda0f3f8fe3bdc1d49 -> trunk/6c05ea6475beaf3acc05e1bda0f3f8fe3bdc1d49 2025-08-14T21:21:50.4443890Z * [new tag] trunk/6da11d9aafc0d84dc7f66030c181608ff2614f66 -> trunk/6da11d9aafc0d84dc7f66030c181608ff2614f66 2025-08-14T21:21:50.4444135Z * [new tag] trunk/6e8865fbc161270e2ffc52817e6c667df417a3f7 -> trunk/6e8865fbc161270e2ffc52817e6c667df417a3f7 2025-08-14T21:21:50.4444375Z * [new tag] trunk/6ea8376f84232048d6be0f7b2edf82aec1b61d58 -> trunk/6ea8376f84232048d6be0f7b2edf82aec1b61d58 2025-08-14T21:21:50.4444609Z * [new tag] trunk/6ee175195ac7853734d64704171993cc6265eb38 -> trunk/6ee175195ac7853734d64704171993cc6265eb38 2025-08-14T21:21:50.4444861Z * [new tag] trunk/6f0f4e0c3eacd479864319127915f869f64e1935 -> trunk/6f0f4e0c3eacd479864319127915f869f64e1935 2025-08-14T21:21:50.4445109Z * [new tag] trunk/70ccdec44b89e355a2cb03ba14a634284f7750f8 -> trunk/70ccdec44b89e355a2cb03ba14a634284f7750f8 2025-08-14T21:21:50.4445849Z * [new tag] trunk/72009ec6bebca7714f99c18449183787f202af4d -> trunk/72009ec6bebca7714f99c18449183787f202af4d 2025-08-14T21:21:50.4446130Z * [new tag] trunk/731ee31f7b6ba19307daab323f6196172b71aaf8 -> trunk/731ee31f7b6ba19307daab323f6196172b71aaf8 2025-08-14T21:21:50.4446392Z * [new tag] trunk/76a0609b6bddb2bc40f1eb4ade12885023653d59 -> trunk/76a0609b6bddb2bc40f1eb4ade12885023653d59 2025-08-14T21:21:50.4446636Z * [new tag] trunk/781e9a7724c47496e3d38a81e6dd6194cf098c41 -> trunk/781e9a7724c47496e3d38a81e6dd6194cf098c41 2025-08-14T21:21:50.4447079Z * [new tag] trunk/78a2fe1d42edeaa2ef7020b0fa0ac82ee4a640e4 -> trunk/78a2fe1d42edeaa2ef7020b0fa0ac82ee4a640e4 2025-08-14T21:21:50.4447361Z * [new tag] trunk/7a974a88f2c529a614baeabe4debd00fc8a3b299 -> trunk/7a974a88f2c529a614baeabe4debd00fc8a3b299 2025-08-14T21:21:50.4447702Z * [new tag] trunk/7ae0629d64b404e0ef5d9c931433ad25e65d6114 -> trunk/7ae0629d64b404e0ef5d9c931433ad25e65d6114 2025-08-14T21:21:50.4450457Z * [new tag] trunk/7d2ec704e47f4b740cdecda5534b305e8e1875ef -> trunk/7d2ec704e47f4b740cdecda5534b305e8e1875ef 2025-08-14T21:21:50.4450758Z * [new tag] trunk/7d87e358ac8440f666fabbfd99058bb5342be6ac -> trunk/7d87e358ac8440f666fabbfd99058bb5342be6ac 2025-08-14T21:21:50.4451139Z * [new tag] trunk/7e27347fd353928c99620495c8c531a5eba7d56b -> trunk/7e27347fd353928c99620495c8c531a5eba7d56b 2025-08-14T21:21:50.4458227Z * [new tag] trunk/7e91394955721c77645fcdb75a5d47a255d65020 -> trunk/7e91394955721c77645fcdb75a5d47a255d65020 2025-08-14T21:21:50.4458829Z * [new tag] trunk/7f4cb4a3e018a621add2a37a3a2f67b982d51001 -> trunk/7f4cb4a3e018a621add2a37a3a2f67b982d51001 2025-08-14T21:21:50.4459209Z * [new tag] trunk/7fbc22855c17741ae016992803b2e147a13aa22d -> trunk/7fbc22855c17741ae016992803b2e147a13aa22d 2025-08-14T21:21:50.4459601Z * [new tag] trunk/8047421fbb607d70ede13b9cd5a60b7b8bdfe348 -> trunk/8047421fbb607d70ede13b9cd5a60b7b8bdfe348 2025-08-14T21:21:50.4460016Z * [new tag] trunk/8088cfa592504a2897b4c78f8a46fe658ab5c2c2 -> trunk/8088cfa592504a2897b4c78f8a46fe658ab5c2c2 2025-08-14T21:21:50.4460377Z * [new tag] trunk/80cca8307943ba64168208b54028f55b2c71daff -> trunk/80cca8307943ba64168208b54028f55b2c71daff 2025-08-14T21:21:50.4461061Z * [new tag] trunk/8147370733bbdcd034cad54e9212e51885a11892 -> trunk/8147370733bbdcd034cad54e9212e51885a11892 2025-08-14T21:21:50.4461421Z * [new tag] trunk/83875cdb5594ccb3c9206b8eb5745fe1d011cf26 -> trunk/83875cdb5594ccb3c9206b8eb5745fe1d011cf26 2025-08-14T21:21:50.4461675Z * [new tag] trunk/8399cf88ce8399d2be93355f29d4cb69f51c0654 -> trunk/8399cf88ce8399d2be93355f29d4cb69f51c0654 2025-08-14T21:21:50.4461932Z * [new tag] trunk/842cc77ab9aafd518593c2fce077d6abb42a5b7f -> trunk/842cc77ab9aafd518593c2fce077d6abb42a5b7f 2025-08-14T21:21:50.4462169Z * [new tag] trunk/85db508af533649d0b3447ff3f0d5fe083150c84 -> trunk/85db508af533649d0b3447ff3f0d5fe083150c84 2025-08-14T21:21:50.4462410Z * [new tag] trunk/86eb65f7f06016bcd5d7951dc9d74bc3993a827a -> trunk/86eb65f7f06016bcd5d7951dc9d74bc3993a827a 2025-08-14T21:21:50.4462658Z * [new tag] trunk/87e6c4079d8ec7d04aff00ed82096b39836a8367 -> trunk/87e6c4079d8ec7d04aff00ed82096b39836a8367 2025-08-14T21:21:50.4462902Z * [new tag] trunk/89654db1abccf7e5f261989a150db4d1619ea2aa -> trunk/89654db1abccf7e5f261989a150db4d1619ea2aa 2025-08-14T21:21:50.4463165Z * [new tag] trunk/8a37f0c90392a2c38b7c5955471fa49edcaf5cb1 -> trunk/8a37f0c90392a2c38b7c5955471fa49edcaf5cb1 2025-08-14T21:21:50.4463402Z * [new tag] trunk/8ab5868a2199fe485c2d66533b9244ccb97e487d -> trunk/8ab5868a2199fe485c2d66533b9244ccb97e487d 2025-08-14T21:21:50.4463635Z * [new tag] trunk/8ae4d2652f64b8444b3d5314b9232bd2119bcde6 -> trunk/8ae4d2652f64b8444b3d5314b9232bd2119bcde6 2025-08-14T21:21:50.4463888Z * [new tag] trunk/8c41cb800ae0411f02ea5da34bd5ccc3790633b0 -> trunk/8c41cb800ae0411f02ea5da34bd5ccc3790633b0 2025-08-14T21:21:50.4464345Z * [new tag] trunk/8cb91e20bc205b1416648d0ffd98d1ba1f3a6fc4 -> trunk/8cb91e20bc205b1416648d0ffd98d1ba1f3a6fc4 2025-08-14T21:21:50.4464668Z * [new tag] trunk/8cfaf51d4e29c9bd9f49ecc98d955ed53df1a13d -> trunk/8cfaf51d4e29c9bd9f49ecc98d955ed53df1a13d 2025-08-14T21:21:50.4464924Z * [new tag] trunk/8d1cf529229dce7cd5ea04abb0faac83b87ca6d1 -> trunk/8d1cf529229dce7cd5ea04abb0faac83b87ca6d1 2025-08-14T21:21:50.4465230Z * [new tag] trunk/8d3d1c844303cb1d46123a1caa76d4cf83973347 -> trunk/8d3d1c844303cb1d46123a1caa76d4cf83973347 2025-08-14T21:21:50.4465471Z * [new tag] trunk/8d6d3246316e1767a57d5e855acd6208da753b75 -> trunk/8d6d3246316e1767a57d5e855acd6208da753b75 2025-08-14T21:21:50.4465691Z * [new tag] trunk/8e6a3138581152ab827a0997f34c470271399f5e -> trunk/8e6a3138581152ab827a0997f34c470271399f5e 2025-08-14T21:21:50.4465942Z * [new tag] trunk/8eee08d2279b98af2522debb6512d37e837e89e3 -> trunk/8eee08d2279b98af2522debb6512d37e837e89e3 2025-08-14T21:21:50.4466181Z * [new tag] trunk/90b78ee50f73b5c963996076a3d54b74b1b965be -> trunk/90b78ee50f73b5c963996076a3d54b74b1b965be 2025-08-14T21:21:50.4469369Z * [new tag] trunk/94b91a876327820a4bb6f5d39d156f13f2553ab6 -> trunk/94b91a876327820a4bb6f5d39d156f13f2553ab6 2025-08-14T21:21:50.4469846Z * [new tag] trunk/95210cc409dd578988c7116b47725c304dea54c7 -> trunk/95210cc409dd578988c7116b47725c304dea54c7 2025-08-14T21:21:50.4470107Z * [new tag] trunk/96bd33b2de79598566df395f32e27c4d33673f05 -> trunk/96bd33b2de79598566df395f32e27c4d33673f05 2025-08-14T21:21:50.4470354Z * [new tag] trunk/9708fcf92db88b80b9010c68662d634434da3106 -> trunk/9708fcf92db88b80b9010c68662d634434da3106 2025-08-14T21:21:50.4470615Z * [new tag] trunk/97c8c98f8dcb9c5c188b691d156e0043dba6c7f8 -> trunk/97c8c98f8dcb9c5c188b691d156e0043dba6c7f8 2025-08-14T21:21:50.4470879Z * [new tag] trunk/9903ca4f70bdc1653016256f5b4fd74fdfc609f8 -> trunk/9903ca4f70bdc1653016256f5b4fd74fdfc609f8 2025-08-14T21:21:50.4471120Z * [new tag] trunk/99bc2f94c1955657e950ebdad5f77e518785ccbd -> trunk/99bc2f94c1955657e950ebdad5f77e518785ccbd 2025-08-14T21:21:50.4471370Z * [new tag] trunk/9a06e6d0310da9d8a59ae05e8ec9c0201b55cacd -> trunk/9a06e6d0310da9d8a59ae05e8ec9c0201b55cacd 2025-08-14T21:21:50.4471624Z * [new tag] trunk/9a0f7a3bb01b235ea04581ee540970a098071b72 -> trunk/9a0f7a3bb01b235ea04581ee540970a098071b72 2025-08-14T21:21:50.4471877Z * [new tag] trunk/9b803cdbe298009f08340c1aaccb25aafbca95d8 -> trunk/9b803cdbe298009f08340c1aaccb25aafbca95d8 2025-08-14T21:21:50.4474325Z * [new tag] trunk/9ccd0f5e31ea54fcf42101dfbaacc103494e34df -> trunk/9ccd0f5e31ea54fcf42101dfbaacc103494e34df 2025-08-14T21:21:50.4474701Z * [new tag] trunk/9d37c960a4fc44d5ac334ca8bf775f85b95d76fc -> trunk/9d37c960a4fc44d5ac334ca8bf775f85b95d76fc 2025-08-14T21:21:50.4474993Z * [new tag] trunk/9e07673deb212c87b1c6fea23799a97474c476ed -> trunk/9e07673deb212c87b1c6fea23799a97474c476ed 2025-08-14T21:21:50.4475316Z * [new tag] trunk/9eedd2a20b64302d0d116ea2802b50948d2ebb09 -> trunk/9eedd2a20b64302d0d116ea2802b50948d2ebb09 2025-08-14T21:21:50.4475573Z * [new tag] trunk/9fa8ce26cf638504469852cbc3e7d04579fc8674 -> trunk/9fa8ce26cf638504469852cbc3e7d04579fc8674 2025-08-14T21:21:50.4475813Z * [new tag] trunk/a06ec54d40013c97fbffc174ea8f524ea5a95715 -> trunk/a06ec54d40013c97fbffc174ea8f524ea5a95715 2025-08-14T21:21:50.4476068Z * [new tag] trunk/a288b15ea9f87ddd665f249d492e0fb0861f5a69 -> trunk/a288b15ea9f87ddd665f249d492e0fb0861f5a69 2025-08-14T21:21:50.4476324Z * [new tag] trunk/a2fd106d670bb4990cebfd00f25ecbae4145e76c -> trunk/a2fd106d670bb4990cebfd00f25ecbae4145e76c 2025-08-14T21:21:50.4479101Z * [new tag] trunk/a354fa91e26b376d96385a2206c5ff5b42aa4600 -> trunk/a354fa91e26b376d96385a2206c5ff5b42aa4600 2025-08-14T21:21:50.4479473Z * [new tag] trunk/a4f69a5da08eace1c1e6469dec6a18aa842da73b -> trunk/a4f69a5da08eace1c1e6469dec6a18aa842da73b 2025-08-14T21:21:50.4479831Z * [new tag] trunk/a53d14d5f846ac44f6c205abb1c5bc4d2f3126ae -> trunk/a53d14d5f846ac44f6c205abb1c5bc4d2f3126ae 2025-08-14T21:21:50.4480259Z * [new tag] trunk/a5652407e4f3d772fc44486ac2abf756decf0861 -> trunk/a5652407e4f3d772fc44486ac2abf756decf0861 2025-08-14T21:21:50.4480686Z * [new tag] trunk/a7abf57aabec0ce686092e2d66e53ba185dbc56b -> trunk/a7abf57aabec0ce686092e2d66e53ba185dbc56b 2025-08-14T21:21:50.4481552Z * [new tag] trunk/a84b60c0c4016785fd93b7b8a0c04f2d0770d332 -> trunk/a84b60c0c4016785fd93b7b8a0c04f2d0770d332 2025-08-14T21:21:50.4481844Z * [new tag] trunk/aa75e917bdb0f95bb6dee81853c2d3c4ab3e1883 -> trunk/aa75e917bdb0f95bb6dee81853c2d3c4ab3e1883 2025-08-14T21:21:50.4482096Z * [new tag] trunk/adcca7d9a1c053495e99012de801b2ea237faad0 -> trunk/adcca7d9a1c053495e99012de801b2ea237faad0 2025-08-14T21:21:50.4482348Z * [new tag] trunk/af10f1f86cc4effc93142a447693d8be55966615 -> trunk/af10f1f86cc4effc93142a447693d8be55966615 2025-08-14T21:21:50.4482629Z * [new tag] trunk/af3cabc55d5699f4da528e1ca39d83338f84ae8c -> trunk/af3cabc55d5699f4da528e1ca39d83338f84ae8c 2025-08-14T21:21:50.4483031Z * [new tag] trunk/b0df7715e8c590c0001d1f9cdb97057be80c9107 -> trunk/b0df7715e8c590c0001d1f9cdb97057be80c9107 2025-08-14T21:21:50.4483282Z * [new tag] trunk/b149c7204c218e7c4d6594a89dd74f72bd480ec5 -> trunk/b149c7204c218e7c4d6594a89dd74f72bd480ec5 2025-08-14T21:21:50.4483506Z * [new tag] trunk/b1a602762e6a6674b406a3137e7e7a678885a97b -> trunk/b1a602762e6a6674b406a3137e7e7a678885a97b 2025-08-14T21:21:50.4483756Z * [new tag] trunk/b1f43548cad8fc0e30bda250f6e196310fa7a4bc -> trunk/b1f43548cad8fc0e30bda250f6e196310fa7a4bc 2025-08-14T21:21:50.4483994Z * [new tag] trunk/b219ca2a00a305753c4f1ea4c9c5d23243d54753 -> trunk/b219ca2a00a305753c4f1ea4c9c5d23243d54753 2025-08-14T21:21:50.4484233Z * [new tag] trunk/b4596895b9d85a686c2cb978938b0a7797b3690a -> trunk/b4596895b9d85a686c2cb978938b0a7797b3690a 2025-08-14T21:21:50.4484495Z * [new tag] trunk/b5fd7223b1bf44720dc9183bda7dfcf7aeccff02 -> trunk/b5fd7223b1bf44720dc9183bda7dfcf7aeccff02 2025-08-14T21:21:50.4484742Z * [new tag] trunk/b602ea9cab7d43a7ee7b4051227090f23fbd3dbf -> trunk/b602ea9cab7d43a7ee7b4051227090f23fbd3dbf 2025-08-14T21:21:50.4485021Z * [new tag] trunk/b6b74aed604bd2e96389ff99aaaf39abc64fdc64 -> trunk/b6b74aed604bd2e96389ff99aaaf39abc64fdc64 2025-08-14T21:21:50.4485857Z * [new tag] trunk/b7db86600a2614adc71c92ca42d359a7ac534d78 -> trunk/b7db86600a2614adc71c92ca42d359a7ac534d78 2025-08-14T21:21:50.4486606Z * [new tag] trunk/b9003ed3d87699e81e436719625a21996a6654e5 -> trunk/b9003ed3d87699e81e436719625a21996a6654e5 2025-08-14T21:21:50.4486875Z * [new tag] trunk/b90feeac86bda00afc2789321bcd706015ff44e3 -> trunk/b90feeac86bda00afc2789321bcd706015ff44e3 2025-08-14T21:21:50.4488401Z * [new tag] trunk/b9d7de3a094598c3dc0dd52e57bce30eb684c9d8 -> trunk/b9d7de3a094598c3dc0dd52e57bce30eb684c9d8 2025-08-14T21:21:50.4488741Z * [new tag] trunk/ba47821f524eee50a214ed39fa2e7765d54aabf4 -> trunk/ba47821f524eee50a214ed39fa2e7765d54aabf4 2025-08-14T21:21:50.4489641Z * [new tag] trunk/ba4ccf5d67e3d237f435eacc2bce3c6025f08491 -> trunk/ba4ccf5d67e3d237f435eacc2bce3c6025f08491 2025-08-14T21:21:50.4490321Z * [new tag] trunk/bcf23ecc476df2bd7479f142567213e2623308ee -> trunk/bcf23ecc476df2bd7479f142567213e2623308ee 2025-08-14T21:21:50.4490906Z * [new tag] trunk/be53f609aaf6f01e2863f490975ea9eaac3ee9ff -> trunk/be53f609aaf6f01e2863f490975ea9eaac3ee9ff 2025-08-14T21:21:50.4491515Z * [new tag] trunk/beb4d7816dedc67a5de1f82e5a45b5910f407941 -> trunk/beb4d7816dedc67a5de1f82e5a45b5910f407941 2025-08-14T21:21:50.4492564Z * [new tag] trunk/bfc873d02ec413344717493e4175a902921359fd -> trunk/bfc873d02ec413344717493e4175a902921359fd 2025-08-14T21:21:50.4492953Z * [new tag] trunk/c184cb3852f0ff2d16a489d61abc3739c309e6ca -> trunk/c184cb3852f0ff2d16a489d61abc3739c309e6ca 2025-08-14T21:21:50.4493841Z * [new tag] trunk/c24ca7f4bf79f62fd623d76346ca27e53f731431 -> trunk/c24ca7f4bf79f62fd623d76346ca27e53f731431 2025-08-14T21:21:50.4494095Z * [new tag] trunk/c3dc8dc4122977893004c49d10e4676cd0a97da4 -> trunk/c3dc8dc4122977893004c49d10e4676cd0a97da4 2025-08-14T21:21:50.4494814Z * [new tag] trunk/c5ec5458a547f7a774468ea0eb2258d3de596492 -> trunk/c5ec5458a547f7a774468ea0eb2258d3de596492 2025-08-14T21:21:50.4495384Z * [new tag] trunk/c5efc5c8a66eca84865015058b3221013ebfe685 -> trunk/c5efc5c8a66eca84865015058b3221013ebfe685 2025-08-14T21:21:50.4496045Z * [new tag] trunk/c6563341208003f64c131854a9cf029555f786d2 -> trunk/c6563341208003f64c131854a9cf029555f786d2 2025-08-14T21:21:50.4496622Z * [new tag] trunk/c6d78d4dbda53837d298d23a5fbc09af90a42d9e -> trunk/c6d78d4dbda53837d298d23a5fbc09af90a42d9e 2025-08-14T21:21:50.4497166Z * [new tag] trunk/c8205cb35435f39d2c26f6c94b45e4adeb6dcb23 -> trunk/c8205cb35435f39d2c26f6c94b45e4adeb6dcb23 2025-08-14T21:21:50.4498435Z * [new tag] trunk/c859ba7114b1fcb49527e090745fa17091d1f8d5 -> trunk/c859ba7114b1fcb49527e090745fa17091d1f8d5 2025-08-14T21:21:50.4498809Z * [new tag] trunk/c86040a8e68f754b90a84099187d3624954c7f36 -> trunk/c86040a8e68f754b90a84099187d3624954c7f36 2025-08-14T21:21:50.4500064Z * [new tag] trunk/c9671dc865aa0fc1cb86df754e355b44d8e02bb4 -> trunk/c9671dc865aa0fc1cb86df754e355b44d8e02bb4 2025-08-14T21:21:50.4500781Z * [new tag] trunk/ca7315c17162ea21b1ca5ba23f4bf6168766c7b9 -> trunk/ca7315c17162ea21b1ca5ba23f4bf6168766c7b9 2025-08-14T21:21:50.4501212Z * [new tag] trunk/cae2b5e3d223829bdc553fc8601df4b1c1554cff -> trunk/cae2b5e3d223829bdc553fc8601df4b1c1554cff 2025-08-14T21:21:50.4501633Z * [new tag] trunk/cbffde774557752cf20447d42d99ec6102673c31 -> trunk/cbffde774557752cf20447d42d99ec6102673c31 2025-08-14T21:21:50.4502325Z * [new tag] trunk/cd8d8c18f5bafdc1c73d5ac0129e7b4d76ab45bc -> trunk/cd8d8c18f5bafdc1c73d5ac0129e7b4d76ab45bc 2025-08-14T21:21:50.4506098Z * [new tag] trunk/cf0a0dcb0afa5e84b95461cc542f862b51ca96bf -> trunk/cf0a0dcb0afa5e84b95461cc542f862b51ca96bf 2025-08-14T21:21:50.4509986Z * [new tag] trunk/cf4964be68fa9f4ffc334f01cce42d7424b1cc81 -> trunk/cf4964be68fa9f4ffc334f01cce42d7424b1cc81 2025-08-14T21:21:50.4510261Z * [new tag] trunk/d0e2240f680ea2a553f7ee8188f52482e130bfd0 -> trunk/d0e2240f680ea2a553f7ee8188f52482e130bfd0 2025-08-14T21:21:50.4510521Z * [new tag] trunk/d1950d4bb5cba8fb6b23e4d283eea5b9801737e2 -> trunk/d1950d4bb5cba8fb6b23e4d283eea5b9801737e2 2025-08-14T21:21:50.4510778Z * [new tag] trunk/d20c4c20e61adecf00335c4d8c22eb1ace472cd3 -> trunk/d20c4c20e61adecf00335c4d8c22eb1ace472cd3 2025-08-14T21:21:50.4514147Z * [new tag] trunk/d25c4f954d599ea512e2f70cd6df101c21479d4c -> trunk/d25c4f954d599ea512e2f70cd6df101c21479d4c 2025-08-14T21:21:50.4514513Z * [new tag] trunk/d3d359dbafa89173a371e2637f22b47398e94a24 -> trunk/d3d359dbafa89173a371e2637f22b47398e94a24 2025-08-14T21:21:50.4514842Z * [new tag] trunk/d46768db04499d07a5b0db984112a6d1b7d3b0c1 -> trunk/d46768db04499d07a5b0db984112a6d1b7d3b0c1 2025-08-14T21:21:50.4515173Z * [new tag] trunk/d4c1a08c89f37d249a0146ff511c82ecc5c53b8f -> trunk/d4c1a08c89f37d249a0146ff511c82ecc5c53b8f 2025-08-14T21:21:50.4515532Z * [new tag] trunk/d556586448f3caab85673c7da0978fe31c7748f7 -> trunk/d556586448f3caab85673c7da0978fe31c7748f7 2025-08-14T21:21:50.4515916Z * [new tag] trunk/d670304001429a1a833255a918ed788d7ec4989a -> trunk/d670304001429a1a833255a918ed788d7ec4989a 2025-08-14T21:21:50.4516148Z * [new tag] trunk/d6786741a77aba200c78002646cc069b7a1799b0 -> trunk/d6786741a77aba200c78002646cc069b7a1799b0 2025-08-14T21:21:50.4519284Z * [new tag] trunk/d68c323692dedcbb74e670801e3502944fd790ff -> trunk/d68c323692dedcbb74e670801e3502944fd790ff 2025-08-14T21:21:50.4519605Z * [new tag] trunk/d8cb3db5339b45e4b745b2b883ef3ecde9843e2c -> trunk/d8cb3db5339b45e4b745b2b883ef3ecde9843e2c 2025-08-14T21:21:50.4519864Z * [new tag] trunk/da1f608ca33f3062535d0a4866d95db19e72fcbd -> trunk/da1f608ca33f3062535d0a4866d95db19e72fcbd 2025-08-14T21:21:50.4520124Z * [new tag] trunk/db0b7f1cc9bb3fe71aaf8b964a644147ae8e1c35 -> trunk/db0b7f1cc9bb3fe71aaf8b964a644147ae8e1c35 2025-08-14T21:21:50.4520380Z * [new tag] trunk/db32b60662b2f2bdcad980127d5dc4b66b02a7e4 -> trunk/db32b60662b2f2bdcad980127d5dc4b66b02a7e4 2025-08-14T21:21:50.4520624Z * [new tag] trunk/db763b17175553ba09637362eb9773a91997a7ad -> trunk/db763b17175553ba09637362eb9773a91997a7ad 2025-08-14T21:21:50.4520971Z * [new tag] trunk/db78943a1ca13a32a3d6045eb15e2b719ee13a2f -> trunk/db78943a1ca13a32a3d6045eb15e2b719ee13a2f 2025-08-14T21:21:50.4521220Z * [new tag] trunk/dc0d18e023d9b7e314ebba0f234b6cb1579dbcfd -> trunk/dc0d18e023d9b7e314ebba0f234b6cb1579dbcfd 2025-08-14T21:21:50.4521468Z * [new tag] trunk/dd21c8a578038ab2841a7ba809a06921093ac9d8 -> trunk/dd21c8a578038ab2841a7ba809a06921093ac9d8 2025-08-14T21:21:50.4521738Z * [new tag] trunk/deea71a90e05eb320c04bebfead5317746637f0d -> trunk/deea71a90e05eb320c04bebfead5317746637f0d 2025-08-14T21:21:50.4522004Z * [new tag] trunk/df55ec7d4b35f6d21691e9dd41c82f27de762948 -> trunk/df55ec7d4b35f6d21691e9dd41c82f27de762948 2025-08-14T21:21:50.4522262Z * [new tag] trunk/e1cf0d496ea85d1807c8c740f296e77bf7bdc1df -> trunk/e1cf0d496ea85d1807c8c740f296e77bf7bdc1df 2025-08-14T21:21:50.4522509Z * [new tag] trunk/e248719ac03c103767ab72034f6b9fd56855bf98 -> trunk/e248719ac03c103767ab72034f6b9fd56855bf98 2025-08-14T21:21:50.4522754Z * [new tag] trunk/e49762026070f66be41bfa6537fbcf9bfc24e558 -> trunk/e49762026070f66be41bfa6537fbcf9bfc24e558 2025-08-14T21:21:50.4523014Z * [new tag] trunk/e4de93f6a3e342bab34d3757cf90ec0ccc87e168 -> trunk/e4de93f6a3e342bab34d3757cf90ec0ccc87e168 2025-08-14T21:21:50.4523296Z * [new tag] trunk/e619c6bb90b9dedaccd3cbeed86a288993a4e33f -> trunk/e619c6bb90b9dedaccd3cbeed86a288993a4e33f 2025-08-14T21:21:50.4526336Z * [new tag] trunk/e63c2b21c186a7d2ab8a8953b8aa1535f2e96e58 -> trunk/e63c2b21c186a7d2ab8a8953b8aa1535f2e96e58 2025-08-14T21:21:50.4526637Z * [new tag] trunk/e7152ff8a6a929a0db7f3f4a72a5b6d471769cd3 -> trunk/e7152ff8a6a929a0db7f3f4a72a5b6d471769cd3 2025-08-14T21:21:50.4526905Z * [new tag] trunk/e96c7c4bb0f6aeae2ab3b6f040f7d67edbec199a -> trunk/e96c7c4bb0f6aeae2ab3b6f040f7d67edbec199a 2025-08-14T21:21:50.4527153Z * [new tag] trunk/e9eb2096a59a79e7a94c3e28a0715e040369f34c -> trunk/e9eb2096a59a79e7a94c3e28a0715e040369f34c 2025-08-14T21:21:50.4527434Z * [new tag] trunk/eac2d9d695a32dd456050f45cac35134ec3809f4 -> trunk/eac2d9d695a32dd456050f45cac35134ec3809f4 2025-08-14T21:21:50.4527694Z * [new tag] trunk/ecde76c764752540edf9ef62a97936c86d984b17 -> trunk/ecde76c764752540edf9ef62a97936c86d984b17 2025-08-14T21:21:50.4527955Z * [new tag] trunk/ecea81117b2fdc52907c97b3c32d779e07b5d55b -> trunk/ecea81117b2fdc52907c97b3c32d779e07b5d55b 2025-08-14T21:21:50.4528202Z * [new tag] trunk/edaa151d0d5a4e75fbec9843f49cc78770eb61fb -> trunk/edaa151d0d5a4e75fbec9843f49cc78770eb61fb 2025-08-14T21:21:50.4529104Z * [new tag] trunk/ee1b0412b919dfb358d5a697b3be49621497fbc2 -> trunk/ee1b0412b919dfb358d5a697b3be49621497fbc2 2025-08-14T21:21:50.4529662Z * [new tag] trunk/ee1fb43450c2e985657f95a91b68328d6f20f24e -> trunk/ee1fb43450c2e985657f95a91b68328d6f20f24e 2025-08-14T21:21:50.4531005Z * [new tag] trunk/ee89cc7a0acd69de25f98fe4ef828546db7b444c -> trunk/ee89cc7a0acd69de25f98fe4ef828546db7b444c 2025-08-14T21:21:50.4531271Z * [new tag] trunk/ee9f8ba11d664b871a9e0c7933fdc8571635b78c -> trunk/ee9f8ba11d664b871a9e0c7933fdc8571635b78c 2025-08-14T21:21:50.4531828Z * [new tag] trunk/eed9dbf70f43ee529fec78ac00ed9a4fd74c6e76 -> trunk/eed9dbf70f43ee529fec78ac00ed9a4fd74c6e76 2025-08-14T21:21:50.4532400Z * [new tag] trunk/f077c2402e4eb5b0ed562b4ee5b7a0503f26ef94 -> trunk/f077c2402e4eb5b0ed562b4ee5b7a0503f26ef94 2025-08-14T21:21:50.4534364Z * [new tag] trunk/f0980fc0bbd656d6c02d23ad97e945353b314f35 -> trunk/f0980fc0bbd656d6c02d23ad97e945353b314f35 2025-08-14T21:21:50.4534810Z * [new tag] trunk/f15ada5c6fad97a7dcbfa4673f067b6942dda640 -> trunk/f15ada5c6fad97a7dcbfa4673f067b6942dda640 2025-08-14T21:21:50.4535198Z * [new tag] trunk/f27232a2134150cb5e55d26a74d8c36c6a961ca5 -> trunk/f27232a2134150cb5e55d26a74d8c36c6a961ca5 2025-08-14T21:21:50.4535755Z * [new tag] trunk/f33ce40bc062a281e1a1f57e8c1926d0a7d155cc -> trunk/f33ce40bc062a281e1a1f57e8c1926d0a7d155cc 2025-08-14T21:21:50.4536126Z * [new tag] trunk/f341077ce4710172da20cfad916ee37159bfe9fe -> trunk/f341077ce4710172da20cfad916ee37159bfe9fe 2025-08-14T21:21:50.4536500Z * [new tag] trunk/f3a4d742ece08de4cb0e59dcc62e0093a7d0b0c7 -> trunk/f3a4d742ece08de4cb0e59dcc62e0093a7d0b0c7 2025-08-14T21:21:50.4536963Z * [new tag] trunk/f3f159ff8c4bad2edec99c68a941c628e983d04c -> trunk/f3f159ff8c4bad2edec99c68a941c628e983d04c 2025-08-14T21:21:50.4537597Z * [new tag] trunk/f60454cce8b93e5bbf67f2f3c88c8ac01ed65457 -> trunk/f60454cce8b93e5bbf67f2f3c88c8ac01ed65457 2025-08-14T21:21:50.4538336Z * [new tag] trunk/f7b2f3314cf7aede67d5fa5c75e4243208484344 -> trunk/f7b2f3314cf7aede67d5fa5c75e4243208484344 2025-08-14T21:21:50.4538922Z * [new tag] trunk/f8f0414a5983ff481a2188e0c18594150430c8c5 -> trunk/f8f0414a5983ff481a2188e0c18594150430c8c5 2025-08-14T21:21:50.4539416Z * [new tag] trunk/f95b58c2844b3444cd8446fed8570729dc4216eb -> trunk/f95b58c2844b3444cd8446fed8570729dc4216eb 2025-08-14T21:21:50.4542017Z * [new tag] trunk/f990490a23815ea6ee27e487c70ba2cf513ba43d -> trunk/f990490a23815ea6ee27e487c70ba2cf513ba43d 2025-08-14T21:21:50.4542313Z * [new tag] trunk/fb887c3bb588cfe782615e67f6c26db636b8539b -> trunk/fb887c3bb588cfe782615e67f6c26db636b8539b 2025-08-14T21:21:50.4542568Z * [new tag] trunk/fc25c68f20f772290927a7031b998b92615259cf -> trunk/fc25c68f20f772290927a7031b998b92615259cf 2025-08-14T21:21:50.4542828Z * [new tag] trunk/fc80f6859e0ccf66513a40f04b9e735e759d4ddb -> trunk/fc80f6859e0ccf66513a40f04b9e735e759d4ddb 2025-08-14T21:21:50.4543445Z * [new tag] trunk/fdfd69bb05488d76123db9cc1cdd90ac4137bbfb -> trunk/fdfd69bb05488d76123db9cc1cdd90ac4137bbfb 2025-08-14T21:21:50.4544686Z * [new tag] trunk/fe3f5fe4ea2ff6f56406dc5d954636ebb08d0a08 -> trunk/fe3f5fe4ea2ff6f56406dc5d954636ebb08d0a08 2025-08-14T21:21:50.4545336Z * [new tag] trunk/fea7e9dd37c02c334b130f6624af6163fde6b2ab -> trunk/fea7e9dd37c02c334b130f6624af6163fde6b2ab 2025-08-14T21:21:50.4545618Z * [new tag] trunk/ff0d56d03592aa03f3ced8359241d21df1783393 -> trunk/ff0d56d03592aa03f3ced8359241d21df1783393 2025-08-14T21:21:50.4546007Z * [new tag] v0.1.1 -> v0.1.1 2025-08-14T21:21:50.4546510Z * [new tag] v0.1.10 -> v0.1.10 2025-08-14T21:21:50.4549995Z * [new tag] v0.1.11 -> v0.1.11 2025-08-14T21:21:50.4550376Z * [new tag] v0.1.12 -> v0.1.12 2025-08-14T21:21:50.4550490Z * [new tag] v0.1.2 -> v0.1.2 2025-08-14T21:21:50.4550592Z * [new tag] v0.1.3 -> v0.1.3 2025-08-14T21:21:50.4555340Z * [new tag] v0.1.4 -> v0.1.4 2025-08-14T21:21:50.4555489Z * [new tag] v0.1.5 -> v0.1.5 2025-08-14T21:21:50.4555611Z * [new tag] v0.1.6 -> v0.1.6 2025-08-14T21:21:50.4555706Z * [new tag] v0.1.7 -> v0.1.7 2025-08-14T21:21:50.4555798Z * [new tag] v0.1.8 -> v0.1.8 2025-08-14T21:21:50.4555899Z * [new tag] v0.1.9 -> v0.1.9 2025-08-14T21:21:50.4555991Z * [new tag] v0.2.0 -> v0.2.0 2025-08-14T21:21:50.4556092Z * [new tag] v0.3.0 -> v0.3.0 2025-08-14T21:21:50.4556186Z * [new tag] v0.3.1 -> v0.3.1 2025-08-14T21:21:50.4556278Z * [new tag] v0.4.0 -> v0.4.0 2025-08-14T21:21:50.4556450Z * [new tag] v0.4.1 -> v0.4.1 2025-08-14T21:21:50.4556550Z * [new tag] v1.0.0 -> v1.0.0 2025-08-14T21:21:50.4556665Z * [new tag] v1.0.0a0 -> v1.0.0a0 2025-08-14T21:21:50.4556769Z * [new tag] v1.0.1 -> v1.0.1 2025-08-14T21:21:50.4556913Z * [new tag] v1.0rc0 -> v1.0rc0 2025-08-14T21:21:50.4557542Z * [new tag] v1.0rc1 -> v1.0rc1 2025-08-14T21:21:50.4558015Z * [new tag] v1.1.0 -> v1.1.0 2025-08-14T21:21:50.4558792Z * [new tag] v1.1.0a0 -> v1.1.0a0 2025-08-14T21:21:50.4561867Z * [new tag] v1.10.0 -> v1.10.0 2025-08-14T21:21:50.4562026Z * [new tag] v1.10.0-rc1 -> v1.10.0-rc1 2025-08-14T21:21:50.4562141Z * [new tag] v1.10.0-rc2 -> v1.10.0-rc2 2025-08-14T21:21:50.4562271Z * [new tag] v1.10.0-rc3 -> v1.10.0-rc3 2025-08-14T21:21:50.4562383Z * [new tag] v1.10.1 -> v1.10.1 2025-08-14T21:21:50.4562486Z * [new tag] v1.10.1-rc1 -> v1.10.1-rc1 2025-08-14T21:21:50.4562621Z * [new tag] v1.10.2 -> v1.10.2 2025-08-14T21:21:50.4563111Z * [new tag] v1.10.2-rc1 -> v1.10.2-rc1 2025-08-14T21:21:50.4563826Z * [new tag] v1.11.0 -> v1.11.0 2025-08-14T21:21:50.4564332Z * [new tag] v1.11.0-rc1 -> v1.11.0-rc1 2025-08-14T21:21:50.4565735Z * [new tag] v1.11.0-rc2 -> v1.11.0-rc2 2025-08-14T21:21:50.4566002Z * [new tag] v1.11.0-rc3 -> v1.11.0-rc3 2025-08-14T21:21:50.4566528Z * [new tag] v1.11.0-rc4 -> v1.11.0-rc4 2025-08-14T21:21:50.4567106Z * [new tag] v1.11.0-rc5 -> v1.11.0-rc5 2025-08-14T21:21:50.4567560Z * [new tag] v1.11.0-rc6 -> v1.11.0-rc6 2025-08-14T21:21:50.4568016Z * [new tag] v1.11.0-rc7 -> v1.11.0-rc7 2025-08-14T21:21:50.4569049Z * [new tag] v1.12.0 -> v1.12.0 2025-08-14T21:21:50.4570123Z * [new tag] v1.12.0-rc1 -> v1.12.0-rc1 2025-08-14T21:21:50.4570392Z * [new tag] v1.12.0-rc2 -> v1.12.0-rc2 2025-08-14T21:21:50.4574368Z * [new tag] v1.12.0-rc3 -> v1.12.0-rc3 2025-08-14T21:21:50.4574522Z * [new tag] v1.12.0-rc4 -> v1.12.0-rc4 2025-08-14T21:21:50.4574631Z * [new tag] v1.12.0-rc5 -> v1.12.0-rc5 2025-08-14T21:21:50.4574734Z * [new tag] v1.12.0-rc6 -> v1.12.0-rc6 2025-08-14T21:21:50.4574862Z * [new tag] v1.12.0-rc7 -> v1.12.0-rc7 2025-08-14T21:21:50.4575113Z * [new tag] v1.12.0-rc8 -> v1.12.0-rc8 2025-08-14T21:21:50.4575225Z * [new tag] v1.12.1 -> v1.12.1 2025-08-14T21:21:50.4575352Z * [new tag] v1.12.1-rc1 -> v1.12.1-rc1 2025-08-14T21:21:50.4575626Z * [new tag] v1.12.1-rc2 -> v1.12.1-rc2 2025-08-14T21:21:50.4576569Z * [new tag] v1.12.1-rc3 -> v1.12.1-rc3 2025-08-14T21:21:50.4576852Z * [new tag] v1.12.1-rc4 -> v1.12.1-rc4 2025-08-14T21:21:50.4577291Z * [new tag] v1.12.1-rc5 -> v1.12.1-rc5 2025-08-14T21:21:50.4579608Z * [new tag] v1.13.0 -> v1.13.0 2025-08-14T21:21:50.4579760Z * [new tag] v1.13.0-rc1 -> v1.13.0-rc1 2025-08-14T21:21:50.4579868Z * [new tag] v1.13.0-rc2 -> v1.13.0-rc2 2025-08-14T21:21:50.4580146Z * [new tag] v1.13.0-rc3 -> v1.13.0-rc3 2025-08-14T21:21:50.4580590Z * [new tag] v1.13.0-rc4 -> v1.13.0-rc4 2025-08-14T21:21:50.4580814Z * [new tag] v1.13.0-rc5 -> v1.13.0-rc5 2025-08-14T21:21:50.4581498Z * [new tag] v1.13.0-rc6 -> v1.13.0-rc6 2025-08-14T21:21:50.4581869Z * [new tag] v1.13.1 -> v1.13.1 2025-08-14T21:21:50.4586652Z * [new tag] v1.13.1-rc1 -> v1.13.1-rc1 2025-08-14T21:21:50.4586796Z * [new tag] v1.2.0 -> v1.2.0 2025-08-14T21:21:50.4586913Z * [new tag] v1.2.0a0 -> v1.2.0a0 2025-08-14T21:21:50.4587029Z * [new tag] v1.3.0 -> v1.3.0 2025-08-14T21:21:50.4587134Z * [new tag] v1.3.0a0 -> v1.3.0a0 2025-08-14T21:21:50.4587261Z * [new tag] v1.3.1 -> v1.3.1 2025-08-14T21:21:50.4587364Z * [new tag] v1.4.0 -> v1.4.0 2025-08-14T21:21:50.4587468Z * [new tag] v1.4.0a0 -> v1.4.0a0 2025-08-14T21:21:50.4587575Z * [new tag] v1.4.1 -> v1.4.1 2025-08-14T21:21:50.4587673Z * [new tag] v1.5.0 -> v1.5.0 2025-08-14T21:21:50.4587998Z * [new tag] v1.5.0-rc1 -> v1.5.0-rc1 2025-08-14T21:21:50.4588484Z * [new tag] v1.5.0-rc2 -> v1.5.0-rc2 2025-08-14T21:21:50.4592253Z * [new tag] v1.5.0-rc3 -> v1.5.0-rc3 2025-08-14T21:21:50.4592541Z * [new tag] v1.5.0-rc4 -> v1.5.0-rc4 2025-08-14T21:21:50.4592683Z * [new tag] v1.5.0-rc5 -> v1.5.0-rc5 2025-08-14T21:21:50.4592813Z * [new tag] v1.5.1 -> v1.5.1 2025-08-14T21:21:50.4592933Z * [new tag] v1.5.1-rc1 -> v1.5.1-rc1 2025-08-14T21:21:50.4593037Z * [new tag] v1.6.0 -> v1.6.0 2025-08-14T21:21:50.4593140Z * [new tag] v1.6.0-rc1 -> v1.6.0-rc1 2025-08-14T21:21:50.4593387Z * [new tag] v1.6.0-rc2 -> v1.6.0-rc2 2025-08-14T21:21:50.4593837Z * [new tag] v1.6.0-rc3 -> v1.6.0-rc3 2025-08-14T21:21:50.4594126Z * [new tag] v1.6.0-rc4 -> v1.6.0-rc4 2025-08-14T21:21:50.4595091Z * [new tag] v1.6.0-rc5 -> v1.6.0-rc5 2025-08-14T21:21:50.4595208Z * [new tag] v1.6.0-rc6 -> v1.6.0-rc6 2025-08-14T21:21:50.4598857Z * [new tag] v1.6.0-rc7 -> v1.6.0-rc7 2025-08-14T21:21:50.4599008Z * [new tag] v1.7.0 -> v1.7.0 2025-08-14T21:21:50.4599304Z * [new tag] v1.7.0-rc1 -> v1.7.0-rc1 2025-08-14T21:21:50.4599420Z * [new tag] v1.7.0-rc2 -> v1.7.0-rc2 2025-08-14T21:21:50.4599523Z * [new tag] v1.7.0-rc3 -> v1.7.0-rc3 2025-08-14T21:21:50.4599659Z * [new tag] v1.7.0-rc4 -> v1.7.0-rc4 2025-08-14T21:21:50.4599937Z * [new tag] v1.7.1 -> v1.7.1 2025-08-14T21:21:50.4600144Z * [new tag] v1.7.1-rc1 -> v1.7.1-rc1 2025-08-14T21:21:50.4602026Z * [new tag] v1.7.1-rc2 -> v1.7.1-rc2 2025-08-14T21:21:50.4602160Z * [new tag] v1.7.1-rc3 -> v1.7.1-rc3 2025-08-14T21:21:50.4602281Z * [new tag] v1.8.0 -> v1.8.0 2025-08-14T21:21:50.4602546Z * [new tag] v1.8.0-rc1 -> v1.8.0-rc1 2025-08-14T21:21:50.4603457Z * [new tag] v1.8.0-rc2 -> v1.8.0-rc2 2025-08-14T21:21:50.4604320Z * [new tag] v1.8.0-rc3 -> v1.8.0-rc3 2025-08-14T21:21:50.4605678Z * [new tag] v1.8.0-rc4 -> v1.8.0-rc4 2025-08-14T21:21:50.4605915Z * [new tag] v1.8.0-rc5 -> v1.8.0-rc5 2025-08-14T21:21:50.4606028Z * [new tag] v1.8.1 -> v1.8.1 2025-08-14T21:21:50.4606134Z * [new tag] v1.8.1-rc1 -> v1.8.1-rc1 2025-08-14T21:21:50.4607265Z * [new tag] v1.8.1-rc2 -> v1.8.1-rc2 2025-08-14T21:21:50.4607509Z * [new tag] v1.8.1-rc3 -> v1.8.1-rc3 2025-08-14T21:21:50.4608927Z * [new tag] v1.8.2 -> v1.8.2 2025-08-14T21:21:50.4609504Z * [new tag] v1.8.2-rc1 -> v1.8.2-rc1 2025-08-14T21:21:50.4612798Z * [new tag] v1.9.0 -> v1.9.0 2025-08-14T21:21:50.4613036Z * [new tag] v1.9.0-rc1 -> v1.9.0-rc1 2025-08-14T21:21:50.4613175Z * [new tag] v1.9.0-rc2 -> v1.9.0-rc2 2025-08-14T21:21:50.4613287Z * [new tag] v1.9.0-rc3 -> v1.9.0-rc3 2025-08-14T21:21:50.4613553Z * [new tag] v1.9.0-rc4 -> v1.9.0-rc4 2025-08-14T21:21:50.4614172Z * [new tag] v1.9.1 -> v1.9.1 2025-08-14T21:21:50.4614313Z * [new tag] v1.9.1-rc1 -> v1.9.1-rc1 2025-08-14T21:21:50.4616581Z * [new tag] v1.9.1-rc2 -> v1.9.1-rc2 2025-08-14T21:21:50.4616799Z * [new tag] v2.0.0 -> v2.0.0 2025-08-14T21:21:50.4616916Z * [new tag] v2.0.0-rc1 -> v2.0.0-rc1 2025-08-14T21:21:50.4617019Z * [new tag] v2.0.0-rc2 -> v2.0.0-rc2 2025-08-14T21:21:50.4617137Z * [new tag] v2.0.0-rc3 -> v2.0.0-rc3 2025-08-14T21:21:50.4617241Z * [new tag] v2.0.0-rc4 -> v2.0.0-rc4 2025-08-14T21:21:50.4617354Z * [new tag] v2.0.0-rc5 -> v2.0.0-rc5 2025-08-14T21:21:50.4617460Z * [new tag] v2.0.0-rc6 -> v2.0.0-rc6 2025-08-14T21:21:50.4624373Z * [new tag] v2.0.1 -> v2.0.1 2025-08-14T21:21:50.4624522Z * [new tag] v2.0.1-rc1 -> v2.0.1-rc1 2025-08-14T21:21:50.4624642Z * [new tag] v2.0.1-rc2 -> v2.0.1-rc2 2025-08-14T21:21:50.4624743Z * [new tag] v2.0.1-rc3 -> v2.0.1-rc3 2025-08-14T21:21:50.4624853Z * [new tag] v2.0.1-rc4 -> v2.0.1-rc4 2025-08-14T21:21:50.4624955Z * [new tag] v2.1.0 -> v2.1.0 2025-08-14T21:21:50.4625083Z * [new tag] v2.1.0-rc1 -> v2.1.0-rc1 2025-08-14T21:21:50.4625413Z * [new tag] v2.1.0-rc2 -> v2.1.0-rc2 2025-08-14T21:21:50.4625519Z * [new tag] v2.1.0-rc3 -> v2.1.0-rc3 2025-08-14T21:21:50.4625636Z * [new tag] v2.1.0-rc4 -> v2.1.0-rc4 2025-08-14T21:21:50.4625740Z * [new tag] v2.1.0-rc5 -> v2.1.0-rc5 2025-08-14T21:21:50.4625851Z * [new tag] v2.1.0-rc6 -> v2.1.0-rc6 2025-08-14T21:21:50.4625953Z * [new tag] v2.1.1 -> v2.1.1 2025-08-14T21:21:50.4626054Z * [new tag] v2.1.1-rc1 -> v2.1.1-rc1 2025-08-14T21:21:50.4626158Z * [new tag] v2.1.1-rc2 -> v2.1.1-rc2 2025-08-14T21:21:50.4630748Z * [new tag] v2.1.1-rc3 -> v2.1.1-rc3 2025-08-14T21:21:50.4631107Z * [new tag] v2.1.1-rc4 -> v2.1.1-rc4 2025-08-14T21:21:50.4631218Z * [new tag] v2.1.1-rc5 -> v2.1.1-rc5 2025-08-14T21:21:50.4631319Z * [new tag] v2.1.1-rc6 -> v2.1.1-rc6 2025-08-14T21:21:50.4631435Z * [new tag] v2.1.2 -> v2.1.2 2025-08-14T21:21:50.4631537Z * [new tag] v2.1.2-rc1 -> v2.1.2-rc1 2025-08-14T21:21:50.4631634Z * [new tag] v2.1.2-rc2 -> v2.1.2-rc2 2025-08-14T21:21:50.4631740Z * [new tag] v2.1.2-rc3 -> v2.1.2-rc3 2025-08-14T21:21:50.4631876Z * [new tag] v2.2.0 -> v2.2.0 2025-08-14T21:21:50.4632096Z * [new tag] v2.2.0-rc1 -> v2.2.0-rc1 2025-08-14T21:21:50.4632218Z * [new tag] v2.2.0-rc2 -> v2.2.0-rc2 2025-08-14T21:21:50.4632329Z * [new tag] v2.2.0-rc3 -> v2.2.0-rc3 2025-08-14T21:21:50.4632442Z * [new tag] v2.2.0-rc4 -> v2.2.0-rc4 2025-08-14T21:21:50.4632657Z * [new tag] v2.2.0-rc5 -> v2.2.0-rc5 2025-08-14T21:21:50.4632998Z * [new tag] v2.2.0-rc6 -> v2.2.0-rc6 2025-08-14T21:21:50.4633105Z * [new tag] v2.2.0-rc7 -> v2.2.0-rc7 2025-08-14T21:21:50.4638805Z * [new tag] v2.2.0-rc8 -> v2.2.0-rc8 2025-08-14T21:21:50.4639350Z * [new tag] v2.2.1 -> v2.2.1 2025-08-14T21:21:50.4639512Z * [new tag] v2.2.1-rc1 -> v2.2.1-rc1 2025-08-14T21:21:50.4639624Z * [new tag] v2.2.1-rc2 -> v2.2.1-rc2 2025-08-14T21:21:50.4639739Z * [new tag] v2.2.1-rc3 -> v2.2.1-rc3 2025-08-14T21:21:50.4639845Z * [new tag] v2.2.2 -> v2.2.2 2025-08-14T21:21:50.4639970Z * [new tag] v2.2.2-rc1 -> v2.2.2-rc1 2025-08-14T21:21:50.4640094Z * [new tag] v2.2.2-rc2 -> v2.2.2-rc2 2025-08-14T21:21:50.4640193Z * [new tag] v2.2.2-rc3 -> v2.2.2-rc3 2025-08-14T21:21:50.4640295Z * [new tag] v2.3.0 -> v2.3.0 2025-08-14T21:21:50.4641160Z * [new tag] v2.3.0-rc1 -> v2.3.0-rc1 2025-08-14T21:21:50.4641832Z * [new tag] v2.3.0-rc10 -> v2.3.0-rc10 2025-08-14T21:21:50.4642005Z * [new tag] v2.3.0-rc11 -> v2.3.0-rc11 2025-08-14T21:21:50.4642112Z * [new tag] v2.3.0-rc12 -> v2.3.0-rc12 2025-08-14T21:21:50.4642225Z * [new tag] v2.3.0-rc2 -> v2.3.0-rc2 2025-08-14T21:21:50.4642340Z * [new tag] v2.3.0-rc3 -> v2.3.0-rc3 2025-08-14T21:21:50.4642466Z * [new tag] v2.3.0-rc4 -> v2.3.0-rc4 2025-08-14T21:21:50.4642723Z * [new tag] v2.3.0-rc5 -> v2.3.0-rc5 2025-08-14T21:21:50.4642871Z * [new tag] v2.3.0-rc6 -> v2.3.0-rc6 2025-08-14T21:21:50.4642973Z * [new tag] v2.3.0-rc7 -> v2.3.0-rc7 2025-08-14T21:21:50.4645979Z * [new tag] v2.3.0-rc8 -> v2.3.0-rc8 2025-08-14T21:21:50.4646134Z * [new tag] v2.3.0-rc9 -> v2.3.0-rc9 2025-08-14T21:21:50.4646253Z * [new tag] v2.3.1 -> v2.3.1 2025-08-14T21:21:50.4646362Z * [new tag] v2.3.1-rc1 -> v2.3.1-rc1 2025-08-14T21:21:50.4646475Z * [new tag] v2.3.1-rc2 -> v2.3.1-rc2 2025-08-14T21:21:50.4646581Z * [new tag] v2.3.1-rc3 -> v2.3.1-rc3 2025-08-14T21:21:50.4646872Z * [new tag] v2.4.0 -> v2.4.0 2025-08-14T21:21:50.4647032Z * [new tag] v2.4.0-rc1 -> v2.4.0-rc1 2025-08-14T21:21:50.4647709Z * [new tag] v2.4.0-rc2 -> v2.4.0-rc2 2025-08-14T21:21:50.4648147Z * [new tag] v2.4.0-rc3 -> v2.4.0-rc3 2025-08-14T21:21:50.4649216Z * [new tag] v2.4.0-rc4 -> v2.4.0-rc4 2025-08-14T21:21:50.4649612Z * [new tag] v2.4.0-rc5 -> v2.4.0-rc5 2025-08-14T21:21:50.4650590Z * [new tag] v2.4.0-rc6 -> v2.4.0-rc6 2025-08-14T21:21:50.4650836Z * [new tag] v2.4.0-rc7 -> v2.4.0-rc7 2025-08-14T21:21:50.4651811Z * [new tag] v2.4.0-rc8 -> v2.4.0-rc8 2025-08-14T21:21:50.4652272Z * [new tag] v2.4.0-rc9 -> v2.4.0-rc9 2025-08-14T21:21:50.4652682Z * [new tag] v2.4.1 -> v2.4.1 2025-08-14T21:21:50.4654296Z * [new tag] v2.4.1-rc1 -> v2.4.1-rc1 2025-08-14T21:21:50.4654505Z * [new tag] v2.4.1-rc2 -> v2.4.1-rc2 2025-08-14T21:21:50.4657093Z * [new tag] v2.4.1-rc3 -> v2.4.1-rc3 2025-08-14T21:21:50.4657234Z * [new tag] v2.5.0 -> v2.5.0 2025-08-14T21:21:50.4657346Z * [new tag] v2.5.0-rc1 -> v2.5.0-rc1 2025-08-14T21:21:50.4657465Z * [new tag] v2.5.0-rc10 -> v2.5.0-rc10 2025-08-14T21:21:50.4657584Z * [new tag] v2.5.0-rc2 -> v2.5.0-rc2 2025-08-14T21:21:50.4657838Z * [new tag] v2.5.0-rc3 -> v2.5.0-rc3 2025-08-14T21:21:50.4658816Z * [new tag] v2.5.0-rc4 -> v2.5.0-rc4 2025-08-14T21:21:50.4659116Z * [new tag] v2.5.0-rc5 -> v2.5.0-rc5 2025-08-14T21:21:50.4660195Z * [new tag] v2.5.0-rc6 -> v2.5.0-rc6 2025-08-14T21:21:50.4660447Z * [new tag] v2.5.0-rc7 -> v2.5.0-rc7 2025-08-14T21:21:50.4661416Z * [new tag] v2.5.0-rc8 -> v2.5.0-rc8 2025-08-14T21:21:50.4661658Z * [new tag] v2.5.0-rc9 -> v2.5.0-rc9 2025-08-14T21:21:50.4662405Z * [new tag] v2.5.1 -> v2.5.1 2025-08-14T21:21:50.4663189Z * [new tag] v2.5.1-rc1 -> v2.5.1-rc1 2025-08-14T21:21:50.4663396Z * [new tag] v2.6.0 -> v2.6.0 2025-08-14T21:21:50.4663639Z * [new tag] v2.6.0-rc1 -> v2.6.0-rc1 2025-08-14T21:21:50.4666851Z * [new tag] v2.6.0-rc2 -> v2.6.0-rc2 2025-08-14T21:21:50.4666995Z * [new tag] v2.6.0-rc3 -> v2.6.0-rc3 2025-08-14T21:21:50.4667122Z * [new tag] v2.6.0-rc4 -> v2.6.0-rc4 2025-08-14T21:21:50.4667399Z * [new tag] v2.6.0-rc5 -> v2.6.0-rc5 2025-08-14T21:21:50.4667887Z * [new tag] v2.6.0-rc6 -> v2.6.0-rc6 2025-08-14T21:21:50.4667990Z * [new tag] v2.6.0-rc7 -> v2.6.0-rc7 2025-08-14T21:21:50.4671681Z * [new tag] v2.6.0-rc8 -> v2.6.0-rc8 2025-08-14T21:21:50.4671821Z * [new tag] v2.6.0-rc9 -> v2.6.0-rc9 2025-08-14T21:21:50.4671942Z * [new tag] v2.7.0 -> v2.7.0 2025-08-14T21:21:50.4672052Z * [new tag] v2.7.0-rc1 -> v2.7.0-rc1 2025-08-14T21:21:50.4672166Z * [new tag] v2.7.0-rc10 -> v2.7.0-rc10 2025-08-14T21:21:50.4672276Z * [new tag] v2.7.0-rc2 -> v2.7.0-rc2 2025-08-14T21:21:50.4672875Z * [new tag] v2.7.0-rc3 -> v2.7.0-rc3 2025-08-14T21:21:50.4674278Z * [new tag] v2.7.0-rc4 -> v2.7.0-rc4 2025-08-14T21:21:50.4674704Z * [new tag] v2.7.0-rc5 -> v2.7.0-rc5 2025-08-14T21:21:50.4674830Z * [new tag] v2.7.0-rc6 -> v2.7.0-rc6 2025-08-14T21:21:50.4675121Z * [new tag] v2.7.0-rc7 -> v2.7.0-rc7 2025-08-14T21:21:50.4679276Z * [new tag] v2.7.0-rc8 -> v2.7.0-rc8 2025-08-14T21:21:50.4679416Z * [new tag] v2.7.0-rc9 -> v2.7.0-rc9 2025-08-14T21:21:50.4679536Z * [new tag] v2.7.1 -> v2.7.1 2025-08-14T21:21:50.4679643Z * [new tag] v2.7.1-rc1 -> v2.7.1-rc1 2025-08-14T21:21:50.4679754Z * [new tag] v2.7.1-rc2 -> v2.7.1-rc2 2025-08-14T21:21:50.4679856Z * [new tag] v2.7.1-rc3 -> v2.7.1-rc3 2025-08-14T21:21:50.4680197Z * [new tag] v2.7.1-rc4 -> v2.7.1-rc4 2025-08-14T21:21:50.4681898Z * [new tag] v2.7.1-rc5 -> v2.7.1-rc5 2025-08-14T21:21:50.4682033Z * [new tag] v2.8.0 -> v2.8.0 2025-08-14T21:21:50.4682162Z * [new tag] v2.8.0-rc1 -> v2.8.0-rc1 2025-08-14T21:21:50.4682702Z * [new tag] v2.8.0-rc2 -> v2.8.0-rc2 2025-08-14T21:21:50.4685735Z * [new tag] v2.8.0-rc3 -> v2.8.0-rc3 2025-08-14T21:21:50.4685859Z * [new tag] v2.8.0-rc4 -> v2.8.0-rc4 2025-08-14T21:21:50.4685966Z * [new tag] v2.8.0-rc5 -> v2.8.0-rc5 2025-08-14T21:21:50.4686080Z * [new tag] v2.8.0-rc6 -> v2.8.0-rc6 2025-08-14T21:21:50.4686183Z * [new tag] v2.8.0-rc7 -> v2.8.0-rc7 2025-08-14T21:21:50.4687173Z * [new tag] v2.8.0-rc8 -> v2.8.0-rc8 2025-08-14T21:21:50.4687486Z * [new tag] whc_flight_1 -> whc_flight_1 2025-08-14T21:21:50.4688025Z * [new tag] whc_flight_2 -> whc_flight_2 2025-08-14T21:21:50.4688443Z * [new tag] whc_flight_4 -> whc_flight_4 2025-08-14T21:21:50.5180355Z [command]/usr/bin/git rev-parse --verify --quiet 1fc683cf17c8c673044538d10266c00f92987be2^{object} 2025-08-14T21:21:50.5204466Z 1fc683cf17c8c673044538d10266c00f92987be2 2025-08-14T21:21:50.5205140Z ##[endgroup] 2025-08-14T21:21:50.5205360Z ##[group]Determining the checkout info 2025-08-14T21:21:50.5205547Z ##[endgroup] 2025-08-14T21:21:50.5209068Z [command]/usr/bin/git sparse-checkout disable 2025-08-14T21:21:50.5257472Z [command]/usr/bin/git config --local --unset-all extensions.worktreeConfig 2025-08-14T21:21:50.5284098Z ##[group]Checking out the ref 2025-08-14T21:21:50.5284936Z [command]/usr/bin/git checkout --progress --force 1fc683cf17c8c673044538d10266c00f92987be2 2025-08-14T21:21:51.5601647Z Updating files: 97% (18920/19474) 2025-08-14T21:21:51.5843912Z Updating files: 98% (19085/19474) 2025-08-14T21:21:51.5973456Z Updating files: 99% (19280/19474) 2025-08-14T21:21:51.5976631Z Updating files: 100% (19474/19474) 2025-08-14T21:21:51.5981730Z Updating files: 100% (19474/19474), done. 2025-08-14T21:21:51.6217729Z Note: switching to '1fc683cf17c8c673044538d10266c00f92987be2'. 2025-08-14T21:21:51.6218646Z 2025-08-14T21:21:51.6218843Z You are in 'detached HEAD' state. You can look around, make experimental 2025-08-14T21:21:51.6219236Z changes and commit them, and you can discard any commits you make in this 2025-08-14T21:21:51.6219622Z state without impacting any branches by switching back to a branch. 2025-08-14T21:21:51.6219841Z 2025-08-14T21:21:51.6219986Z If you want to create a new branch to retain commits you create, you may 2025-08-14T21:21:51.6220329Z do so (now or later) by using -c with the switch command. Example: 2025-08-14T21:21:51.6220838Z 2025-08-14T21:21:51.6220954Z git switch -c 2025-08-14T21:21:51.6221096Z 2025-08-14T21:21:51.6221182Z Or undo this operation with: 2025-08-14T21:21:51.6221314Z 2025-08-14T21:21:51.6221387Z git switch - 2025-08-14T21:21:51.6221487Z 2025-08-14T21:21:51.6221655Z Turn off this advice by setting config variable advice.detachedHead to false 2025-08-14T21:21:51.6221869Z 2025-08-14T21:21:51.6222128Z HEAD is now at 1fc683cf17c [Inductor] Allow indexing a flexible layout for extract_input_node_reduction_ranges (#160645) 2025-08-14T21:21:51.6268833Z ##[endgroup] 2025-08-14T21:21:51.6272660Z ##[group]Setting up auth for fetching submodules 2025-08-14T21:21:51.6280134Z [command]/usr/bin/git config --global http.https://github.com/.extraheader AUTHORIZATION: basic *** 2025-08-14T21:21:51.6334205Z [command]/usr/bin/git config --global --unset-all url.https://github.com/.insteadOf 2025-08-14T21:21:51.6362192Z [command]/usr/bin/git config --global --add url.https://github.com/.insteadOf git@github.com: 2025-08-14T21:21:51.6391882Z [command]/usr/bin/git config --global --add url.https://github.com/.insteadOf org-21003710@github.com: 2025-08-14T21:21:51.6424119Z ##[endgroup] 2025-08-14T21:21:51.6437135Z ##[group]Fetching submodules 2025-08-14T21:21:51.6437423Z [command]/usr/bin/git submodule sync --recursive 2025-08-14T21:21:51.6750534Z [command]/usr/bin/git -c protocol.version=2 submodule update --init --force --recursive 2025-08-14T21:21:51.7064755Z Submodule 'android/libs/fbjni' (https://github.com/facebookincubator/fbjni.git) registered for path 'android/libs/fbjni' 2025-08-14T21:21:51.7065453Z Submodule 'third_party/NNPACK_deps/FP16' (https://github.com/Maratyszcza/FP16.git) registered for path 'third_party/FP16' 2025-08-14T21:21:51.7066147Z Submodule 'third_party/NNPACK_deps/FXdiv' (https://github.com/Maratyszcza/FXdiv.git) registered for path 'third_party/FXdiv' 2025-08-14T21:21:51.7066739Z Submodule 'third_party/NNPACK' (https://github.com/Maratyszcza/NNPACK.git) registered for path 'third_party/NNPACK' 2025-08-14T21:21:51.7067345Z Submodule 'third_party/NVTX' (https://github.com/NVIDIA/NVTX.git) registered for path 'third_party/NVTX' 2025-08-14T21:21:51.7068105Z Submodule 'third_party/VulkanMemoryAllocator' (https://github.com/GPUOpen-LibrariesAndSDKs/VulkanMemoryAllocator.git) registered for path 'third_party/VulkanMemoryAllocator' 2025-08-14T21:21:51.7069340Z Submodule 'third_party/XNNPACK' (https://github.com/google/XNNPACK.git) registered for path 'third_party/XNNPACK' 2025-08-14T21:21:51.7070249Z Submodule 'third_party/aiter' (https://github.com/ROCm/aiter.git) registered for path 'third_party/aiter' 2025-08-14T21:21:51.7072379Z Submodule 'third_party/benchmark' (https://github.com/google/benchmark.git) registered for path 'third_party/benchmark' 2025-08-14T21:21:51.7075003Z Submodule 'third_party/composable_kernel' (https://github.com/ROCm/composable_kernel.git) registered for path 'third_party/composable_kernel' 2025-08-14T21:21:51.7077782Z Submodule 'third_party/cpp-httplib' (https://github.com/yhirose/cpp-httplib.git) registered for path 'third_party/cpp-httplib' 2025-08-14T21:21:51.7078633Z Submodule 'third_party/cpuinfo' (https://github.com/pytorch/cpuinfo.git) registered for path 'third_party/cpuinfo' 2025-08-14T21:21:51.7081180Z Submodule 'third_party/cudnn_frontend' (https://github.com/NVIDIA/cudnn-frontend.git) registered for path 'third_party/cudnn_frontend' 2025-08-14T21:21:51.7083304Z Submodule 'third_party/cutlass' (https://github.com/NVIDIA/cutlass.git) registered for path 'third_party/cutlass' 2025-08-14T21:21:51.7086113Z Submodule 'third_party/fbgemm' (https://github.com/pytorch/fbgemm) registered for path 'third_party/fbgemm' 2025-08-14T21:21:51.7090297Z Submodule 'third_party/flash-attention' (https://github.com/Dao-AILab/flash-attention.git) registered for path 'third_party/flash-attention' 2025-08-14T21:21:51.7094203Z Submodule 'third_party/flatbuffers' (https://github.com/google/flatbuffers.git) registered for path 'third_party/flatbuffers' 2025-08-14T21:21:51.7096089Z Submodule 'third_party/fmt' (https://github.com/fmtlib/fmt.git) registered for path 'third_party/fmt' 2025-08-14T21:21:51.7480655Z Submodule 'third_party/gemmlowp/gemmlowp' (https://github.com/google/gemmlowp.git) registered for path 'third_party/gemmlowp/gemmlowp' 2025-08-14T21:21:51.7481970Z Submodule 'third_party/gloo' (https://github.com/pytorch/gloo) registered for path 'third_party/gloo' 2025-08-14T21:21:51.7484258Z Submodule 'third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/googletest' 2025-08-14T21:21:51.7486982Z Submodule 'third_party/ideep' (https://github.com/intel/ideep) registered for path 'third_party/ideep' 2025-08-14T21:21:51.7490112Z Submodule 'third_party/ittapi' (https://github.com/intel/ittapi.git) registered for path 'third_party/ittapi' 2025-08-14T21:21:51.7493361Z Submodule 'third_party/kineto' (https://github.com/pytorch/kineto) registered for path 'third_party/kineto' 2025-08-14T21:21:51.7495977Z Submodule 'third_party/kleidiai' (https://github.com/ARM-software/kleidiai.git) registered for path 'third_party/kleidiai' 2025-08-14T21:21:51.7500641Z Submodule 'third_party/mimalloc' (https://github.com/microsoft/mimalloc.git) registered for path 'third_party/mimalloc' 2025-08-14T21:21:51.7501744Z Submodule 'third_party/nlohmann' (https://github.com/nlohmann/json.git) registered for path 'third_party/nlohmann' 2025-08-14T21:21:51.7510192Z Submodule 'third_party/onnx' (https://github.com/onnx/onnx.git) registered for path 'third_party/onnx' 2025-08-14T21:21:51.7510909Z Submodule 'third_party/opentelemetry-cpp' (https://github.com/open-telemetry/opentelemetry-cpp.git) registered for path 'third_party/opentelemetry-cpp' 2025-08-14T21:21:51.7512720Z Submodule 'third_party/pocketfft' (https://github.com/mreineck/pocketfft) registered for path 'third_party/pocketfft' 2025-08-14T21:21:51.7518448Z Submodule 'third_party/protobuf' (https://github.com/protocolbuffers/protobuf.git) registered for path 'third_party/protobuf' 2025-08-14T21:21:51.7519222Z Submodule 'third_party/NNPACK_deps/psimd' (https://github.com/Maratyszcza/psimd.git) registered for path 'third_party/psimd' 2025-08-14T21:21:51.7520305Z Submodule 'third_party/NNPACK_deps/pthreadpool' (https://github.com/Maratyszcza/pthreadpool.git) registered for path 'third_party/pthreadpool' 2025-08-14T21:21:51.7523572Z Submodule 'third_party/pybind11' (https://github.com/pybind/pybind11.git) registered for path 'third_party/pybind11' 2025-08-14T21:21:51.7528210Z Submodule 'third_party/python-peachpy' (https://github.com/malfet/PeachPy.git) registered for path 'third_party/python-peachpy' 2025-08-14T21:21:51.7537037Z Submodule 'third_party/sleef' (https://github.com/shibatch/sleef) registered for path 'third_party/sleef' 2025-08-14T21:21:51.7537685Z Submodule 'third_party/tensorpipe' (https://github.com/pytorch/tensorpipe.git) registered for path 'third_party/tensorpipe' 2025-08-14T21:21:51.7570206Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/android/libs/fbjni'... 2025-08-14T21:21:51.9916532Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/FP16'... 2025-08-14T21:21:51.9917054Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/FXdiv'... 2025-08-14T21:21:51.9917516Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/psimd'... 2025-08-14T21:21:51.9945746Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/pthreadpool'... 2025-08-14T21:21:52.1828711Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/NNPACK'... 2025-08-14T21:21:52.1829842Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/pocketfft'... 2025-08-14T21:21:52.1830804Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/gloo'... 2025-08-14T21:21:52.1918257Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/pybind11'... 2025-08-14T21:21:53.3418059Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/gemmlowp/gemmlowp'... 2025-08-14T21:21:53.3418968Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/benchmark'... 2025-08-14T21:21:53.3419699Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/ittapi'... 2025-08-14T21:21:53.3420445Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/ideep'... 2025-08-14T21:21:53.3421215Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kleidiai'... 2025-08-14T21:21:53.3421952Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/NVTX'... 2025-08-14T21:21:53.3423286Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/python-peachpy'... 2025-08-14T21:21:53.3424118Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/flash-attention'... 2025-08-14T21:21:53.3424936Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/cpp-httplib'... 2025-08-14T21:21:53.3425879Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe'... 2025-08-14T21:21:53.3426621Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/cpuinfo'... 2025-08-14T21:21:53.3427295Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/mimalloc'... 2025-08-14T21:21:53.3427857Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/googletest'... 2025-08-14T21:21:53.3428612Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/sleef'... 2025-08-14T21:21:53.4417148Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/VulkanMemoryAllocator'... 2025-08-14T21:21:53.6334377Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/cudnn_frontend'... 2025-08-14T21:21:53.6335308Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto'... 2025-08-14T21:21:53.7057586Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/XNNPACK'... 2025-08-14T21:22:05.7295378Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fmt'... 2025-08-14T21:22:05.7295928Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/flatbuffers'... 2025-08-14T21:22:05.7296390Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm'... 2025-08-14T21:22:05.7296849Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/cutlass'... 2025-08-14T21:22:05.7297282Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/onnx'... 2025-08-14T21:22:05.7297758Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/composable_kernel'... 2025-08-14T21:22:05.7298219Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/aiter'... 2025-08-14T21:22:05.7298769Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp'... 2025-08-14T21:22:05.7299575Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/nlohmann'... 2025-08-14T21:22:05.7300029Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/protobuf'... 2025-08-14T21:22:05.7443675Z Submodule path 'android/libs/fbjni': checked out '7e1e1fe3858c63c251c637ae41a20de425dde96f' 2025-08-14T21:22:05.7561500Z Submodule path 'third_party/FP16': checked out '4dfe081cf6bcd15db339cf2680b9281b8451eeb3' 2025-08-14T21:22:05.7660365Z Submodule path 'third_party/FXdiv': checked out 'b408327ac2a15ec3e43352421954f5b1967701d1' 2025-08-14T21:22:05.7879893Z Submodule path 'third_party/NNPACK': checked out 'c07e3a0400713d546e0dea2d5466dd22ea389c73' 2025-08-14T21:22:05.8621371Z Submodule path 'third_party/NVTX': checked out '2942f167cc30c5e3a44a2aecd5b0d9c07ff61a07' 2025-08-14T21:22:05.9082037Z Submodule path 'third_party/VulkanMemoryAllocator': checked out '1d8f600fd424278486eade7ed3e877c99f0846b1' 2025-08-14T21:22:06.4863192Z Submodule path 'third_party/XNNPACK': checked out '51a0103656eff6fc9bfd39a4597923c4b542c883' 2025-08-14T21:22:06.6185547Z Submodule path 'third_party/aiter': checked out '01aae101b9e5e94d6c16a9514c9fb8df99c93150' 2025-08-14T21:22:06.6204030Z Submodule '3rdparty/composable_kernel' (https://github.com/ROCm/composable_kernel.git) registered for path 'third_party/aiter/3rdparty/composable_kernel' 2025-08-14T21:22:06.6235398Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/aiter/3rdparty/composable_kernel'... 2025-08-14T21:22:10.0773438Z Submodule path 'third_party/aiter/3rdparty/composable_kernel': checked out 'cffe8fa2a442ac8e80dd236a1a5d24fe3d7e0cbf' 2025-08-14T21:22:10.0994333Z Submodule path 'third_party/benchmark': checked out '299e5928955cc62af9968370293b916f5130916f' 2025-08-14T21:22:10.3622268Z Submodule path 'third_party/composable_kernel': checked out '7fe50dc3da2069d6645d9deb8c017a876472a977' 2025-08-14T21:22:10.4097472Z Submodule path 'third_party/cpp-httplib': checked out '3af7f2c16147f3fbc6e4d717032daf505dc1652c' 2025-08-14T21:22:10.4994816Z Submodule path 'third_party/cpuinfo': checked out '5e3d2445e6a84d9599bee2bf78edbb4d80865e1d' 2025-08-14T21:22:10.5409899Z Submodule path 'third_party/cudnn_frontend': checked out 'f937055efc6d414d11f4c6577e3977fe74f35fb6' 2025-08-14T21:22:11.0919670Z Submodule path 'third_party/cutlass': checked out 'e51efbfe18fe4f4cbb66ab814c55bf4aa0185491' 2025-08-14T21:22:11.2107577Z Submodule path 'third_party/fbgemm': checked out '21c7d30c526c0f1ad873ecc632dca6cfa8a69067' 2025-08-14T21:22:11.2124274Z Submodule 'external/asmjit' (https://github.com/asmjit/asmjit.git) registered for path 'third_party/fbgemm/external/asmjit' 2025-08-14T21:22:11.2125040Z Submodule 'external/composable_kernel' (https://github.com/jwfromm/composable_kernel.git) registered for path 'third_party/fbgemm/external/composable_kernel' 2025-08-14T21:22:11.2125898Z Submodule 'external/cpuinfo' (https://github.com/pytorch/cpuinfo) registered for path 'third_party/fbgemm/external/cpuinfo' 2025-08-14T21:22:11.2127884Z Submodule 'external/cutlass' (https://github.com/jwfromm/cutlass) registered for path 'third_party/fbgemm/external/cutlass' 2025-08-14T21:22:11.2133949Z Submodule 'external/googletest' (https://github.com/google/googletest) registered for path 'third_party/fbgemm/external/googletest' 2025-08-14T21:22:11.2134894Z Submodule 'external/hipify_torch' (https://github.com/ROCmSoftwarePlatform/hipify_torch.git) registered for path 'third_party/fbgemm/external/hipify_torch' 2025-08-14T21:22:11.2141527Z Submodule 'external/json' (https://github.com/nlohmann/json.git) registered for path 'third_party/fbgemm/external/json' 2025-08-14T21:22:11.2168200Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/external/asmjit'... 2025-08-14T21:22:12.3831080Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/external/hipify_torch'... 2025-08-14T21:22:12.3831692Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/external/cpuinfo'... 2025-08-14T21:22:12.3832534Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/external/googletest'... 2025-08-14T21:22:12.4831256Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/external/composable_kernel'... 2025-08-14T21:22:12.6186200Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/external/cutlass'... 2025-08-14T21:22:13.4965447Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/external/json'... 2025-08-14T21:22:17.8207536Z Submodule path 'third_party/fbgemm/external/asmjit': checked out 'a3199e8857792cd10b7589ff5d58343d2c9008ea' 2025-08-14T21:22:18.0315892Z Submodule path 'third_party/fbgemm/external/composable_kernel': checked out 'b1281b8b08d973a7064f864f47eeb30f3e2596e9' 2025-08-14T21:22:18.1259694Z Submodule path 'third_party/fbgemm/external/cpuinfo': checked out '6543fec09b2f04ac4a666882998b534afc9c1349' 2025-08-14T21:22:18.6734603Z Submodule path 'third_party/fbgemm/external/cutlass': checked out 'b40777404c174b9694a870bff5c13ce6b7f656ad' 2025-08-14T21:22:18.7163997Z Submodule path 'third_party/fbgemm/external/googletest': checked out '52eb8108c5bdec04579160ae17225d66034bd723' 2025-08-14T21:22:18.7282998Z Submodule path 'third_party/fbgemm/external/hipify_torch': checked out 'a4337c69fe0e2552a7b7b0669178926beeed828c' 2025-08-14T21:22:18.8283855Z Submodule path 'third_party/fbgemm/external/json': checked out '9cca280a4d0ccf0c08f47a99aa71d1b0e52f8d03' 2025-08-14T21:22:18.8904519Z Submodule path 'third_party/flash-attention': checked out '979702c87a8713a8e0a5e9fee122b90d2ef13be5' 2025-08-14T21:22:18.8923150Z Submodule 'csrc/composable_kernel' (https://github.com/ROCm/composable_kernel.git) registered for path 'third_party/flash-attention/csrc/composable_kernel' 2025-08-14T21:22:18.8923956Z Submodule 'csrc/cutlass' (https://github.com/NVIDIA/cutlass.git) registered for path 'third_party/flash-attention/csrc/cutlass' 2025-08-14T21:22:18.8949709Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/flash-attention/csrc/composable_kernel'... 2025-08-14T21:22:22.1204694Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/flash-attention/csrc/cutlass'... 2025-08-14T21:22:22.3167722Z Submodule path 'third_party/flash-attention/csrc/composable_kernel': checked out '888317e698e9803c62bd38568abc9e05d7709f33' 2025-08-14T21:22:22.8206472Z Submodule path 'third_party/flash-attention/csrc/cutlass': checked out 'c506e16788cb08416a4a57e11a9067beeee29420' 2025-08-14T21:22:22.9362203Z Submodule path 'third_party/flatbuffers': checked out 'a2cd1ea3b6d3fee220106b5fed3f7ce8da9eb757' 2025-08-14T21:22:22.9683417Z Submodule path 'third_party/fmt': checked out '40626af88bd7df9a5fb80be7b25ac85b122d6c21' 2025-08-14T21:22:23.0057694Z Submodule path 'third_party/gemmlowp/gemmlowp': checked out '3fb5c176c17c765a3492cd2f0321b0dab712f350' 2025-08-14T21:22:23.0287069Z Submodule path 'third_party/gloo': checked out 'c7b7b022c124d9643957d9bd55f57ac59fce8fa2' 2025-08-14T21:22:23.0712021Z Submodule path 'third_party/googletest': checked out '52eb8108c5bdec04579160ae17225d66034bd723' 2025-08-14T21:22:23.0845856Z Submodule path 'third_party/ideep': checked out '719d8e6cd7f7a0e01b155657526d693acf97c2b3' 2025-08-14T21:22:23.0869109Z Submodule 'mkl-dnn' (https://github.com/intel/mkl-dnn.git) registered for path 'third_party/ideep/mkl-dnn' 2025-08-14T21:22:23.0893950Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/ideep/mkl-dnn'... 2025-08-14T21:22:34.7196722Z Submodule path 'third_party/ideep/mkl-dnn': checked out '8d263e693366ef8db40acc569cc7d8edf644556d' 2025-08-14T21:22:34.7384503Z Submodule path 'third_party/ittapi': checked out 'dec1d23ca65ab069d225dfe40dea14f455170959' 2025-08-14T21:22:34.8368209Z Submodule path 'third_party/kineto': checked out '5e7501833f1021ce6f618572d3baf657b6319658' 2025-08-14T21:22:34.8386812Z Submodule 'libkineto/third_party/dynolog' (https://github.com/facebookincubator/dynolog.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog' 2025-08-14T21:22:34.8387690Z Submodule 'libkineto/third_party/fmt' (https://github.com/fmtlib/fmt.git) registered for path 'third_party/kineto/libkineto/third_party/fmt' 2025-08-14T21:22:34.8388491Z Submodule 'libkineto/third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/kineto/libkineto/third_party/googletest' 2025-08-14T21:22:34.8421627Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog'... 2025-08-14T21:22:35.4818043Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/fmt'... 2025-08-14T21:22:36.1089100Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/googletest'... 2025-08-14T21:22:36.1826795Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog': checked out '7d04a0053a845370ae06ce317a22a48e9edcc74e' 2025-08-14T21:22:36.1843192Z Submodule 'third_party/DCGM' (https://github.com/NVIDIA/DCGM.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-08-14T21:22:36.1844153Z Submodule 'third_party/cpr' (https://github.com/libcpr/cpr.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-08-14T21:22:36.1847119Z Submodule 'third_party/fmt' (https://github.com/fmtlib/fmt.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-08-14T21:22:36.1848129Z Submodule 'third_party/gflags' (https://github.com/gflags/gflags.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-08-14T21:22:36.1854369Z Submodule 'third_party/glog' (https://github.com/google/glog.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-08-14T21:22:36.1855243Z Submodule 'third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-08-14T21:22:36.1856050Z Submodule 'third_party/json' (https://github.com/nlohmann/json.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-08-14T21:22:36.1858583Z Submodule 'third_party/pfs' (https://github.com/dtrugman/pfs.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-08-14T21:22:36.1885301Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM'... 2025-08-14T21:22:37.4678765Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/pfs'... 2025-08-14T21:22:37.4679508Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/gflags'... 2025-08-14T21:22:37.4680239Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/cpr'... 2025-08-14T21:22:37.4680909Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/glog'... 2025-08-14T21:22:37.4681613Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/googletest'... 2025-08-14T21:22:37.4748776Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/fmt'... 2025-08-14T21:22:37.5752854Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/json'... 2025-08-14T21:22:42.8622913Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM': checked out 'ffde4e54bc7249a6039a5e6b45b395141e1217f9' 2025-08-14T21:22:42.8783463Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr': checked out '871ed52d350214a034f6ef8a3b8f51c5ce1bd400' 2025-08-14T21:22:42.9109446Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt': checked out 'cd4af11efc9c622896a3e4cb599fa28668ca3d05' 2025-08-14T21:22:42.9246176Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags': checked out 'e171aa2d15ed9eb17054558e0b3a6a413bb01067' 2025-08-14T21:22:42.9265375Z Submodule 'doc' (https://github.com/gflags/gflags.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-08-14T21:22:42.9285313Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc'... 2025-08-14T21:22:43.2041070Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc': checked out '8411df715cf522606e3b1aca386ddfc0b63d34b4' 2025-08-14T21:22:43.2218089Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog': checked out 'b33e3bad4c46c8a6345525fd822af355e5ef9446' 2025-08-14T21:22:43.2596317Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest': checked out '58d77fa8070e8cec2dc1ed015d66b454c8d78850' 2025-08-14T21:22:43.3511567Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/json': checked out '4f8fba14066156b73f1189a2b8bd568bde5284c5' 2025-08-14T21:22:43.3669769Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs': checked out 'f68a2fa8ea36c783bdd760371411fcb495aa3150' 2025-08-14T21:22:43.4083023Z Submodule path 'third_party/kineto/libkineto/third_party/fmt': checked out '0041a40c1350ba702d475b9c4ad62da77caea164' 2025-08-14T21:22:43.4661279Z Submodule path 'third_party/kineto/libkineto/third_party/googletest': checked out '7aca84427f224eeed3144123d5230d5871e93347' 2025-08-14T21:22:43.5054281Z Submodule path 'third_party/kleidiai': checked out 'cca02c2f69dd18e1f12647c1c0bdc8cf90e680c7' 2025-08-14T21:22:43.5410388Z Submodule path 'third_party/mimalloc': checked out 'fbd8b99c2b828428947d70fdc046bb55609be93e' 2025-08-14T21:22:43.6450603Z Submodule path 'third_party/nlohmann': checked out '55f93686c01528224f448c19128836e7df245f72' 2025-08-14T21:22:43.9546963Z Submodule path 'third_party/onnx': checked out 'e709452ef2bbc1d113faf678c24e6d3467696e83' 2025-08-14T21:22:43.9577993Z Submodule 'third_party/pybind11' (https://github.com/pybind/pybind11.git) registered for path 'third_party/onnx/third_party/pybind11' 2025-08-14T21:22:43.9607900Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/onnx/third_party/pybind11'... 2025-08-14T21:22:45.1756268Z Submodule path 'third_party/onnx/third_party/pybind11': checked out 'a2e59f0e7065404b44dfe92a28aca47ba1378dc4' 2025-08-14T21:22:45.2342296Z Submodule path 'third_party/opentelemetry-cpp': checked out 'a799f4aed9c94b765dcdaabaeab7d5e7e2310878' 2025-08-14T21:22:45.2359508Z Submodule 'third_party/benchmark' (https://github.com/google/benchmark) registered for path 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-08-14T21:22:45.2360292Z Submodule 'third_party/googletest' (https://github.com/google/googletest) registered for path 'third_party/opentelemetry-cpp/third_party/googletest' 2025-08-14T21:22:45.2362324Z Submodule 'third_party/ms-gsl' (https://github.com/microsoft/GSL) registered for path 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-08-14T21:22:45.2367972Z Submodule 'third_party/nlohmann-json' (https://github.com/nlohmann/json) registered for path 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-08-14T21:22:45.2369142Z Submodule 'third_party/opentelemetry-proto' (https://github.com/open-telemetry/opentelemetry-proto) registered for path 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-08-14T21:22:45.2370088Z Submodule 'third_party/opentracing-cpp' (https://github.com/opentracing/opentracing-cpp.git) registered for path 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-08-14T21:22:45.2371181Z Submodule 'third_party/prometheus-cpp' (https://github.com/jupp0r/prometheus-cpp) registered for path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-08-14T21:22:45.2371878Z Submodule 'tools/vcpkg' (https://github.com/Microsoft/vcpkg) registered for path 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-08-14T21:22:45.2397751Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/benchmark'... 2025-08-14T21:22:45.6399168Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/opentracing-cpp'... 2025-08-14T21:22:45.6399932Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/opentelemetry-proto'... 2025-08-14T21:22:45.6400589Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/ms-gsl'... 2025-08-14T21:22:45.6401533Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/prometheus-cpp'... 2025-08-14T21:22:45.7404213Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/googletest'... 2025-08-14T21:22:46.3070226Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/nlohmann-json'... 2025-08-14T21:22:53.4226363Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/tools/vcpkg'... 2025-08-14T21:22:53.9404235Z Submodule path 'third_party/opentelemetry-cpp/third_party/benchmark': checked out 'd572f4777349d43653b21d6c2fc63020ab326db2' 2025-08-14T21:22:53.9791850Z Submodule path 'third_party/opentelemetry-cpp/third_party/googletest': checked out 'b796f7d44681514f58a683a3a71ff17c94edb0c1' 2025-08-14T21:22:53.9962402Z Submodule path 'third_party/opentelemetry-cpp/third_party/ms-gsl': checked out '6f4529395c5b7c2d661812257cd6780c67e54afa' 2025-08-14T21:22:54.0934864Z Submodule path 'third_party/opentelemetry-cpp/third_party/nlohmann-json': checked out 'bc889afb4c5bf1c0d8ee29ef35eaaf4c8bef8a5d' 2025-08-14T21:22:54.1066457Z Submodule path 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto': checked out '4ca4f0335c63cda7ab31ea7ed70d6553aee14dce' 2025-08-14T21:22:54.1206294Z Submodule path 'third_party/opentelemetry-cpp/third_party/opentracing-cpp': checked out '06b57f48ded1fa3bdd3d4346f6ef29e40e08eaf5' 2025-08-14T21:22:54.1352770Z Submodule path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp': checked out 'c9ffcdda9086ffd9e1283ea7a0276d831f3c8a8d' 2025-08-14T21:22:54.1364754Z Submodule 'civetweb' (https://github.com/civetweb/civetweb.git) registered for path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-08-14T21:22:54.1365726Z Submodule 'googletest' (https://github.com/google/googletest.git) registered for path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-08-14T21:22:54.1393182Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb'... 2025-08-14T21:22:55.9730408Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest'... 2025-08-14T21:22:56.2021014Z Submodule path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb': checked out 'eefb26f82b233268fc98577d265352720d477ba4' 2025-08-14T21:22:56.2449879Z Submodule path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest': checked out 'e2239ee6043f73722e7aa812a459f54a28552929' 2025-08-14T21:22:56.5948937Z Submodule path 'third_party/opentelemetry-cpp/tools/vcpkg': checked out '8eb57355a4ffb410a2e94c07b4dca2dffbee8e50' 2025-08-14T21:22:56.6058505Z Submodule path 'third_party/pocketfft': checked out '0fa0ef591e38c2758e3184c6c23e497b9f732ffa' 2025-08-14T21:22:56.8394340Z Submodule path 'third_party/protobuf': checked out 'd1eca4e4b421cd2997495c4b4e65cea6be4e9b8a' 2025-08-14T21:22:56.8419821Z Submodule 'third_party/benchmark' (https://github.com/google/benchmark.git) registered for path 'third_party/protobuf/third_party/benchmark' 2025-08-14T21:22:56.8420671Z Submodule 'third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/protobuf/third_party/googletest' 2025-08-14T21:22:56.8446051Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/protobuf/third_party/benchmark'... 2025-08-14T21:22:57.4087739Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/protobuf/third_party/googletest'... 2025-08-14T21:22:57.7913025Z Submodule path 'third_party/protobuf/third_party/benchmark': checked out '5b7683f49e1e9223cf9927b24f6fd3d6bd82e3f8' 2025-08-14T21:22:57.8564177Z Submodule path 'third_party/protobuf/third_party/googletest': checked out '5ec7f0c4a113e2f18ac2c6cc7df51ad6afc24081' 2025-08-14T21:22:57.8657832Z Submodule path 'third_party/psimd': checked out '072586a71b55b7f8c584153d223e95687148a900' 2025-08-14T21:22:57.8777201Z Submodule path 'third_party/pthreadpool': checked out '4fe0e1e183925bf8cfa6aae24237e724a96479b8' 2025-08-14T21:22:57.9120839Z Submodule path 'third_party/pybind11': checked out 'a2e59f0e7065404b44dfe92a28aca47ba1378dc4' 2025-08-14T21:22:57.9371361Z Submodule path 'third_party/python-peachpy': checked out 'f45429b087dd7d5bc78bb40dc7cf06425c252d67' 2025-08-14T21:22:57.9770626Z Submodule path 'third_party/sleef': checked out '5a1d179df9cf652951b59010a2d2075372d67f68' 2025-08-14T21:22:58.0007086Z Submodule path 'third_party/tensorpipe': checked out 'dacda0567d9f23d4bc503e1c4f84aa65f33ac38a' 2025-08-14T21:22:58.0030494Z Submodule 'third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/tensorpipe/third_party/googletest' 2025-08-14T21:22:58.0035368Z Submodule 'third_party/libnop' (https://github.com/google/libnop.git) registered for path 'third_party/tensorpipe/third_party/libnop' 2025-08-14T21:22:58.0036354Z Submodule 'third_party/libuv' (https://github.com/libuv/libuv.git) registered for path 'third_party/tensorpipe/third_party/libuv' 2025-08-14T21:22:58.0037222Z Submodule 'third_party/pybind11' (https://github.com/pybind/pybind11.git) registered for path 'third_party/tensorpipe/third_party/pybind11' 2025-08-14T21:22:58.0055728Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe/third_party/googletest'... 2025-08-14T21:22:58.9848016Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe/third_party/libnop'... 2025-08-14T21:22:58.9848874Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe/third_party/pybind11'... 2025-08-14T21:22:59.0849524Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe/third_party/libuv'... 2025-08-14T21:22:59.2585900Z Submodule path 'third_party/tensorpipe/third_party/googletest': checked out 'aee0f9d9b5b87796ee8a0ab26b7587ec30e8858e' 2025-08-14T21:22:59.2727487Z Submodule path 'third_party/tensorpipe/third_party/libnop': checked out '910b55815be16109f04f4180e9adee14fb4ce281' 2025-08-14T21:22:59.3395502Z Submodule path 'third_party/tensorpipe/third_party/libuv': checked out '5152db2cbfeb5582e9c27c5ea1dba2cd9e10759b' 2025-08-14T21:22:59.3669215Z Submodule path 'third_party/tensorpipe/third_party/pybind11': checked out 'a23996fce38ff6ccfbcdc09f1e63f2c4be5ea2ef' 2025-08-14T21:22:59.3685654Z Submodule 'tools/clang' (https://github.com/wjakob/clang-cindex-python3) registered for path 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-08-14T21:22:59.3712223Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe/third_party/pybind11/tools/clang'... 2025-08-14T21:22:59.5785392Z Submodule path 'third_party/tensorpipe/third_party/pybind11/tools/clang': checked out '6a00cbc4a9b8e68b71caf7f774b3f9c753ae84d5' 2025-08-14T21:22:59.5827700Z [command]/usr/bin/git submodule foreach --recursive git config --local gc.auto 0 2025-08-14T21:22:59.6146849Z Entering 'android/libs/fbjni' 2025-08-14T21:22:59.6185659Z Entering 'third_party/FP16' 2025-08-14T21:22:59.6228562Z Entering 'third_party/FXdiv' 2025-08-14T21:22:59.6273526Z Entering 'third_party/NNPACK' 2025-08-14T21:22:59.6309858Z Entering 'third_party/NVTX' 2025-08-14T21:22:59.6357278Z Entering 'third_party/VulkanMemoryAllocator' 2025-08-14T21:22:59.6402415Z Entering 'third_party/XNNPACK' 2025-08-14T21:22:59.6460502Z Entering 'third_party/aiter' 2025-08-14T21:22:59.6501284Z Entering 'third_party/aiter/3rdparty/composable_kernel' 2025-08-14T21:22:59.6552795Z Entering 'third_party/benchmark' 2025-08-14T21:22:59.6591107Z Entering 'third_party/composable_kernel' 2025-08-14T21:22:59.6649437Z Entering 'third_party/cpp-httplib' 2025-08-14T21:22:59.6687571Z Entering 'third_party/cpuinfo' 2025-08-14T21:22:59.6728926Z Entering 'third_party/cudnn_frontend' 2025-08-14T21:22:59.6771128Z Entering 'third_party/cutlass' 2025-08-14T21:22:59.6824198Z Entering 'third_party/fbgemm' 2025-08-14T21:22:59.6862447Z Entering 'third_party/fbgemm/external/asmjit' 2025-08-14T21:22:59.6900534Z Entering 'third_party/fbgemm/external/composable_kernel' 2025-08-14T21:22:59.6953380Z Entering 'third_party/fbgemm/external/cpuinfo' 2025-08-14T21:22:59.6995726Z Entering 'third_party/fbgemm/external/cutlass' 2025-08-14T21:22:59.7044252Z Entering 'third_party/fbgemm/external/googletest' 2025-08-14T21:22:59.7087599Z Entering 'third_party/fbgemm/external/hipify_torch' 2025-08-14T21:22:59.7127272Z Entering 'third_party/fbgemm/external/json' 2025-08-14T21:22:59.7170667Z Entering 'third_party/flash-attention' 2025-08-14T21:22:59.7214850Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-08-14T21:22:59.7261424Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-08-14T21:22:59.7312379Z Entering 'third_party/flatbuffers' 2025-08-14T21:22:59.7355468Z Entering 'third_party/fmt' 2025-08-14T21:22:59.7398572Z Entering 'third_party/gemmlowp/gemmlowp' 2025-08-14T21:22:59.7440551Z Entering 'third_party/gloo' 2025-08-14T21:22:59.7480667Z Entering 'third_party/googletest' 2025-08-14T21:22:59.7520225Z Entering 'third_party/ideep' 2025-08-14T21:22:59.7560317Z Entering 'third_party/ideep/mkl-dnn' 2025-08-14T21:22:59.7607732Z Entering 'third_party/ittapi' 2025-08-14T21:22:59.7647176Z Entering 'third_party/kineto' 2025-08-14T21:22:59.7689296Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-08-14T21:22:59.7735106Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-08-14T21:22:59.7772472Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-08-14T21:22:59.7820416Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-08-14T21:22:59.7865064Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-08-14T21:22:59.7904278Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-08-14T21:22:59.7952276Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-08-14T21:22:59.7990335Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-08-14T21:22:59.8027922Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-08-14T21:22:59.8065371Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-08-14T21:22:59.8107576Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-08-14T21:22:59.8151585Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-08-14T21:22:59.8194574Z Entering 'third_party/kleidiai' 2025-08-14T21:22:59.8241625Z Entering 'third_party/mimalloc' 2025-08-14T21:22:59.8281934Z Entering 'third_party/nlohmann' 2025-08-14T21:22:59.8324626Z Entering 'third_party/onnx' 2025-08-14T21:22:59.8381806Z Entering 'third_party/onnx/third_party/pybind11' 2025-08-14T21:22:59.8427165Z Entering 'third_party/opentelemetry-cpp' 2025-08-14T21:22:59.8465088Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-08-14T21:22:59.8507937Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-08-14T21:22:59.8549738Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-08-14T21:22:59.8583492Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-08-14T21:22:59.8631131Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-08-14T21:22:59.8673834Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-08-14T21:22:59.8721510Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-08-14T21:22:59.8757225Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-08-14T21:22:59.8802097Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-08-14T21:22:59.8845599Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-08-14T21:22:59.8902748Z Entering 'third_party/pocketfft' 2025-08-14T21:22:59.8949222Z Entering 'third_party/protobuf' 2025-08-14T21:22:59.8993694Z Entering 'third_party/protobuf/third_party/benchmark' 2025-08-14T21:22:59.9029513Z Entering 'third_party/protobuf/third_party/googletest' 2025-08-14T21:22:59.9074680Z Entering 'third_party/psimd' 2025-08-14T21:22:59.9115758Z Entering 'third_party/pthreadpool' 2025-08-14T21:22:59.9160443Z Entering 'third_party/pybind11' 2025-08-14T21:22:59.9200478Z Entering 'third_party/python-peachpy' 2025-08-14T21:22:59.9247738Z Entering 'third_party/sleef' 2025-08-14T21:22:59.9283947Z Entering 'third_party/tensorpipe' 2025-08-14T21:22:59.9327499Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-08-14T21:22:59.9369433Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-08-14T21:22:59.9409290Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-08-14T21:22:59.9460802Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-08-14T21:22:59.9495839Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-08-14T21:22:59.9554803Z ##[endgroup] 2025-08-14T21:22:59.9555198Z ##[group]Persisting credentials for submodules 2025-08-14T21:22:59.9560892Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'url\.https\:\/\/github\.com\/\.insteadOf' && git config --local --unset-all 'url.https://github.com/.insteadOf' || :" 2025-08-14T21:22:59.9893190Z Entering 'android/libs/fbjni' 2025-08-14T21:22:59.9952348Z Entering 'third_party/FP16' 2025-08-14T21:23:00.0007786Z Entering 'third_party/FXdiv' 2025-08-14T21:23:00.0061501Z Entering 'third_party/NNPACK' 2025-08-14T21:23:00.0117493Z Entering 'third_party/NVTX' 2025-08-14T21:23:00.0174063Z Entering 'third_party/VulkanMemoryAllocator' 2025-08-14T21:23:00.0233375Z Entering 'third_party/XNNPACK' 2025-08-14T21:23:00.0302117Z Entering 'third_party/aiter' 2025-08-14T21:23:00.0357359Z Entering 'third_party/aiter/3rdparty/composable_kernel' 2025-08-14T21:23:00.0420660Z Entering 'third_party/benchmark' 2025-08-14T21:23:00.0476030Z Entering 'third_party/composable_kernel' 2025-08-14T21:23:00.0543112Z Entering 'third_party/cpp-httplib' 2025-08-14T21:23:00.0597136Z Entering 'third_party/cpuinfo' 2025-08-14T21:23:00.0661764Z Entering 'third_party/cudnn_frontend' 2025-08-14T21:23:00.0716647Z Entering 'third_party/cutlass' 2025-08-14T21:23:00.0785715Z Entering 'third_party/fbgemm' 2025-08-14T21:23:00.0839610Z Entering 'third_party/fbgemm/external/asmjit' 2025-08-14T21:23:00.0899265Z Entering 'third_party/fbgemm/external/composable_kernel' 2025-08-14T21:23:00.0962083Z Entering 'third_party/fbgemm/external/cpuinfo' 2025-08-14T21:23:00.1016202Z Entering 'third_party/fbgemm/external/cutlass' 2025-08-14T21:23:00.1079657Z Entering 'third_party/fbgemm/external/googletest' 2025-08-14T21:23:00.1134264Z Entering 'third_party/fbgemm/external/hipify_torch' 2025-08-14T21:23:00.1186127Z Entering 'third_party/fbgemm/external/json' 2025-08-14T21:23:00.1243996Z Entering 'third_party/flash-attention' 2025-08-14T21:23:00.1300491Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-08-14T21:23:00.1357563Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-08-14T21:23:00.1423703Z Entering 'third_party/flatbuffers' 2025-08-14T21:23:00.1479495Z Entering 'third_party/fmt' 2025-08-14T21:23:00.1539386Z Entering 'third_party/gemmlowp/gemmlowp' 2025-08-14T21:23:00.1592603Z Entering 'third_party/gloo' 2025-08-14T21:23:00.1653398Z Entering 'third_party/googletest' 2025-08-14T21:23:00.1704834Z Entering 'third_party/ideep' 2025-08-14T21:23:00.1758997Z Entering 'third_party/ideep/mkl-dnn' 2025-08-14T21:23:00.1821357Z Entering 'third_party/ittapi' 2025-08-14T21:23:00.1878101Z Entering 'third_party/kineto' 2025-08-14T21:23:00.1934774Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-08-14T21:23:00.1985991Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-08-14T21:23:00.2045387Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-08-14T21:23:00.2099101Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-08-14T21:23:00.2161340Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-08-14T21:23:00.2212621Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-08-14T21:23:00.2272971Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-08-14T21:23:00.2330731Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-08-14T21:23:00.2386409Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-08-14T21:23:00.2442192Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-08-14T21:23:00.2500208Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-08-14T21:23:00.2555906Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-08-14T21:23:00.2613434Z Entering 'third_party/kleidiai' 2025-08-14T21:23:00.2669272Z Entering 'third_party/mimalloc' 2025-08-14T21:23:00.2728189Z Entering 'third_party/nlohmann' 2025-08-14T21:23:00.2783338Z Entering 'third_party/onnx' 2025-08-14T21:23:00.2848984Z Entering 'third_party/onnx/third_party/pybind11' 2025-08-14T21:23:00.2903974Z Entering 'third_party/opentelemetry-cpp' 2025-08-14T21:23:00.2965244Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-08-14T21:23:00.3019183Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-08-14T21:23:00.3071911Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-08-14T21:23:00.3124147Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-08-14T21:23:00.3179554Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-08-14T21:23:00.3238026Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-08-14T21:23:00.3287408Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-08-14T21:23:00.3345105Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-08-14T21:23:00.3397439Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-08-14T21:23:00.3458108Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-08-14T21:23:00.3530153Z Entering 'third_party/pocketfft' 2025-08-14T21:23:00.3582582Z Entering 'third_party/protobuf' 2025-08-14T21:23:00.3643822Z Entering 'third_party/protobuf/third_party/benchmark' 2025-08-14T21:23:00.3692736Z Entering 'third_party/protobuf/third_party/googletest' 2025-08-14T21:23:00.3750775Z Entering 'third_party/psimd' 2025-08-14T21:23:00.3802593Z Entering 'third_party/pthreadpool' 2025-08-14T21:23:00.3867986Z Entering 'third_party/pybind11' 2025-08-14T21:23:00.3917940Z Entering 'third_party/python-peachpy' 2025-08-14T21:23:00.3976110Z Entering 'third_party/sleef' 2025-08-14T21:23:00.4033459Z Entering 'third_party/tensorpipe' 2025-08-14T21:23:00.4086228Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-08-14T21:23:00.4143417Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-08-14T21:23:00.4196584Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-08-14T21:23:00.4248329Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-08-14T21:23:00.4301968Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-08-14T21:23:00.4374905Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local 'http.https://github.com/.extraheader' 'AUTHORIZATION: basic ***' && git config --local --show-origin --name-only --get-regexp remote.origin.url" 2025-08-14T21:23:00.4685549Z Entering 'android/libs/fbjni' 2025-08-14T21:23:00.4739739Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/android/libs/fbjni/config remote.origin.url 2025-08-14T21:23:00.4752447Z Entering 'third_party/FP16' 2025-08-14T21:23:00.4804127Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NNPACK_deps/FP16/config remote.origin.url 2025-08-14T21:23:00.4823253Z Entering 'third_party/FXdiv' 2025-08-14T21:23:00.4873995Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NNPACK_deps/FXdiv/config remote.origin.url 2025-08-14T21:23:00.4887366Z Entering 'third_party/NNPACK' 2025-08-14T21:23:00.4941076Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NNPACK/config remote.origin.url 2025-08-14T21:23:00.4955811Z Entering 'third_party/NVTX' 2025-08-14T21:23:00.5007764Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NVTX/config remote.origin.url 2025-08-14T21:23:00.5027270Z Entering 'third_party/VulkanMemoryAllocator' 2025-08-14T21:23:00.5080278Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/VulkanMemoryAllocator/config remote.origin.url 2025-08-14T21:23:00.5098519Z Entering 'third_party/XNNPACK' 2025-08-14T21:23:00.5153577Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/XNNPACK/config remote.origin.url 2025-08-14T21:23:00.5177525Z Entering 'third_party/aiter' 2025-08-14T21:23:00.5230580Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/aiter/config remote.origin.url 2025-08-14T21:23:00.5246031Z Entering 'third_party/aiter/3rdparty/composable_kernel' 2025-08-14T21:23:00.5297982Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/aiter/modules/3rdparty/composable_kernel/config remote.origin.url 2025-08-14T21:23:00.5322356Z Entering 'third_party/benchmark' 2025-08-14T21:23:00.5374363Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/benchmark/config remote.origin.url 2025-08-14T21:23:00.5389660Z Entering 'third_party/composable_kernel' 2025-08-14T21:23:00.5440023Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/composable_kernel/config remote.origin.url 2025-08-14T21:23:00.5466399Z Entering 'third_party/cpp-httplib' 2025-08-14T21:23:00.5516442Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/cpp-httplib/config remote.origin.url 2025-08-14T21:23:00.5535748Z Entering 'third_party/cpuinfo' 2025-08-14T21:23:00.5585289Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/cpuinfo/config remote.origin.url 2025-08-14T21:23:00.5601567Z Entering 'third_party/cudnn_frontend' 2025-08-14T21:23:00.5657962Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/cudnn_frontend/config remote.origin.url 2025-08-14T21:23:00.5668740Z Entering 'third_party/cutlass' 2025-08-14T21:23:00.5720566Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/cutlass/config remote.origin.url 2025-08-14T21:23:00.5750046Z Entering 'third_party/fbgemm' 2025-08-14T21:23:00.5800751Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/config remote.origin.url 2025-08-14T21:23:00.5819821Z Entering 'third_party/fbgemm/external/asmjit' 2025-08-14T21:23:00.5868372Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/external/asmjit/config remote.origin.url 2025-08-14T21:23:00.5883450Z Entering 'third_party/fbgemm/external/composable_kernel' 2025-08-14T21:23:00.5935297Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/external/composable_kernel/config remote.origin.url 2025-08-14T21:23:00.5957672Z Entering 'third_party/fbgemm/external/cpuinfo' 2025-08-14T21:23:00.6007848Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/external/cpuinfo/config remote.origin.url 2025-08-14T21:23:00.6024395Z Entering 'third_party/fbgemm/external/cutlass' 2025-08-14T21:23:00.6075914Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/external/cutlass/config remote.origin.url 2025-08-14T21:23:00.6099658Z Entering 'third_party/fbgemm/external/googletest' 2025-08-14T21:23:00.6152210Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/external/googletest/config remote.origin.url 2025-08-14T21:23:00.6168773Z Entering 'third_party/fbgemm/external/hipify_torch' 2025-08-14T21:23:00.6225070Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/external/hipify_torch/config remote.origin.url 2025-08-14T21:23:00.6241027Z Entering 'third_party/fbgemm/external/json' 2025-08-14T21:23:00.6288280Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/external/json/config remote.origin.url 2025-08-14T21:23:00.6312989Z Entering 'third_party/flash-attention' 2025-08-14T21:23:00.6359823Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/flash-attention/config remote.origin.url 2025-08-14T21:23:00.6380050Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-08-14T21:23:00.6431195Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/flash-attention/modules/csrc/composable_kernel/config remote.origin.url 2025-08-14T21:23:00.6453519Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-08-14T21:23:00.6501379Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/flash-attention/modules/csrc/cutlass/config remote.origin.url 2025-08-14T21:23:00.6524024Z Entering 'third_party/flatbuffers' 2025-08-14T21:23:00.6577609Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/flatbuffers/config remote.origin.url 2025-08-14T21:23:00.6595312Z Entering 'third_party/fmt' 2025-08-14T21:23:00.6647959Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fmt/config remote.origin.url 2025-08-14T21:23:00.6660671Z Entering 'third_party/gemmlowp/gemmlowp' 2025-08-14T21:23:00.6711867Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/gemmlowp/gemmlowp/config remote.origin.url 2025-08-14T21:23:00.6725719Z Entering 'third_party/gloo' 2025-08-14T21:23:00.6776354Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/gloo/config remote.origin.url 2025-08-14T21:23:00.6795287Z Entering 'third_party/googletest' 2025-08-14T21:23:00.6846330Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/googletest/config remote.origin.url 2025-08-14T21:23:00.6864345Z Entering 'third_party/ideep' 2025-08-14T21:23:00.6915268Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/ideep/config remote.origin.url 2025-08-14T21:23:00.6929594Z Entering 'third_party/ideep/mkl-dnn' 2025-08-14T21:23:00.6975921Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/ideep/modules/mkl-dnn/config remote.origin.url 2025-08-14T21:23:00.6998742Z Entering 'third_party/ittapi' 2025-08-14T21:23:00.7048187Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/ittapi/config remote.origin.url 2025-08-14T21:23:00.7067616Z Entering 'third_party/kineto' 2025-08-14T21:23:00.7114648Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/config remote.origin.url 2025-08-14T21:23:00.7131022Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-08-14T21:23:00.7182889Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/config remote.origin.url 2025-08-14T21:23:00.7195234Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-08-14T21:23:00.7248612Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/DCGM/config remote.origin.url 2025-08-14T21:23:00.7268857Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-08-14T21:23:00.7317261Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/cpr/config remote.origin.url 2025-08-14T21:23:00.7335873Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-08-14T21:23:00.7382263Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/fmt/config remote.origin.url 2025-08-14T21:23:00.7396988Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-08-14T21:23:00.7452468Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/gflags/config remote.origin.url 2025-08-14T21:23:00.7469263Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-08-14T21:23:00.7522054Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/gflags/modules/doc/config remote.origin.url 2025-08-14T21:23:00.7544314Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-08-14T21:23:00.7590442Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/glog/config remote.origin.url 2025-08-14T21:23:00.7606309Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-08-14T21:23:00.7663191Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/googletest/config remote.origin.url 2025-08-14T21:23:00.7676867Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-08-14T21:23:00.7728871Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/json/config remote.origin.url 2025-08-14T21:23:00.7744850Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-08-14T21:23:00.7794092Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/pfs/config remote.origin.url 2025-08-14T21:23:00.7812196Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-08-14T21:23:00.7864287Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/fmt/config remote.origin.url 2025-08-14T21:23:00.7881014Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-08-14T21:23:00.7930374Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/googletest/config remote.origin.url 2025-08-14T21:23:00.7949739Z Entering 'third_party/kleidiai' 2025-08-14T21:23:00.7999373Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kleidiai/config remote.origin.url 2025-08-14T21:23:00.8025422Z Entering 'third_party/mimalloc' 2025-08-14T21:23:00.8074732Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/mimalloc/config remote.origin.url 2025-08-14T21:23:00.8093430Z Entering 'third_party/nlohmann' 2025-08-14T21:23:00.8144342Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/nlohmann/config remote.origin.url 2025-08-14T21:23:00.8159605Z Entering 'third_party/onnx' 2025-08-14T21:23:00.8230654Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/onnx/config remote.origin.url 2025-08-14T21:23:00.8261804Z Entering 'third_party/onnx/third_party/pybind11' 2025-08-14T21:23:00.8311439Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/onnx/modules/third_party/pybind11/config remote.origin.url 2025-08-14T21:23:00.8333857Z Entering 'third_party/opentelemetry-cpp' 2025-08-14T21:23:00.8384625Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/config remote.origin.url 2025-08-14T21:23:00.8401140Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-08-14T21:23:00.8455457Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/benchmark/config remote.origin.url 2025-08-14T21:23:00.8470088Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-08-14T21:23:00.8522199Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/googletest/config remote.origin.url 2025-08-14T21:23:00.8538423Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-08-14T21:23:00.8583253Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/ms-gsl/config remote.origin.url 2025-08-14T21:23:00.8601291Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-08-14T21:23:00.8650968Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/nlohmann-json/config remote.origin.url 2025-08-14T21:23:00.8672043Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-08-14T21:23:00.8721218Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/opentelemetry-proto/config remote.origin.url 2025-08-14T21:23:00.8736117Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-08-14T21:23:00.8785592Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/opentracing-cpp/config remote.origin.url 2025-08-14T21:23:00.8800418Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-08-14T21:23:00.8851402Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/prometheus-cpp/config remote.origin.url 2025-08-14T21:23:00.8867840Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-08-14T21:23:00.8922101Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/prometheus-cpp/modules/civetweb/config remote.origin.url 2025-08-14T21:23:00.8942324Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-08-14T21:23:00.8992948Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/prometheus-cpp/modules/googletest/config remote.origin.url 2025-08-14T21:23:00.9009190Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-08-14T21:23:00.9060106Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/tools/vcpkg/config remote.origin.url 2025-08-14T21:23:00.9096600Z Entering 'third_party/pocketfft' 2025-08-14T21:23:00.9146997Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/pocketfft/config remote.origin.url 2025-08-14T21:23:00.9164888Z Entering 'third_party/protobuf' 2025-08-14T21:23:00.9220556Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/protobuf/config remote.origin.url 2025-08-14T21:23:00.9237465Z Entering 'third_party/protobuf/third_party/benchmark' 2025-08-14T21:23:00.9290437Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/protobuf/modules/third_party/benchmark/config remote.origin.url 2025-08-14T21:23:00.9312277Z Entering 'third_party/protobuf/third_party/googletest' 2025-08-14T21:23:00.9358563Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/protobuf/modules/third_party/googletest/config remote.origin.url 2025-08-14T21:23:00.9379235Z Entering 'third_party/psimd' 2025-08-14T21:23:00.9433091Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NNPACK_deps/psimd/config remote.origin.url 2025-08-14T21:23:00.9450293Z Entering 'third_party/pthreadpool' 2025-08-14T21:23:00.9495827Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NNPACK_deps/pthreadpool/config remote.origin.url 2025-08-14T21:23:00.9517521Z Entering 'third_party/pybind11' 2025-08-14T21:23:00.9564915Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/pybind11/config remote.origin.url 2025-08-14T21:23:00.9578389Z Entering 'third_party/python-peachpy' 2025-08-14T21:23:00.9632561Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/python-peachpy/config remote.origin.url 2025-08-14T21:23:00.9653811Z Entering 'third_party/sleef' 2025-08-14T21:23:00.9700078Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/sleef/config remote.origin.url 2025-08-14T21:23:00.9716936Z Entering 'third_party/tensorpipe' 2025-08-14T21:23:00.9767550Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/config remote.origin.url 2025-08-14T21:23:00.9783405Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-08-14T21:23:00.9834107Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/modules/third_party/googletest/config remote.origin.url 2025-08-14T21:23:00.9855196Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-08-14T21:23:00.9900454Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/modules/third_party/libnop/config remote.origin.url 2025-08-14T21:23:00.9917091Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-08-14T21:23:00.9967400Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/modules/third_party/libuv/config remote.origin.url 2025-08-14T21:23:00.9979926Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-08-14T21:23:01.0032476Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/modules/third_party/pybind11/config remote.origin.url 2025-08-14T21:23:01.0044518Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-08-14T21:23:01.0096373Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/modules/third_party/pybind11/modules/tools/clang/config remote.origin.url 2025-08-14T21:23:01.1165849Z [command]/usr/bin/git submodule foreach --recursive git config --local --add 'url.https://github.com/.insteadOf' 'git@github.com:' 2025-08-14T21:23:01.1506978Z Entering 'android/libs/fbjni' 2025-08-14T21:23:01.1551035Z Entering 'third_party/FP16' 2025-08-14T21:23:01.1592302Z Entering 'third_party/FXdiv' 2025-08-14T21:23:01.1635803Z Entering 'third_party/NNPACK' 2025-08-14T21:23:01.1675009Z Entering 'third_party/NVTX' 2025-08-14T21:23:01.1718514Z Entering 'third_party/VulkanMemoryAllocator' 2025-08-14T21:23:01.1764144Z Entering 'third_party/XNNPACK' 2025-08-14T21:23:01.1841545Z Entering 'third_party/aiter' 2025-08-14T21:23:01.1860400Z Entering 'third_party/aiter/3rdparty/composable_kernel' 2025-08-14T21:23:01.1910877Z Entering 'third_party/benchmark' 2025-08-14T21:23:01.1950743Z Entering 'third_party/composable_kernel' 2025-08-14T21:23:01.1998784Z Entering 'third_party/cpp-httplib' 2025-08-14T21:23:01.2044693Z Entering 'third_party/cpuinfo' 2025-08-14T21:23:01.2085420Z Entering 'third_party/cudnn_frontend' 2025-08-14T21:23:01.2125883Z Entering 'third_party/cutlass' 2025-08-14T21:23:01.2175936Z Entering 'third_party/fbgemm' 2025-08-14T21:23:01.2223924Z Entering 'third_party/fbgemm/external/asmjit' 2025-08-14T21:23:01.2257344Z Entering 'third_party/fbgemm/external/composable_kernel' 2025-08-14T21:23:01.2310740Z Entering 'third_party/fbgemm/external/cpuinfo' 2025-08-14T21:23:01.2347076Z Entering 'third_party/fbgemm/external/cutlass' 2025-08-14T21:23:01.2394278Z Entering 'third_party/fbgemm/external/googletest' 2025-08-14T21:23:01.2442847Z Entering 'third_party/fbgemm/external/hipify_torch' 2025-08-14T21:23:01.2483334Z Entering 'third_party/fbgemm/external/json' 2025-08-14T21:23:01.2525631Z Entering 'third_party/flash-attention' 2025-08-14T21:23:01.2565761Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-08-14T21:23:01.2613063Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-08-14T21:23:01.2665313Z Entering 'third_party/flatbuffers' 2025-08-14T21:23:01.2708999Z Entering 'third_party/fmt' 2025-08-14T21:23:01.2753706Z Entering 'third_party/gemmlowp/gemmlowp' 2025-08-14T21:23:01.2790678Z Entering 'third_party/gloo' 2025-08-14T21:23:01.2836329Z Entering 'third_party/googletest' 2025-08-14T21:23:01.2878061Z Entering 'third_party/ideep' 2025-08-14T21:23:01.2919101Z Entering 'third_party/ideep/mkl-dnn' 2025-08-14T21:23:01.2972715Z Entering 'third_party/ittapi' 2025-08-14T21:23:01.3016317Z Entering 'third_party/kineto' 2025-08-14T21:23:01.3058785Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-08-14T21:23:01.3100618Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-08-14T21:23:01.3147617Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-08-14T21:23:01.3183343Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-08-14T21:23:01.3224805Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-08-14T21:23:01.3272103Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-08-14T21:23:01.3315838Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-08-14T21:23:01.3355190Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-08-14T21:23:01.3393270Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-08-14T21:23:01.3435811Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-08-14T21:23:01.3477598Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-08-14T21:23:01.3515769Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-08-14T21:23:01.3561231Z Entering 'third_party/kleidiai' 2025-08-14T21:23:01.3604333Z Entering 'third_party/mimalloc' 2025-08-14T21:23:01.3647321Z Entering 'third_party/nlohmann' 2025-08-14T21:23:01.3689682Z Entering 'third_party/onnx' 2025-08-14T21:23:01.3747460Z Entering 'third_party/onnx/third_party/pybind11' 2025-08-14T21:23:01.3791286Z Entering 'third_party/opentelemetry-cpp' 2025-08-14T21:23:01.3841433Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-08-14T21:23:01.3881349Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-08-14T21:23:01.3921801Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-08-14T21:23:01.3961648Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-08-14T21:23:01.3999559Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-08-14T21:23:01.4048424Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-08-14T21:23:01.4087138Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-08-14T21:23:01.4138062Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-08-14T21:23:01.4175205Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-08-14T21:23:01.4217202Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-08-14T21:23:01.4270402Z Entering 'third_party/pocketfft' 2025-08-14T21:23:01.4315385Z Entering 'third_party/protobuf' 2025-08-14T21:23:01.4357025Z Entering 'third_party/protobuf/third_party/benchmark' 2025-08-14T21:23:01.4399335Z Entering 'third_party/protobuf/third_party/googletest' 2025-08-14T21:23:01.4441344Z Entering 'third_party/psimd' 2025-08-14T21:23:01.4484220Z Entering 'third_party/pthreadpool' 2025-08-14T21:23:01.4527877Z Entering 'third_party/pybind11' 2025-08-14T21:23:01.4568367Z Entering 'third_party/python-peachpy' 2025-08-14T21:23:01.4618238Z Entering 'third_party/sleef' 2025-08-14T21:23:01.4656869Z Entering 'third_party/tensorpipe' 2025-08-14T21:23:01.4692108Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-08-14T21:23:01.4733826Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-08-14T21:23:01.4769046Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-08-14T21:23:01.4809627Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-08-14T21:23:01.4853660Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-08-14T21:23:01.4911769Z [command]/usr/bin/git submodule foreach --recursive git config --local --add 'url.https://github.com/.insteadOf' 'org-21003710@github.com:' 2025-08-14T21:23:01.5224827Z Entering 'android/libs/fbjni' 2025-08-14T21:23:01.5260261Z Entering 'third_party/FP16' 2025-08-14T21:23:01.5302532Z Entering 'third_party/FXdiv' 2025-08-14T21:23:01.5346742Z Entering 'third_party/NNPACK' 2025-08-14T21:23:01.5390334Z Entering 'third_party/NVTX' 2025-08-14T21:23:01.5436685Z Entering 'third_party/VulkanMemoryAllocator' 2025-08-14T21:23:01.5474793Z Entering 'third_party/XNNPACK' 2025-08-14T21:23:01.5528996Z Entering 'third_party/aiter' 2025-08-14T21:23:01.5569873Z Entering 'third_party/aiter/3rdparty/composable_kernel' 2025-08-14T21:23:01.5624086Z Entering 'third_party/benchmark' 2025-08-14T21:23:01.5668889Z Entering 'third_party/composable_kernel' 2025-08-14T21:23:01.5719639Z Entering 'third_party/cpp-httplib' 2025-08-14T21:23:01.5760857Z Entering 'third_party/cpuinfo' 2025-08-14T21:23:01.5804144Z Entering 'third_party/cudnn_frontend' 2025-08-14T21:23:01.5846180Z Entering 'third_party/cutlass' 2025-08-14T21:23:01.5893324Z Entering 'third_party/fbgemm' 2025-08-14T21:23:01.5966871Z Entering 'third_party/fbgemm/external/asmjit' 2025-08-14T21:23:01.6009031Z Entering 'third_party/fbgemm/external/composable_kernel' 2025-08-14T21:23:01.6055309Z Entering 'third_party/fbgemm/external/cpuinfo' 2025-08-14T21:23:01.6096122Z Entering 'third_party/fbgemm/external/cutlass' 2025-08-14T21:23:01.6144320Z Entering 'third_party/fbgemm/external/googletest' 2025-08-14T21:23:01.6183192Z Entering 'third_party/fbgemm/external/hipify_torch' 2025-08-14T21:23:01.6225352Z Entering 'third_party/fbgemm/external/json' 2025-08-14T21:23:01.6268684Z Entering 'third_party/flash-attention' 2025-08-14T21:23:01.6314451Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-08-14T21:23:01.6359653Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-08-14T21:23:01.6416020Z Entering 'third_party/flatbuffers' 2025-08-14T21:23:01.6457018Z Entering 'third_party/fmt' 2025-08-14T21:23:01.6498579Z Entering 'third_party/gemmlowp/gemmlowp' 2025-08-14T21:23:01.6543085Z Entering 'third_party/gloo' 2025-08-14T21:23:01.6580792Z Entering 'third_party/googletest' 2025-08-14T21:23:01.6624297Z Entering 'third_party/ideep' 2025-08-14T21:23:01.6663771Z Entering 'third_party/ideep/mkl-dnn' 2025-08-14T21:23:01.6715013Z Entering 'third_party/ittapi' 2025-08-14T21:23:01.6759717Z Entering 'third_party/kineto' 2025-08-14T21:23:01.6801202Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-08-14T21:23:01.6842871Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-08-14T21:23:01.6888351Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-08-14T21:23:01.6925660Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-08-14T21:23:01.6968170Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-08-14T21:23:01.7007939Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-08-14T21:23:01.7054989Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-08-14T21:23:01.7092062Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-08-14T21:23:01.7136010Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-08-14T21:23:01.7178054Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-08-14T21:23:01.7226554Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-08-14T21:23:01.7264986Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-08-14T21:23:01.7310924Z Entering 'third_party/kleidiai' 2025-08-14T21:23:01.7356076Z Entering 'third_party/mimalloc' 2025-08-14T21:23:01.7397156Z Entering 'third_party/nlohmann' 2025-08-14T21:23:01.7444176Z Entering 'third_party/onnx' 2025-08-14T21:23:01.7501249Z Entering 'third_party/onnx/third_party/pybind11' 2025-08-14T21:23:01.7547717Z Entering 'third_party/opentelemetry-cpp' 2025-08-14T21:23:01.7586813Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-08-14T21:23:01.7634335Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-08-14T21:23:01.7675060Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-08-14T21:23:01.7713446Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-08-14T21:23:01.7757244Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-08-14T21:23:01.7799283Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-08-14T21:23:01.7845059Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-08-14T21:23:01.7883119Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-08-14T21:23:01.7927692Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-08-14T21:23:01.7969677Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-08-14T21:23:01.8030841Z Entering 'third_party/pocketfft' 2025-08-14T21:23:01.8083217Z Entering 'third_party/protobuf' 2025-08-14T21:23:01.8124516Z Entering 'third_party/protobuf/third_party/benchmark' 2025-08-14T21:23:01.8171465Z Entering 'third_party/protobuf/third_party/googletest' 2025-08-14T21:23:01.8217277Z Entering 'third_party/psimd' 2025-08-14T21:23:01.8255585Z Entering 'third_party/pthreadpool' 2025-08-14T21:23:01.8296572Z Entering 'third_party/pybind11' 2025-08-14T21:23:01.8340821Z Entering 'third_party/python-peachpy' 2025-08-14T21:23:01.8384819Z Entering 'third_party/sleef' 2025-08-14T21:23:01.8426614Z Entering 'third_party/tensorpipe' 2025-08-14T21:23:01.8469332Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-08-14T21:23:01.8511673Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-08-14T21:23:01.8555595Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-08-14T21:23:01.8593472Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-08-14T21:23:01.8634191Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-08-14T21:23:01.8691400Z ##[endgroup] 2025-08-14T21:23:01.8726511Z [command]/usr/bin/git log -1 --format=%H 2025-08-14T21:23:01.8751324Z 1fc683cf17c8c673044538d10266c00f92987be2 2025-08-14T21:23:01.8917808Z Prepare all required actions 2025-08-14T21:23:01.8918346Z Getting action download info 2025-08-14T21:23:02.0118928Z ##[group]Run ./.github/actions/setup-linux 2025-08-14T21:23:02.0119214Z env: 2025-08-14T21:23:02.0119416Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:23:02.0119635Z ##[endgroup] 2025-08-14T21:23:02.0155694Z ##[group]Run set -euo pipefail 2025-08-14T21:23:02.0155978Z set -euo pipefail 2025-08-14T21:23:02.0156196Z function get_ec2_metadata() { 2025-08-14T21:23:02.0156458Z  # Pulled from instance metadata endpoint for EC2 2025-08-14T21:23:02.0156907Z  # see https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/instancedata-data-retrieval.html 2025-08-14T21:23:02.0157276Z  category=$1 2025-08-14T21:23:02.0157529Z  # If it is GCP runner (runner name contains gcp), do not run this 2025-08-14T21:23:02.0157829Z  runner_name_str=i-0115c72a6ef255e70 2025-08-14T21:23:02.0158125Z  if [[ -f /.inarc ]]; then 2025-08-14T21:23:02.0158367Z  echo "ARC Runner, no info on ec2 metadata" 2025-08-14T21:23:02.0158635Z  elif [[ $runner_name_str == *"gcp"* ]]; then 2025-08-14T21:23:02.0158948Z  echo "Runner is from Google Cloud Platform, No info on ec2 metadata" 2025-08-14T21:23:02.0159223Z  else 2025-08-14T21:23:02.0159783Z  curl -H "X-aws-ec2-metadata-token: $(curl -s -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")" -fsSL "http://169.254.169.254/latest/meta-data/${category}" 2025-08-14T21:23:02.0160483Z  fi 2025-08-14T21:23:02.0160650Z } 2025-08-14T21:23:02.0160846Z echo "ami-id: $(get_ec2_metadata ami-id)" 2025-08-14T21:23:02.0161140Z echo "instance-id: $(get_ec2_metadata instance-id)" 2025-08-14T21:23:02.0161460Z echo "instance-type: $(get_ec2_metadata instance-type)" 2025-08-14T21:23:02.0161742Z echo "system info $(uname -a)" 2025-08-14T21:23:02.0170621Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:23:02.0170890Z env: 2025-08-14T21:23:02.0171065Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:23:02.0171254Z ##[endgroup] 2025-08-14T21:23:02.0326518Z ami-id: ami-05ffe3c48a9991133 2025-08-14T21:23:02.0428149Z instance-id: i-0115c72a6ef255e70 2025-08-14T21:23:02.0519228Z instance-type: m7i-flex.8xlarge 2025-08-14T21:23:02.0533605Z system info Linux ip-10-0-39-154.ec2.internal 6.1.141-155.222.amzn2023.x86_64 #1 SMP PREEMPT_DYNAMIC Tue Jun 17 10:29:47 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux 2025-08-14T21:23:02.0550739Z ##[group]Run echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-08-14T21:23:02.0551349Z echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-08-14T21:23:02.0556489Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:23:02.0556768Z env: 2025-08-14T21:23:02.0556937Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:23:02.0557126Z ##[endgroup] 2025-08-14T21:23:02.0604691Z ##[group]Run if systemctl is-active --quiet docker; then 2025-08-14T21:23:02.0605024Z if systemctl is-active --quiet docker; then 2025-08-14T21:23:02.0605292Z  echo "Docker daemon is running..."; 2025-08-14T21:23:02.0605509Z else 2025-08-14T21:23:02.0605755Z  echo "Starting docker daemon..." && sudo systemctl start docker; 2025-08-14T21:23:02.0606036Z fi 2025-08-14T21:23:02.0611694Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:23:02.0612074Z env: 2025-08-14T21:23:02.0612308Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:23:02.0612492Z ##[endgroup] 2025-08-14T21:23:02.0737364Z Docker daemon is running... 2025-08-14T21:23:02.0770907Z ##[group]Run nick-fields/retry@v3.0.0 2025-08-14T21:23:02.0771143Z with: 2025-08-14T21:23:02.0771310Z shell: bash 2025-08-14T21:23:02.0771644Z timeout_minutes: 5 2025-08-14T21:23:02.0771835Z max_attempts: 3 2025-08-14T21:23:02.0772011Z retry_wait_seconds: 30 2025-08-14T21:23:02.0773474Z command: AWS_ACCOUNT_ID=$(aws sts get-caller-identity|grep Account|cut -f4 -d\") aws ecr get-login-password --region "$AWS_DEFAULT_REGION" | docker login --username AWS \ --password-stdin "$AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com" # For LF Runners we need to make sure we also login to Meta's ECR docker registry too. META_AWS_ACCOUNT_ID=308535385114 if [ "$AWS_ACCOUNT_ID" != "$META_AWS_ACCOUNT_ID" ] ; then aws ecr get-login-password --region "$AWS_DEFAULT_REGION" | docker login --username AWS \ --password-stdin "$META_AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com" fi 2025-08-14T21:23:02.0774897Z polling_interval_seconds: 1 2025-08-14T21:23:02.0775106Z warning_on_retry: true 2025-08-14T21:23:02.0775301Z continue_on_error: false 2025-08-14T21:23:02.0775493Z env: 2025-08-14T21:23:02.0775664Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:23:02.0775863Z AWS_RETRY_MODE: standard 2025-08-14T21:23:02.0776048Z AWS_MAX_ATTEMPTS: 5 2025-08-14T21:23:02.0776245Z AWS_DEFAULT_REGION: us-east-1 2025-08-14T21:23:02.0776452Z ##[endgroup] 2025-08-14T21:23:03.1639320Z WARNING! Your password will be stored unencrypted in /home/ec2-user/.docker/config.json. 2025-08-14T21:23:03.1642368Z Configure a credential helper to remove this warning. See 2025-08-14T21:23:03.1642830Z https://docs.docker.com/engine/reference/commandline/login/#credentials-store 2025-08-14T21:23:03.1643084Z 2025-08-14T21:23:03.1643517Z Login Succeeded 2025-08-14T21:23:03.2207031Z Command completed after 1 attempt(s). 2025-08-14T21:23:03.2262113Z ##[group]Run env | grep '^GITHUB' >> "/tmp/github_env_${GITHUB_RUN_ID}" 2025-08-14T21:23:03.2262493Z env | grep '^GITHUB' >> "/tmp/github_env_${GITHUB_RUN_ID}" 2025-08-14T21:23:03.2262808Z env | grep '^CI' >> "/tmp/github_env_${GITHUB_RUN_ID}" 2025-08-14T21:23:03.2269367Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:23:03.2269641Z env: 2025-08-14T21:23:03.2269819Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:23:03.2270006Z ##[endgroup] 2025-08-14T21:23:03.2344185Z ##[group]Run # ignore expansion of "docker ps -q" since it could be empty 2025-08-14T21:23:03.2344596Z # ignore expansion of "docker ps -q" since it could be empty 2025-08-14T21:23:03.2344897Z # shellcheck disable=SC2046 2025-08-14T21:23:03.2345148Z docker stop $(docker ps -q) || true 2025-08-14T21:23:03.2345391Z # Prune all of the docker images 2025-08-14T21:23:03.2345645Z docker system prune -af 2025-08-14T21:23:03.2350341Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:23:03.2350593Z env: 2025-08-14T21:23:03.2350756Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:23:03.2350943Z ##[endgroup] 2025-08-14T21:23:03.2861708Z "docker stop" requires at least 1 argument. 2025-08-14T21:23:03.2862061Z See 'docker stop --help'. 2025-08-14T21:23:03.2862226Z 2025-08-14T21:23:03.2862367Z Usage: docker stop [OPTIONS] CONTAINER [CONTAINER...] 2025-08-14T21:23:03.2862554Z 2025-08-14T21:23:03.2862640Z Stop one or more running containers 2025-08-14T21:23:03.3074474Z Total reclaimed space: 0B 2025-08-14T21:23:03.3103001Z ##[group]Run set +e 2025-08-14T21:23:03.3103240Z set +e 2025-08-14T21:23:03.3103416Z set -x 2025-08-14T21:23:03.3103586Z  2025-08-14T21:23:03.3103769Z PT_DOMAIN=download.pytorch.org 2025-08-14T21:23:03.3104173Z # TODO: Flaky access to download.pytorch.org https://github.com/pytorch/pytorch/issues/100400, 2025-08-14T21:23:03.3104655Z # cleaning this up once the issue is fixed. There are more than one resolved IP here, the last 2025-08-14T21:23:03.3105007Z # one is returned at random 2025-08-14T21:23:03.3105310Z RESOLVED_IP=$(dig -4 +short "${PT_DOMAIN}" | tail -n1) 2025-08-14T21:23:03.3105567Z  2025-08-14T21:23:03.3105882Z if [ -z "${RESOLVED_IP}" ]; then 2025-08-14T21:23:03.3106176Z  echo "Couldn't resolve ${PT_DOMAIN}, retrying with Google DNS..." 2025-08-14T21:23:03.3106526Z  RESOLVED_IP=$(dig -4 +short "${PT_DOMAIN}" @8.8.8.8 | tail -n1) 2025-08-14T21:23:03.3106792Z  2025-08-14T21:23:03.3106974Z  if [ -z "${RESOLVED_IP}" ]; then 2025-08-14T21:23:03.3107236Z  echo "Couldn't resolve ${PT_DOMAIN}, exiting..." 2025-08-14T21:23:03.3107491Z  exit 1 2025-08-14T21:23:03.3107668Z  fi 2025-08-14T21:23:03.3107827Z fi 2025-08-14T21:23:03.3107990Z  2025-08-14T21:23:03.3108183Z if grep -r "${PT_DOMAIN}" /etc/hosts; then 2025-08-14T21:23:03.3108432Z  # Clean up any old records first 2025-08-14T21:23:03.3108681Z  sudo sed -i "/${PT_DOMAIN}/d" /etc/hosts 2025-08-14T21:23:03.3108907Z fi 2025-08-14T21:23:03.3109061Z  2025-08-14T21:23:03.3109279Z echo "${RESOLVED_IP} ${PT_DOMAIN}" | sudo tee -a /etc/hosts 2025-08-14T21:23:03.3109546Z cat /etc/hosts 2025-08-14T21:23:03.3114592Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:23:03.3114841Z env: 2025-08-14T21:23:03.3115009Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:23:03.3115197Z ##[endgroup] 2025-08-14T21:23:03.3139745Z + PT_DOMAIN=download.pytorch.org 2025-08-14T21:23:03.3146912Z ++ tail -n1 2025-08-14T21:23:03.3147174Z ++ dig -4 +short download.pytorch.org 2025-08-14T21:23:03.3633547Z + RESOLVED_IP=18.160.10.28 2025-08-14T21:23:03.3634312Z + '[' -z 18.160.10.28 ']' 2025-08-14T21:23:03.3635000Z + grep -r download.pytorch.org /etc/hosts 2025-08-14T21:23:03.3649748Z + echo '18.160.10.28 download.pytorch.org' 2025-08-14T21:23:03.3650325Z + sudo tee -a /etc/hosts 2025-08-14T21:23:03.7341608Z 18.160.10.28 download.pytorch.org 2025-08-14T21:23:03.7359146Z + cat /etc/hosts 2025-08-14T21:23:03.7368245Z 127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4 2025-08-14T21:23:03.7374938Z ::1 localhost6 localhost6.localdomain6 2025-08-14T21:23:03.7375248Z 18.160.10.28 download.pytorch.org 2025-08-14T21:23:03.7485119Z ##[group]Run pytorch/test-infra/.github/actions/calculate-docker-image@main 2025-08-14T21:23:03.7485473Z with: 2025-08-14T21:23:03.7486073Z docker-image-name: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:23:03.7486740Z use-custom-docker-registry: true 2025-08-14T21:23:03.7486997Z docker-build-dir: .ci/docker 2025-08-14T21:23:03.7487242Z docker-build-script: ./build.sh 2025-08-14T21:23:03.7487457Z working-directory: . 2025-08-14T21:23:03.7487719Z docker-registry: 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-08-14T21:23:03.7488013Z force-push: false 2025-08-14T21:23:03.7488186Z env: 2025-08-14T21:23:03.7488357Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:23:03.7488555Z ##[endgroup] 2025-08-14T21:23:03.7503610Z ##[group]Run set -ex 2025-08-14T21:23:03.7503846Z set -ex 2025-08-14T21:23:03.7504019Z  2025-08-14T21:23:03.7504359Z # If the docker build directory or the build script doesn't exist, the action will 2025-08-14T21:23:03.7504786Z # gracefully return the docker image name as it is. Pulling docker image in Linux 2025-08-14T21:23:03.7505167Z # job could then download the pre-built image as usual 2025-08-14T21:23:03.7505623Z if [[ -d "${DOCKER_BUILD_DIR}" ]] && [[ -f "${DOCKER_BUILD_DIR}/${DOCKER_BUILD_SCRIPT}" ]] && [[ "${USE_CUSTOM_DOCKER_REGISTRY}" == "true" ]]; then 2025-08-14T21:23:03.7506053Z  echo "skip=false" >> "${GITHUB_OUTPUT}" 2025-08-14T21:23:03.7506281Z else 2025-08-14T21:23:03.7506485Z  echo "skip=true" >> "${GITHUB_OUTPUT}" 2025-08-14T21:23:03.7506799Z  echo "docker-image=${DOCKER_IMAGE_NAME}" >> "${GITHUB_OUTPUT}" 2025-08-14T21:23:03.7507067Z  2025-08-14T21:23:03.7507436Z  echo "Not using custom ECR registry. Either it was not requested or there is no Docker build script in the ${REPO_NAME} repo..." 2025-08-14T21:23:03.7507835Z  exit 0 2025-08-14T21:23:03.7508001Z fi 2025-08-14T21:23:03.7508153Z  2025-08-14T21:23:03.7508396Z if [[ "${DOCKER_IMAGE_NAME}" == *"${DOCKER_REGISTRY}/${REPO_NAME}"* ]]; then 2025-08-14T21:23:03.7508783Z  # The docker image name already includes the ECR prefix and tag, so we can just 2025-08-14T21:23:03.7509132Z  # use it as it is, but first let's extract the tag 2025-08-14T21:23:03.7509463Z  DOCKER_TAG=$(echo "${DOCKER_IMAGE_NAME}" | awk -F '[:,]' '{print $2}') 2025-08-14T21:23:03.7509802Z  echo "docker-tag=${DOCKER_TAG}" >> "${GITHUB_OUTPUT}" 2025-08-14T21:23:03.7510127Z  echo "docker-image=${DOCKER_IMAGE_NAME}" >> "${GITHUB_OUTPUT}" 2025-08-14T21:23:03.7510387Z else 2025-08-14T21:23:03.7510587Z  if [[ "${DOCKER_IMAGE_NAME}" == *:* ]]; then 2025-08-14T21:23:03.7510855Z  CUSTOM_TAG_PREFIX=${DOCKER_IMAGE_NAME#*:} 2025-08-14T21:23:03.7511117Z  DOCKER_IMAGE_NAME=${DOCKER_IMAGE_NAME%%:*} 2025-08-14T21:23:03.7511350Z  fi 2025-08-14T21:23:03.7511655Z  DOCKER_TAG=${CUSTOM_TAG_PREFIX:+${CUSTOM_TAG_PREFIX}-}$(git rev-parse HEAD:"${DOCKER_BUILD_DIR}") 2025-08-14T21:23:03.7512053Z  echo "docker-tag=${DOCKER_TAG}" >> "${GITHUB_OUTPUT}" 2025-08-14T21:23:03.7512455Z  echo "docker-image=${DOCKER_REGISTRY}/${REPO_NAME}/${DOCKER_IMAGE_NAME}:${DOCKER_TAG}" >> "${GITHUB_OUTPUT}" 2025-08-14T21:23:03.7513021Z  echo "custom-tag-prefix=${CUSTOM_TAG_PREFIX}" >> "${GITHUB_OUTPUT}" 2025-08-14T21:23:03.7513311Z fi 2025-08-14T21:23:03.7521359Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:23:03.7521615Z env: 2025-08-14T21:23:03.7521786Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:23:03.7521975Z REPO_NAME: pytorch 2025-08-14T21:23:03.7522653Z DOCKER_IMAGE_NAME: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:23:03.7523225Z DOCKER_BUILD_DIR: .ci/docker 2025-08-14T21:23:03.7523441Z DOCKER_BUILD_SCRIPT: ./build.sh 2025-08-14T21:23:03.7523708Z DOCKER_REGISTRY: 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-08-14T21:23:03.7523989Z USE_CUSTOM_DOCKER_REGISTRY: true 2025-08-14T21:23:03.7524205Z CUSTOM_TAG_PREFIX: 2025-08-14T21:23:03.7524384Z ##[endgroup] 2025-08-14T21:23:03.7555515Z + [[ -d .ci/docker ]] 2025-08-14T21:23:03.7555777Z + [[ -f .ci/docker/./build.sh ]] 2025-08-14T21:23:03.7555990Z + [[ true == \t\r\u\e ]] 2025-08-14T21:23:03.7556181Z + echo skip=false 2025-08-14T21:23:03.7556892Z + [[ 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe == *\3\0\8\5\3\5\3\8\5\1\1\4\.\d\k\r\.\e\c\r\.\u\s\-\e\a\s\t\-\1\.\a\m\a\z\o\n\a\w\s\.\c\o\m\/\p\y\t\o\r\c\h* ]] 2025-08-14T21:23:03.7562430Z ++ echo 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:23:03.7563051Z ++ awk -F '[:,]' '{print $2}' 2025-08-14T21:23:03.7589753Z + DOCKER_TAG=pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:23:03.7590959Z + echo docker-tag=pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:23:03.7594608Z + echo docker-image=308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:23:03.7616752Z ##[group]Run set +e 2025-08-14T21:23:03.7616991Z set +e 2025-08-14T21:23:03.7617165Z set -x 2025-08-14T21:23:03.7617329Z  2025-08-14T21:23:03.7617479Z login() { 2025-08-14T21:23:03.7617803Z  aws ecr get-login-password --region us-east-1 | docker login -u AWS --password-stdin "$1" 2025-08-14T21:23:03.7618153Z } 2025-08-14T21:23:03.7618305Z  2025-08-14T21:23:03.7618461Z retry () { 2025-08-14T21:23:03.7618659Z  $* || (sleep 1 && $*) || (sleep 2 && $*) 2025-08-14T21:23:03.7618870Z } 2025-08-14T21:23:03.7619024Z  2025-08-14T21:23:03.7619194Z retry login "${DOCKER_REGISTRY}" 2025-08-14T21:23:03.7619407Z  2025-08-14T21:23:03.7619559Z START_TIME=$(date +%s) 2025-08-14T21:23:03.7619781Z # Wait up to 120 minutes 2025-08-14T21:23:03.7620043Z while [[ $(( $(date +%s) - 7200 )) -lt $START_TIME ]]; do 2025-08-14T21:23:03.7620353Z  # Check if image already exists, if it does then skip building it 2025-08-14T21:23:03.7620674Z  if docker manifest inspect "${DOCKER_IMAGE}"; then 2025-08-14T21:23:03.7620913Z  exit 0 2025-08-14T21:23:03.7621072Z  fi 2025-08-14T21:23:03.7621228Z  2025-08-14T21:23:03.7621492Z  # NB: This flag is used by Docker build workflow to push the image to ECR, so we can 2025-08-14T21:23:03.7621905Z  # use this to differentiate between the Docker build and regular build jobs. For the 2025-08-14T21:23:03.7622314Z  # latter, it will wait for the Docker images to become available before continuing 2025-08-14T21:23:03.7622666Z  if [ "${DOCKER_PUSH:-false}" == "true" ]; then 2025-08-14T21:23:03.7622942Z  # It's a Docker build job, let's build the image 2025-08-14T21:23:03.7623282Z  break 2025-08-14T21:23:03.7623451Z  else 2025-08-14T21:23:03.7623697Z  # It's a regular build job, wait for the image to become available 2025-08-14T21:23:03.7623966Z  sleep 300 2025-08-14T21:23:03.7624137Z  fi 2025-08-14T21:23:03.7624297Z done 2025-08-14T21:23:03.7624454Z  2025-08-14T21:23:03.7624689Z # NB: This part requires a full checkout. Otherwise, the merge base will 2025-08-14T21:23:03.7625152Z # be empty. The default action would be to continue rebuild the image 2025-08-14T21:23:03.7625485Z if [[ "$BASE_REVISION" = "$(git rev-parse HEAD)" ]]; then 2025-08-14T21:23:03.7625784Z  # if we're on the base branch then use the parent commit 2025-08-14T21:23:03.7626047Z  MERGE_BASE=$(git rev-parse HEAD~) 2025-08-14T21:23:03.7626260Z else 2025-08-14T21:23:03.7626487Z  # otherwise we're on a PR, so use the most recent base commit 2025-08-14T21:23:03.7626797Z  MERGE_BASE=$(git merge-base HEAD "$BASE_REVISION") 2025-08-14T21:23:03.7627034Z fi 2025-08-14T21:23:03.7627188Z  2025-08-14T21:23:03.7627358Z if [[ -z "${MERGE_BASE}" ]]; then 2025-08-14T21:23:03.7627599Z  echo "rebuild=true" >> "${GITHUB_OUTPUT}" 2025-08-14T21:23:03.7627827Z  2025-08-14T21:23:03.7628138Z  echo "Finding merge base only works with full checkout, please set fetch-depth to 0, continuing ..." 2025-08-14T21:23:03.7628473Z  exit 0 2025-08-14T21:23:03.7628640Z fi 2025-08-14T21:23:03.7628793Z  2025-08-14T21:23:03.7629002Z if ! git rev-parse "${MERGE_BASE}:${DOCKER_BUILD_DIR}"; then 2025-08-14T21:23:03.7629427Z  echo "Directory '${DOCKER_BUILD_DIR}' not found in commit $MERGE_BASE, you should rebase onto a more recent commit" 2025-08-14T21:23:03.7629789Z  exit 1 2025-08-14T21:23:03.7629956Z fi 2025-08-14T21:23:03.7630101Z  2025-08-14T21:23:03.7630346Z PREVIOUS_DOCKER_TAG=$(git rev-parse "${MERGE_BASE}:${DOCKER_BUILD_DIR}") 2025-08-14T21:23:03.7630756Z # If no image exists but the hash is the same as the previous hash then we should error out here 2025-08-14T21:23:03.7631128Z if [[ "${PREVIOUS_DOCKER_TAG}" == "${DOCKER_TAG}" ]]; then 2025-08-14T21:23:03.7631545Z  echo "WARNING: Something has gone wrong and the previous image isn't available for the merge-base of your branch" 2025-08-14T21:23:03.7632014Z  echo " Will re-build docker image to store in local cache, TTS may be longer" 2025-08-14T21:23:03.7632307Z fi 2025-08-14T21:23:03.7632457Z  2025-08-14T21:23:03.7632653Z echo "rebuild=true" >> "${GITHUB_OUTPUT}" 2025-08-14T21:23:03.7637892Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:23:03.7638158Z env: 2025-08-14T21:23:03.7638329Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:23:03.7638641Z DOCKER_BUILD_DIR: .ci/docker 2025-08-14T21:23:03.7638905Z BASE_REVISION: 1fc683cf17c8c673044538d10266c00f92987be2 2025-08-14T21:23:03.7639527Z DOCKER_IMAGE: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:23:03.7640297Z DOCKER_TAG: pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:23:03.7640771Z DOCKER_REGISTRY: 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-08-14T21:23:03.7641036Z DOCKER_PUSH: 2025-08-14T21:23:03.7641213Z ##[endgroup] 2025-08-14T21:23:03.7662975Z + retry login 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-08-14T21:23:03.7663313Z + login 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-08-14T21:23:03.7669369Z + aws ecr get-login-password --region us-east-1 2025-08-14T21:23:03.7669779Z + docker login -u AWS --password-stdin 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-08-14T21:23:04.2422840Z WARNING! Your password will be stored unencrypted in /home/ec2-user/.docker/config.json. 2025-08-14T21:23:04.2423309Z Login Succeeded 2025-08-14T21:23:04.2423565Z Configure a credential helper to remove this warning. See 2025-08-14T21:23:04.2423956Z https://docs.docker.com/engine/reference/commandline/login/#credentials-store 2025-08-14T21:23:04.2424221Z 2025-08-14T21:23:04.2446240Z ++ date +%s 2025-08-14T21:23:04.2457017Z + START_TIME=1755206584 2025-08-14T21:23:04.2459169Z ++ date +%s 2025-08-14T21:23:04.2467961Z + [[ 1755199384 -lt 1755206584 ]] 2025-08-14T21:23:04.2468658Z + docker manifest inspect 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:23:04.4853900Z { 2025-08-14T21:23:04.4854552Z "schemaVersion": 2, 2025-08-14T21:23:04.4854906Z "mediaType": "application/vnd.docker.distribution.manifest.v2+json", 2025-08-14T21:23:04.4855223Z "config": { 2025-08-14T21:23:04.4855509Z "mediaType": "application/vnd.docker.container.image.v1+json", 2025-08-14T21:23:04.4857943Z "size": 30151, 2025-08-14T21:23:04.4858277Z "digest": "sha256:0899ae453036ee7a91795ea95b1db61000579eeb74b140edab5976919ee64bbe" 2025-08-14T21:23:04.4858592Z }, 2025-08-14T21:23:04.4858754Z "layers": [ 2025-08-14T21:23:04.4858915Z { 2025-08-14T21:23:04.4859143Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4859424Z "size": 30448173, 2025-08-14T21:23:04.4859731Z "digest": "sha256:660ffc76f83b006444a5731b215acc2e35138d8be5cac8ed1ffd40f947117495" 2025-08-14T21:23:04.4860031Z }, 2025-08-14T21:23:04.4860174Z { 2025-08-14T21:23:04.4860395Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4860675Z "size": 1554, 2025-08-14T21:23:04.4860946Z "digest": "sha256:c7b4a852a45516e27a9256df90878663d770f96d271d6155d43be78cc5225eef" 2025-08-14T21:23:04.4861243Z }, 2025-08-14T21:23:04.4861380Z { 2025-08-14T21:23:04.4861604Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4861882Z "size": 313280151, 2025-08-14T21:23:04.4862160Z "digest": "sha256:e5a28988c8932eb5797557621582a064ce48651dbb5eaed379e9978535daccb9" 2025-08-14T21:23:04.4862449Z }, 2025-08-14T21:23:04.4862590Z { 2025-08-14T21:23:04.4862808Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4863077Z "size": 793, 2025-08-14T21:23:04.4863359Z "digest": "sha256:76a69b57b6837bef07dbc1b481cf28a62dfd7c7063219d9f6e0d0d63067653c7" 2025-08-14T21:23:04.4863693Z }, 2025-08-14T21:23:04.4863825Z { 2025-08-14T21:23:04.4864043Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4864320Z "size": 106, 2025-08-14T21:23:04.4864599Z "digest": "sha256:5c785dcb4cdbf1f2ceffe4d1d8e85d73225a56d0236e7ed6e36a95c836996052" 2025-08-14T21:23:04.4864894Z }, 2025-08-14T21:23:04.4865120Z { 2025-08-14T21:23:04.4865334Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4865602Z "size": 704, 2025-08-14T21:23:04.4865871Z "digest": "sha256:836ab08052e8eb2bae68e69ae086fd23a5f04a8491c320718ab47f84f03aebb1" 2025-08-14T21:23:04.4866165Z }, 2025-08-14T21:23:04.4866295Z { 2025-08-14T21:23:04.4866512Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4866781Z "size": 1217, 2025-08-14T21:23:04.4867070Z "digest": "sha256:53b11c77468cbefca210560f7d8be8e58f9eeb415e096ab0c3fb0277f0b41caf" 2025-08-14T21:23:04.4867375Z }, 2025-08-14T21:23:04.4867519Z { 2025-08-14T21:23:04.4867739Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4868016Z "size": 485, 2025-08-14T21:23:04.4868293Z "digest": "sha256:e97311a6a967664cbe10c5027a1ec60c514caa9a1160167d8363088fd1f9fe09" 2025-08-14T21:23:04.4868594Z }, 2025-08-14T21:23:04.4868737Z { 2025-08-14T21:23:04.4868955Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4869582Z "size": 110343699, 2025-08-14T21:23:04.4869861Z "digest": "sha256:2c414689d31dc46a22fe02d4f43699f528cc1c02fb505824768383fa0bbf1c74" 2025-08-14T21:23:04.4870161Z }, 2025-08-14T21:23:04.4870303Z { 2025-08-14T21:23:04.4870515Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4870791Z "size": 4817, 2025-08-14T21:23:04.4871074Z "digest": "sha256:6d89b5f065d59e4abcaa9b5ff3bf0afded2394d493d2df0f7babf7154f7548e0" 2025-08-14T21:23:04.4871492Z }, 2025-08-14T21:23:04.4871627Z { 2025-08-14T21:23:04.4871846Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4872122Z "size": 1709, 2025-08-14T21:23:04.4872409Z "digest": "sha256:5a5cc76ada432cccf7d18e0eb79379afb95deaaa7afec482406267924d291ae4" 2025-08-14T21:23:04.4872725Z }, 2025-08-14T21:23:04.4872867Z { 2025-08-14T21:23:04.4873079Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4873353Z "size": 724, 2025-08-14T21:23:04.4873630Z "digest": "sha256:fc6b37d40530f2c5339430321eab67ae1e2e87e997587c7bc8c41504464208f9" 2025-08-14T21:23:04.4873924Z }, 2025-08-14T21:23:04.4874129Z { 2025-08-14T21:23:04.4874365Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4874697Z "size": 542, 2025-08-14T21:23:04.4874961Z "digest": "sha256:2e16579078600b91216fd14aca1e0ce0f9d1801b230689dd309980e8d2783935" 2025-08-14T21:23:04.4875256Z }, 2025-08-14T21:23:04.4875393Z { 2025-08-14T21:23:04.4875612Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4875884Z "size": 3397512507, 2025-08-14T21:23:04.4876174Z "digest": "sha256:7b92d7a4b8c766d7b7873aa33088e171fb44a8e968645e4b31dfe6de2968aead" 2025-08-14T21:23:04.4876466Z }, 2025-08-14T21:23:04.4876601Z { 2025-08-14T21:23:04.4876820Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4877095Z "size": 32, 2025-08-14T21:23:04.4877368Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-08-14T21:23:04.4877679Z }, 2025-08-14T21:23:04.4877820Z { 2025-08-14T21:23:04.4878037Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4878315Z "size": 380, 2025-08-14T21:23:04.4878594Z "digest": "sha256:d6226eb61f823984003d5ac28f4d66fec9b27baf5d54a9513286483f5912cd88" 2025-08-14T21:23:04.4878897Z }, 2025-08-14T21:23:04.4879034Z { 2025-08-14T21:23:04.4879257Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4879519Z "size": 234681, 2025-08-14T21:23:04.4879801Z "digest": "sha256:83c70f4266a6ee5f8f44a88d4cb951382f6c960323b8250046bddc080e62268b" 2025-08-14T21:23:04.4880102Z }, 2025-08-14T21:23:04.4880234Z { 2025-08-14T21:23:04.4880455Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4880722Z "size": 231, 2025-08-14T21:23:04.4880996Z "digest": "sha256:60c725d21861c24c417efe3a5474414ba04f0f49c78c6d6451478ab9e45469ec" 2025-08-14T21:23:04.4881292Z }, 2025-08-14T21:23:04.4881429Z { 2025-08-14T21:23:04.4881651Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4881918Z "size": 4464546, 2025-08-14T21:23:04.4882212Z "digest": "sha256:a504e76e66a49926b4ea837b7a7ff3c842a27b2caaa4d80cf5057a1e55293666" 2025-08-14T21:23:04.4882527Z }, 2025-08-14T21:23:04.4882662Z { 2025-08-14T21:23:04.4882892Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4883168Z "size": 1864, 2025-08-14T21:23:04.4883459Z "digest": "sha256:fc1c200a4f77face2af0146f9b03ad04f31fe06fec216473ffd2ebd538cde056" 2025-08-14T21:23:04.4883771Z }, 2025-08-14T21:23:04.4883913Z { 2025-08-14T21:23:04.4884129Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4884404Z "size": 475, 2025-08-14T21:23:04.4884681Z "digest": "sha256:43273c22704f81f162741d2039015f745273eee1d1fdec47be35c9b2a90dcc5b" 2025-08-14T21:23:04.4885040Z }, 2025-08-14T21:23:04.4885175Z { 2025-08-14T21:23:04.4885395Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4885667Z "size": 178, 2025-08-14T21:23:04.4885940Z "digest": "sha256:89df389d042adbd7621a94d36b6e3db60ff6c559efb95c6fcc11b8afd42f0599" 2025-08-14T21:23:04.4886250Z }, 2025-08-14T21:23:04.4886396Z { 2025-08-14T21:23:04.4886611Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4886933Z "size": 586, 2025-08-14T21:23:04.4887207Z "digest": "sha256:684349f50d9456597026ee5c1bd890c51d1e498614f367adf03329c5227add79" 2025-08-14T21:23:04.4887535Z }, 2025-08-14T21:23:04.4887680Z { 2025-08-14T21:23:04.4887896Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4888165Z "size": 218, 2025-08-14T21:23:04.4888442Z "digest": "sha256:21d0eae87fb3ac753b3f0e91ae638360d23922d4cd119410a5a1b97bbe0ca435" 2025-08-14T21:23:04.4889082Z }, 2025-08-14T21:23:04.4889234Z { 2025-08-14T21:23:04.4889460Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4889733Z "size": 802, 2025-08-14T21:23:04.4890008Z "digest": "sha256:c9c2b424b8e08d943dc259a3796d66eede3a1e93a6460df5db132c0036d3d6af" 2025-08-14T21:23:04.4890317Z }, 2025-08-14T21:23:04.4890458Z { 2025-08-14T21:23:04.4890672Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4890946Z "size": 32, 2025-08-14T21:23:04.4891231Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-08-14T21:23:04.4891529Z }, 2025-08-14T21:23:04.4891672Z { 2025-08-14T21:23:04.4891898Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4892165Z "size": 104, 2025-08-14T21:23:04.4892435Z "digest": "sha256:98dda28f339592e3ca6d589d551e69b8314f2b7fc2a1544eacc1b3c2d3378521" 2025-08-14T21:23:04.4892738Z }, 2025-08-14T21:23:04.4892878Z { 2025-08-14T21:23:04.4893096Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4893364Z "size": 1496, 2025-08-14T21:23:04.4893644Z "digest": "sha256:acf5babd87f23aa905883eb434073e9a00ff41679134f2f4827dd86949f5a9d9" 2025-08-14T21:23:04.4893943Z }, 2025-08-14T21:23:04.4894085Z { 2025-08-14T21:23:04.4894306Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4894571Z "size": 453555614, 2025-08-14T21:23:04.4894864Z "digest": "sha256:7c5050d8408d3c4f9f5e8f2cb215245473bfc2f1510fe5ee01c2a6c505068b5a" 2025-08-14T21:23:04.4895167Z }, 2025-08-14T21:23:04.4895298Z { 2025-08-14T21:23:04.4895566Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4895845Z "size": 163, 2025-08-14T21:23:04.4896125Z "digest": "sha256:7ddd14e2b548b9ae6e216a081bb20116434aacbbe571c99b40e60fb2fde22a2a" 2025-08-14T21:23:04.4896422Z }, 2025-08-14T21:23:04.4896564Z { 2025-08-14T21:23:04.4896789Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4897058Z "size": 347, 2025-08-14T21:23:04.4897337Z "digest": "sha256:4ba8e7a736c8199931fd7ff9931a5f17b7b931d0383a3e158f1b12b191a1d250" 2025-08-14T21:23:04.4897639Z }, 2025-08-14T21:23:04.4897774Z { 2025-08-14T21:23:04.4898001Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4898267Z "size": 32, 2025-08-14T21:23:04.4898541Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-08-14T21:23:04.4898850Z }, 2025-08-14T21:23:04.4898990Z { 2025-08-14T21:23:04.4899206Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4899475Z "size": 106, 2025-08-14T21:23:04.4899752Z "digest": "sha256:907c320fee2f90da0cf5028c90a0ef49a137518baf79b483dcf7f22d5a0a497d" 2025-08-14T21:23:04.4900055Z }, 2025-08-14T21:23:04.4900187Z { 2025-08-14T21:23:04.4900406Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4900739Z "size": 425, 2025-08-14T21:23:04.4901010Z "digest": "sha256:18c4ed1ec491095788e352ae018afd84de0f251fbcfb8f74d5d893e1e9ab196d" 2025-08-14T21:23:04.4901312Z }, 2025-08-14T21:23:04.4901454Z { 2025-08-14T21:23:04.4901669Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4901940Z "size": 19308711, 2025-08-14T21:23:04.4902233Z "digest": "sha256:d7618c2df6cdb4bbf3d9870ba2d089094ac46c429b573d9adb94411fac54cfca" 2025-08-14T21:23:04.4902534Z }, 2025-08-14T21:23:04.4903015Z { 2025-08-14T21:23:04.4903249Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4903523Z "size": 108, 2025-08-14T21:23:04.4903801Z "digest": "sha256:b7bdd9a6f789ba483a46c92e5d373638850f33e88b1baa4bbe67e1c6a09cb7d0" 2025-08-14T21:23:04.4904116Z }, 2025-08-14T21:23:04.4904257Z { 2025-08-14T21:23:04.4904468Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4904733Z "size": 691, 2025-08-14T21:23:04.4905013Z "digest": "sha256:6738ba83282e002d92bff3d2b4951e3c1a67f5ec2c1bad2fd780c2f5d444748f" 2025-08-14T21:23:04.4905304Z }, 2025-08-14T21:23:04.4905445Z { 2025-08-14T21:23:04.4905662Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4905919Z "size": 724, 2025-08-14T21:23:04.4906188Z "digest": "sha256:fc6b37d40530f2c5339430321eab67ae1e2e87e997587c7bc8c41504464208f9" 2025-08-14T21:23:04.4906480Z }, 2025-08-14T21:23:04.4906610Z { 2025-08-14T21:23:04.4906827Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4907090Z "size": 116, 2025-08-14T21:23:04.4907367Z "digest": "sha256:dfb0f24886393e1d394f1f433dc9346026679dafd7a60c3a93de17d94078c1ca" 2025-08-14T21:23:04.4907665Z }, 2025-08-14T21:23:04.4907805Z { 2025-08-14T21:23:04.4908026Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4908288Z "size": 136, 2025-08-14T21:23:04.4908566Z "digest": "sha256:dc833b0762f2e144670a660f6b7ce62cec71a5fdd24df4e67b5c6173d5834451" 2025-08-14T21:23:04.4908880Z }, 2025-08-14T21:23:04.4909016Z { 2025-08-14T21:23:04.4909242Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4909519Z "size": 139, 2025-08-14T21:23:04.4909782Z "digest": "sha256:8827df8ca2da347e0032d1bff3b0312437f711c5d0b5f2164f8a60c3368a9827" 2025-08-14T21:23:04.4910077Z }, 2025-08-14T21:23:04.4910214Z { 2025-08-14T21:23:04.4910427Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4910698Z "size": 17672683360, 2025-08-14T21:23:04.4910989Z "digest": "sha256:fac8f3bd0f85eaffb43df539683dc3d861c370e583623253559fd7a1f5b00229" 2025-08-14T21:23:04.4911285Z }, 2025-08-14T21:23:04.4911415Z { 2025-08-14T21:23:04.4911629Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4911893Z "size": 214, 2025-08-14T21:23:04.4912155Z "digest": "sha256:d7cf7f140df32761610e1d58686db7f7c66a85affa4bb4b9d3c245e232443a8f" 2025-08-14T21:23:04.4912455Z }, 2025-08-14T21:23:04.4912589Z { 2025-08-14T21:23:04.4912798Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4913078Z "size": 272992162, 2025-08-14T21:23:04.4913369Z "digest": "sha256:733eedc8da8d8e7bd5a85a58d3d7818f14ed9a4fdf2dbd587038bb7725fbb9f7" 2025-08-14T21:23:04.4913667Z }, 2025-08-14T21:23:04.4913809Z { 2025-08-14T21:23:04.4914028Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4914310Z "size": 6435582332, 2025-08-14T21:23:04.4914592Z "digest": "sha256:5b092eb06909a2ea8906849acac588a10864da349670d65c0bfea342187edba2" 2025-08-14T21:23:04.4914883Z }, 2025-08-14T21:23:04.4915017Z { 2025-08-14T21:23:04.4915227Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4915496Z "size": 129, 2025-08-14T21:23:04.4915756Z "digest": "sha256:bc596103109216e154006085503386753b0b114b5900bf44758cdff324df5504" 2025-08-14T21:23:04.4916102Z }, 2025-08-14T21:23:04.4916242Z { 2025-08-14T21:23:04.4916461Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4916734Z "size": 776, 2025-08-14T21:23:04.4917021Z "digest": "sha256:0531cc34c12ab9127f1858c4cf365bb3a02bc31e8d6df5eabba2e1b6ef026ccf" 2025-08-14T21:23:04.4917320Z }, 2025-08-14T21:23:04.4917452Z { 2025-08-14T21:23:04.4917667Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4918009Z "size": 724, 2025-08-14T21:23:04.4918278Z "digest": "sha256:fc6b37d40530f2c5339430321eab67ae1e2e87e997587c7bc8c41504464208f9" 2025-08-14T21:23:04.4918562Z }, 2025-08-14T21:23:04.4918702Z { 2025-08-14T21:23:04.4918920Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4919194Z "size": 141, 2025-08-14T21:23:04.4919463Z "digest": "sha256:38c303d3b62eb463762816db04062a480014a6f3c9754386f3e83ba331ab4d1d" 2025-08-14T21:23:04.4919760Z }, 2025-08-14T21:23:04.4919895Z { 2025-08-14T21:23:04.4920115Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4920381Z "size": 32, 2025-08-14T21:23:04.4920650Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-08-14T21:23:04.4920945Z }, 2025-08-14T21:23:04.4921082Z { 2025-08-14T21:23:04.4921294Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4921570Z "size": 160, 2025-08-14T21:23:04.4921847Z "digest": "sha256:e06d15594a2a76995baebbce7032946ff9f94e281246fbc3f8ab19d8bcc38b81" 2025-08-14T21:23:04.4922143Z }, 2025-08-14T21:23:04.4922271Z { 2025-08-14T21:23:04.4922486Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4922748Z "size": 1010, 2025-08-14T21:23:04.4923020Z "digest": "sha256:0e55deb5cb38fd36b600183f7d86eaca0dabc04d2ff4d49ec2266ee3329edc4a" 2025-08-14T21:23:04.4923319Z }, 2025-08-14T21:23:04.4923455Z { 2025-08-14T21:23:04.4923666Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4923930Z "size": 724, 2025-08-14T21:23:04.4924199Z "digest": "sha256:fc6b37d40530f2c5339430321eab67ae1e2e87e997587c7bc8c41504464208f9" 2025-08-14T21:23:04.4924484Z }, 2025-08-14T21:23:04.4924620Z { 2025-08-14T21:23:04.4924839Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4925101Z "size": 134, 2025-08-14T21:23:04.4925372Z "digest": "sha256:4a53d66dce071bb7416414aa1adbc3e4a59003300c0d42038612fabdeb5a1b01" 2025-08-14T21:23:04.4925667Z }, 2025-08-14T21:23:04.4925810Z { 2025-08-14T21:23:04.4926025Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4926298Z "size": 32, 2025-08-14T21:23:04.4926577Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-08-14T21:23:04.4926876Z }, 2025-08-14T21:23:04.4927017Z { 2025-08-14T21:23:04.4927242Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4927509Z "size": 159, 2025-08-14T21:23:04.4927788Z "digest": "sha256:1519daa051b8b80e04125f2f2215dc412dcdbb9502711925e97aeccbda069eaf" 2025-08-14T21:23:04.4928087Z }, 2025-08-14T21:23:04.4928219Z { 2025-08-14T21:23:04.4928442Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4928832Z "size": 1371, 2025-08-14T21:23:04.4929133Z "digest": "sha256:381ed91d2119f078fbba19102a65befc4cb242f8cf47a11fb6f76ea424690692" 2025-08-14T21:23:04.4929443Z }, 2025-08-14T21:23:04.4929587Z { 2025-08-14T21:23:04.4929813Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4930080Z "size": 32, 2025-08-14T21:23:04.4930363Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-08-14T21:23:04.4930673Z }, 2025-08-14T21:23:04.4930808Z { 2025-08-14T21:23:04.4931032Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4931375Z "size": 137, 2025-08-14T21:23:04.4931651Z "digest": "sha256:c6b0a01a96dd479640297d4b012031ffc1bd9fc0daf61d86058f9b675c0a0705" 2025-08-14T21:23:04.4931957Z }, 2025-08-14T21:23:04.4932102Z { 2025-08-14T21:23:04.4932318Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4932602Z "size": 380, 2025-08-14T21:23:04.4932886Z "digest": "sha256:62df6413daeefebde04dcc401134734952e4ea37fc85ff23c89cb9b4fbd45155" 2025-08-14T21:23:04.4933203Z }, 2025-08-14T21:23:04.4933341Z { 2025-08-14T21:23:04.4933611Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4933889Z "size": 32, 2025-08-14T21:23:04.4934188Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-08-14T21:23:04.4934503Z }, 2025-08-14T21:23:04.4934652Z { 2025-08-14T21:23:04.4934872Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4935160Z "size": 104, 2025-08-14T21:23:04.4935458Z "digest": "sha256:7a18bc2a6881b76a6f591c98dafb47e44d903f7a905f7eba0fc3aedb5c90fff7" 2025-08-14T21:23:04.4935767Z }, 2025-08-14T21:23:04.4935917Z { 2025-08-14T21:23:04.4936147Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4936434Z "size": 407, 2025-08-14T21:23:04.4936710Z "digest": "sha256:93359cd58a8cece344fd4291b27647e57761c9399bb54bb0c18149c12af5f66a" 2025-08-14T21:23:04.4937020Z }, 2025-08-14T21:23:04.4937171Z { 2025-08-14T21:23:04.4937400Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4937685Z "size": 32, 2025-08-14T21:23:04.4937982Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-08-14T21:23:04.4938277Z }, 2025-08-14T21:23:04.4938421Z { 2025-08-14T21:23:04.4938645Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4938917Z "size": 109, 2025-08-14T21:23:04.4939202Z "digest": "sha256:c35ba0a1f353d6894c914a4bfbea9a2c9b8ac1b526af64d34cbe9a12bd83c78e" 2025-08-14T21:23:04.4939513Z }, 2025-08-14T21:23:04.4939647Z { 2025-08-14T21:23:04.4939869Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4940146Z "size": 1896, 2025-08-14T21:23:04.4940424Z "digest": "sha256:dcf1e01c98d6a6f72674d79a4e8e4047b54796576cd06ad682c225a92820a8f5" 2025-08-14T21:23:04.4940721Z }, 2025-08-14T21:23:04.4940864Z { 2025-08-14T21:23:04.4941095Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4941384Z "size": 242635753, 2025-08-14T21:23:04.4941673Z "digest": "sha256:bad0564f61fdf377e3ae31f6fec0ec28b6922da0b9db28408b55b8e97ff1ea51" 2025-08-14T21:23:04.4941983Z }, 2025-08-14T21:23:04.4942118Z { 2025-08-14T21:23:04.4942339Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4942621Z "size": 106, 2025-08-14T21:23:04.4942892Z "digest": "sha256:539ded9057364aade7abe23ab908d2caf53966a186734aa58ae84a56bee659eb" 2025-08-14T21:23:04.4943202Z }, 2025-08-14T21:23:04.4943345Z { 2025-08-14T21:23:04.4943560Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4943825Z "size": 163, 2025-08-14T21:23:04.4944098Z "digest": "sha256:28d482062637d32514edfc447913e98745d7c13d2f277531e64ffcf090ae6d92" 2025-08-14T21:23:04.4944394Z }, 2025-08-14T21:23:04.4944537Z { 2025-08-14T21:23:04.4944758Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4945031Z "size": 7943, 2025-08-14T21:23:04.4945305Z "digest": "sha256:3245316ff51b50b27da4ef7279733c92f76cc652b3fce3877c0e3d510430e8b3" 2025-08-14T21:23:04.4945602Z }, 2025-08-14T21:23:04.4945744Z { 2025-08-14T21:23:04.4946015Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4946281Z "size": 8073, 2025-08-14T21:23:04.4946552Z "digest": "sha256:b53167d1a6df0e4b67d637d073150dff1fb87a823864c0c98d77c15e56babc24" 2025-08-14T21:23:04.4946840Z }, 2025-08-14T21:23:04.4947035Z { 2025-08-14T21:23:04.4947250Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4947518Z "size": 303, 2025-08-14T21:23:04.4947777Z "digest": "sha256:7f5277f691672469f431fd90a8c2bb702c6c68333f6be2cff868f00e416c5a1a" 2025-08-14T21:23:04.4948071Z }, 2025-08-14T21:23:04.4948210Z { 2025-08-14T21:23:04.4948417Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4948677Z "size": 32, 2025-08-14T21:23:04.4948995Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-08-14T21:23:04.4949292Z }, 2025-08-14T21:23:04.4949436Z { 2025-08-14T21:23:04.4949659Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4949920Z "size": 108, 2025-08-14T21:23:04.4950198Z "digest": "sha256:23dff10cdaa5b1e9c7250f0c58a6279f104b35408281e951bfe9983f97e3d9ed" 2025-08-14T21:23:04.4950505Z }, 2025-08-14T21:23:04.4950644Z { 2025-08-14T21:23:04.4950881Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4951157Z "size": 54145699, 2025-08-14T21:23:04.4951444Z "digest": "sha256:9fb73296da6ac15f37f36663bd10afc98abb8a01fb40bff4848de7247d28e018" 2025-08-14T21:23:04.4951740Z }, 2025-08-14T21:23:04.4951884Z { 2025-08-14T21:23:04.4952104Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-08-14T21:23:04.4952366Z "size": 32, 2025-08-14T21:23:04.4952645Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-08-14T21:23:04.4952950Z } 2025-08-14T21:23:04.4953089Z ] 2025-08-14T21:23:04.4953238Z } 2025-08-14T21:23:04.4953425Z + exit 0 2025-08-14T21:23:04.4974949Z ##[group]Run set -eux 2025-08-14T21:23:04.4975185Z set -eux 2025-08-14T21:23:04.4975762Z aws secretsmanager get-secret-value --secret-id docker_hub_readonly_token | jq --raw-output '.SecretString' | jq -r .docker_hub_readonly_token | docker login --username pytorchbot --password-stdin 2025-08-14T21:23:04.4983171Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:23:04.4983429Z env: 2025-08-14T21:23:04.4983601Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:23:04.4983791Z ##[endgroup] 2025-08-14T21:23:04.5018165Z + jq --raw-output .SecretString 2025-08-14T21:23:04.5019039Z + aws secretsmanager get-secret-value --secret-id docker_hub_readonly_token 2025-08-14T21:23:04.5019394Z + jq -r .docker_hub_readonly_token 2025-08-14T21:23:04.5019699Z + docker login --username pytorchbot --password-stdin 2025-08-14T21:23:04.9854702Z WARNING! Your password will be stored unencrypted in /home/ec2-user/.docker/config.json. 2025-08-14T21:23:04.9855075Z Login Succeeded 2025-08-14T21:23:04.9855327Z Configure a credential helper to remove this warning. See 2025-08-14T21:23:04.9855725Z https://docs.docker.com/engine/reference/commandline/login/#credentials-store 2025-08-14T21:23:04.9855966Z 2025-08-14T21:23:04.9926863Z ##[group]Run tag=${ECR_DOCKER_IMAGE##*:} 2025-08-14T21:23:04.9927169Z tag=${ECR_DOCKER_IMAGE##*:} 2025-08-14T21:23:04.9927450Z echo "docker pull ghcr.io/pytorch/ci-image:${tag/:/-}" 2025-08-14T21:23:04.9932493Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:23:04.9932765Z env: 2025-08-14T21:23:04.9932930Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:23:04.9933514Z ECR_DOCKER_IMAGE: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:23:04.9934094Z ##[endgroup] 2025-08-14T21:23:04.9958346Z docker pull ghcr.io/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:23:04.9993247Z ##[group]Run pytorch/test-infra/.github/actions/pull-docker-image@main 2025-08-14T21:23:04.9993595Z with: 2025-08-14T21:23:04.9994209Z docker-image: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:23:04.9995074Z docker-registry: 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-08-14T21:23:04.9995389Z env: 2025-08-14T21:23:04.9995582Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:23:04.9995806Z ##[endgroup] 2025-08-14T21:23:05.0007993Z ##[group]Run set -x 2025-08-14T21:23:05.0008241Z set -x 2025-08-14T21:23:05.0008439Z set +e 2025-08-14T21:23:05.0008633Z  2025-08-14T21:23:05.0008970Z login() { 2025-08-14T21:23:05.0009371Z  aws ecr get-login-password --region us-east-1 | docker login -u AWS --password-stdin "$1" 2025-08-14T21:23:05.0009775Z } 2025-08-14T21:23:05.0009951Z  2025-08-14T21:23:05.0010207Z retry () { 2025-08-14T21:23:05.0010438Z  $* || (sleep 1 && $*) || (sleep 2 && $*) 2025-08-14T21:23:05.0010697Z } 2025-08-14T21:23:05.0010875Z  2025-08-14T21:23:05.0011083Z retry login "${DOCKER_REGISTRY}" 2025-08-14T21:23:05.0011327Z  2025-08-14T21:23:05.0011699Z IMAGE_SIZE=$(docker manifest inspect "${DOCKER_IMAGE}" | jq '[.layers[].size, .config.size] | add / 1024 / 1024') 2025-08-14T21:23:05.0012183Z echo "Compressed size of image in MB: ${IMAGE_SIZE}" 2025-08-14T21:23:05.0012458Z  2025-08-14T21:23:05.0012630Z set -e 2025-08-14T21:23:05.0012906Z # ignore output since only exit code is used for conditional 2025-08-14T21:23:05.0013274Z # only pull docker image if it's not available locally 2025-08-14T21:23:05.0013669Z if ! docker inspect --type=image "${DOCKER_IMAGE}" >/dev/null 2>/dev/null; then 2025-08-14T21:23:05.0014050Z  retry docker pull "${DOCKER_IMAGE}" 2025-08-14T21:23:05.0014303Z fi 2025-08-14T21:23:05.0019342Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:23:05.0019695Z env: 2025-08-14T21:23:05.0019893Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:23:05.0020554Z DOCKER_IMAGE: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:23:05.0021292Z DOCKER_REGISTRY: 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-08-14T21:23:05.0021603Z ##[endgroup] 2025-08-14T21:23:05.0044764Z + set +e 2025-08-14T21:23:05.0045331Z + retry login 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-08-14T21:23:05.0045650Z + login 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-08-14T21:23:05.0047877Z + aws ecr get-login-password --region us-east-1 2025-08-14T21:23:05.0049202Z + docker login -u AWS --password-stdin 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-08-14T21:23:05.4477011Z WARNING! Your password will be stored unencrypted in /home/ec2-user/.docker/config.json. 2025-08-14T21:23:05.4477448Z Configure a credential helper to remove this warning. See 2025-08-14T21:23:05.4477884Z https://docs.docker.com/engine/reference/commandline/login/#credentials-store 2025-08-14T21:23:05.4478144Z 2025-08-14T21:23:05.4478229Z Login Succeeded 2025-08-14T21:23:05.4497863Z ++ jq '[.layers[].size, .config.size] | add / 1024 / 1024' 2025-08-14T21:23:05.4498583Z ++ docker manifest inspect 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:23:05.6740161Z + IMAGE_SIZE=27663.483686447144 2025-08-14T21:23:05.6740503Z + echo 'Compressed size of image in MB: 27663.483686447144' 2025-08-14T21:23:05.6740761Z + set -e 2025-08-14T21:23:05.6741714Z + docker inspect --type=image 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:23:05.6742356Z Compressed size of image in MB: 27663.483686447144 2025-08-14T21:23:05.6878013Z + retry docker pull 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:23:05.6879334Z + docker pull 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:23:05.9560936Z pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe: Pulling from pytorch/ci-image 2025-08-14T21:23:05.9564928Z 660ffc76f83b: Pulling fs layer 2025-08-14T21:23:05.9565815Z c7b4a852a455: Pulling fs layer 2025-08-14T21:23:05.9566286Z e5a28988c893: Pulling fs layer 2025-08-14T21:23:05.9566714Z 76a69b57b683: Pulling fs layer 2025-08-14T21:23:05.9567480Z 5c785dcb4cdb: Pulling fs layer 2025-08-14T21:23:05.9567744Z 836ab08052e8: Pulling fs layer 2025-08-14T21:23:05.9568103Z 53b11c77468c: Pulling fs layer 2025-08-14T21:23:05.9568306Z e97311a6a967: Pulling fs layer 2025-08-14T21:23:05.9568512Z 2c414689d31d: Pulling fs layer 2025-08-14T21:23:05.9568968Z 6d89b5f065d5: Pulling fs layer 2025-08-14T21:23:05.9569181Z 5a5cc76ada43: Pulling fs layer 2025-08-14T21:23:05.9569406Z fc6b37d40530: Pulling fs layer 2025-08-14T21:23:05.9569611Z 2e1657907860: Pulling fs layer 2025-08-14T21:23:05.9569815Z 7b92d7a4b8c7: Pulling fs layer 2025-08-14T21:23:05.9570012Z 4f4fb700ef54: Pulling fs layer 2025-08-14T21:23:05.9570216Z d6226eb61f82: Pulling fs layer 2025-08-14T21:23:05.9570411Z 83c70f4266a6: Pulling fs layer 2025-08-14T21:23:05.9570595Z 60c725d21861: Pulling fs layer 2025-08-14T21:23:05.9570837Z a504e76e66a4: Pulling fs layer 2025-08-14T21:23:05.9571028Z fc1c200a4f77: Pulling fs layer 2025-08-14T21:23:05.9571221Z 43273c22704f: Pulling fs layer 2025-08-14T21:23:05.9571406Z 89df389d042a: Pulling fs layer 2025-08-14T21:23:05.9571598Z 684349f50d94: Pulling fs layer 2025-08-14T21:23:05.9571796Z 21d0eae87fb3: Pulling fs layer 2025-08-14T21:23:05.9571985Z c9c2b424b8e0: Pulling fs layer 2025-08-14T21:23:05.9572180Z 98dda28f3395: Pulling fs layer 2025-08-14T21:23:05.9572376Z acf5babd87f2: Pulling fs layer 2025-08-14T21:23:05.9572565Z 7c5050d8408d: Pulling fs layer 2025-08-14T21:23:05.9572760Z 7ddd14e2b548: Pulling fs layer 2025-08-14T21:23:05.9572952Z 4ba8e7a736c8: Pulling fs layer 2025-08-14T21:23:05.9573152Z 907c320fee2f: Pulling fs layer 2025-08-14T21:23:05.9573334Z 18c4ed1ec491: Pulling fs layer 2025-08-14T21:23:05.9573528Z d7618c2df6cd: Pulling fs layer 2025-08-14T21:23:05.9573725Z b7bdd9a6f789: Pulling fs layer 2025-08-14T21:23:05.9573913Z 6738ba83282e: Pulling fs layer 2025-08-14T21:23:05.9574107Z dfb0f2488639: Pulling fs layer 2025-08-14T21:23:05.9574302Z dc833b0762f2: Pulling fs layer 2025-08-14T21:23:05.9574485Z 8827df8ca2da: Pulling fs layer 2025-08-14T21:23:05.9574678Z fac8f3bd0f85: Pulling fs layer 2025-08-14T21:23:05.9574878Z d7cf7f140df3: Pulling fs layer 2025-08-14T21:23:05.9575061Z 733eedc8da8d: Pulling fs layer 2025-08-14T21:23:05.9575262Z 5b092eb06909: Pulling fs layer 2025-08-14T21:23:05.9575438Z bc5961031092: Pulling fs layer 2025-08-14T21:23:05.9575606Z 0531cc34c12a: Pulling fs layer 2025-08-14T21:23:05.9575789Z 38c303d3b62e: Pulling fs layer 2025-08-14T21:23:05.9575979Z e06d15594a2a: Pulling fs layer 2025-08-14T21:23:05.9576167Z 0e55deb5cb38: Pulling fs layer 2025-08-14T21:23:05.9576353Z 4a53d66dce07: Pulling fs layer 2025-08-14T21:23:05.9576544Z 1519daa051b8: Pulling fs layer 2025-08-14T21:23:05.9576729Z 381ed91d2119: Pulling fs layer 2025-08-14T21:23:05.9576907Z c6b0a01a96dd: Pulling fs layer 2025-08-14T21:23:05.9577096Z 62df6413daee: Pulling fs layer 2025-08-14T21:23:05.9577284Z 7a18bc2a6881: Pulling fs layer 2025-08-14T21:23:05.9577463Z 93359cd58a8c: Pulling fs layer 2025-08-14T21:23:05.9577652Z c35ba0a1f353: Pulling fs layer 2025-08-14T21:23:05.9577851Z dcf1e01c98d6: Pulling fs layer 2025-08-14T21:23:05.9578024Z bad0564f61fd: Pulling fs layer 2025-08-14T21:23:05.9578452Z 539ded905736: Pulling fs layer 2025-08-14T21:23:05.9578642Z 28d482062637: Pulling fs layer 2025-08-14T21:23:05.9578811Z 3245316ff51b: Pulling fs layer 2025-08-14T21:23:05.9578993Z b53167d1a6df: Pulling fs layer 2025-08-14T21:23:05.9579183Z 7f5277f69167: Pulling fs layer 2025-08-14T21:23:05.9579467Z 23dff10cdaa5: Pulling fs layer 2025-08-14T21:23:05.9579662Z 9fb73296da6a: Pulling fs layer 2025-08-14T21:23:05.9579849Z b7bdd9a6f789: Waiting 2025-08-14T21:23:05.9580023Z 6738ba83282e: Waiting 2025-08-14T21:23:05.9580180Z dfb0f2488639: Waiting 2025-08-14T21:23:05.9580341Z 76a69b57b683: Waiting 2025-08-14T21:23:05.9580530Z dc833b0762f2: Waiting 2025-08-14T21:23:05.9580695Z 5c785dcb4cdb: Waiting 2025-08-14T21:23:05.9580859Z 8827df8ca2da: Waiting 2025-08-14T21:23:05.9581014Z 836ab08052e8: Waiting 2025-08-14T21:23:05.9581179Z e97311a6a967: Waiting 2025-08-14T21:23:05.9581333Z 53b11c77468c: Waiting 2025-08-14T21:23:05.9581479Z fac8f3bd0f85: Waiting 2025-08-14T21:23:05.9581634Z 2c414689d31d: Waiting 2025-08-14T21:23:05.9581789Z 6d89b5f065d5: Waiting 2025-08-14T21:23:05.9581937Z 5a5cc76ada43: Waiting 2025-08-14T21:23:05.9582092Z 4f4fb700ef54: Waiting 2025-08-14T21:23:05.9582248Z d7cf7f140df3: Waiting 2025-08-14T21:23:05.9582404Z 83c70f4266a6: Waiting 2025-08-14T21:23:05.9582550Z 733eedc8da8d: Waiting 2025-08-14T21:23:05.9582710Z fc6b37d40530: Waiting 2025-08-14T21:23:05.9582862Z 60c725d21861: Waiting 2025-08-14T21:23:05.9583008Z 43273c22704f: Waiting 2025-08-14T21:23:05.9583167Z fc1c200a4f77: Waiting 2025-08-14T21:23:05.9583322Z 2e1657907860: Waiting 2025-08-14T21:23:05.9583472Z 7b92d7a4b8c7: Waiting 2025-08-14T21:23:05.9583628Z 89df389d042a: Waiting 2025-08-14T21:23:05.9583784Z 21d0eae87fb3: Waiting 2025-08-14T21:23:05.9583933Z 98dda28f3395: Waiting 2025-08-14T21:23:05.9584089Z 4ba8e7a736c8: Waiting 2025-08-14T21:23:05.9584240Z 907c320fee2f: Waiting 2025-08-14T21:23:05.9606826Z 684349f50d94: Waiting 2025-08-14T21:23:05.9607111Z 18c4ed1ec491: Waiting 2025-08-14T21:23:05.9607320Z d7618c2df6cd: Waiting 2025-08-14T21:23:05.9607538Z 4a53d66dce07: Waiting 2025-08-14T21:23:05.9607719Z 1519daa051b8: Waiting 2025-08-14T21:23:05.9607886Z 0531cc34c12a: Waiting 2025-08-14T21:23:05.9608064Z 38c303d3b62e: Waiting 2025-08-14T21:23:05.9608238Z 7c5050d8408d: Waiting 2025-08-14T21:23:05.9608413Z c6b0a01a96dd: Waiting 2025-08-14T21:23:05.9608591Z 381ed91d2119: Waiting 2025-08-14T21:23:05.9608910Z 62df6413daee: Waiting 2025-08-14T21:23:05.9609092Z e06d15594a2a: Waiting 2025-08-14T21:23:05.9609263Z bad0564f61fd: Waiting 2025-08-14T21:23:05.9609440Z 7a18bc2a6881: Waiting 2025-08-14T21:23:05.9609623Z 3245316ff51b: Waiting 2025-08-14T21:23:05.9609865Z 93359cd58a8c: Waiting 2025-08-14T21:23:05.9610043Z b53167d1a6df: Waiting 2025-08-14T21:23:05.9610214Z 28d482062637: Waiting 2025-08-14T21:23:05.9610379Z 9fb73296da6a: Waiting 2025-08-14T21:23:05.9610558Z 539ded905736: Waiting 2025-08-14T21:23:05.9610736Z 23dff10cdaa5: Waiting 2025-08-14T21:23:05.9610905Z c35ba0a1f353: Waiting 2025-08-14T21:23:05.9611077Z 7f5277f69167: Waiting 2025-08-14T21:23:05.9611257Z 0e55deb5cb38: Waiting 2025-08-14T21:23:05.9611428Z 5b092eb06909: Waiting 2025-08-14T21:23:05.9611603Z dcf1e01c98d6: Waiting 2025-08-14T21:23:05.9611778Z acf5babd87f2: Waiting 2025-08-14T21:23:05.9611943Z bc5961031092: Waiting 2025-08-14T21:23:05.9612111Z a504e76e66a4: Waiting 2025-08-14T21:23:05.9612301Z 7ddd14e2b548: Waiting 2025-08-14T21:23:05.9612470Z c9c2b424b8e0: Waiting 2025-08-14T21:23:06.0272297Z c7b4a852a455: Verifying Checksum 2025-08-14T21:23:06.0273206Z c7b4a852a455: Download complete 2025-08-14T21:23:06.1247480Z 76a69b57b683: Verifying Checksum 2025-08-14T21:23:06.1247874Z 76a69b57b683: Download complete 2025-08-14T21:23:06.2124917Z 5c785dcb4cdb: Verifying Checksum 2025-08-14T21:23:06.2125485Z 5c785dcb4cdb: Download complete 2025-08-14T21:23:06.2988419Z 836ab08052e8: Download complete 2025-08-14T21:23:06.3245049Z 660ffc76f83b: Verifying Checksum 2025-08-14T21:23:06.3245363Z 660ffc76f83b: Download complete 2025-08-14T21:23:06.3813680Z 53b11c77468c: Verifying Checksum 2025-08-14T21:23:06.3814343Z 53b11c77468c: Download complete 2025-08-14T21:23:06.4119112Z e97311a6a967: Verifying Checksum 2025-08-14T21:23:06.4119425Z e97311a6a967: Download complete 2025-08-14T21:23:06.5445688Z 6d89b5f065d5: Verifying Checksum 2025-08-14T21:23:06.5446027Z 6d89b5f065d5: Download complete 2025-08-14T21:23:06.6318090Z 5a5cc76ada43: Verifying Checksum 2025-08-14T21:23:06.6318558Z 5a5cc76ada43: Download complete 2025-08-14T21:23:06.6988065Z fc6b37d40530: Verifying Checksum 2025-08-14T21:23:06.6988374Z fc6b37d40530: Download complete 2025-08-14T21:23:06.7622528Z 2e1657907860: Verifying Checksum 2025-08-14T21:23:06.7628425Z 2e1657907860: Download complete 2025-08-14T21:23:07.5500947Z 2c414689d31d: Verifying Checksum 2025-08-14T21:23:07.5501302Z 2c414689d31d: Download complete 2025-08-14T21:23:07.5585238Z 4f4fb700ef54: Download complete 2025-08-14T21:23:07.5908154Z 660ffc76f83b: Pull complete 2025-08-14T21:23:07.6066060Z c7b4a852a455: Pull complete 2025-08-14T21:23:07.6831307Z d6226eb61f82: Verifying Checksum 2025-08-14T21:23:07.6832052Z d6226eb61f82: Download complete 2025-08-14T21:23:07.7821932Z 83c70f4266a6: Verifying Checksum 2025-08-14T21:23:07.7822239Z 83c70f4266a6: Download complete 2025-08-14T21:23:07.8551575Z 60c725d21861: Verifying Checksum 2025-08-14T21:23:07.8557429Z 60c725d21861: Download complete 2025-08-14T21:23:07.9486994Z a504e76e66a4: Verifying Checksum 2025-08-14T21:23:07.9487306Z a504e76e66a4: Download complete 2025-08-14T21:23:08.0265667Z fc1c200a4f77: Download complete 2025-08-14T21:23:08.1302759Z 43273c22704f: Verifying Checksum 2025-08-14T21:23:08.1303084Z 43273c22704f: Download complete 2025-08-14T21:23:08.2138846Z 89df389d042a: Verifying Checksum 2025-08-14T21:23:08.2139649Z 89df389d042a: Download complete 2025-08-14T21:23:08.3595517Z 684349f50d94: Download complete 2025-08-14T21:23:08.4383653Z 21d0eae87fb3: Verifying Checksum 2025-08-14T21:23:08.4387444Z 21d0eae87fb3: Download complete 2025-08-14T21:23:08.5589242Z c9c2b424b8e0: Verifying Checksum 2025-08-14T21:23:08.5589604Z c9c2b424b8e0: Download complete 2025-08-14T21:23:08.6462317Z 98dda28f3395: Download complete 2025-08-14T21:23:08.7403504Z acf5babd87f2: Verifying Checksum 2025-08-14T21:23:08.7403805Z acf5babd87f2: Download complete 2025-08-14T21:23:09.1604437Z e5a28988c893: Verifying Checksum 2025-08-14T21:23:09.1604740Z e5a28988c893: Download complete 2025-08-14T21:23:09.2563226Z 7ddd14e2b548: Verifying Checksum 2025-08-14T21:23:09.2563555Z 7ddd14e2b548: Download complete 2025-08-14T21:23:09.3257532Z 4ba8e7a736c8: Verifying Checksum 2025-08-14T21:23:09.3260799Z 4ba8e7a736c8: Download complete 2025-08-14T21:23:09.4137458Z 907c320fee2f: Download complete 2025-08-14T21:23:09.5131011Z 18c4ed1ec491: Download complete 2025-08-14T21:23:09.7674104Z d7618c2df6cd: Verifying Checksum 2025-08-14T21:23:09.7674416Z d7618c2df6cd: Download complete 2025-08-14T21:23:09.8552704Z b7bdd9a6f789: Verifying Checksum 2025-08-14T21:23:09.8553031Z b7bdd9a6f789: Download complete 2025-08-14T21:23:09.9395532Z 6738ba83282e: Verifying Checksum 2025-08-14T21:23:09.9395926Z 6738ba83282e: Download complete 2025-08-14T21:23:10.0047662Z dfb0f2488639: Verifying Checksum 2025-08-14T21:23:10.0048153Z dfb0f2488639: Download complete 2025-08-14T21:23:10.0771279Z dc833b0762f2: Download complete 2025-08-14T21:23:10.1474763Z 8827df8ca2da: Verifying Checksum 2025-08-14T21:23:10.1475109Z 8827df8ca2da: Download complete 2025-08-14T21:23:13.3297028Z 7c5050d8408d: Verifying Checksum 2025-08-14T21:23:13.3297335Z 7c5050d8408d: Download complete 2025-08-14T21:23:16.2263307Z 733eedc8da8d: Verifying Checksum 2025-08-14T21:23:16.2263616Z 733eedc8da8d: Download complete 2025-08-14T21:23:21.0455750Z e5a28988c893: Pull complete 2025-08-14T21:23:21.3009340Z 76a69b57b683: Pull complete 2025-08-14T21:23:21.5534779Z 5c785dcb4cdb: Pull complete 2025-08-14T21:23:21.8180494Z 836ab08052e8: Pull complete 2025-08-14T21:23:22.0756045Z 53b11c77468c: Pull complete 2025-08-14T21:23:22.3505673Z e97311a6a967: Pull complete 2025-08-14T21:23:25.9533393Z 2c414689d31d: Pull complete 2025-08-14T21:23:26.2694218Z 6d89b5f065d5: Pull complete 2025-08-14T21:23:26.5552372Z 5a5cc76ada43: Pull complete 2025-08-14T21:23:26.8438211Z fc6b37d40530: Pull complete 2025-08-14T21:23:27.1724409Z 2e1657907860: Pull complete 2025-08-14T21:23:40.8011806Z 7b92d7a4b8c7: Verifying Checksum 2025-08-14T21:23:40.8012743Z 7b92d7a4b8c7: Download complete 2025-08-14T21:23:40.8731809Z bc5961031092: Download complete 2025-08-14T21:23:40.9564124Z 0531cc34c12a: Verifying Checksum 2025-08-14T21:23:40.9566050Z 0531cc34c12a: Download complete 2025-08-14T21:23:41.0352663Z 38c303d3b62e: Verifying Checksum 2025-08-14T21:23:41.0352997Z 38c303d3b62e: Download complete 2025-08-14T21:23:41.1458416Z e06d15594a2a: Verifying Checksum 2025-08-14T21:23:41.1458983Z e06d15594a2a: Download complete 2025-08-14T21:23:41.2387973Z 0e55deb5cb38: Download complete 2025-08-14T21:23:41.3103212Z 4a53d66dce07: Verifying Checksum 2025-08-14T21:23:41.3103527Z 4a53d66dce07: Download complete 2025-08-14T21:23:41.3919481Z 1519daa051b8: Verifying Checksum 2025-08-14T21:23:41.3920019Z 1519daa051b8: Download complete 2025-08-14T21:23:41.4665754Z 381ed91d2119: Verifying Checksum 2025-08-14T21:23:41.4666048Z 381ed91d2119: Download complete 2025-08-14T21:23:41.5621177Z c6b0a01a96dd: Verifying Checksum 2025-08-14T21:23:41.5621497Z c6b0a01a96dd: Download complete 2025-08-14T21:23:41.6574654Z 62df6413daee: Verifying Checksum 2025-08-14T21:23:41.6574999Z 62df6413daee: Download complete 2025-08-14T21:23:41.7287209Z 7a18bc2a6881: Verifying Checksum 2025-08-14T21:23:41.7287527Z 7a18bc2a6881: Download complete 2025-08-14T21:23:41.8150325Z 93359cd58a8c: Verifying Checksum 2025-08-14T21:23:41.8150644Z 93359cd58a8c: Download complete 2025-08-14T21:23:41.8967817Z c35ba0a1f353: Verifying Checksum 2025-08-14T21:23:41.8968143Z c35ba0a1f353: Download complete 2025-08-14T21:23:41.9652036Z dcf1e01c98d6: Verifying Checksum 2025-08-14T21:23:41.9652345Z dcf1e01c98d6: Download complete 2025-08-14T21:23:44.4628741Z bad0564f61fd: Verifying Checksum 2025-08-14T21:23:44.4629049Z bad0564f61fd: Download complete 2025-08-14T21:23:44.5241500Z 539ded905736: Verifying Checksum 2025-08-14T21:23:44.5241819Z 539ded905736: Download complete 2025-08-14T21:23:44.6191894Z 28d482062637: Download complete 2025-08-14T21:23:44.7075958Z 3245316ff51b: Download complete 2025-08-14T21:23:44.7962573Z b53167d1a6df: Verifying Checksum 2025-08-14T21:23:44.7962934Z b53167d1a6df: Download complete 2025-08-14T21:23:44.8886807Z 7f5277f69167: Verifying Checksum 2025-08-14T21:23:44.8887137Z 7f5277f69167: Download complete 2025-08-14T21:23:44.9748647Z 23dff10cdaa5: Download complete 2025-08-14T21:23:45.5593003Z 9fb73296da6a: Verifying Checksum 2025-08-14T21:23:45.5594741Z 9fb73296da6a: Download complete 2025-08-14T21:24:20.6680078Z 5b092eb06909: Verifying Checksum 2025-08-14T21:24:20.6680389Z 5b092eb06909: Download complete 2025-08-14T21:24:55.6855014Z 7b92d7a4b8c7: Pull complete 2025-08-14T21:24:56.1809704Z 4f4fb700ef54: Pull complete 2025-08-14T21:24:56.7150292Z d6226eb61f82: Pull complete 2025-08-14T21:24:57.3096859Z 83c70f4266a6: Pull complete 2025-08-14T21:24:57.6756729Z 60c725d21861: Pull complete 2025-08-14T21:24:58.0685070Z a504e76e66a4: Pull complete 2025-08-14T21:24:58.3677483Z fc1c200a4f77: Pull complete 2025-08-14T21:24:58.5802455Z 43273c22704f: Pull complete 2025-08-14T21:24:58.7766229Z 89df389d042a: Pull complete 2025-08-14T21:24:59.0853690Z 684349f50d94: Pull complete 2025-08-14T21:24:59.3645656Z 21d0eae87fb3: Pull complete 2025-08-14T21:24:59.6551388Z c9c2b424b8e0: Pull complete 2025-08-14T21:25:00.4768649Z 98dda28f3395: Pull complete 2025-08-14T21:25:00.8593451Z acf5babd87f2: Pull complete 2025-08-14T21:25:11.3760695Z 7c5050d8408d: Pull complete 2025-08-14T21:25:11.5827907Z 7ddd14e2b548: Pull complete 2025-08-14T21:25:12.0266046Z 4ba8e7a736c8: Pull complete 2025-08-14T21:25:12.7806533Z 907c320fee2f: Pull complete 2025-08-14T21:25:13.2650144Z 18c4ed1ec491: Pull complete 2025-08-14T21:25:14.0257838Z d7618c2df6cd: Pull complete 2025-08-14T21:25:14.3813974Z b7bdd9a6f789: Pull complete 2025-08-14T21:25:14.7366542Z 6738ba83282e: Pull complete 2025-08-14T21:25:15.5144596Z dfb0f2488639: Pull complete 2025-08-14T21:25:15.9341000Z dc833b0762f2: Pull complete 2025-08-14T21:25:16.3270552Z 8827df8ca2da: Pull complete 2025-08-14T21:26:06.9252465Z fac8f3bd0f85: Verifying Checksum 2025-08-14T21:26:06.9258475Z fac8f3bd0f85: Download complete 2025-08-14T21:29:51.9508347Z fac8f3bd0f85: Pull complete 2025-08-14T21:29:51.9757500Z d7cf7f140df3: Pull complete 2025-08-14T21:29:53.9195396Z 733eedc8da8d: Pull complete 2025-08-14T21:32:11.4475429Z 5b092eb06909: Pull complete 2025-08-14T21:32:11.4744030Z bc5961031092: Pull complete 2025-08-14T21:32:11.4988270Z 0531cc34c12a: Pull complete 2025-08-14T21:32:11.5490196Z 38c303d3b62e: Pull complete 2025-08-14T21:32:11.6069557Z e06d15594a2a: Pull complete 2025-08-14T21:32:11.6315938Z 0e55deb5cb38: Pull complete 2025-08-14T21:32:11.6823067Z 4a53d66dce07: Pull complete 2025-08-14T21:32:11.7304763Z 1519daa051b8: Pull complete 2025-08-14T21:32:11.7543341Z 381ed91d2119: Pull complete 2025-08-14T21:32:11.8015471Z c6b0a01a96dd: Pull complete 2025-08-14T21:32:11.8262107Z 62df6413daee: Pull complete 2025-08-14T21:32:11.8719861Z 7a18bc2a6881: Pull complete 2025-08-14T21:32:11.8965821Z 93359cd58a8c: Pull complete 2025-08-14T21:32:11.9535790Z c35ba0a1f353: Pull complete 2025-08-14T21:32:11.9964945Z dcf1e01c98d6: Pull complete 2025-08-14T21:32:21.1127182Z bad0564f61fd: Pull complete 2025-08-14T21:32:21.4058813Z 539ded905736: Pull complete 2025-08-14T21:32:21.7700902Z 28d482062637: Pull complete 2025-08-14T21:32:22.1529804Z 3245316ff51b: Pull complete 2025-08-14T21:32:22.4604284Z b53167d1a6df: Pull complete 2025-08-14T21:32:22.7702471Z 7f5277f69167: Pull complete 2025-08-14T21:32:23.5638673Z 23dff10cdaa5: Pull complete 2025-08-14T21:32:26.2473247Z 9fb73296da6a: Pull complete 2025-08-14T21:32:26.9354178Z Digest: sha256:4236794baba289041d240d08fd393bbd57497c3012e5e0ccd9fd98f61ebf35c6 2025-08-14T21:32:27.0026509Z Status: Downloaded newer image for 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:32:27.0310942Z 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:32:27.0389056Z ##[group]Run echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-08-14T21:32:27.0389696Z echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-08-14T21:32:27.0399312Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:32:27.0399579Z env: 2025-08-14T21:32:27.0399759Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:27.0399960Z ##[endgroup] 2025-08-14T21:32:27.0482418Z Prepare all required actions 2025-08-14T21:32:27.0529504Z ##[group]Run ./.github/actions/get-workflow-job-id 2025-08-14T21:32:27.0529792Z with: 2025-08-14T21:32:27.0530401Z github-token: *** 2025-08-14T21:32:27.0530575Z env: 2025-08-14T21:32:27.0530820Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:27.0531014Z ##[endgroup] 2025-08-14T21:32:27.0681441Z ##[group]Run set -eux 2025-08-14T21:32:27.0681677Z set -eux 2025-08-14T21:32:27.0681990Z python3 .github/scripts/get_workflow_job_id.py "${GITHUB_RUN_ID}" "${RUNNER_NAME}" 2025-08-14T21:32:27.0687174Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:32:27.0687437Z env: 2025-08-14T21:32:27.0687608Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:27.0688089Z GITHUB_TOKEN: *** 2025-08-14T21:32:27.0688262Z ##[endgroup] 2025-08-14T21:32:27.0713112Z + python3 .github/scripts/get_workflow_job_id.py 16976338999 i-0115c72a6ef255e70 2025-08-14T21:32:30.6222977Z Setting output job-id=48128261038 2025-08-14T21:32:30.6223880Z Setting output job-name=linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx) 2025-08-14T21:32:30.6339789Z ##[group]Run python3 -m pip install psutil==5.9.8 dataclasses_json==0.6.7 nvidia-ml-py==11.525.84 2025-08-14T21:32:30.6340282Z python3 -m pip install psutil==5.9.8 dataclasses_json==0.6.7 nvidia-ml-py==11.525.84 2025-08-14T21:32:30.6340871Z python3 -m tools.stats.monitor --log-interval "$MONITOR_LOG_INTERVAL" --data-collect-interval "$MONITOR_DATA_COLLECT_INTERVAL" > usage_log.txt 2>&1 & 2025-08-14T21:32:30.6341590Z echo "monitor-script-pid=${!}" >> "${GITHUB_OUTPUT}" 2025-08-14T21:32:30.6347129Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:32:30.6347388Z env: 2025-08-14T21:32:30.6347558Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:30.6347738Z JOB_ID: 48128261038 2025-08-14T21:32:30.6348115Z JOB_NAME: linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx) 2025-08-14T21:32:30.6348522Z WORKFLOW_NAME: inductor-periodic 2025-08-14T21:32:30.6348788Z WORKFLOW_RUN_ID: 16976338999 2025-08-14T21:32:30.6348990Z MONITOR_LOG_INTERVAL: 5 2025-08-14T21:32:30.6349183Z MONITOR_DATA_COLLECT_INTERVAL: 1 2025-08-14T21:32:30.6349386Z ##[endgroup] 2025-08-14T21:32:31.3701422Z Defaulting to user installation because normal site-packages is not writeable 2025-08-14T21:32:31.6590193Z Collecting psutil==5.9.8 2025-08-14T21:32:31.6764401Z Downloading psutil-5.9.8-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (288 kB) 2025-08-14T21:32:31.7674382Z Collecting dataclasses_json==0.6.7 2025-08-14T21:32:31.7716424Z Downloading dataclasses_json-0.6.7-py3-none-any.whl (28 kB) 2025-08-14T21:32:31.8078867Z Collecting nvidia-ml-py==11.525.84 2025-08-14T21:32:31.8134848Z Downloading nvidia_ml_py-11.525.84-py3-none-any.whl (34 kB) 2025-08-14T21:32:31.9233615Z Collecting marshmallow<4.0.0,>=3.18.0 2025-08-14T21:32:31.9269684Z Downloading marshmallow-3.26.1-py3-none-any.whl (50 kB) 2025-08-14T21:32:31.9887798Z Collecting typing-inspect<1,>=0.4.0 2025-08-14T21:32:31.9922261Z Downloading typing_inspect-0.9.0-py3-none-any.whl (8.8 kB) 2025-08-14T21:32:32.1176558Z Collecting packaging>=17.0 2025-08-14T21:32:32.1210333Z Downloading packaging-25.0-py3-none-any.whl (66 kB) 2025-08-14T21:32:32.2465350Z Collecting typing-extensions>=3.7.4 2025-08-14T21:32:32.2501192Z Downloading typing_extensions-4.14.1-py3-none-any.whl (43 kB) 2025-08-14T21:32:32.3844845Z Collecting mypy-extensions>=0.3.0 2025-08-14T21:32:32.3881508Z Downloading mypy_extensions-1.1.0-py3-none-any.whl (5.0 kB) 2025-08-14T21:32:32.6505638Z Installing collected packages: typing-extensions, packaging, mypy-extensions, typing-inspect, marshmallow, psutil, nvidia-ml-py, dataclasses-json 2025-08-14T21:32:33.0364260Z Successfully installed dataclasses-json-0.6.7 marshmallow-3.26.1 mypy-extensions-1.1.0 nvidia-ml-py-11.525.84 packaging-25.0 psutil-5.9.8 typing-extensions-4.14.1 typing-inspect-0.9.0 2025-08-14T21:32:33.2776776Z Prepare all required actions 2025-08-14T21:32:33.2777159Z Getting action download info 2025-08-14T21:32:33.4494348Z Download action repository 'seemethere/download-artifact-s3@v4' (SHA:1da556a7aa0a088e3153970611f6c432d58e80e6) 2025-08-14T21:32:34.0147358Z Download action repository 'actions/download-artifact@v4' (SHA:d3f86a106a0bac45b974a628896c90dbdf5c8093) 2025-08-14T21:32:36.4505075Z ##[group]Run ./.github/actions/download-build-artifacts 2025-08-14T21:32:36.4505365Z with: 2025-08-14T21:32:36.4505562Z name: linux-jammy-py3.9-gcc11-build 2025-08-14T21:32:36.4505789Z s3-bucket: gha-artifacts 2025-08-14T21:32:36.4505986Z env: 2025-08-14T21:32:36.4506151Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:36.4506332Z ##[endgroup] 2025-08-14T21:32:36.4551008Z ##[group]Run seemethere/download-artifact-s3@v4 2025-08-14T21:32:36.4551275Z with: 2025-08-14T21:32:36.4551479Z name: linux-jammy-py3.9-gcc11-build 2025-08-14T21:32:36.4551729Z s3-bucket: gha-artifacts 2025-08-14T21:32:36.4552002Z region: us-east-1 2025-08-14T21:32:36.4552182Z env: 2025-08-14T21:32:36.4552358Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:36.4552566Z ##[endgroup] 2025-08-14T21:32:37.3111547Z (node:48007) NOTE: We are formalizing our plans to enter AWS SDK for JavaScript (v2) into maintenance mode in 2023. 2025-08-14T21:32:37.3113881Z 2025-08-14T21:32:37.3114136Z Please migrate your code to use AWS SDK for JavaScript (v3). 2025-08-14T21:32:37.3114530Z For more information, check the migration guide at https://a.co/7PzMCcy 2025-08-14T21:32:37.3115301Z (Use `node --trace-warnings ...` to show where the warning was created) 2025-08-14T21:32:38.5360460Z Found 1 objects with prefix pytorch/pytorch/16976338999/linux-jammy-py3.9-gcc11-build/ 2025-08-14T21:32:38.5360974Z Starting download (1/1): /home/ec2-user/actions-runner/_work/pytorch/pytorch/artifacts.zip 2025-08-14T21:32:44.0468566Z Finished download (1/1): /home/ec2-user/actions-runner/_work/pytorch/pytorch/artifacts.zip 2025-08-14T21:32:44.0474677Z Artifact download has finished successfully 2025-08-14T21:32:44.0657159Z ##[group]Run unzip -o artifacts.zip 2025-08-14T21:32:44.0657427Z unzip -o artifacts.zip 2025-08-14T21:32:44.0662361Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:32:44.0662620Z env: 2025-08-14T21:32:44.0662789Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:44.0662982Z ##[endgroup] 2025-08-14T21:32:44.0727252Z Archive: artifacts.zip 2025-08-14T21:32:44.0727520Z creating: dist/ 2025-08-14T21:32:45.2521080Z inflating: dist/torch-2.9.0a0+git1fc683c-cp39-cp39-linux_x86_64.whl 2025-08-14T21:32:45.2522118Z creating: dist/vision/ 2025-08-14T21:32:45.2602986Z inflating: dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl 2025-08-14T21:32:45.2603408Z creating: dist/audio/ 2025-08-14T21:32:45.2715761Z inflating: dist/audio/torchaudio-2.8.0a0+bdb88e1-cp39-cp39-linux_x86_64.whl 2025-08-14T21:32:45.2721064Z creating: dist/ao/ 2025-08-14T21:32:45.2753644Z inflating: dist/ao/torchao-0.7.0+git51c87b6e-py3-none-any.whl 2025-08-14T21:32:45.2878113Z inflating: dist/.ninja_log 2025-08-14T21:32:45.2878457Z creating: build/custom_test_artifacts/ 2025-08-14T21:32:45.2878755Z creating: build/custom_test_artifacts/custom-op-build/ 2025-08-14T21:32:45.2879108Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/ 2025-08-14T21:32:45.2879555Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/pkgRedirects/ 2025-08-14T21:32:45.2880433Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/CMakeConfigureLog.yaml 2025-08-14T21:32:45.2880925Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/ 2025-08-14T21:32:45.2881408Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CMakeSystem.cmake 2025-08-14T21:32:45.2881915Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CompilerIdC/ 2025-08-14T21:32:45.2882775Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CompilerIdC/tmp/ 2025-08-14T21:32:45.2884272Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CompilerIdC/CMakeCCompilerId.c 2025-08-14T21:32:45.2885233Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CompilerIdC/a.out 2025-08-14T21:32:45.2885865Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CMakeCCompiler.cmake 2025-08-14T21:32:45.2886351Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CompilerIdCXX/ 2025-08-14T21:32:45.2886806Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CompilerIdCXX/tmp/ 2025-08-14T21:32:45.2889385Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CompilerIdCXX/CMakeCXXCompilerId.cpp 2025-08-14T21:32:45.2890180Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CompilerIdCXX/a.out 2025-08-14T21:32:45.2891173Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CMakeCXXCompiler.cmake 2025-08-14T21:32:45.2892189Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CMakeDetermineCompilerABI_C.bin 2025-08-14T21:32:45.2894025Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CMakeDetermineCompilerABI_CXX.bin 2025-08-14T21:32:45.2894525Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/CMakeScratch/ 2025-08-14T21:32:45.2894960Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/cmake.check_cache 2025-08-14T21:32:45.2895604Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/ 2025-08-14T21:32:45.2896083Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/compiler_depend.ts 2025-08-14T21:32:45.2896612Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/compiler_depend.make 2025-08-14T21:32:45.2897125Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/depend.make 2025-08-14T21:32:45.2897606Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/link.txt 2025-08-14T21:32:45.2898109Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/cmake_clean.cmake 2025-08-14T21:32:45.2898602Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/build.make 2025-08-14T21:32:45.2899086Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/DependInfo.cmake 2025-08-14T21:32:45.2899581Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/flags.make 2025-08-14T21:32:45.2900049Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/progress.make 2025-08-14T21:32:45.2919122Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/op.cpp.o.d 2025-08-14T21:32:45.3104891Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/op.cpp.o 2025-08-14T21:32:45.3105472Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/ 2025-08-14T21:32:45.3106004Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/compiler_depend.ts 2025-08-14T21:32:45.3106578Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/compiler_depend.make 2025-08-14T21:32:45.3107118Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/depend.make 2025-08-14T21:32:45.3107635Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/link.txt 2025-08-14T21:32:45.3108141Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/cmake_clean.cmake 2025-08-14T21:32:45.3108740Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/build.make 2025-08-14T21:32:45.3109732Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/DependInfo.cmake 2025-08-14T21:32:45.3110281Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/flags.make 2025-08-14T21:32:45.3110792Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/progress.make 2025-08-14T21:32:45.3128576Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/test_custom_ops.cpp.o.d 2025-08-14T21:32:45.3207462Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/test_custom_ops.cpp.o 2025-08-14T21:32:45.3208111Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/CMakeDirectoryInformation.cmake 2025-08-14T21:32:45.3208642Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/TargetDirectories.txt 2025-08-14T21:32:45.3209182Z extracting: build/custom_test_artifacts/custom-op-build/CMakeFiles/progress.marks 2025-08-14T21:32:45.3209649Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/Makefile2 2025-08-14T21:32:45.3210077Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/Makefile.cmake 2025-08-14T21:32:45.3210523Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/InstallScripts.json 2025-08-14T21:32:45.3210954Z inflating: build/custom_test_artifacts/custom-op-build/CMakeCache.txt 2025-08-14T21:32:45.3211855Z inflating: build/custom_test_artifacts/custom-op-build/Makefile 2025-08-14T21:32:45.3212593Z inflating: build/custom_test_artifacts/custom-op-build/cmake_install.cmake 2025-08-14T21:32:45.3375467Z inflating: build/custom_test_artifacts/custom-op-build/libcustom_ops.so 2025-08-14T21:32:45.3431917Z inflating: build/custom_test_artifacts/custom-op-build/test_custom_ops 2025-08-14T21:32:45.3432516Z creating: build/custom_test_artifacts/jit-hook-build/ 2025-08-14T21:32:45.3432982Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/ 2025-08-14T21:32:45.3433558Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/pkgRedirects/ 2025-08-14T21:32:45.3434188Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/CMakeConfigureLog.yaml 2025-08-14T21:32:45.3434618Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/ 2025-08-14T21:32:45.3435055Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CMakeSystem.cmake 2025-08-14T21:32:45.3435512Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CompilerIdC/ 2025-08-14T21:32:45.3435975Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CompilerIdC/tmp/ 2025-08-14T21:32:45.3436458Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CompilerIdC/CMakeCCompilerId.c 2025-08-14T21:32:45.3436951Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CompilerIdC/a.out 2025-08-14T21:32:45.3437422Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CMakeCCompiler.cmake 2025-08-14T21:32:45.3437885Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CompilerIdCXX/ 2025-08-14T21:32:45.3438327Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CompilerIdCXX/tmp/ 2025-08-14T21:32:45.3439020Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CompilerIdCXX/CMakeCXXCompilerId.cpp 2025-08-14T21:32:45.3440135Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CompilerIdCXX/a.out 2025-08-14T21:32:45.3440725Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CMakeCXXCompiler.cmake 2025-08-14T21:32:45.3441404Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CMakeDetermineCompilerABI_C.bin 2025-08-14T21:32:45.3442512Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CMakeDetermineCompilerABI_CXX.bin 2025-08-14T21:32:45.3443147Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/CMakeScratch/ 2025-08-14T21:32:45.3444040Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/cmake.check_cache 2025-08-14T21:32:45.3444499Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/ 2025-08-14T21:32:45.3445002Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/compiler_depend.ts 2025-08-14T21:32:45.3445538Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/compiler_depend.make 2025-08-14T21:32:45.3446076Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/depend.make 2025-08-14T21:32:45.3446564Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/link.txt 2025-08-14T21:32:45.3447089Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/cmake_clean.cmake 2025-08-14T21:32:45.3447585Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/build.make 2025-08-14T21:32:45.3448109Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/DependInfo.cmake 2025-08-14T21:32:45.3448661Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/flags.make 2025-08-14T21:32:45.3449430Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/progress.make 2025-08-14T21:32:45.3468134Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/test_jit_hooks.cpp.o.d 2025-08-14T21:32:45.3529793Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/test_jit_hooks.cpp.o 2025-08-14T21:32:45.3530389Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/CMakeDirectoryInformation.cmake 2025-08-14T21:32:45.3530890Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/TargetDirectories.txt 2025-08-14T21:32:45.3531336Z extracting: build/custom_test_artifacts/jit-hook-build/CMakeFiles/progress.marks 2025-08-14T21:32:45.3535736Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/Makefile2 2025-08-14T21:32:45.3540141Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/Makefile.cmake 2025-08-14T21:32:45.3540663Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/InstallScripts.json 2025-08-14T21:32:45.3541103Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeCache.txt 2025-08-14T21:32:45.3541470Z inflating: build/custom_test_artifacts/jit-hook-build/Makefile 2025-08-14T21:32:45.3541878Z inflating: build/custom_test_artifacts/jit-hook-build/cmake_install.cmake 2025-08-14T21:32:45.3577840Z inflating: build/custom_test_artifacts/jit-hook-build/test_jit_hooks 2025-08-14T21:32:45.3581884Z creating: build/custom_test_artifacts/custom-backend-build/ 2025-08-14T21:32:45.3582318Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/ 2025-08-14T21:32:45.3582776Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/pkgRedirects/ 2025-08-14T21:32:45.3583284Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/CMakeConfigureLog.yaml 2025-08-14T21:32:45.3583799Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/ 2025-08-14T21:32:45.3584271Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CMakeSystem.cmake 2025-08-14T21:32:45.3584769Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CompilerIdC/ 2025-08-14T21:32:45.3585267Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CompilerIdC/tmp/ 2025-08-14T21:32:45.3585817Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CompilerIdC/CMakeCCompilerId.c 2025-08-14T21:32:45.3586362Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CompilerIdC/a.out 2025-08-14T21:32:45.3587205Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CMakeCCompiler.cmake 2025-08-14T21:32:45.3587721Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CompilerIdCXX/ 2025-08-14T21:32:45.3588209Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CompilerIdCXX/tmp/ 2025-08-14T21:32:45.3588781Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CompilerIdCXX/CMakeCXXCompilerId.cpp 2025-08-14T21:32:45.3589354Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CompilerIdCXX/a.out 2025-08-14T21:32:45.3589877Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CMakeCXXCompiler.cmake 2025-08-14T21:32:45.3590443Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CMakeDetermineCompilerABI_C.bin 2025-08-14T21:32:45.3591036Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CMakeDetermineCompilerABI_CXX.bin 2025-08-14T21:32:45.3591579Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/CMakeScratch/ 2025-08-14T21:32:45.3592053Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/cmake.check_cache 2025-08-14T21:32:45.3592525Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/ 2025-08-14T21:32:45.3593039Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/compiler_depend.ts 2025-08-14T21:32:45.3593618Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/compiler_depend.make 2025-08-14T21:32:45.3594319Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/depend.make 2025-08-14T21:32:45.3594845Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/link.txt 2025-08-14T21:32:45.3595388Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/cmake_clean.cmake 2025-08-14T21:32:45.3595944Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/build.make 2025-08-14T21:32:45.3596502Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/DependInfo.cmake 2025-08-14T21:32:45.3597057Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/flags.make 2025-08-14T21:32:45.3597590Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/progress.make 2025-08-14T21:32:45.3598351Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/custom_backend.cpp.o.d 2025-08-14T21:32:45.3716294Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/custom_backend.cpp.o 2025-08-14T21:32:45.3716946Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/ 2025-08-14T21:32:45.3717542Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/compiler_depend.ts 2025-08-14T21:32:45.3718167Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/compiler_depend.make 2025-08-14T21:32:45.3718763Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/depend.make 2025-08-14T21:32:45.3719319Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/link.txt 2025-08-14T21:32:45.3719890Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/cmake_clean.cmake 2025-08-14T21:32:45.3720453Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/build.make 2025-08-14T21:32:45.3721019Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/DependInfo.cmake 2025-08-14T21:32:45.3721931Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/flags.make 2025-08-14T21:32:45.3722495Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/progress.make 2025-08-14T21:32:45.3735695Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/test_custom_backend.cpp.o.d 2025-08-14T21:32:45.3788097Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/test_custom_backend.cpp.o 2025-08-14T21:32:45.3788820Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/CMakeDirectoryInformation.cmake 2025-08-14T21:32:45.3789380Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/TargetDirectories.txt 2025-08-14T21:32:45.3789863Z extracting: build/custom_test_artifacts/custom-backend-build/CMakeFiles/progress.marks 2025-08-14T21:32:45.3790321Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/Makefile2 2025-08-14T21:32:45.3790798Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/Makefile.cmake 2025-08-14T21:32:45.3791262Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/InstallScripts.json 2025-08-14T21:32:45.3791697Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeCache.txt 2025-08-14T21:32:45.3792688Z inflating: build/custom_test_artifacts/custom-backend-build/Makefile 2025-08-14T21:32:45.3793147Z inflating: build/custom_test_artifacts/custom-backend-build/cmake_install.cmake 2025-08-14T21:32:45.3893762Z inflating: build/custom_test_artifacts/custom-backend-build/libcustom_backend.so 2025-08-14T21:32:45.3932797Z inflating: build/custom_test_artifacts/custom-backend-build/test_custom_backend 2025-08-14T21:32:45.3938373Z creating: build/lib/ 2025-08-14T21:32:45.4017427Z inflating: build/lib/libprotobuf-lite.a 2025-08-14T21:32:45.4457711Z inflating: build/lib/libprotobuf.a 2025-08-14T21:32:45.4950752Z inflating: build/lib/libprotoc.a 2025-08-14T21:32:45.4959306Z inflating: build/lib/libpthreadpool.a 2025-08-14T21:32:45.4967649Z inflating: build/lib/libcpuinfo.a 2025-08-14T21:32:45.4978734Z inflating: build/lib/libcpuinfo_internals.a 2025-08-14T21:32:45.4980770Z inflating: build/lib/libclog.a 2025-08-14T21:32:45.4994697Z inflating: build/lib/libpytorch_qnnpack.a 2025-08-14T21:32:45.4996821Z inflating: build/lib/libnnpack_reference_layers.a 2025-08-14T21:32:45.5186285Z inflating: build/lib/libmicrokernels-prod.a 2025-08-14T21:32:45.5203815Z inflating: build/lib/libnnpack.a 2025-08-14T21:32:45.6083560Z inflating: build/lib/libmicrokernels-all.a 2025-08-14T21:32:45.6154215Z inflating: build/lib/libgtest.a 2025-08-14T21:32:45.6173291Z inflating: build/lib/libgmock.a 2025-08-14T21:32:45.6173636Z inflating: build/lib/libgmock_main.a 2025-08-14T21:32:45.6173880Z inflating: build/lib/libgtest_main.a 2025-08-14T21:32:45.6263609Z inflating: build/lib/libXNNPACK.a 2025-08-14T21:32:45.6339267Z inflating: build/lib/libbenchmark.a 2025-08-14T21:32:45.6345447Z inflating: build/lib/libbenchmark_main.a 2025-08-14T21:32:45.6349828Z inflating: build/lib/libjitprofiling.a 2025-08-14T21:32:45.6409991Z inflating: build/lib/libasmjit.a 2025-08-14T21:32:45.6415807Z inflating: build/lib/libittnotify.a 2025-08-14T21:32:45.7553771Z inflating: build/lib/libfbgemm.a 2025-08-14T21:32:45.7584281Z inflating: build/lib/libtensorpipe_uv.a 2025-08-14T21:32:45.8123037Z inflating: build/lib/libtensorpipe.a 2025-08-14T21:32:45.8243844Z inflating: build/lib/libgloo.a 2025-08-14T21:32:45.8288843Z inflating: build/lib/libonnx_proto.a 2025-08-14T21:32:45.8989125Z inflating: build/lib/libonnx.a 2025-08-14T21:32:46.9017164Z inflating: build/lib/libdnnl.a 2025-08-14T21:32:46.9036620Z inflating: build/lib/libfmt.a 2025-08-14T21:32:46.9300125Z inflating: build/lib/libkineto.a 2025-08-14T21:32:46.9410691Z inflating: build/lib/libc10.so 2025-08-14T21:32:46.9411439Z inflating: build/lib/libtorch_global_deps.so 2025-08-14T21:32:49.9300646Z inflating: build/lib/libtorch_cpu.so 2025-08-14T21:32:49.9300976Z inflating: build/lib/libtorch.so 2025-08-14T21:32:49.9373249Z inflating: build/lib/libtorchbind_test.so 2025-08-14T21:32:49.9391546Z inflating: build/lib/libjitbackend_test.so 2025-08-14T21:32:49.9418483Z inflating: build/lib/libbackend_with_compiler.so 2025-08-14T21:32:49.9442453Z inflating: build/lib/libaoti_custom_ops.so 2025-08-14T21:32:49.9446274Z inflating: build/lib/libshm.so 2025-08-14T21:32:50.1440946Z inflating: build/lib/libtorch_python.so 2025-08-14T21:32:50.1472576Z inflating: build/lib/libnnapi_backend.so 2025-08-14T21:32:50.1472874Z creating: build/bin/ 2025-08-14T21:32:50.1473089Z creating: build/bin/CMakeFiles/ 2025-08-14T21:32:50.1473331Z inflating: build/bin/cmake_install.cmake 2025-08-14T21:32:50.1473571Z inflating: build/bin/CTestTestfile.cmake 2025-08-14T21:32:50.1939401Z inflating: build/bin/protoc-3.13.0.0 2025-08-14T21:32:50.2402205Z inflating: build/bin/protoc 2025-08-14T21:32:50.2465971Z inflating: build/bin/c10_AllocatorConfig_test 2025-08-14T21:32:50.2517251Z inflating: build/bin/c10_CompileTimeFunctionPointer_test 2025-08-14T21:32:50.2573415Z inflating: build/bin/c10_DeviceGuard_test 2025-08-14T21:32:50.2630539Z inflating: build/bin/c10_Device_test 2025-08-14T21:32:50.2684940Z inflating: build/bin/c10_StreamGuard_test 2025-08-14T21:32:50.2751262Z inflating: build/bin/c10_DispatchKeySet_test 2025-08-14T21:32:50.2807736Z inflating: build/bin/c10_SymInt_test 2025-08-14T21:32:50.2868427Z inflating: build/bin/c10_Scalar_test 2025-08-14T21:32:50.2928943Z inflating: build/bin/c10_InlineDeviceGuard_test 2025-08-14T21:32:50.2992272Z inflating: build/bin/c10_InlineStreamGuard_test 2025-08-14T21:32:50.3053302Z inflating: build/bin/c10_SizesAndStrides_test 2025-08-14T21:32:50.3112149Z inflating: build/bin/c10_Bitset_test 2025-08-14T21:32:50.3190179Z inflating: build/bin/c10_cow_test 2025-08-14T21:32:50.3245525Z inflating: build/bin/c10_ArrayRef_test 2025-08-14T21:32:50.3300854Z inflating: build/bin/c10_ConstexprCrc_test 2025-08-14T21:32:50.3358646Z inflating: build/bin/c10_DeadlockDetection_test 2025-08-14T21:32:50.3420755Z inflating: build/bin/c10_Enumerate_test 2025-08-14T21:32:50.3477682Z inflating: build/bin/c10_Half_test 2025-08-14T21:32:50.3540568Z inflating: build/bin/c10_IntrusiveList_test 2025-08-14T21:32:50.3599408Z inflating: build/bin/c10_LeftRight_test 2025-08-14T21:32:50.3663604Z inflating: build/bin/c10_Metaprogramming_test 2025-08-14T21:32:50.3724195Z inflating: build/bin/c10_NetworkFlow_test 2025-08-14T21:32:50.3780960Z inflating: build/bin/c10_Synchronized_test 2025-08-14T21:32:50.3834589Z inflating: build/bin/c10_Semaphore_test 2025-08-14T21:32:50.3891789Z inflating: build/bin/c10_TypeIndex_test 2025-08-14T21:32:50.3956497Z inflating: build/bin/c10_ThreadLocal_test 2025-08-14T21:32:50.4010881Z inflating: build/bin/c10_TypeList_test 2025-08-14T21:32:50.4068315Z inflating: build/bin/c10_TypeTraits_test 2025-08-14T21:32:50.4123342Z inflating: build/bin/c10_accumulate_test 2025-08-14T21:32:50.4189745Z inflating: build/bin/c10_bfloat16_test 2025-08-14T21:32:50.4247944Z inflating: build/bin/c10_complex_test 2025-08-14T21:32:50.4310948Z inflating: build/bin/c10_complex_math_test 2025-08-14T21:32:50.4367392Z inflating: build/bin/c10_bit_cast_test 2025-08-14T21:32:50.4425172Z inflating: build/bin/c10_error_test 2025-08-14T21:32:50.4481461Z inflating: build/bin/c10_exception_test 2025-08-14T21:32:50.4539176Z inflating: build/bin/c10_flags_test 2025-08-14T21:32:50.4599287Z inflating: build/bin/c10_irange_test 2025-08-14T21:32:50.4652957Z inflating: build/bin/c10_generic_math_test 2025-08-14T21:32:50.4827824Z inflating: build/bin/c10_intrusive_ptr_test 2025-08-14T21:32:50.4884346Z inflating: build/bin/c10_lazy_test 2025-08-14T21:32:50.4947708Z inflating: build/bin/c10_logging_test 2025-08-14T21:32:50.5016349Z inflating: build/bin/c10_ordered_preserving_dict_test 2025-08-14T21:32:50.5098527Z inflating: build/bin/c10_optional_test 2025-08-14T21:32:50.5162844Z inflating: build/bin/c10_registry_test 2025-08-14T21:32:50.5321890Z inflating: build/bin/c10_small_vector_test 2025-08-14T21:32:50.5384412Z inflating: build/bin/c10_string_util_test 2025-08-14T21:32:50.5443294Z inflating: build/bin/c10_ssize_test 2025-08-14T21:32:50.5498845Z inflating: build/bin/c10_string_view_test 2025-08-14T21:32:50.5552847Z inflating: build/bin/c10_tempfile_test 2025-08-14T21:32:50.5618253Z inflating: build/bin/c10_typeid_test 2025-08-14T21:32:50.5664300Z inflating: build/bin/c10_intrusive_ptr_benchmark 2025-08-14T21:32:50.6264027Z inflating: build/bin/vec_test_all_types_DEFAULT 2025-08-14T21:32:50.6877594Z inflating: build/bin/vec_test_all_types_AVX512 2025-08-14T21:32:50.7503794Z inflating: build/bin/vec_test_all_types_AVX2 2025-08-14T21:32:50.7561437Z inflating: build/bin/static_runtime_bench 2025-08-14T21:32:50.7823400Z inflating: build/bin/static_runtime_test 2025-08-14T21:32:50.7904501Z inflating: build/bin/Dict_test 2025-08-14T21:32:50.7963125Z inflating: build/bin/Dimname_test 2025-08-14T21:32:50.8034692Z inflating: build/bin/MaybeOwned_test 2025-08-14T21:32:50.8098667Z inflating: build/bin/NamedTensor_test 2025-08-14T21:32:50.8164120Z inflating: build/bin/apply_utils_test 2025-08-14T21:32:50.8229893Z inflating: build/bin/atest 2025-08-14T21:32:50.8300805Z inflating: build/bin/basic 2025-08-14T21:32:50.8364481Z inflating: build/bin/broadcast_test 2025-08-14T21:32:50.8423365Z inflating: build/bin/cpu_allocator_test 2025-08-14T21:32:50.8485341Z inflating: build/bin/cpu_generator_test 2025-08-14T21:32:50.8544671Z inflating: build/bin/cpu_profiling_allocator_test 2025-08-14T21:32:50.8641612Z inflating: build/bin/cpu_rng_test 2025-08-14T21:32:50.8698819Z inflating: build/bin/dlconvertor_test 2025-08-14T21:32:50.8762913Z inflating: build/bin/extension_backend_test 2025-08-14T21:32:50.8825870Z inflating: build/bin/half_test 2025-08-14T21:32:50.8931044Z inflating: build/bin/ivalue_test 2025-08-14T21:32:50.8987828Z inflating: build/bin/lazy_tensor_test 2025-08-14T21:32:50.9048318Z inflating: build/bin/math_kernel_test 2025-08-14T21:32:50.9108650Z inflating: build/bin/memory_format_test 2025-08-14T21:32:50.9167552Z inflating: build/bin/memory_overlapping_test 2025-08-14T21:32:50.9226275Z inflating: build/bin/mobile_memory_cleanup 2025-08-14T21:32:50.9288491Z inflating: build/bin/native_test 2025-08-14T21:32:50.9345640Z inflating: build/bin/operator_name_test 2025-08-14T21:32:50.9401259Z inflating: build/bin/operators_test 2025-08-14T21:32:50.9461972Z inflating: build/bin/packedtensoraccessor_test 2025-08-14T21:32:50.9536023Z inflating: build/bin/pow_test 2025-08-14T21:32:50.9595792Z inflating: build/bin/quantized_test 2025-08-14T21:32:50.9651837Z inflating: build/bin/reduce_ops_test 2025-08-14T21:32:50.9709085Z inflating: build/bin/reportMemoryUsage_test 2025-08-14T21:32:50.9775973Z inflating: build/bin/scalar_tensor_test 2025-08-14T21:32:50.9841816Z inflating: build/bin/scalar_test 2025-08-14T21:32:50.9897873Z inflating: build/bin/StorageUtils_test 2025-08-14T21:32:50.9954938Z inflating: build/bin/stride_properties_test 2025-08-14T21:32:51.0040556Z inflating: build/bin/tensor_iterator_test 2025-08-14T21:32:51.0106509Z inflating: build/bin/test_parallel 2025-08-14T21:32:51.0159756Z inflating: build/bin/thread_init_test 2025-08-14T21:32:51.0224001Z inflating: build/bin/type_ptr_test 2025-08-14T21:32:51.0290762Z inflating: build/bin/type_test 2025-08-14T21:32:51.0349431Z inflating: build/bin/undefined_tensor_test 2025-08-14T21:32:51.0405059Z inflating: build/bin/verify_api_visibility 2025-08-14T21:32:51.0482670Z inflating: build/bin/legacy_vmap_test 2025-08-14T21:32:51.0541977Z inflating: build/bin/weakref_test 2025-08-14T21:32:51.0597624Z inflating: build/bin/wrapdim_test 2025-08-14T21:32:51.0657317Z inflating: build/bin/xla_tensor_test 2025-08-14T21:32:51.0720482Z inflating: build/bin/IListRef_test 2025-08-14T21:32:51.0834361Z inflating: build/bin/List_test 2025-08-14T21:32:51.0906134Z inflating: build/bin/KernelFunction_test 2025-08-14T21:32:51.1034988Z inflating: build/bin/kernel_function_legacy_test 2025-08-14T21:32:51.1137863Z inflating: build/bin/kernel_function_test 2025-08-14T21:32:51.1272345Z inflating: build/bin/kernel_lambda_legacy_test 2025-08-14T21:32:51.1382106Z inflating: build/bin/kernel_lambda_test 2025-08-14T21:32:51.1449420Z inflating: build/bin/kernel_stackbased_test 2025-08-14T21:32:51.1552299Z inflating: build/bin/make_boxed_from_unboxed_functor_test 2025-08-14T21:32:51.1610338Z inflating: build/bin/CppSignature_test 2025-08-14T21:32:51.1672311Z inflating: build/bin/backend_fallback_test 2025-08-14T21:32:51.1727629Z inflating: build/bin/op_allowlist_test 2025-08-14T21:32:51.2048491Z inflating: build/bin/op_registration_test 2025-08-14T21:32:51.2123219Z inflating: build/bin/inline_container_test 2025-08-14T21:32:51.3254560Z inflating: build/bin/test_jit 2025-08-14T21:32:51.3587969Z inflating: build/bin/test_nativert 2025-08-14T21:32:51.3644623Z inflating: build/bin/BackoffTest 2025-08-14T21:32:51.3702047Z inflating: build/bin/FileStoreTest 2025-08-14T21:32:51.3765598Z inflating: build/bin/TCPStoreTest 2025-08-14T21:32:51.3828506Z inflating: build/bin/HashStoreTest 2025-08-14T21:32:51.3902210Z inflating: build/bin/ProcessGroupGlooTest 2025-08-14T21:32:51.3907337Z inflating: build/bin/example_allreduce 2025-08-14T21:32:51.3963611Z inflating: build/bin/test_dist_autograd 2025-08-14T21:32:51.4038338Z inflating: build/bin/test_cpp_rpc 2025-08-14T21:32:51.5201557Z inflating: build/bin/test_api 2025-08-14T21:32:51.5202209Z inflating: build/bin/parallel_benchmark 2025-08-14T21:32:51.5554002Z inflating: build/bin/test_lazy 2025-08-14T21:32:51.5557128Z inflating: build/bin/torch_shm_manager 2025-08-14T21:32:51.5557422Z creating: .additional_ci_files/ 2025-08-14T21:32:51.5635777Z inflating: .additional_ci_files/test-times.json 2025-08-14T21:32:51.5941001Z inflating: .additional_ci_files/test-class-times.json 2025-08-14T21:32:51.5996076Z ##[group]Run rm artifacts.zip 2025-08-14T21:32:51.5996347Z rm artifacts.zip 2025-08-14T21:32:51.6001346Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:32:51.6001606Z env: 2025-08-14T21:32:51.6001776Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:51.6001973Z ##[endgroup] 2025-08-14T21:32:51.6314649Z ##[group]Run df -H 2025-08-14T21:32:51.6314861Z df -H 2025-08-14T21:32:51.6319949Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:32:51.6320207Z env: 2025-08-14T21:32:51.6320401Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:51.6320596Z ##[endgroup] 2025-08-14T21:32:51.6361608Z Filesystem Size Used Avail Use% Mounted on 2025-08-14T21:32:51.6364055Z devtmpfs 4.2M 0 4.2M 0% /dev 2025-08-14T21:32:51.6364407Z tmpfs 67G 0 67G 0% /dev/shm 2025-08-14T21:32:51.6364649Z tmpfs 27G 791k 27G 1% /run 2025-08-14T21:32:51.6364882Z /dev/nvme0n1p1 215G 69G 147G 32% / 2025-08-14T21:32:51.6365145Z tmpfs 67G 13k 67G 1% /tmp 2025-08-14T21:32:51.6365418Z /dev/nvme0n1p128 11M 1.4M 9.2M 13% /boot/efi 2025-08-14T21:32:51.6394586Z Prepare all required actions 2025-08-14T21:32:51.6395569Z Getting action download info 2025-08-14T21:32:51.8075672Z ##[group]Run ./.github/actions/download-td-artifacts 2025-08-14T21:32:51.8075995Z with: 2025-08-14T21:32:51.8076189Z env: 2025-08-14T21:32:51.8076394Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:51.8076625Z ##[endgroup] 2025-08-14T21:32:51.8190500Z ##[group]Run seemethere/download-artifact-s3@v4 2025-08-14T21:32:51.8190762Z with: 2025-08-14T21:32:51.8190921Z name: td_results 2025-08-14T21:32:51.8191109Z s3-bucket: gha-artifacts 2025-08-14T21:32:51.8191307Z region: us-east-1 2025-08-14T21:32:51.8191470Z env: 2025-08-14T21:32:51.8191638Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:51.8191823Z ##[endgroup] 2025-08-14T21:32:52.1931023Z (node:48028) NOTE: We are formalizing our plans to enter AWS SDK for JavaScript (v2) into maintenance mode in 2023. 2025-08-14T21:32:52.1931624Z 2025-08-14T21:32:52.1931945Z Please migrate your code to use AWS SDK for JavaScript (v3). 2025-08-14T21:32:52.1932342Z For more information, check the migration guide at https://a.co/7PzMCcy 2025-08-14T21:32:52.1932843Z (Use `node --trace-warnings ...` to show where the warning was created) 2025-08-14T21:32:52.2738470Z Found 0 objects with prefix pytorch/pytorch/16976338999/td_results/ 2025-08-14T21:32:52.2745052Z Artifact download has finished successfully 2025-08-14T21:32:52.5228902Z ##[group]Run mkdir -p .additional_ci_files 2025-08-14T21:32:52.5229190Z mkdir -p .additional_ci_files 2025-08-14T21:32:52.5229486Z mv td_results.json .additional_ci_files/td_results.json || true 2025-08-14T21:32:52.5234667Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:32:52.5234922Z env: 2025-08-14T21:32:52.5235091Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:52.5235272Z ##[endgroup] 2025-08-14T21:32:52.5287702Z mv: cannot stat 'td_results.json': No such file or directory 2025-08-14T21:32:52.5355607Z ##[group]Run .github/scripts/parse_ref.py 2025-08-14T21:32:52.5355892Z .github/scripts/parse_ref.py 2025-08-14T21:32:52.5360836Z shell: /usr/bin/bash -e {0} 2025-08-14T21:32:52.5361033Z env: 2025-08-14T21:32:52.5361211Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:52.5361395Z ##[endgroup] 2025-08-14T21:32:52.5987494Z Setting output branch=main 2025-08-14T21:32:52.6093307Z Prepare all required actions 2025-08-14T21:32:52.6093683Z Getting action download info 2025-08-14T21:32:52.7463560Z ##[group]Run ./.github/actions/filter-test-configs 2025-08-14T21:32:52.7463822Z with: 2025-08-14T21:32:52.7464292Z github-token: *** 2025-08-14T21:32:52.7468607Z test-matrix: {"include": [{"config": "cpu_inductor_freezing_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_timm", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_timm", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_timm", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_timm", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_timm", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_timm", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_amp_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_amp_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_amp_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_amp_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}]} 2025-08-14T21:32:52.7473416Z job-name: linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx) 2025-08-14T21:32:52.7473894Z env: 2025-08-14T21:32:52.7474086Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:52.7474310Z ##[endgroup] 2025-08-14T21:32:52.7532316Z ##[group]Run nick-fields/retry@v3.0.0 2025-08-14T21:32:52.7532555Z with: 2025-08-14T21:32:52.7532746Z shell: bash 2025-08-14T21:32:52.7532927Z timeout_minutes: 10 2025-08-14T21:32:52.7533106Z max_attempts: 5 2025-08-14T21:32:52.7533293Z retry_wait_seconds: 30 2025-08-14T21:32:52.7533829Z command: set -eux # PyYAML 6.0 doesn't work with MacOS x86 anymore # This must run on Python-3.7 (AmazonLinux2) so can't use request=3.32.2 python3 -m pip install requests==2.27.1 pyyaml==6.0.2 2025-08-14T21:32:52.7534356Z polling_interval_seconds: 1 2025-08-14T21:32:52.7534560Z warning_on_retry: true 2025-08-14T21:32:52.7534755Z continue_on_error: false 2025-08-14T21:32:52.7535077Z env: 2025-08-14T21:32:52.7535243Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:52.7535633Z GITHUB_TOKEN: *** 2025-08-14T21:32:52.7535824Z ##[endgroup] 2025-08-14T21:32:52.9119964Z + python3 -m pip install requests==2.27.1 pyyaml==6.0.2 2025-08-14T21:32:53.1077560Z Defaulting to user installation because normal site-packages is not writeable 2025-08-14T21:32:53.2150075Z Collecting requests==2.27.1 2025-08-14T21:32:53.2300016Z Downloading requests-2.27.1-py2.py3-none-any.whl (63 kB) 2025-08-14T21:32:53.4296221Z Collecting pyyaml==6.0.2 2025-08-14T21:32:53.4335843Z Downloading PyYAML-6.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (737 kB) 2025-08-14T21:32:53.4927479Z Collecting certifi>=2017.4.17 2025-08-14T21:32:53.4957501Z Downloading certifi-2025.8.3-py3-none-any.whl (161 kB) 2025-08-14T21:32:53.7584881Z Collecting charset-normalizer~=2.0.0 2025-08-14T21:32:53.7618228Z Downloading charset_normalizer-2.0.12-py3-none-any.whl (39 kB) 2025-08-14T21:32:53.7672608Z Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/lib/python3.9/site-packages (from requests==2.27.1) (1.25.10) 2025-08-14T21:32:53.7680418Z Requirement already satisfied: idna<4,>=2.5 in /usr/lib/python3.9/site-packages (from requests==2.27.1) (2.10) 2025-08-14T21:32:53.8341803Z Installing collected packages: charset-normalizer, certifi, requests, pyyaml 2025-08-14T21:32:53.9527705Z Successfully installed certifi-2025.8.3 charset-normalizer-2.0.12 pyyaml-6.0.2 requests-2.27.1 2025-08-14T21:32:54.8209262Z Command completed after 1 attempt(s). 2025-08-14T21:32:54.8279316Z ##[group]Run set -x 2025-08-14T21:32:54.8279536Z set -x 2025-08-14T21:32:54.8279706Z  2025-08-14T21:32:54.8279972Z # Use relative path here as this could be checked out anywhere, not necessarily 2025-08-14T21:32:54.8280294Z # in runner workspace 2025-08-14T21:32:54.8280570Z python3 "${GITHUB_ACTION_PATH}/../../scripts/parse_ref.py" 2025-08-14T21:32:54.8285681Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:32:54.8285967Z env: 2025-08-14T21:32:54.8286149Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:54.8286344Z ##[endgroup] 2025-08-14T21:32:54.8316299Z + python3 /home/ec2-user/actions-runner/_work/pytorch/pytorch/./.github/actions/filter-test-configs/../../scripts/parse_ref.py 2025-08-14T21:32:54.8466897Z Setting output branch=main 2025-08-14T21:32:54.8516491Z ##[group]Run echo "Workflow: ${GITHUB_WORKFLOW}" 2025-08-14T21:32:54.8516808Z echo "Workflow: ${GITHUB_WORKFLOW}" 2025-08-14T21:32:54.8517049Z echo "Job name: ${JOB_NAME}" 2025-08-14T21:32:54.8517261Z  2025-08-14T21:32:54.8517520Z # Use relative path here as this could be checked out anywhere, not necessarily 2025-08-14T21:32:54.8517826Z # in runner workspace 2025-08-14T21:32:54.8518114Z python3 "${GITHUB_ACTION_PATH}/../../scripts/filter_test_configs.py" \ 2025-08-14T21:32:54.8518436Z  --workflow "${GITHUB_WORKFLOW}" \ 2025-08-14T21:32:54.8518676Z  --job-name "${JOB_NAME}" \ 2025-08-14T21:32:54.8523240Z  --test-matrix "{"include": [{"config": "cpu_inductor_freezing_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_timm", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_timm", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_timm", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_timm", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_timm", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_timm", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_amp_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_amp_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_amp_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_amp_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}]}" \ 2025-08-14T21:32:54.8528124Z  --selected-test-configs "" \ 2025-08-14T21:32:54.8528408Z  --pr-number "${PR_NUMBER}" \ 2025-08-14T21:32:54.8528660Z  --tag "${TAG}" \ 2025-08-14T21:32:54.8529229Z  --event-name "${EVENT_NAME}" \ 2025-08-14T21:32:54.8529505Z  --schedule "${SCHEDULE}" \ 2025-08-14T21:32:54.8529757Z  --branch "${HEAD_BRANCH}" 2025-08-14T21:32:54.8535133Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:32:54.8535429Z env: 2025-08-14T21:32:54.8535615Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:54.8536231Z GITHUB_TOKEN: *** 2025-08-14T21:32:54.8536691Z JOB_NAME: linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx) 2025-08-14T21:32:54.8537147Z PR_NUMBER: 2025-08-14T21:32:54.8537330Z TAG: 2025-08-14T21:32:54.8537516Z EVENT_NAME: schedule 2025-08-14T21:32:54.8537741Z SCHEDULE: 45 0,4,8,12,16,20 * * 1-5 2025-08-14T21:32:54.8537985Z HEAD_BRANCH: main 2025-08-14T21:32:54.8538187Z ##[endgroup] 2025-08-14T21:32:54.8561377Z Workflow: inductor-periodic 2025-08-14T21:32:54.8561895Z Job name: linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx) 2025-08-14T21:32:55.0185067Z Setting output keep-going=True 2025-08-14T21:32:55.0185421Z Setting output ci-verbose-test-logs=False 2025-08-14T21:32:55.0185671Z Setting output ci-test-showlocals=False 2025-08-14T21:32:55.0185911Z Setting output ci-no-test-timeout=False 2025-08-14T21:32:55.0186241Z Setting output ci-no-td=False 2025-08-14T21:32:55.0186509Z Setting output ci-td-distributed=False 2025-08-14T21:32:55.0186751Z Setting output is-unstable=False 2025-08-14T21:32:55.0186977Z Setting output reenabled-issues= 2025-08-14T21:32:55.0191972Z Setting output test-matrix={"include": [{"config": "cpu_inductor_freezing_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_timm", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_timm", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_timm", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_timm", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_timm", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_timm", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_amp_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_amp_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_amp_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_amp_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}]} 2025-08-14T21:32:55.0196817Z Setting output is-test-matrix-empty=False 2025-08-14T21:32:55.0315673Z ##[group]Run echo "Filtered matrix:" 2025-08-14T21:32:55.0315958Z echo "Filtered matrix:" 2025-08-14T21:32:55.0320318Z echo "{"include": [{"config": "cpu_inductor_freezing_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_timm", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_timm", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_timm", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_timm", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_inductor_amp_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_timm", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_timm", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_amp_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "cpu_aot_inductor_amp_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_amp_freezing_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.8xlarge.amx"}, {"config": "dynamic_cpu_aot_inductor_amp_freezing_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.8xlarge.amx"}]}" 2025-08-14T21:32:55.0325257Z  2025-08-14T21:32:55.0325444Z echo 2025-08-14T21:32:55.0325717Z echo "Is the current job unstable? False" 2025-08-14T21:32:55.0325996Z  2025-08-14T21:32:55.0326172Z echo 2025-08-14T21:32:55.0326396Z echo "Is keep-going label set? True" 2025-08-14T21:32:55.0326774Z  2025-08-14T21:32:55.0326951Z echo 2025-08-14T21:32:55.0327158Z echo "Reenabled issues? " 2025-08-14T21:32:55.0333501Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:32:55.0333796Z env: 2025-08-14T21:32:55.0333991Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:55.0334210Z ##[endgroup] 2025-08-14T21:32:55.0360821Z Filtered matrix: 2025-08-14T21:32:55.0365851Z {include: [{config: cpu_inductor_freezing_huggingface, shard: 1, num_shards: 1, runner: linux.8xlarge.amx}, {config: cpu_inductor_freezing_timm, shard: 1, num_shards: 2, runner: linux.8xlarge.amx}, {config: cpu_inductor_freezing_timm, shard: 2, num_shards: 2, runner: linux.8xlarge.amx}, {config: cpu_inductor_freezing_torchbench, shard: 1, num_shards: 2, runner: linux.8xlarge.amx}, {config: cpu_inductor_freezing_torchbench, shard: 2, num_shards: 2, runner: linux.8xlarge.amx}, {config: cpu_inductor_amp_freezing_huggingface, shard: 1, num_shards: 1, runner: linux.8xlarge.amx}, {config: cpu_inductor_amp_freezing_timm, shard: 1, num_shards: 2, runner: linux.8xlarge.amx}, {config: cpu_inductor_amp_freezing_timm, shard: 2, num_shards: 2, runner: linux.8xlarge.amx}, {config: cpu_inductor_amp_freezing_torchbench, shard: 1, num_shards: 2, runner: linux.8xlarge.amx}, {config: cpu_inductor_amp_freezing_torchbench, shard: 2, num_shards: 2, runner: linux.8xlarge.amx}, {config: cpu_aot_inductor_freezing_huggingface, shard: 1, num_shards: 1, runner: linux.8xlarge.amx}, {config: cpu_aot_inductor_freezing_timm, shard: 1, num_shards: 2, runner: linux.8xlarge.amx}, {config: cpu_aot_inductor_freezing_timm, shard: 2, num_shards: 2, runner: linux.8xlarge.amx}, {config: cpu_aot_inductor_freezing_torchbench, shard: 1, num_shards: 2, runner: linux.8xlarge.amx}, {config: cpu_aot_inductor_freezing_torchbench, shard: 2, num_shards: 2, runner: linux.8xlarge.amx}, {config: cpu_aot_inductor_amp_freezing_torchbench, shard: 1, num_shards: 2, runner: linux.8xlarge.amx}, {config: cpu_aot_inductor_amp_freezing_torchbench, shard: 2, num_shards: 2, runner: linux.8xlarge.amx}, {config: dynamic_cpu_aot_inductor_freezing_torchbench, shard: 1, num_shards: 2, runner: linux.8xlarge.amx}, {config: dynamic_cpu_aot_inductor_freezing_torchbench, shard: 2, num_shards: 2, runner: linux.8xlarge.amx}, {config: dynamic_cpu_aot_inductor_amp_freezing_torchbench, shard: 1, num_shards: 2, runner: linux.8xlarge.amx}, {config: dynamic_cpu_aot_inductor_amp_freezing_torchbench, shard: 2, num_shards: 2, runner: linux.8xlarge.amx}]} 2025-08-14T21:32:55.0370724Z 2025-08-14T21:32:55.0370836Z Is the current job unstable? False 2025-08-14T21:32:55.0371030Z 2025-08-14T21:32:55.0371129Z Is keep-going label set? True 2025-08-14T21:32:55.0371292Z 2025-08-14T21:32:55.0371383Z Reenabled issues? 2025-08-14T21:32:55.0421408Z ##[group]Run echo "timeout=$((JOB_TIMEOUT-30))" >> "${GITHUB_OUTPUT}" 2025-08-14T21:32:55.0421778Z echo "timeout=$((JOB_TIMEOUT-30))" >> "${GITHUB_OUTPUT}" 2025-08-14T21:32:55.0426919Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:32:55.0427188Z env: 2025-08-14T21:32:55.0427367Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:55.0427555Z JOB_TIMEOUT: 240 2025-08-14T21:32:55.0427733Z ##[endgroup] 2025-08-14T21:32:55.0474679Z ##[group]Run env | grep '^GITHUB' >> "/tmp/github_env_${GITHUB_RUN_ID}" 2025-08-14T21:32:55.0475050Z env | grep '^GITHUB' >> "/tmp/github_env_${GITHUB_RUN_ID}" 2025-08-14T21:32:55.0475355Z env | grep '^CI' >> "/tmp/github_env_${GITHUB_RUN_ID}" 2025-08-14T21:32:55.0480028Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T21:32:55.0480408Z env: 2025-08-14T21:32:55.0480578Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:55.0480767Z ##[endgroup] 2025-08-14T21:32:55.0572249Z ##[group]Run set -x 2025-08-14T21:32:55.0572535Z set -x 2025-08-14T21:32:55.0572699Z  2025-08-14T21:32:55.0572896Z if [[ $TEST_CONFIG == 'multigpu' ]]; then 2025-08-14T21:32:55.0573180Z  TEST_COMMAND=.ci/pytorch/multigpu-test.sh 2025-08-14T21:32:55.0573450Z elif [[ $BUILD_ENVIRONMENT == *onnx* ]]; then 2025-08-14T21:32:55.0573714Z  TEST_COMMAND=.ci/onnx/test.sh 2025-08-14T21:32:55.0573922Z else 2025-08-14T21:32:55.0574114Z  TEST_COMMAND=.ci/pytorch/test.sh 2025-08-14T21:32:55.0574323Z fi 2025-08-14T21:32:55.0574478Z  2025-08-14T21:32:55.0574671Z # Leaving 1GB for the runner and other things 2025-08-14T21:32:55.0575052Z TOTAL_AVAILABLE_MEMORY_IN_GB=$(awk '/MemTotal/ { printf "%.3f \n", $2/1024/1024 - 1 }' /proc/meminfo) 2025-08-14T21:32:55.0575659Z # https://docs.docker.com/engine/containers/resource_constraints/#--memory-swap-details, the 3GB swap 2025-08-14T21:32:55.0576129Z # comes from https://github.com/pytorch/test-infra/pull/6058 2025-08-14T21:32:55.0576467Z TOTAL_MEMORY_WITH_SWAP=$(("${TOTAL_AVAILABLE_MEMORY_IN_GB%.*}" + 3)) 2025-08-14T21:32:55.0576735Z  2025-08-14T21:32:55.0576929Z if [[ ${BUILD_ENVIRONMENT} == *"s390x"* ]]; then 2025-08-14T21:32:55.0577155Z  SHM_OPTS= 2025-08-14T21:32:55.0577338Z  JENKINS_USER= 2025-08-14T21:32:55.0577581Z  # ensure that docker container cleanly exits in 12 hours 2025-08-14T21:32:55.0577887Z  # if for some reason cleanup action doesn't stop container 2025-08-14T21:32:55.0578152Z  # when job is cancelled 2025-08-14T21:32:55.0578375Z  DOCKER_SHELL_CMD="sleep 12h" 2025-08-14T21:32:55.0578581Z else 2025-08-14T21:32:55.0578763Z  SHM_OPTS="--shm-size=${SHM_SIZE}" 2025-08-14T21:32:55.0579000Z  JENKINS_USER="--user jenkins" 2025-08-14T21:32:55.0579224Z  DOCKER_SHELL_CMD= 2025-08-14T21:32:55.0579409Z fi 2025-08-14T21:32:55.0579565Z  2025-08-14T21:32:55.0579808Z # detached container should get cleaned up by teardown_ec2_linux 2025-08-14T21:32:55.0580153Z # TODO: Stop building test binaries as part of the build phase 2025-08-14T21:32:55.0580546Z # Used for GPU_FLAG, SHM_OPTS, JENKINS_USER and DOCKER_SHELL_CMD since that doesn't play nice 2025-08-14T21:32:55.0580892Z # shellcheck disable=SC2086,SC2090 2025-08-14T21:32:55.0581129Z container_name=$(docker run \ 2025-08-14T21:32:55.0581355Z  ${GPU_FLAG:-} \ 2025-08-14T21:32:55.0581572Z  ${SCCACHE_SERVER_PORT_DOCKER_FLAG:-} \ 2025-08-14T21:32:55.0581809Z  -e BUILD_ENVIRONMENT \ 2025-08-14T21:32:55.0582011Z  -e PR_NUMBER \ 2025-08-14T21:32:55.0582208Z  -e GITHUB_ACTIONS \ 2025-08-14T21:32:55.0582416Z  -e GITHUB_REPOSITORY \ 2025-08-14T21:32:55.0582635Z  -e GITHUB_WORKFLOW \ 2025-08-14T21:32:55.0582836Z  -e GITHUB_JOB \ 2025-08-14T21:32:55.0583030Z  -e GITHUB_RUN_ID \ 2025-08-14T21:32:55.0583230Z  -e GITHUB_RUN_NUMBER \ 2025-08-14T21:32:55.0583431Z  -e GITHUB_RUN_ATTEMPT \ 2025-08-14T21:32:55.0583642Z  -e JOB_ID \ 2025-08-14T21:32:55.0583830Z  -e JOB_NAME \ 2025-08-14T21:32:55.0584019Z  -e BASE_SHA \ 2025-08-14T21:32:55.0584199Z  -e BRANCH \ 2025-08-14T21:32:55.0584374Z  -e SHA1 \ 2025-08-14T21:32:55.0584551Z  -e AWS_DEFAULT_REGION \ 2025-08-14T21:32:55.0584754Z  -e IN_WHEEL_TEST \ 2025-08-14T21:32:55.0584944Z  -e SHARD_NUMBER \ 2025-08-14T21:32:55.0585135Z  -e TEST_CONFIG \ 2025-08-14T21:32:55.0585320Z  -e NUM_TEST_SHARDS \ 2025-08-14T21:32:55.0585525Z  -e REENABLED_ISSUES \ 2025-08-14T21:32:55.0585866Z  -e CONTINUE_THROUGH_ERROR \ 2025-08-14T21:32:55.0586149Z  -e VERBOSE_TEST_LOGS \ 2025-08-14T21:32:55.0586360Z  -e TEST_SHOWLOCALS \ 2025-08-14T21:32:55.0586561Z  -e NO_TEST_TIMEOUT \ 2025-08-14T21:32:55.0586746Z  -e NO_TD \ 2025-08-14T21:32:55.0586935Z  -e TD_DISTRIBUTED \ 2025-08-14T21:32:55.0587142Z  -e PR_LABELS \ 2025-08-14T21:32:55.0587361Z  -e MAX_JOBS="$(nproc --ignore=2)" \ 2025-08-14T21:32:55.0587596Z  -e SCCACHE_BUCKET \ 2025-08-14T21:32:55.0587799Z  -e SCCACHE_REGION \ 2025-08-14T21:32:55.0587998Z  -e XLA_CUDA \ 2025-08-14T21:32:55.0588205Z  -e XLA_CLANG_CACHE_S3_BUCKET_NAME \ 2025-08-14T21:32:55.0588463Z  -e PYTORCH_TEST_CUDA_MEM_LEAK_CHECK \ 2025-08-14T21:32:55.0588722Z  -e PYTORCH_TEST_RERUN_DISABLED_TESTS \ 2025-08-14T21:32:55.0588971Z  -e SKIP_SCCACHE_INITIALIZATION=1 \ 2025-08-14T21:32:55.0589216Z  -e HUGGING_FACE_HUB_TOKEN \ 2025-08-14T21:32:55.0589454Z  -e SCRIBE_GRAPHQL_ACCESS_TOKEN \ 2025-08-14T21:32:55.0589679Z  -e DASHBOARD_TAG \ 2025-08-14T21:32:55.0589883Z  -e ARTIFACTS_FILE_SUFFIX \ 2025-08-14T21:32:55.0590136Z  --memory="${TOTAL_AVAILABLE_MEMORY_IN_GB%.*}g" \ 2025-08-14T21:32:55.0590428Z  --memory-swap="${TOTAL_MEMORY_WITH_SWAP}g" \ 2025-08-14T21:32:55.0590699Z  --env-file="/tmp/github_env_${GITHUB_RUN_ID}" \ 2025-08-14T21:32:55.0590981Z  --security-opt seccomp=unconfined \ 2025-08-14T21:32:55.0591220Z  --cap-add=SYS_PTRACE \ 2025-08-14T21:32:55.0591433Z  --ipc=host \ 2025-08-14T21:32:55.0591615Z  ${SHM_OPTS} \ 2025-08-14T21:32:55.0591802Z  --tty \ 2025-08-14T21:32:55.0591979Z  --detach \ 2025-08-14T21:32:55.0592168Z  --name="${container_name}" \ 2025-08-14T21:32:55.0592389Z  ${JENKINS_USER} \ 2025-08-14T21:32:55.0592641Z  -v "${GITHUB_WORKSPACE}:/var/lib/jenkins/workspace" \ 2025-08-14T21:32:55.0592909Z  -w /var/lib/jenkins/workspace \ 2025-08-14T21:32:55.0593134Z  "${DOCKER_IMAGE}" \ 2025-08-14T21:32:55.0593338Z  ${DOCKER_SHELL_CMD} 2025-08-14T21:32:55.0593520Z ) 2025-08-14T21:32:55.0593738Z # Propagate download.pytorch.org IP to container 2025-08-14T21:32:55.0594173Z grep download.pytorch.org /etc/hosts | docker exec -i "${container_name}" sudo bash -c "/bin/cat >> /etc/hosts" 2025-08-14T21:32:55.0594627Z echo "DOCKER_CONTAINER_ID=${container_name}" >> "${GITHUB_ENV}" 2025-08-14T21:32:55.0594893Z  2025-08-14T21:32:55.0595089Z if [[ ${BUILD_ENVIRONMENT} == *"s390x"* ]]; then 2025-08-14T21:32:55.0595471Z  docker exec -t "${container_name}" sh -c "python3 -m pip install -r .ci/docker/requirements-ci.txt" 2025-08-14T21:32:55.0595805Z fi 2025-08-14T21:32:55.0595957Z  2025-08-14T21:32:55.0596288Z docker exec -t "${container_name}" sh -c "python3 -m pip install $(echo dist/*.whl)[opt-einsum] && ${TEST_COMMAND}" 2025-08-14T21:32:55.0601208Z shell: /usr/bin/bash -e {0} 2025-08-14T21:32:55.0601408Z env: 2025-08-14T21:32:55.0601578Z GIT_DEFAULT_BRANCH: main 2025-08-14T21:32:55.0601822Z BUILD_ENVIRONMENT: linux-jammy-py3.9-gcc11-build 2025-08-14T21:32:55.0602064Z PR_NUMBER: 2025-08-14T21:32:55.0602251Z GITHUB_REPOSITORY: pytorch/pytorch 2025-08-14T21:32:55.0602488Z GITHUB_WORKFLOW: inductor-periodic 2025-08-14T21:32:55.0602932Z GITHUB_JOB: test 2025-08-14T21:32:55.0603117Z GITHUB_RUN_ID: 16976338999 2025-08-14T21:32:55.0603320Z GITHUB_RUN_NUMBER: 66307 2025-08-14T21:32:55.0603515Z GITHUB_RUN_ATTEMPT: 1 2025-08-14T21:32:55.0603694Z JOB_ID: 48128261038 2025-08-14T21:32:55.0604089Z JOB_NAME: linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx) 2025-08-14T21:32:55.0604490Z BRANCH: main 2025-08-14T21:32:55.0604781Z SHA1: 1fc683cf17c8c673044538d10266c00f92987be2 2025-08-14T21:32:55.0605149Z BASE_SHA: 1fc683cf17c8c673044538d10266c00f92987be2 2025-08-14T21:32:55.0605427Z TEST_CONFIG: cpu_inductor_amp_freezing_huggingface 2025-08-14T21:32:55.0605667Z SHARD_NUMBER: 1 2025-08-14T21:32:55.0605835Z NUM_TEST_SHARDS: 1 2025-08-14T21:32:55.0606009Z REENABLED_ISSUES: 2025-08-14T21:32:55.0606200Z CONTINUE_THROUGH_ERROR: True 2025-08-14T21:32:55.0606397Z VERBOSE_TEST_LOGS: False 2025-08-14T21:32:55.0606593Z TEST_SHOWLOCALS: False 2025-08-14T21:32:55.0606788Z NO_TEST_TIMEOUT: False 2025-08-14T21:32:55.0606963Z NO_TD: False 2025-08-14T21:32:55.0607130Z TD_DISTRIBUTED: False 2025-08-14T21:32:55.0607356Z SCCACHE_BUCKET: ossci-compiler-cache-circleci-v2 2025-08-14T21:32:55.0607606Z SCCACHE_REGION: us-east-1 2025-08-14T21:32:55.0607798Z SHM_SIZE: 1g 2025-08-14T21:32:55.0608356Z DOCKER_IMAGE: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:32:55.0609021Z XLA_CUDA: 2025-08-14T21:32:55.0609281Z XLA_CLANG_CACHE_S3_BUCKET_NAME: ossci-compiler-clang-cache-circleci-xla 2025-08-14T21:32:55.0609592Z PYTORCH_TEST_CUDA_MEM_LEAK_CHECK: 0 2025-08-14T21:32:55.0609826Z PYTORCH_TEST_RERUN_DISABLED_TESTS: 0 2025-08-14T21:32:55.0610033Z DASHBOARD_TAG: 2025-08-14T21:32:55.0610407Z HUGGING_FACE_HUB_TOKEN: *** 2025-08-14T21:32:55.0610708Z SCRIBE_GRAPHQL_ACCESS_TOKEN: *** 2025-08-14T21:32:55.0611092Z ARTIFACTS_FILE_SUFFIX: test-cpu_inductor_amp_freezing_huggingface-1-1-linux.8xlarge.amx_48128261038 2025-08-14T21:32:55.0611446Z ##[endgroup] 2025-08-14T21:32:55.0633633Z + [[ cpu_inductor_amp_freezing_huggingface == \m\u\l\t\i\g\p\u ]] 2025-08-14T21:32:55.0633999Z + [[ linux-jammy-py3.9-gcc11-build == *onnx* ]] 2025-08-14T21:32:55.0634285Z + TEST_COMMAND=.ci/pytorch/test.sh 2025-08-14T21:32:55.0637057Z ++ awk '/MemTotal/ { printf "%.3f \n", $2/1024/1024 - 1 }' /proc/meminfo 2025-08-14T21:32:55.0657598Z + TOTAL_AVAILABLE_MEMORY_IN_GB='122.780 ' 2025-08-14T21:32:55.0658004Z + TOTAL_MEMORY_WITH_SWAP=125 2025-08-14T21:32:55.0658279Z + [[ linux-jammy-py3.9-gcc11-build == *\s\3\9\0\x* ]] 2025-08-14T21:32:55.0658537Z + SHM_OPTS=--shm-size=1g 2025-08-14T21:32:55.0658748Z + JENKINS_USER='--user jenkins' 2025-08-14T21:32:55.0658957Z + DOCKER_SHELL_CMD= 2025-08-14T21:32:55.0666781Z +++ nproc --ignore=2 2025-08-14T21:32:55.0698580Z ++ docker run -e BUILD_ENVIRONMENT -e PR_NUMBER -e GITHUB_ACTIONS -e GITHUB_REPOSITORY -e GITHUB_WORKFLOW -e GITHUB_JOB -e GITHUB_RUN_ID -e GITHUB_RUN_NUMBER -e GITHUB_RUN_ATTEMPT -e JOB_ID -e JOB_NAME -e BASE_SHA -e BRANCH -e SHA1 -e AWS_DEFAULT_REGION -e IN_WHEEL_TEST -e SHARD_NUMBER -e TEST_CONFIG -e NUM_TEST_SHARDS -e REENABLED_ISSUES -e CONTINUE_THROUGH_ERROR -e VERBOSE_TEST_LOGS -e TEST_SHOWLOCALS -e NO_TEST_TIMEOUT -e NO_TD -e TD_DISTRIBUTED -e PR_LABELS -e MAX_JOBS=30 -e SCCACHE_BUCKET -e SCCACHE_REGION -e XLA_CUDA -e XLA_CLANG_CACHE_S3_BUCKET_NAME -e PYTORCH_TEST_CUDA_MEM_LEAK_CHECK -e PYTORCH_TEST_RERUN_DISABLED_TESTS -e SKIP_SCCACHE_INITIALIZATION=1 -e HUGGING_FACE_HUB_TOKEN -e SCRIBE_GRAPHQL_ACCESS_TOKEN -e DASHBOARD_TAG -e ARTIFACTS_FILE_SUFFIX --memory=122g --memory-swap=125g --env-file=/tmp/github_env_16976338999 --security-opt seccomp=unconfined --cap-add=SYS_PTRACE --ipc=host --shm-size=1g --tty --detach --name= --user jenkins -v /home/ec2-user/actions-runner/_work/pytorch/pytorch:/var/lib/jenkins/workspace -w /var/lib/jenkins/workspace 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T21:33:05.3442400Z + container_name=ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T21:33:05.3445793Z + grep download.pytorch.org /etc/hosts 2025-08-14T21:33:05.3446997Z + docker exec -i ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 sudo bash -c '/bin/cat >> /etc/hosts' 2025-08-14T21:33:05.4583066Z + echo DOCKER_CONTAINER_ID=ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T21:33:05.4583943Z + [[ linux-jammy-py3.9-gcc11-build == *\s\3\9\0\x* ]] 2025-08-14T21:33:05.4586726Z ++ echo dist/torch-2.9.0a0+git1fc683c-cp39-cp39-linux_x86_64.whl 2025-08-14T21:33:05.4588931Z + docker exec -t ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 sh -c 'python3 -m pip install dist/torch-2.9.0a0+git1fc683c-cp39-cp39-linux_x86_64.whl[opt-einsum] && .ci/pytorch/test.sh' 2025-08-14T21:33:05.8265558Z Processing ./dist/torch-2.9.0a0+git1fc683c-cp39-cp39-linux_x86_64.whl (from torch==2.9.0a0+git1fc683c) 2025-08-14T21:33:06.0561457Z Requirement already satisfied: filelock in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from torch==2.9.0a0+git1fc683c->torch==2.9.0a0+git1fc683c) (3.18.0) 2025-08-14T21:33:06.0562411Z Requirement already satisfied: typing-extensions>=4.10.0 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from torch==2.9.0a0+git1fc683c->torch==2.9.0a0+git1fc683c) (4.14.1) 2025-08-14T21:33:06.0563334Z Requirement already satisfied: sympy>=1.13.3 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from torch==2.9.0a0+git1fc683c->torch==2.9.0a0+git1fc683c) (1.13.3) 2025-08-14T21:33:06.0564821Z Requirement already satisfied: networkx>=2.5.1 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from torch==2.9.0a0+git1fc683c->torch==2.9.0a0+git1fc683c) (2.8.8) 2025-08-14T21:33:06.0569211Z Requirement already satisfied: jinja2 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from torch==2.9.0a0+git1fc683c->torch==2.9.0a0+git1fc683c) (3.1.6) 2025-08-14T21:33:06.0574359Z Requirement already satisfied: fsspec>=0.8.5 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from torch==2.9.0a0+git1fc683c->torch==2.9.0a0+git1fc683c) (2025.3.0) 2025-08-14T21:33:06.0587695Z Requirement already satisfied: opt-einsum>=3.3 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from torch==2.9.0a0+git1fc683c->torch==2.9.0a0+git1fc683c) (3.3.0) 2025-08-14T21:33:06.0906421Z Requirement already satisfied: numpy>=1.7 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from opt-einsum>=3.3->torch==2.9.0a0+git1fc683c->torch==2.9.0a0+git1fc683c) (1.22.4) 2025-08-14T21:33:06.0923227Z Requirement already satisfied: mpmath<1.4,>=1.1.0 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from sympy>=1.13.3->torch==2.9.0a0+git1fc683c->torch==2.9.0a0+git1fc683c) (1.3.0) 2025-08-14T21:33:06.0989586Z Requirement already satisfied: MarkupSafe>=2.0 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from jinja2->torch==2.9.0a0+git1fc683c->torch==2.9.0a0+git1fc683c) (3.0.2) 2025-08-14T21:33:06.9108528Z Installing collected packages: torch 2025-08-14T21:33:14.6795896Z ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 2025-08-14T21:33:14.6796631Z dall-e 0.1 requires torchvision, which is not installed. 2025-08-14T21:33:14.6796963Z effdet 0.4.1 requires torchvision, which is not installed. 2025-08-14T21:33:14.6797430Z pytorch-labs-segment-anything-fast 0.2 requires torchao, which is not installed. 2025-08-14T21:33:14.6798006Z pytorch-labs-segment-anything-fast 0.2 requires torchvision>=0.17.0.dev20231026, which is not installed. 2025-08-14T21:33:14.6798613Z timm 1.0.14 requires torchvision, which is not installed. 2025-08-14T21:33:14.6799027Z Successfully installed torch-2.9.0a0+git1fc683c 2025-08-14T21:33:14.7862470Z + export TERM=vt100 2025-08-14T21:33:14.7862705Z + TERM=vt100 2025-08-14T21:33:14.7862891Z ++ dirname .ci/pytorch/test.sh 2025-08-14T21:33:14.7869809Z + source .ci/pytorch/common.sh 2025-08-14T21:33:14.7871415Z +++ dirname .ci/pytorch/common.sh 2025-08-14T21:33:14.7877083Z ++ source .ci/pytorch/common_utils.sh 2025-08-14T21:33:14.7877362Z +++ declare -f -t trap_add 2025-08-14T21:33:14.7881489Z ++ set -ex -o pipefail 2025-08-14T21:33:14.7881804Z ++ [[ linux-jammy-py3.9-gcc11-build == *rocm* ]] 2025-08-14T21:33:14.7882072Z ++ BUILD_TEST_LIBTORCH=0 2025-08-14T21:33:14.7884423Z ++ dirname .ci/pytorch/test.sh 2025-08-14T21:33:14.7897060Z + source .ci/pytorch/common-build.sh 2025-08-14T21:33:14.7897546Z ++ [[ linux-jammy-py3.9-gcc11-build != *win-* ]] 2025-08-14T21:33:14.7903998Z ++++ dirname .ci/pytorch/common-build.sh 2025-08-14T21:33:14.7922143Z +++ cd .ci/pytorch 2025-08-14T21:33:14.7922375Z +++ pwd -P 2025-08-14T21:33:14.7922898Z ++ script_dir=/var/lib/jenkins/workspace/.ci/pytorch 2025-08-14T21:33:14.7923252Z ++ [[ linux-jammy-py3.9-gcc11-build == *-pch* ]] 2025-08-14T21:33:14.7923486Z ++ which sccache 2025-08-14T21:33:14.7950512Z ++ [[ -z ossci-compiler-cache-circleci-v2 ]] 2025-08-14T21:33:14.7950951Z ++ sccache --stop-server 2025-08-14T21:33:14.7967433Z ++ true 2025-08-14T21:33:14.7967669Z ++ rm -f /var/lib/jenkins/sccache_error.log 2025-08-14T21:33:14.7975151Z ++ trap_add sccache_epilogue EXIT 2025-08-14T21:33:14.7975406Z ++ trap_add_cmd=sccache_epilogue 2025-08-14T21:33:14.7975604Z ++ shift 2025-08-14T21:33:14.7975784Z ++ for trap_add_name in "$@" 2025-08-14T21:33:14.7979296Z ++++ trap -p EXIT 2025-08-14T21:33:14.7981881Z +++ eval 'extract_trap_cmd ' 2025-08-14T21:33:14.7982291Z ++++ extract_trap_cmd 2025-08-14T21:33:14.7982633Z ++++ printf '%s\n' '' 2025-08-14T21:33:14.7982854Z +++ printf '%s\n' sccache_epilogue 2025-08-14T21:33:14.7985083Z ++ trap -- ' 2025-08-14T21:33:14.7985764Z sccache_epilogue' EXIT 2025-08-14T21:33:14.7986173Z ++ [[ -n 1 ]] 2025-08-14T21:33:14.7986975Z ++ echo 'Skipping sccache server initialization, setting environment variables' 2025-08-14T21:33:14.7987454Z Skipping sccache server initialization, setting environment variables 2025-08-14T21:33:14.7987775Z ++ export SCCACHE_IDLE_TIMEOUT=0 2025-08-14T21:33:14.7988002Z ++ SCCACHE_IDLE_TIMEOUT=0 2025-08-14T21:33:14.7988268Z ++ export SCCACHE_ERROR_LOG=/var/lib/jenkins/sccache_error.log 2025-08-14T21:33:14.7988591Z ++ SCCACHE_ERROR_LOG=/var/lib/jenkins/sccache_error.log 2025-08-14T21:33:14.7988929Z ++ export RUST_LOG=sccache::server=error 2025-08-14T21:33:14.7989164Z ++ RUST_LOG=sccache::server=error 2025-08-14T21:33:14.7989409Z ++ sccache --zero-stats 2025-08-14T21:33:14.9500005Z Statistics zeroed. 2025-08-14T21:33:14.9503421Z ++ which ccache 2025-08-14T21:33:14.9530679Z + [[ linux-jammy-py3.9-gcc11-build != *rocm* ]] 2025-08-14T21:33:14.9531016Z + [[ linux-jammy-py3.9-gcc11-build != *s390x* ]] 2025-08-14T21:33:14.9531283Z + [[ -d /var/lib/jenkins/workspace ]] 2025-08-14T21:33:14.9535522Z ++ stat -c %u /var/lib/jenkins/workspace 2025-08-14T21:33:14.9545331Z + WORKSPACE_ORIGINAL_OWNER_ID=1000 2025-08-14T21:33:14.9545597Z + trap_add cleanup_workspace EXIT 2025-08-14T21:33:14.9545832Z + trap_add_cmd=cleanup_workspace 2025-08-14T21:33:14.9546039Z + shift 2025-08-14T21:33:14.9546208Z + for trap_add_name in "$@" 2025-08-14T21:33:14.9555062Z +++ trap -p EXIT 2025-08-14T21:33:14.9559064Z ++ eval 'extract_trap_cmd trap -- '\'' 2025-08-14T21:33:14.9559352Z sccache_epilogue'\'' EXIT' 2025-08-14T21:33:14.9559573Z +++ extract_trap_cmd trap -- ' 2025-08-14T21:33:14.9559786Z sccache_epilogue' EXIT 2025-08-14T21:33:14.9559996Z +++ printf '%s\n' ' 2025-08-14T21:33:14.9560178Z sccache_epilogue' 2025-08-14T21:33:14.9560390Z ++ printf '%s\n' cleanup_workspace 2025-08-14T21:33:14.9560944Z + trap -- ' 2025-08-14T21:33:14.9561118Z sccache_epilogue 2025-08-14T21:33:14.9561307Z cleanup_workspace' EXIT 2025-08-14T21:33:14.9561538Z + sudo chown -R jenkins /var/lib/jenkins/workspace 2025-08-14T21:33:15.3940416Z + git config --global --add safe.directory /var/lib/jenkins/workspace 2025-08-14T21:33:15.3960150Z + echo 'Environment variables:' 2025-08-14T21:33:15.3960706Z Environment variables: 2025-08-14T21:33:15.3960913Z + env 2025-08-14T21:33:15.3965731Z GITHUB_WORKSPACE=/home/ec2-user/actions-runner/_work/pytorch/pytorch 2025-08-14T21:33:15.3966093Z CONTINUE_THROUGH_ERROR=True 2025-08-14T21:33:15.3966782Z BUILD_ENVIRONMENT=linux-jammy-py3.9-gcc11-build 2025-08-14T21:33:15.3967032Z HOSTNAME=ca0b9dd31303 2025-08-14T21:33:15.3967445Z GITHUB_PATH=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/add_path_c538f098-72ee-4a23-ade5-2a38fab2cff5 2025-08-14T21:33:15.3968218Z GITHUB_ACTION=__run_2 2025-08-14T21:33:15.3968537Z PYTORCH_TEST_CUDA_MEM_LEAK_CHECK=0 2025-08-14T21:33:15.3969002Z GITHUB_RUN_NUMBER=66307 2025-08-14T21:33:15.3969231Z TEST_CONFIG=cpu_inductor_amp_freezing_huggingface 2025-08-14T21:33:15.3969487Z GITHUB_REPOSITORY_OWNER_ID=21003710 2025-08-14T21:33:15.3969721Z TORCH_NVCC_FLAGS=-Xfatbin -compress-all 2025-08-14T21:33:15.3969950Z SCCACHE_IDLE_TIMEOUT=0 2025-08-14T21:33:15.3970390Z SCRIBE_GRAPHQL_ACCESS_TOKEN=*** 2025-08-14T21:33:15.3970617Z GITHUB_TRIGGERING_ACTOR=pytorchmergebot 2025-08-14T21:33:15.3970834Z GITHUB_REF_TYPE=branch 2025-08-14T21:33:15.3971028Z TORCH_CUDA_ARCH_LIST=Maxwell 2025-08-14T21:33:15.3971259Z BASE_SHA=1fc683cf17c8c673044538d10266c00f92987be2 2025-08-14T21:33:15.3971521Z XLA_CUDA= 2025-08-14T21:33:15.3971693Z NCCL_LIB_DIR=/usr/local/cuda/lib64/ 2025-08-14T21:33:15.3972011Z HUGGING_FACE_HUB_TOKEN=*** 2025-08-14T21:33:15.3972497Z *** 2025-08-14T21:33:15.3972673Z GITHUB_REPOSITORY_ID=65600975 2025-08-14T21:33:15.3972874Z GITHUB_ACTIONS=true 2025-08-14T21:33:15.3973099Z SCCACHE_ERROR_LOG=/var/lib/jenkins/sccache_error.log 2025-08-14T21:33:15.3973367Z SHA1=1fc683cf17c8c673044538d10266c00f92987be2 2025-08-14T21:33:15.3973613Z GITHUB_SHA=1fc683cf17c8c673044538d10266c00f92987be2 2025-08-14T21:33:15.3974002Z GITHUB_WORKFLOW_REF=pytorch/pytorch/.github/workflows/inductor-periodic.yml@refs/heads/main 2025-08-14T21:33:15.3974356Z UCC_HOME=/usr 2025-08-14T21:33:15.3974518Z VERBOSE_TEST_LOGS=False 2025-08-14T21:33:15.3974708Z GITHUB_REF=refs/heads/main 2025-08-14T21:33:15.3974896Z SHARD_NUMBER=1 2025-08-14T21:33:15.3975064Z GITHUB_REF_PROTECTED=true 2025-08-14T21:33:15.3975255Z HOME=/var/lib/jenkins 2025-08-14T21:33:15.3975459Z GITHUB_API_URL=https://api.github.com 2025-08-14T21:33:15.3975692Z PYTORCH_TEST_RERUN_DISABLED_TESTS=0 2025-08-14T21:33:15.3975889Z UCX_COMMIT= 2025-08-14T21:33:15.3976049Z USE_SYSTEM_NCCL=1 2025-08-14T21:33:15.3976216Z NUM_TEST_SHARDS=1 2025-08-14T21:33:15.3976372Z UCX_HOME=/usr 2025-08-14T21:33:15.3976762Z GITHUB_STATE=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/save_state_c538f098-72ee-4a23-ade5-2a38fab2cff5 2025-08-14T21:33:15.3977374Z JOB_NAME=linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx) 2025-08-14T21:33:15.3977950Z GITHUB_ENV=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/set_env_c538f098-72ee-4a23-ade5-2a38fab2cff5 2025-08-14T21:33:15.3978462Z GITHUB_EVENT_PATH=/home/ec2-user/actions-runner/_work/_temp/_github_workflow/event.json 2025-08-14T21:33:15.3978783Z GITHUB_EVENT_NAME=schedule 2025-08-14T21:33:15.3978973Z DASHBOARD_TAG= 2025-08-14T21:33:15.3979139Z GITHUB_RUN_ID=16976338999 2025-08-14T21:33:15.3979330Z INSTALLED_OPENBLAS= 2025-08-14T21:33:15.3979732Z GITHUB_STEP_SUMMARY=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/step_summary_c538f098-72ee-4a23-ade5-2a38fab2cff5 2025-08-14T21:33:15.3980163Z GITHUB_ACTOR=pytorchmergebot 2025-08-14T21:33:15.3980363Z PR_NUMBER= 2025-08-14T21:33:15.3980525Z DESIRED_CUDA= 2025-08-14T21:33:15.3980688Z GITHUB_RUN_ATTEMPT=1 2025-08-14T21:33:15.3980879Z ANACONDA_PYTHON_VERSION=3.9 2025-08-14T21:33:15.3981116Z GITHUB_GRAPHQL_URL=https://api.github.com/graphql 2025-08-14T21:33:15.3981348Z TERM=vt100 2025-08-14T21:33:15.3981510Z INSTALLED_VISION=yes 2025-08-14T21:33:15.3981685Z BRANCH=main 2025-08-14T21:33:15.3981845Z SCCACHE_REGION=us-east-1 2025-08-14T21:33:15.3982046Z OPENSSL_ROOT_DIR=/opt/openssl 2025-08-14T21:33:15.3982248Z CUDA_PATH=/usr/local/cuda 2025-08-14T21:33:15.3982592Z GITHUB_ACTION_PATH=/home/ec2-user/actions-runner/_work/pytorch/pytorch/./.github/actions/setup-linux 2025-08-14T21:33:15.3982971Z GITHUB_SERVER_URL=https://github.com 2025-08-14T21:33:15.3983187Z UCC_COMMIT= 2025-08-14T21:33:15.3983369Z REENABLED_ISSUES= 2025-08-14T21:33:15.3983530Z DOCS=yes 2025-08-14T21:33:15.3983682Z SHLVL=1 2025-08-14T21:33:15.3983832Z MAX_JOBS=30 2025-08-14T21:33:15.3983983Z GITHUB_ACTOR_ID=97764156 2025-08-14T21:33:15.3984305Z GITHUB_WORKFLOW_SHA=1fc683cf17c8c673044538d10266c00f92987be2 2025-08-14T21:33:15.3984628Z GITHUB_REF_NAME=main 2025-08-14T21:33:15.3984895Z XLA_CLANG_CACHE_S3_BUCKET_NAME=ossci-compiler-clang-cache-circleci-xla 2025-08-14T21:33:15.3985193Z GITHUB_JOB=test 2025-08-14T21:33:15.3985368Z NO_TEST_TIMEOUT=False 2025-08-14T21:33:15.3985542Z TD_DISTRIBUTED=False 2025-08-14T21:33:15.3985866Z GITHUB_REPOSITORY=pytorch/pytorch 2025-08-14T21:33:15.3986085Z GITHUB_RETENTION_DAYS=90 2025-08-14T21:33:15.3986268Z OPENSSL_DIR=/opt/openssl 2025-08-14T21:33:15.3986462Z GITHUB_ACTION_REPOSITORY= 2025-08-14T21:33:15.3986961Z PATH=/opt/cache/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/opt/conda/envs/py_3.9/bin:/opt/conda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin 2025-08-14T21:33:15.3987458Z GITHUB_BASE_REF= 2025-08-14T21:33:15.3987627Z INSTALLED_ACL= 2025-08-14T21:33:15.3987956Z ARTIFACTS_FILE_SUFFIX=test-cpu_inductor_amp_freezing_huggingface-1-1-linux.8xlarge.amx_48128261038 2025-08-14T21:33:15.3988317Z CI=true 2025-08-14T21:33:15.3988479Z GITHUB_REPOSITORY_OWNER=pytorch 2025-08-14T21:33:15.3988743Z RUST_LOG=sccache::server=error 2025-08-14T21:33:15.3988941Z JOB_ID=48128261038 2025-08-14T21:33:15.3989099Z GITHUB_HEAD_REF= 2025-08-14T21:33:15.3989270Z GITHUB_ACTION_REF= 2025-08-14T21:33:15.3989482Z SCCACHE_BUCKET=ossci-compiler-cache-circleci-v2 2025-08-14T21:33:15.3989718Z TEST_SHOWLOCALS=False 2025-08-14T21:33:15.3989913Z GITHUB_WORKFLOW=inductor-periodic 2025-08-14T21:33:15.3990129Z DEBIAN_FRONTEND=noninteractive 2025-08-14T21:33:15.3990532Z GITHUB_OUTPUT=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/set_output_c538f098-72ee-4a23-ade5-2a38fab2cff5 2025-08-14T21:33:15.3990926Z NO_TD=False 2025-08-14T21:33:15.3991099Z SKIP_SCCACHE_INITIALIZATION=1 2025-08-14T21:33:15.3991318Z NCCL_INCLUDE_DIR=/usr/local/cuda/include/ 2025-08-14T21:33:15.3991527Z _=/usr/bin/env 2025-08-14T21:33:15.3991770Z ++ python -c 'import site; print(site.getsitepackages()[0])' 2025-08-14T21:33:15.4249997Z + TORCH_INSTALL_DIR=/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch 2025-08-14T21:33:15.4250560Z + TORCH_BIN_DIR=/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/bin 2025-08-14T21:33:15.4250973Z + TORCH_LIB_DIR=/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/lib 2025-08-14T21:33:15.4251353Z + TORCH_TEST_DIR=/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/test 2025-08-14T21:33:15.4251649Z + BUILD_DIR=build 2025-08-14T21:33:15.4251836Z + BUILD_RENAMED_DIR=build_renamed 2025-08-14T21:33:15.4252050Z + BUILD_BIN_DIR=build/bin 2025-08-14T21:33:15.4252235Z + SHARD_NUMBER=1 2025-08-14T21:33:15.4252397Z + NUM_TEST_SHARDS=1 2025-08-14T21:33:15.4252591Z + export TORCH_SERIALIZATION_DEBUG=1 2025-08-14T21:33:15.4252813Z + TORCH_SERIALIZATION_DEBUG=1 2025-08-14T21:33:15.4253008Z + export VALGRIND=ON 2025-08-14T21:33:15.4253179Z + VALGRIND=ON 2025-08-14T21:33:15.4253384Z + [[ linux-jammy-py3.9-gcc11-build == *clang9* ]] 2025-08-14T21:33:15.4253652Z + [[ linux-jammy-py3.9-gcc11-build == *xpu* ]] 2025-08-14T21:33:15.4253914Z + [[ linux-jammy-py3.9-gcc11-build == *s390x* ]] 2025-08-14T21:33:15.4254133Z + [[ 0 == \1 ]] 2025-08-14T21:33:15.4254294Z + [[ True == \1 ]] 2025-08-14T21:33:15.4254474Z + [[ linux-jammy-py3.9-gcc11-build != *bazel* ]] 2025-08-14T21:33:15.4258699Z ++ realpath build/custom_test_artifacts 2025-08-14T21:33:15.4266083Z + CUSTOM_TEST_ARTIFACT_BUILD_DIR=/var/lib/jenkins/workspace/build/custom_test_artifacts 2025-08-14T21:33:15.4266435Z + [[ -n '' ]] 2025-08-14T21:33:15.4266621Z + echo 'Environment variables' 2025-08-14T21:33:15.4266828Z Environment variables 2025-08-14T21:33:15.4267004Z + env 2025-08-14T21:33:15.4287201Z GITHUB_WORKSPACE=/home/ec2-user/actions-runner/_work/pytorch/pytorch 2025-08-14T21:33:15.4287636Z CONTINUE_THROUGH_ERROR=True 2025-08-14T21:33:15.4287895Z BUILD_ENVIRONMENT=linux-jammy-py3.9-gcc11-build 2025-08-14T21:33:15.4288156Z HOSTNAME=ca0b9dd31303 2025-08-14T21:33:15.4288566Z GITHUB_PATH=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/add_path_c538f098-72ee-4a23-ade5-2a38fab2cff5 2025-08-14T21:33:15.4289519Z GITHUB_ACTION=__run_2 2025-08-14T21:33:15.4289846Z PYTORCH_TEST_CUDA_MEM_LEAK_CHECK=0 2025-08-14T21:33:15.4290070Z GITHUB_RUN_NUMBER=66307 2025-08-14T21:33:15.4290296Z TEST_CONFIG=cpu_inductor_amp_freezing_huggingface 2025-08-14T21:33:15.4290553Z GITHUB_REPOSITORY_OWNER_ID=21003710 2025-08-14T21:33:15.4290799Z TORCH_NVCC_FLAGS=-Xfatbin -compress-all 2025-08-14T21:33:15.4291019Z SCCACHE_IDLE_TIMEOUT=0 2025-08-14T21:33:15.4291457Z SCRIBE_GRAPHQL_ACCESS_TOKEN=*** 2025-08-14T21:33:15.4291688Z GITHUB_TRIGGERING_ACTOR=pytorchmergebot 2025-08-14T21:33:15.4291907Z GITHUB_REF_TYPE=branch 2025-08-14T21:33:15.4292101Z TORCH_CUDA_ARCH_LIST=Maxwell 2025-08-14T21:33:15.4292332Z BASE_SHA=1fc683cf17c8c673044538d10266c00f92987be2 2025-08-14T21:33:15.4292557Z XLA_CUDA= 2025-08-14T21:33:15.4292733Z NCCL_LIB_DIR=/usr/local/cuda/lib64/ 2025-08-14T21:33:15.4293016Z HUGGING_FACE_HUB_TOKEN=*** 2025-08-14T21:33:15.4293268Z *** 2025-08-14T21:33:15.4293428Z GITHUB_REPOSITORY_ID=65600975 2025-08-14T21:33:15.4293636Z GITHUB_ACTIONS=true 2025-08-14T21:33:15.4293859Z SCCACHE_ERROR_LOG=/var/lib/jenkins/sccache_error.log 2025-08-14T21:33:15.4294116Z SHA1=1fc683cf17c8c673044538d10266c00f92987be2 2025-08-14T21:33:15.4294367Z GITHUB_SHA=1fc683cf17c8c673044538d10266c00f92987be2 2025-08-14T21:33:15.4294753Z GITHUB_WORKFLOW_REF=pytorch/pytorch/.github/workflows/inductor-periodic.yml@refs/heads/main 2025-08-14T21:33:15.4295105Z UCC_HOME=/usr 2025-08-14T21:33:15.4295273Z TORCH_SERIALIZATION_DEBUG=1 2025-08-14T21:33:15.4295470Z VERBOSE_TEST_LOGS=False 2025-08-14T21:33:15.4295661Z GITHUB_REF=refs/heads/main 2025-08-14T21:33:15.4295844Z SHARD_NUMBER=1 2025-08-14T21:33:15.4296021Z GITHUB_REF_PROTECTED=true 2025-08-14T21:33:15.4296216Z HOME=/var/lib/jenkins 2025-08-14T21:33:15.4296419Z GITHUB_API_URL=https://api.github.com 2025-08-14T21:33:15.4296659Z PYTORCH_TEST_RERUN_DISABLED_TESTS=0 2025-08-14T21:33:15.4296873Z UCX_COMMIT= 2025-08-14T21:33:15.4297027Z USE_SYSTEM_NCCL=1 2025-08-14T21:33:15.4297204Z NUM_TEST_SHARDS=1 2025-08-14T21:33:15.4297369Z UCX_HOME=/usr 2025-08-14T21:33:15.4297745Z GITHUB_STATE=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/save_state_c538f098-72ee-4a23-ade5-2a38fab2cff5 2025-08-14T21:33:15.4298365Z JOB_NAME=linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx) 2025-08-14T21:33:15.4298960Z GITHUB_ENV=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/set_env_c538f098-72ee-4a23-ade5-2a38fab2cff5 2025-08-14T21:33:15.4299480Z GITHUB_EVENT_PATH=/home/ec2-user/actions-runner/_work/_temp/_github_workflow/event.json 2025-08-14T21:33:15.4299796Z GITHUB_EVENT_NAME=schedule 2025-08-14T21:33:15.4299987Z DASHBOARD_TAG= 2025-08-14T21:33:15.4300160Z GITHUB_RUN_ID=16976338999 2025-08-14T21:33:15.4300342Z INSTALLED_OPENBLAS= 2025-08-14T21:33:15.4300756Z GITHUB_STEP_SUMMARY=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/step_summary_c538f098-72ee-4a23-ade5-2a38fab2cff5 2025-08-14T21:33:15.4301199Z GITHUB_ACTOR=pytorchmergebot 2025-08-14T21:33:15.4301397Z PR_NUMBER= 2025-08-14T21:33:15.4301554Z DESIRED_CUDA= 2025-08-14T21:33:15.4301722Z GITHUB_RUN_ATTEMPT=1 2025-08-14T21:33:15.4301899Z VALGRIND=ON 2025-08-14T21:33:15.4302062Z ANACONDA_PYTHON_VERSION=3.9 2025-08-14T21:33:15.4302299Z GITHUB_GRAPHQL_URL=https://api.github.com/graphql 2025-08-14T21:33:15.4302538Z TERM=vt100 2025-08-14T21:33:15.4302901Z INSTALLED_VISION=yes 2025-08-14T21:33:15.4303083Z BRANCH=main 2025-08-14T21:33:15.4303253Z SCCACHE_REGION=us-east-1 2025-08-14T21:33:15.4303449Z OPENSSL_ROOT_DIR=/opt/openssl 2025-08-14T21:33:15.4303655Z CUDA_PATH=/usr/local/cuda 2025-08-14T21:33:15.4304007Z GITHUB_ACTION_PATH=/home/ec2-user/actions-runner/_work/pytorch/pytorch/./.github/actions/setup-linux 2025-08-14T21:33:15.4304378Z GITHUB_SERVER_URL=https://github.com 2025-08-14T21:33:15.4304595Z UCC_COMMIT= 2025-08-14T21:33:15.4304757Z REENABLED_ISSUES= 2025-08-14T21:33:15.4304916Z DOCS=yes 2025-08-14T21:33:15.4305069Z SHLVL=1 2025-08-14T21:33:15.4305328Z MAX_JOBS=30 2025-08-14T21:33:15.4305489Z GITHUB_ACTOR_ID=97764156 2025-08-14T21:33:15.4305795Z GITHUB_WORKFLOW_SHA=1fc683cf17c8c673044538d10266c00f92987be2 2025-08-14T21:33:15.4306056Z GITHUB_REF_NAME=main 2025-08-14T21:33:15.4306328Z XLA_CLANG_CACHE_S3_BUCKET_NAME=ossci-compiler-clang-cache-circleci-xla 2025-08-14T21:33:15.4306622Z GITHUB_JOB=test 2025-08-14T21:33:15.4306794Z NO_TEST_TIMEOUT=False 2025-08-14T21:33:15.4306976Z TD_DISTRIBUTED=False 2025-08-14T21:33:15.4307161Z GITHUB_REPOSITORY=pytorch/pytorch 2025-08-14T21:33:15.4307377Z GITHUB_RETENTION_DAYS=90 2025-08-14T21:33:15.4307568Z OPENSSL_DIR=/opt/openssl 2025-08-14T21:33:15.4307752Z GITHUB_ACTION_REPOSITORY= 2025-08-14T21:33:15.4308250Z PATH=/opt/cache/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/opt/conda/envs/py_3.9/bin:/opt/conda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin 2025-08-14T21:33:15.4308740Z GITHUB_BASE_REF= 2025-08-14T21:33:15.4308927Z INSTALLED_ACL= 2025-08-14T21:33:15.4309257Z ARTIFACTS_FILE_SUFFIX=test-cpu_inductor_amp_freezing_huggingface-1-1-linux.8xlarge.amx_48128261038 2025-08-14T21:33:15.4309611Z CI=true 2025-08-14T21:33:15.4309781Z GITHUB_REPOSITORY_OWNER=pytorch 2025-08-14T21:33:15.4310033Z RUST_LOG=sccache::server=error 2025-08-14T21:33:15.4310229Z JOB_ID=48128261038 2025-08-14T21:33:15.4310395Z GITHUB_HEAD_REF= 2025-08-14T21:33:15.4310559Z GITHUB_ACTION_REF= 2025-08-14T21:33:15.4310773Z SCCACHE_BUCKET=ossci-compiler-cache-circleci-v2 2025-08-14T21:33:15.4311017Z TEST_SHOWLOCALS=False 2025-08-14T21:33:15.4311204Z GITHUB_WORKFLOW=inductor-periodic 2025-08-14T21:33:15.4311423Z DEBIAN_FRONTEND=noninteractive 2025-08-14T21:33:15.4311830Z GITHUB_OUTPUT=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/set_output_c538f098-72ee-4a23-ade5-2a38fab2cff5 2025-08-14T21:33:15.4312222Z NO_TD=False 2025-08-14T21:33:15.4312397Z SKIP_SCCACHE_INITIALIZATION=1 2025-08-14T21:33:15.4312619Z NCCL_INCLUDE_DIR=/usr/local/cuda/include/ 2025-08-14T21:33:15.4312836Z _=/usr/bin/env 2025-08-14T21:33:15.4313000Z + echo 'Testing pytorch' 2025-08-14T21:33:15.4313191Z Testing pytorch 2025-08-14T21:33:15.4313395Z + export LANG=C.UTF-8 2025-08-14T21:33:15.4313567Z + LANG=C.UTF-8 2025-08-14T21:33:15.4313747Z + PR_NUMBER= 2025-08-14T21:33:15.4313969Z + [[ cpu_inductor_amp_freezing_huggingface == \d\e\f\a\u\l\t ]] 2025-08-14T21:33:15.4314288Z + [[ cpu_inductor_amp_freezing_huggingface == \d\i\s\t\r\i\b\u\t\e\d ]] 2025-08-14T21:33:15.4314601Z + [[ cpu_inductor_amp_freezing_huggingface == \s\l\o\w ]] 2025-08-14T21:33:15.4314895Z + [[ linux-jammy-py3.9-gcc11-build == *slow-gradcheck* ]] 2025-08-14T21:33:15.4315166Z + [[ linux-jammy-py3.9-gcc11-build == *cuda* ]] 2025-08-14T21:33:15.4315420Z + [[ linux-jammy-py3.9-gcc11-build == *rocm* ]] 2025-08-14T21:33:15.4315674Z + [[ linux-jammy-py3.9-gcc11-build == *xpu* ]] 2025-08-14T21:33:15.4315946Z + [[ cpu_inductor_amp_freezing_huggingface == *crossref* ]] 2025-08-14T21:33:15.4316207Z + [[ linux-jammy-py3.9-gcc11-build == *rocm* ]] 2025-08-14T21:33:15.4316454Z + [[ linux-jammy-py3.9-gcc11-build == *xpu* ]] 2025-08-14T21:33:15.4316718Z + [[ linux-jammy-py3.9-gcc11-build != *-bazel-* ]] 2025-08-14T21:33:15.4316960Z + pip_install ninja==1.10.2 2025-08-14T21:33:15.4317222Z + pip_install_pkg='python3 -m pip install --progress-bar off' 2025-08-14T21:33:15.4317538Z + python3 -m pip install --progress-bar off ninja==1.10.2 2025-08-14T21:33:15.8209352Z Collecting ninja==1.10.2 2025-08-14T21:33:15.8330976Z Downloading ninja-1.10.2-py2.py3-none-manylinux_2_5_x86_64.manylinux1_x86_64.whl.metadata (5.0 kB) 2025-08-14T21:33:15.8451419Z Downloading ninja-1.10.2-py2.py3-none-manylinux_2_5_x86_64.manylinux1_x86_64.whl (108 kB) 2025-08-14T21:33:16.6572152Z Installing collected packages: ninja 2025-08-14T21:33:16.6576014Z Attempting uninstall: ninja 2025-08-14T21:33:16.6584746Z Found existing installation: ninja 1.11.1.3 2025-08-14T21:33:16.6598644Z Uninstalling ninja-1.11.1.3: 2025-08-14T21:33:16.6688436Z Successfully uninstalled ninja-1.11.1.3 2025-08-14T21:33:16.9163894Z Successfully installed ninja-1.10.2 2025-08-14T21:33:17.0228554Z + export PATH=/var/lib/jenkins/.local/bin:/opt/cache/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/opt/conda/envs/py_3.9/bin:/opt/conda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin 2025-08-14T21:33:17.0229594Z + PATH=/var/lib/jenkins/.local/bin:/opt/cache/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/opt/conda/envs/py_3.9/bin:/opt/conda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin 2025-08-14T21:33:17.0230223Z + [[ linux-jammy-py3.9-gcc11-build == *aarch64* ]] 2025-08-14T21:33:17.0230513Z + [[ linux-jammy-py3.9-gcc11-build == *asan* ]] 2025-08-14T21:33:17.0230796Z + [[ linux-jammy-py3.9-gcc11-build == *-debug* ]] 2025-08-14T21:33:17.0231073Z + [[ linux-jammy-py3.9-gcc11-build != *-bazel-* ]] 2025-08-14T21:33:17.0231442Z + echo 'We are not in debug mode: linux-jammy-py3.9-gcc11-build. Expect the assertion to pass' 2025-08-14T21:33:17.0231927Z We are not in debug mode: linux-jammy-py3.9-gcc11-build. Expect the assertion to pass 2025-08-14T21:33:17.0237393Z + cd test 2025-08-14T21:33:17.0237674Z + python -c 'import torch; torch._C._crash_if_debug_asserts_fail(424242)' 2025-08-14T21:33:18.3258435Z + [[ cpu_inductor_amp_freezing_huggingface == \n\o\g\p\u\_\N\O\_\A\V\X\2 ]] 2025-08-14T21:33:18.3258857Z + [[ cpu_inductor_amp_freezing_huggingface == \n\o\g\p\u\_\A\V\X\5\1\2 ]] 2025-08-14T21:33:18.3259237Z + [[ cpu_inductor_amp_freezing_huggingface == \l\e\g\a\c\y\_\n\v\i\d\i\a\_\d\r\i\v\e\r ]] 2025-08-14T21:33:18.3259595Z + DYNAMO_BENCHMARK_FLAGS=() 2025-08-14T21:33:18.3259905Z + [[ cpu_inductor_amp_freezing_huggingface == *pr_time_benchmarks* ]] 2025-08-14T21:33:18.3260315Z + [[ cpu_inductor_amp_freezing_huggingface == *dynamo_eager* ]] 2025-08-14T21:33:18.3260622Z + [[ cpu_inductor_amp_freezing_huggingface == *aot_eager* ]] 2025-08-14T21:33:18.3260915Z + [[ cpu_inductor_amp_freezing_huggingface == *aot_inductor* ]] 2025-08-14T21:33:18.3261245Z + [[ cpu_inductor_amp_freezing_huggingface == *max_autotune_inductor* ]] 2025-08-14T21:33:18.3261611Z + [[ cpu_inductor_amp_freezing_huggingface == *inductor* ]] 2025-08-14T21:33:18.3261951Z + [[ cpu_inductor_amp_freezing_huggingface != *perf* ]] 2025-08-14T21:33:18.3262238Z + DYNAMO_BENCHMARK_FLAGS+=(--inductor) 2025-08-14T21:33:18.3262506Z + [[ cpu_inductor_amp_freezing_huggingface == *dynamic* ]] 2025-08-14T21:33:18.3262787Z + [[ cpu_inductor_amp_freezing_huggingface == *cpu* ]] 2025-08-14T21:33:18.3263044Z + DYNAMO_BENCHMARK_FLAGS+=(--device cpu) 2025-08-14T21:33:18.3364437Z + [[ linux-jammy-py3.9-gcc11-build == *libtorch* ]] 2025-08-14T21:33:18.3364792Z + [[ linux-jammy-py3.9-gcc11-build == *-bazel-* ]] 2025-08-14T21:33:18.3368827Z + cd test 2025-08-14T21:33:18.3369544Z + python -c 'import torch; print(torch.__config__.show())' 2025-08-14T21:33:19.3642839Z PyTorch built with: 2025-08-14T21:33:19.3643118Z - GCC 11.4 2025-08-14T21:33:19.3643302Z - C++ Version: 201703 2025-08-14T21:33:19.3643699Z - Intel(R) oneAPI Math Kernel Library Version 2024.2-Product Build 20240605 for Intel(R) 64 architecture applications 2025-08-14T21:33:19.3644223Z - Intel(R) MKL-DNN v3.7.1 (Git Hash 8d263e693366ef8db40acc569cc7d8edf644556d) 2025-08-14T21:33:19.3644532Z - OpenMP 201511 (a.k.a. OpenMP 4.5) 2025-08-14T21:33:19.3644781Z - LAPACK is enabled (usually provided by MKL) 2025-08-14T21:33:19.3645015Z - NNPACK is enabled 2025-08-14T21:33:19.3645219Z - CPU capability usage: AVX512 2025-08-14T21:33:19.3648557Z - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, COMMIT_SHA=1fc683cf17c8c673044538d10266c00f92987be2, CXX_COMPILER=/opt/cache/bin/c++, CXX_FLAGS= -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -DNDEBUG -DUSE_KINETO -DLIBKINETO_NOCUPTI -DLIBKINETO_NOROCTRACER -DLIBKINETO_NOXPUPTI=ON -DUSE_FBGEMM -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -O2 -fPIC -DC10_NODEPRECATED -Wall -Wextra -Werror=return-type -Werror=non-virtual-dtor -Werror=range-loop-construct -Werror=bool-operation -Wnarrowing -Wno-missing-field-initializers -Wno-unknown-pragmas -Wno-unused-parameter -Wno-strict-overflow -Wno-strict-aliasing -Wno-stringop-overflow -Wsuggest-override -Wno-psabi -Wno-error=old-style-cast -faligned-new -Werror -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, TORCH_VERSION=2.9.0, USE_CUDA=OFF, USE_CUDNN=OFF, USE_CUSPARSELT=OFF, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_GLOO=ON, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=OFF, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF, USE_ROCM_KERNEL_ASSERT=OFF, USE_XCCL=OFF, USE_XPU=OFF, 2025-08-14T21:33:19.3651943Z 2025-08-14T21:33:19.6210866Z + cd test 2025-08-14T21:33:19.6211246Z + python -c 'import torch; print(torch.__config__.parallel_info())' 2025-08-14T21:33:20.6810394Z ATen/Parallel: 2025-08-14T21:33:20.6820196Z at::get_num_threads() : 16 2025-08-14T21:33:20.6825538Z at::get_num_interop_threads() : 16 2025-08-14T21:33:20.6829762Z OpenMP 201511 (a.k.a. OpenMP 4.5) 2025-08-14T21:33:20.6831586Z omp_get_max_threads() : 16 2025-08-14T21:33:20.6832047Z Intel(R) oneAPI Math Kernel Library Version 2024.2-Product Build 20240605 for Intel(R) 64 architecture applications 2025-08-14T21:33:20.6832430Z mkl_get_max_threads() : 16 2025-08-14T21:33:20.6832717Z Intel(R) MKL-DNN v3.7.1 (Git Hash 8d263e693366ef8db40acc569cc7d8edf644556d) 2025-08-14T21:33:20.6833158Z std::thread::hardware_concurrency() : 32 2025-08-14T21:33:20.6833390Z Environment variables: 2025-08-14T21:33:20.6833591Z OMP_NUM_THREADS : [not set] 2025-08-14T21:33:20.6833795Z MKL_NUM_THREADS : [not set] 2025-08-14T21:33:20.6833991Z ATen parallel backend: OpenMP 2025-08-14T21:33:20.6834130Z 2025-08-14T21:33:20.9422793Z + [[ cpu_inductor_amp_freezing_huggingface == *numpy_2* ]] 2025-08-14T21:33:20.9423159Z + [[ linux-jammy-py3.9-gcc11-build == *aarch64* ]] 2025-08-14T21:33:20.9423463Z + [[ cpu_inductor_amp_freezing_huggingface == *backward* ]] 2025-08-14T21:33:20.9423762Z + [[ cpu_inductor_amp_freezing_huggingface == *xla* ]] 2025-08-14T21:33:20.9424061Z + [[ cpu_inductor_amp_freezing_huggingface == *executorch* ]] 2025-08-14T21:33:20.9424437Z + [[ cpu_inductor_amp_freezing_huggingface == \j\i\t\_\l\e\g\a\c\y ]] 2025-08-14T21:33:20.9424794Z + [[ linux-jammy-py3.9-gcc11-build == *libtorch* ]] 2025-08-14T21:33:20.9425083Z + [[ cpu_inductor_amp_freezing_huggingface == distributed ]] 2025-08-14T21:33:20.9425401Z + [[ cpu_inductor_amp_freezing_huggingface == *operator_benchmark* ]] 2025-08-14T21:33:20.9425765Z + [[ cpu_inductor_amp_freezing_huggingface == *inductor_distributed* ]] 2025-08-14T21:33:20.9426104Z + [[ cpu_inductor_amp_freezing_huggingface == *inductor-halide* ]] 2025-08-14T21:33:20.9426468Z + [[ cpu_inductor_amp_freezing_huggingface == *inductor-triton-cpu* ]] 2025-08-14T21:33:20.9426837Z + [[ cpu_inductor_amp_freezing_huggingface == *inductor-micro-benchmark* ]] 2025-08-14T21:33:20.9427182Z + [[ cpu_inductor_amp_freezing_huggingface == *huggingface* ]] 2025-08-14T21:33:20.9427465Z + install_torchvision 2025-08-14T21:33:20.9427658Z + local orig_preload 2025-08-14T21:33:20.9427841Z + local commit 2025-08-14T21:33:20.9428027Z ++ get_pinned_commit vision 2025-08-14T21:33:20.9428250Z ++ cat .github/ci_commit_pins/vision.txt 2025-08-14T21:33:20.9449845Z + commit=966da7e46f65d6d49df3e31214470a4fe5cc8e66 2025-08-14T21:33:20.9450256Z + orig_preload= 2025-08-14T21:33:20.9450536Z + '[' -n '' ']' 2025-08-14T21:33:20.9450772Z + [[ linux-jammy-py3.9-gcc11-build == *cuda* ]] 2025-08-14T21:33:20.9451388Z + pip_build_and_install git+https://github.com/pytorch/vision.git@966da7e46f65d6d49df3e31214470a4fe5cc8e66 dist/vision 2025-08-14T21:33:20.9452105Z + local build_target=git+https://github.com/pytorch/vision.git@966da7e46f65d6d49df3e31214470a4fe5cc8e66 2025-08-14T21:33:20.9452495Z + local wheel_dir=dist/vision 2025-08-14T21:33:20.9453134Z + local found_whl=0 2025-08-14T21:33:20.9453380Z + for file in "${wheel_dir}"/*.whl 2025-08-14T21:33:20.9453743Z + [[ -f dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl ]] 2025-08-14T21:33:20.9454065Z + found_whl=1 2025-08-14T21:33:20.9454237Z + break 2025-08-14T21:33:20.9454735Z + '[' 1 == 0 ']' 2025-08-14T21:33:20.9455055Z + for file in "${wheel_dir}"/*.whl 2025-08-14T21:33:20.9455400Z + pip_install_whl dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl 2025-08-14T21:33:20.9455834Z + args=('dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl') 2025-08-14T21:33:20.9456146Z + local args 2025-08-14T21:33:20.9458101Z + [[ dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl == *\ * ]] 2025-08-14T21:33:20.9458516Z + for path in "${args[@]}" 2025-08-14T21:33:20.9458857Z + echo 'Installing dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl' 2025-08-14T21:33:20.9459370Z Installing dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl 2025-08-14T21:33:20.9459888Z + python3 -mpip install --no-index --no-deps dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl 2025-08-14T21:33:21.2627595Z Processing ./dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl 2025-08-14T21:33:21.2709877Z Installing collected packages: torchvision 2025-08-14T21:33:21.6719605Z Successfully installed torchvision-0.22.0a0+966da7e 2025-08-14T21:33:21.7180115Z + '[' -n '' ']' 2025-08-14T21:33:21.7180370Z + id=0 2025-08-14T21:33:21.7180556Z + test_dynamo_benchmark huggingface 0 2025-08-14T21:33:21.7180823Z ++ pwd 2025-08-14T21:33:21.7181208Z + TEST_REPORTS_DIR=/var/lib/jenkins/workspace/test/test-reports 2025-08-14T21:33:21.7181645Z + local suite=huggingface 2025-08-14T21:33:21.7181964Z + shift 2025-08-14T21:33:21.7182207Z + local shard_id=0 2025-08-14T21:33:21.7182493Z + shift 2025-08-14T21:33:21.7182813Z + [[ cpu_inductor_amp_freezing_huggingface == *perf_compare* ]] 2025-08-14T21:33:21.7183267Z + [[ cpu_inductor_amp_freezing_huggingface == *perf* ]] 2025-08-14T21:33:21.7183554Z + [[ cpu_inductor_amp_freezing_huggingface == *cpu* ]] 2025-08-14T21:33:21.7183817Z + local dt=float32 2025-08-14T21:33:21.7184014Z + [[ cpu_inductor_amp_freezing_huggingface == *amp* ]] 2025-08-14T21:33:21.7184280Z + dt=amp 2025-08-14T21:33:21.7184506Z + [[ cpu_inductor_amp_freezing_huggingface == *freezing* ]] 2025-08-14T21:33:21.7184886Z + test_single_dynamo_benchmark inference huggingface 0 --inference --amp --freezing 2025-08-14T21:33:21.7188225Z ++ pwd 2025-08-14T21:33:21.7188615Z + TEST_REPORTS_DIR=/var/lib/jenkins/workspace/test/test-reports 2025-08-14T21:33:21.7189092Z + mkdir -p /var/lib/jenkins/workspace/test/test-reports 2025-08-14T21:33:21.7205536Z + local name=inference 2025-08-14T21:33:21.7205773Z + shift 2025-08-14T21:33:21.7205955Z + local suite=huggingface 2025-08-14T21:33:21.7206151Z + shift 2025-08-14T21:33:21.7206302Z + local shard_id=0 2025-08-14T21:33:21.7206472Z + shift 2025-08-14T21:33:21.7206633Z + partition_flags=() 2025-08-14T21:33:21.7206819Z + local partition_flags 2025-08-14T21:33:21.7207036Z + [[ -n 1 ]] 2025-08-14T21:33:21.7207197Z + [[ -n 0 ]] 2025-08-14T21:33:21.7207480Z + partition_flags=(--total-partitions "$NUM_TEST_SHARDS" --partition-id "$shard_id") 2025-08-14T21:33:21.7207876Z + [[ cpu_inductor_amp_freezing_huggingface == *perf_compare* ]] 2025-08-14T21:33:21.7208239Z + [[ cpu_inductor_amp_freezing_huggingface == *perf* ]] 2025-08-14T21:33:21.7208516Z + [[ cpu_inductor_amp_freezing_huggingface == *_avx2* ]] 2025-08-14T21:33:21.7209035Z + [[ cpu_inductor_amp_freezing_huggingface == *_avx512* ]] 2025-08-14T21:33:21.7209887Z + python benchmarks/dynamo/huggingface.py --ci --accuracy --timing --explain --print-compilation-time --inductor --device cpu --inference --amp --freezing --total-partitions 1 --partition-id 0 --output /var/lib/jenkins/workspace/test/test-reports/inference_huggingface.csv 2025-08-14T21:33:25.4690921Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:33:25.4691895Z from pkg_resources import resource_filename 2025-08-14T21:33:25.8885724Z 2025-08-14T21:33:25.8924133Z config.json: 0% 0.00/694 [00:00bcxy", (query, key)) # multiply 2025-08-14T21:35:30.9779030Z 2025-08-14T21:35:30.9779126Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9779355Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9779629Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9780194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9780722Z layer_outputs = layer_module( 2025-08-14T21:35:30.9781100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9781510Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9781968Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9782413Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9782867Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9783316Z self_outputs = self.self( 2025-08-14T21:35:30.9783746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:30.9784226Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:30.9784864Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:30.9785517Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:30.9785784Z 2025-08-14T21:35:30.9785905Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9786465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9787014Z layer_outputs = layer_module( 2025-08-14T21:35:30.9787401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9787796Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9788246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9788710Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9789165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9789601Z self_outputs = self.self( 2025-08-14T21:35:30.9790031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:30.9790519Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:30.9791056Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:30.9791680Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:30.9791950Z 2025-08-14T21:35:30.9792064Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9792639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9793169Z layer_outputs = layer_module( 2025-08-14T21:35:30.9793536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9793932Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9794381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9794845Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9795287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9795733Z self_outputs = self.self( 2025-08-14T21:35:30.9796155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:30.9796637Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:30.9797164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:30.9797796Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:30.9798051Z 2025-08-14T21:35:30.9798145Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9798368Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9798595Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9798819Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9799077Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9799633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9800277Z layer_outputs = layer_module( 2025-08-14T21:35:30.9800647Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9801034Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9801491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9801960Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9802414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9803588Z self_outputs = self.self( 2025-08-14T21:35:30.9804131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 536, in forward 2025-08-14T21:35:30.9804640Z diagonal_mask = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:30.9805222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 834, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:30.9805821Z self._mask_invalid_locations(diagonal_attention_scores, window_overlap) 2025-08-14T21:35:30.9806398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 762, in _mask_invalid_locations 2025-08-14T21:35:30.9806985Z input_tensor[:, :affected_seq_len, :, : affected_seq_len + 1] = torch.full_like( 2025-08-14T21:35:30.9807236Z 2025-08-14T21:35:30.9807339Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9807610Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9808184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9810130Z layer_outputs = layer_module( 2025-08-14T21:35:30.9810722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9811173Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9811805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9812282Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9812905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9813379Z self_outputs = self.self( 2025-08-14T21:35:30.9814005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-08-14T21:35:30.9814545Z attn_scores += diagonal_mask 2025-08-14T21:35:30.9814681Z 2025-08-14T21:35:30.9814798Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9815565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9816177Z layer_outputs = layer_module( 2025-08-14T21:35:30.9816564Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9816968Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9817418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9818130Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9818599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9819044Z self_outputs = self.self( 2025-08-14T21:35:30.9819483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-08-14T21:35:30.9820406Z attn_probs = nn.functional.softmax( 2025-08-14T21:35:30.9820560Z 2025-08-14T21:35:30.9820650Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9820886Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9821145Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9821700Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9822355Z layer_outputs = layer_module( 2025-08-14T21:35:30.9822799Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9823196Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9823650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9824107Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9824553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9824995Z self_outputs = self.self( 2025-08-14T21:35:30.9825440Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:30.9825961Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:30.9826745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:30.9827440Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-08-14T21:35:30.9827890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:30.9828266Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:30.9828431Z 2025-08-14T21:35:30.9828553Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9829102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9829634Z layer_outputs = layer_module( 2025-08-14T21:35:30.9830013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9830465Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9830897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9831353Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9831941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9832387Z self_outputs = self.self( 2025-08-14T21:35:30.9832819Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:30.9833326Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:30.9833906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:30.9834495Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-08-14T21:35:30.9835054Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-08-14T21:35:30.9835560Z chunked_hidden_states = nn.functional.pad( 2025-08-14T21:35:30.9835930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:30.9836387Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:30.9836562Z 2025-08-14T21:35:30.9836674Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9837230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9837755Z layer_outputs = layer_module( 2025-08-14T21:35:30.9838113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9838495Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9838930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9839362Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9839799Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9840237Z self_outputs = self.self( 2025-08-14T21:35:30.9840655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:30.9841126Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:30.9841691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:30.9842300Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:30.9842522Z 2025-08-14T21:35:30.9842642Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9843190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9843715Z layer_outputs = layer_module( 2025-08-14T21:35:30.9844095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9844491Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9844937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9845390Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9845844Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9846279Z self_outputs = self.self( 2025-08-14T21:35:30.9846711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:30.9847210Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:30.9847787Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:30.9848385Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:30.9848716Z 2025-08-14T21:35:30.9848869Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9849447Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9849976Z layer_outputs = layer_module( 2025-08-14T21:35:30.9850349Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9850752Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9851212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9851713Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9852191Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9852635Z self_outputs = self.self( 2025-08-14T21:35:30.9853066Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-08-14T21:35:30.9853631Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-08-14T21:35:30.9853894Z 2025-08-14T21:35:30.9853983Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9854214Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9854441Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9854656Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9854878Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9855100Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9855349Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9855910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9856431Z layer_outputs = layer_module( 2025-08-14T21:35:30.9856804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9857191Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9857639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9858088Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9858527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9858973Z self_outputs = self.self( 2025-08-14T21:35:30.9859410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-08-14T21:35:30.9859871Z query_vectors = self.query(hidden_states) 2025-08-14T21:35:30.9860020Z 2025-08-14T21:35:30.9860103Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9860325Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9860576Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9861128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9861643Z layer_outputs = layer_module( 2025-08-14T21:35:30.9862018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9862411Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9862859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9863325Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9863761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9864191Z self_outputs = self.self( 2025-08-14T21:35:30.9864608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:30.9865092Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:30.9865634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:30.9866265Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:30.9866527Z 2025-08-14T21:35:30.9866664Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9866937Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9867190Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9867732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9868238Z layer_outputs = layer_module( 2025-08-14T21:35:30.9868603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9868981Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9869412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9869851Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9870285Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9870722Z self_outputs = self.self( 2025-08-14T21:35:30.9871136Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:30.9871605Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:30.9872125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:30.9872730Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:30.9872990Z 2025-08-14T21:35:30.9873098Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9873636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9874155Z layer_outputs = layer_module( 2025-08-14T21:35:30.9874525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9874925Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9875361Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9875802Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9876229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9876658Z self_outputs = self.self( 2025-08-14T21:35:30.9877075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:30.9877542Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:30.9878065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:30.9878673Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:30.9878922Z 2025-08-14T21:35:30.9879039Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9879580Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9880079Z layer_outputs = layer_module( 2025-08-14T21:35:30.9880443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9880819Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9881257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9881742Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9882209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9882646Z self_outputs = self.self( 2025-08-14T21:35:30.9883080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:30.9883545Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:30.9884061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:30.9884674Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:30.9884941Z 2025-08-14T21:35:30.9885025Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9885254Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9885518Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9886069Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9886589Z layer_outputs = layer_module( 2025-08-14T21:35:30.9886965Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9887350Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9887799Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9888246Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9888793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9889246Z self_outputs = self.self( 2025-08-14T21:35:30.9889691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-08-14T21:35:30.9890141Z attn_scores += diagonal_mask 2025-08-14T21:35:30.9890274Z 2025-08-14T21:35:30.9890397Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9890962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9891493Z layer_outputs = layer_module( 2025-08-14T21:35:30.9891878Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9892272Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9892724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9893183Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9893640Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9894078Z self_outputs = self.self( 2025-08-14T21:35:30.9894508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-08-14T21:35:30.9894965Z attn_probs = nn.functional.softmax( 2025-08-14T21:35:30.9895110Z 2025-08-14T21:35:30.9895202Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9895428Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9895680Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9896236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9896767Z layer_outputs = layer_module( 2025-08-14T21:35:30.9897222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9897623Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9898078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9898526Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9898976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9899434Z self_outputs = self.self( 2025-08-14T21:35:30.9899859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:30.9900357Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:30.9900936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:30.9901579Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-08-14T21:35:30.9902036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:30.9902419Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:30.9902819Z 2025-08-14T21:35:30.9902938Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9903486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9903992Z layer_outputs = layer_module( 2025-08-14T21:35:30.9904365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9904752Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9905204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9905638Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9906080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9906527Z self_outputs = self.self( 2025-08-14T21:35:30.9906937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:30.9907420Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:30.9907980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:30.9908562Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-08-14T21:35:30.9909090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-08-14T21:35:30.9909593Z chunked_hidden_states = nn.functional.pad( 2025-08-14T21:35:30.9909950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:30.9910315Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:30.9910473Z 2025-08-14T21:35:30.9910584Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9911125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9911645Z layer_outputs = layer_module( 2025-08-14T21:35:30.9912011Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9912387Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9912997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9913450Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9913896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9914354Z self_outputs = self.self( 2025-08-14T21:35:30.9914784Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:30.9915279Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:30.9915839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:30.9916445Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:30.9916675Z 2025-08-14T21:35:30.9916791Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9917350Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9917871Z layer_outputs = layer_module( 2025-08-14T21:35:30.9918250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9918645Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9919096Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9919543Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9919997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9920446Z self_outputs = self.self( 2025-08-14T21:35:30.9920874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:30.9921366Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:30.9921929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:30.9922539Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:30.9922761Z 2025-08-14T21:35:30.9922889Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9923444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9923986Z layer_outputs = layer_module( 2025-08-14T21:35:30.9924369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9924780Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9925250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9925721Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9926189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9926650Z self_outputs = self.self( 2025-08-14T21:35:30.9927089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-08-14T21:35:30.9927672Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-08-14T21:35:30.9927932Z 2025-08-14T21:35:30.9928024Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9928331Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9928670Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9928914Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9929132Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9929364Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9929619Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9930182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9930721Z layer_outputs = layer_module( 2025-08-14T21:35:30.9931091Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9931476Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9931914Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9932368Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9932827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9933280Z self_outputs = self.self( 2025-08-14T21:35:30.9933715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-08-14T21:35:30.9934180Z query_vectors = self.query(hidden_states) 2025-08-14T21:35:30.9934341Z 2025-08-14T21:35:30.9934428Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9934640Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9934887Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9935430Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9935969Z layer_outputs = layer_module( 2025-08-14T21:35:30.9936343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9936741Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9937196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9937640Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9938092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9938555Z self_outputs = self.self( 2025-08-14T21:35:30.9938992Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:30.9939471Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:30.9940023Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:30.9940674Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:30.9940940Z 2025-08-14T21:35:30.9941034Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9941256Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9941510Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9942073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9942599Z layer_outputs = layer_module( 2025-08-14T21:35:30.9942986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9943382Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9943932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9944377Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9944822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9945268Z self_outputs = self.self( 2025-08-14T21:35:30.9945689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:30.9946168Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:30.9946707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:30.9947334Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:30.9947595Z 2025-08-14T21:35:30.9947705Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9948264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9948789Z layer_outputs = layer_module( 2025-08-14T21:35:30.9949169Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9949541Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9949979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9950413Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9950848Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9951268Z self_outputs = self.self( 2025-08-14T21:35:30.9951697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:30.9952183Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:30.9952701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:30.9953300Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:30.9953558Z 2025-08-14T21:35:30.9953667Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9954200Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9954701Z layer_outputs = layer_module( 2025-08-14T21:35:30.9955060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9955452Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9955902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9956355Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9956811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9957248Z self_outputs = self.self( 2025-08-14T21:35:30.9957662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:30.9958115Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:30.9958637Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:30.9959327Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:30.9959579Z 2025-08-14T21:35:30.9959674Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9959901Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9960160Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9960714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9961239Z layer_outputs = layer_module( 2025-08-14T21:35:30.9961630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9962038Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9962507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9962980Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9963428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9963875Z self_outputs = self.self( 2025-08-14T21:35:30.9964303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-08-14T21:35:30.9964774Z attn_scores += diagonal_mask 2025-08-14T21:35:30.9964917Z 2025-08-14T21:35:30.9965031Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9965597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9966131Z layer_outputs = layer_module( 2025-08-14T21:35:30.9966515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9966930Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9967392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9967852Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9968309Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9968854Z self_outputs = self.self( 2025-08-14T21:35:30.9969290Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-08-14T21:35:30.9969753Z attn_probs = nn.functional.softmax( 2025-08-14T21:35:30.9969909Z 2025-08-14T21:35:30.9969995Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9970226Z cudagraph partition due to non gpu ops 2025-08-14T21:35:30.9970478Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9971055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9971579Z layer_outputs = layer_module( 2025-08-14T21:35:30.9971959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9972350Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9972798Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9973258Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9973697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9974152Z self_outputs = self.self( 2025-08-14T21:35:30.9974646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:30.9975170Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:30.9975728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:30.9976367Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-08-14T21:35:30.9976823Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:30.9977200Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:30.9977365Z 2025-08-14T21:35:30.9977477Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9978031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9978553Z layer_outputs = layer_module( 2025-08-14T21:35:30.9978929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9979314Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9979768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9980219Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9980660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9981110Z self_outputs = self.self( 2025-08-14T21:35:30.9981523Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:30.9982000Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:30.9982547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:30.9983114Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-08-14T21:35:30.9983654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-08-14T21:35:30.9984158Z chunked_hidden_states = nn.functional.pad( 2025-08-14T21:35:30.9984519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:30.9984893Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:30.9985055Z 2025-08-14T21:35:30.9985173Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9985723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9986238Z layer_outputs = layer_module( 2025-08-14T21:35:30.9986602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9986982Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9987412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9987849Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9988296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9988739Z self_outputs = self.self( 2025-08-14T21:35:30.9989149Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:30.9989639Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:30.9990306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:30.9990899Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:30.9991116Z 2025-08-14T21:35:30.9991226Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9991764Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9992274Z layer_outputs = layer_module( 2025-08-14T21:35:30.9992638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9993012Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9993454Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9993905Z self_attn_outputs = self.attention( 2025-08-14T21:35:30.9994335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:30.9994769Z self_outputs = self.self( 2025-08-14T21:35:30.9995183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:30.9995658Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:30.9996201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:30.9996815Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:30.9997040Z 2025-08-14T21:35:30.9997152Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:30.9997710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:30.9998237Z layer_outputs = layer_module( 2025-08-14T21:35:30.9998598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:30.9998989Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:30.9999436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:30.9999888Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0000340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0000788Z self_outputs = self.self( 2025-08-14T21:35:31.0001216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-08-14T21:35:31.0001794Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-08-14T21:35:31.0002057Z 2025-08-14T21:35:31.0002146Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0002378Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0002739Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0002979Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0003204Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0003422Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0003681Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0004249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0004783Z layer_outputs = layer_module( 2025-08-14T21:35:31.0005306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0005703Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0006163Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0006611Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0007065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0007517Z self_outputs = self.self( 2025-08-14T21:35:31.0007950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-08-14T21:35:31.0008408Z query_vectors = self.query(hidden_states) 2025-08-14T21:35:31.0008618Z 2025-08-14T21:35:31.0008717Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0008955Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0009211Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0009764Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0010286Z layer_outputs = layer_module( 2025-08-14T21:35:31.0010665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0011050Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0011503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0011954Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0012401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0012843Z self_outputs = self.self( 2025-08-14T21:35:31.0013278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0013761Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0014288Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0014920Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0015181Z 2025-08-14T21:35:31.0015265Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0015488Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0015735Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0016292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0016817Z layer_outputs = layer_module( 2025-08-14T21:35:31.0017194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0017587Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0018032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0018495Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0018923Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0019379Z self_outputs = self.self( 2025-08-14T21:35:31.0019797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0020263Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0020884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0021493Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0021752Z 2025-08-14T21:35:31.0021862Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0022403Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0022918Z layer_outputs = layer_module( 2025-08-14T21:35:31.0023299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0023702Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0024165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0024627Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0025101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0025556Z self_outputs = self.self( 2025-08-14T21:35:31.0025978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0026447Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0026993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0027650Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0027910Z 2025-08-14T21:35:31.0028021Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0028600Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0029135Z layer_outputs = layer_module( 2025-08-14T21:35:31.0029516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0029892Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0030336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0030777Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0031217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0031641Z self_outputs = self.self( 2025-08-14T21:35:31.0032063Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0032534Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0033051Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0033661Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0033920Z 2025-08-14T21:35:31.0034003Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0034228Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0034468Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0035026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0035550Z layer_outputs = layer_module( 2025-08-14T21:35:31.0035978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0036412Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0036866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0037301Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0037727Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0038180Z self_outputs = self.self( 2025-08-14T21:35:31.0038608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-08-14T21:35:31.0039054Z attn_scores += diagonal_mask 2025-08-14T21:35:31.0039185Z 2025-08-14T21:35:31.0039297Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0039855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0040380Z layer_outputs = layer_module( 2025-08-14T21:35:31.0040756Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0041149Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0041598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0042057Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0042496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0042952Z self_outputs = self.self( 2025-08-14T21:35:31.0043382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-08-14T21:35:31.0043838Z attn_probs = nn.functional.softmax( 2025-08-14T21:35:31.0043985Z 2025-08-14T21:35:31.0044070Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0044302Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0044559Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0045107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0045623Z layer_outputs = layer_module( 2025-08-14T21:35:31.0045998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0046389Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0046849Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0047302Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0047750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0048194Z self_outputs = self.self( 2025-08-14T21:35:31.0048711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0049224Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0049799Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0050433Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-08-14T21:35:31.0050886Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0051269Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0051500Z 2025-08-14T21:35:31.0051662Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0052233Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0052736Z layer_outputs = layer_module( 2025-08-14T21:35:31.0053108Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0053494Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0053931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0054368Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0054814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0055263Z self_outputs = self.self( 2025-08-14T21:35:31.0055690Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0056182Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0056750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0057344Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-08-14T21:35:31.0057883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-08-14T21:35:31.0058372Z chunked_hidden_states = nn.functional.pad( 2025-08-14T21:35:31.0058735Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0059094Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0059261Z 2025-08-14T21:35:31.0059373Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0059911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0060415Z layer_outputs = layer_module( 2025-08-14T21:35:31.0060788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0061179Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0061630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0062081Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0062525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0062980Z self_outputs = self.self( 2025-08-14T21:35:31.0063417Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0063900Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0064448Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0065033Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0065248Z 2025-08-14T21:35:31.0065365Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0065899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0066404Z layer_outputs = layer_module( 2025-08-14T21:35:31.0066912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0067301Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0067734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0068176Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0068615Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0069057Z self_outputs = self.self( 2025-08-14T21:35:31.0069470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0069955Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0070513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0071109Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0071328Z 2025-08-14T21:35:31.0071440Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0071990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0072531Z layer_outputs = layer_module( 2025-08-14T21:35:31.0072913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0073318Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0073779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0074233Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0074691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0075467Z self_outputs = self.self( 2025-08-14T21:35:31.0075908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-08-14T21:35:31.0076487Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-08-14T21:35:31.0076750Z 2025-08-14T21:35:31.0076838Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0077075Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0077306Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0077526Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0077753Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0077979Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0078235Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0078809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0079346Z layer_outputs = layer_module( 2025-08-14T21:35:31.0079731Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0080136Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0080599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0081063Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0081523Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0081976Z self_outputs = self.self( 2025-08-14T21:35:31.0082465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-08-14T21:35:31.0082965Z query_vectors = self.query(hidden_states) 2025-08-14T21:35:31.0083120Z 2025-08-14T21:35:31.0083217Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0083443Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0083702Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0084262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0084788Z layer_outputs = layer_module( 2025-08-14T21:35:31.0085172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0085584Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0086041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0086496Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0086955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0087409Z self_outputs = self.self( 2025-08-14T21:35:31.0087837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0088325Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0088925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0089561Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0089819Z 2025-08-14T21:35:31.0089904Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0090140Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0090396Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0090949Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0091467Z layer_outputs = layer_module( 2025-08-14T21:35:31.0091845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0092238Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0092680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0093133Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0093580Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0094026Z self_outputs = self.self( 2025-08-14T21:35:31.0094448Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0094926Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0095464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0096077Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0096333Z 2025-08-14T21:35:31.0096443Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0096990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0097492Z layer_outputs = layer_module( 2025-08-14T21:35:31.0098840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0099243Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0099689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0100135Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0100575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0101023Z self_outputs = self.self( 2025-08-14T21:35:31.0101460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0101945Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0102484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0103318Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0103588Z 2025-08-14T21:35:31.0103703Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0104268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0104768Z layer_outputs = layer_module( 2025-08-14T21:35:31.0105137Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0105522Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0105973Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0106428Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0106896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0107327Z self_outputs = self.self( 2025-08-14T21:35:31.0107736Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0108202Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0108722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0109329Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0109578Z 2025-08-14T21:35:31.0109661Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0109888Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0110136Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0110681Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0111179Z layer_outputs = layer_module( 2025-08-14T21:35:31.0111543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0111924Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0112356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0112792Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0113225Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0113656Z self_outputs = self.self( 2025-08-14T21:35:31.0114061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-08-14T21:35:31.0114655Z attn_scores += diagonal_mask 2025-08-14T21:35:31.0114791Z 2025-08-14T21:35:31.0114915Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0115477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0116000Z layer_outputs = layer_module( 2025-08-14T21:35:31.0116385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0116794Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0117241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0117710Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0118168Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0118623Z self_outputs = self.self( 2025-08-14T21:35:31.0119053Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-08-14T21:35:31.0119511Z attn_probs = nn.functional.softmax( 2025-08-14T21:35:31.0119656Z 2025-08-14T21:35:31.0119750Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0119983Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0120236Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0120792Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0121320Z layer_outputs = layer_module( 2025-08-14T21:35:31.0121694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0122093Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0122547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0123007Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0123453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0123903Z self_outputs = self.self( 2025-08-14T21:35:31.0124337Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0124837Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0125407Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0126046Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-08-14T21:35:31.0126515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0126893Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0127069Z 2025-08-14T21:35:31.0127185Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0127745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0128272Z layer_outputs = layer_module( 2025-08-14T21:35:31.0128702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0129104Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0129557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0130092Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0130537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0130985Z self_outputs = self.self( 2025-08-14T21:35:31.0131421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0131906Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0132475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0133062Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-08-14T21:35:31.0133607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-08-14T21:35:31.0134107Z chunked_hidden_states = nn.functional.pad( 2025-08-14T21:35:31.0134472Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0134834Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0134990Z 2025-08-14T21:35:31.0135105Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0135637Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0136161Z layer_outputs = layer_module( 2025-08-14T21:35:31.0136538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0136928Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0137383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0137824Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0138266Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0138712Z self_outputs = self.self( 2025-08-14T21:35:31.0139127Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0139606Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0140152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0140734Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0140955Z 2025-08-14T21:35:31.0141067Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0141603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0142109Z layer_outputs = layer_module( 2025-08-14T21:35:31.0142462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0142843Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0143278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0143713Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0144142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0144568Z self_outputs = self.self( 2025-08-14T21:35:31.0145070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0145545Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0146114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0146722Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0146943Z 2025-08-14T21:35:31.0147065Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0147619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0148130Z layer_outputs = layer_module( 2025-08-14T21:35:31.0148494Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0148885Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0149376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0149814Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0150250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0150682Z self_outputs = self.self( 2025-08-14T21:35:31.0151094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-08-14T21:35:31.0151646Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-08-14T21:35:31.0151898Z 2025-08-14T21:35:31.0151992Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0152214Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0152443Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0152666Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0152886Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0153100Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0153350Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0153945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0154475Z layer_outputs = layer_module( 2025-08-14T21:35:31.0154862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0155263Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0155724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0156182Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0156645Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0157163Z self_outputs = self.self( 2025-08-14T21:35:31.0157587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-08-14T21:35:31.0158044Z query_vectors = self.query(hidden_states) 2025-08-14T21:35:31.0158202Z 2025-08-14T21:35:31.0158285Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0158509Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0159069Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0159627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0160150Z layer_outputs = layer_module( 2025-08-14T21:35:31.0160619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0161001Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0161448Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0161905Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0162345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0162796Z self_outputs = self.self( 2025-08-14T21:35:31.0163224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0163698Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0164229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0164862Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0165129Z 2025-08-14T21:35:31.0165214Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0165440Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0165686Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0166240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0166763Z layer_outputs = layer_module( 2025-08-14T21:35:31.0167139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0167532Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0167986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0168440Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0168965Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0169415Z self_outputs = self.self( 2025-08-14T21:35:31.0169853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0170342Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0170880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0171519Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0171781Z 2025-08-14T21:35:31.0171904Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0172476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0172998Z layer_outputs = layer_module( 2025-08-14T21:35:31.0173380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0173776Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0174224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0174677Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0175134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0175580Z self_outputs = self.self( 2025-08-14T21:35:31.0176053Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0176565Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0177099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0177724Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0177982Z 2025-08-14T21:35:31.0178098Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0178659Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0179165Z layer_outputs = layer_module( 2025-08-14T21:35:31.0179528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0179905Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0180351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0180791Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0181218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0181648Z self_outputs = self.self( 2025-08-14T21:35:31.0182066Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0182526Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0183042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0183679Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0183950Z 2025-08-14T21:35:31.0184037Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0184267Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0184512Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0185072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0185582Z layer_outputs = layer_module( 2025-08-14T21:35:31.0185947Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0186343Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0186781Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0187221Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0187657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0188093Z self_outputs = self.self( 2025-08-14T21:35:31.0188525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-08-14T21:35:31.0188972Z attn_scores += diagonal_mask 2025-08-14T21:35:31.0189103Z 2025-08-14T21:35:31.0189215Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0189767Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0190302Z layer_outputs = layer_module( 2025-08-14T21:35:31.0190687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0191122Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0191619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0192084Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0192531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0192985Z self_outputs = self.self( 2025-08-14T21:35:31.0193410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-08-14T21:35:31.0193871Z attn_probs = nn.functional.softmax( 2025-08-14T21:35:31.0194014Z 2025-08-14T21:35:31.0194098Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0194325Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0194579Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0195168Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0195694Z layer_outputs = layer_module( 2025-08-14T21:35:31.0196074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0196471Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0196920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0197375Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0197832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0198286Z self_outputs = self.self( 2025-08-14T21:35:31.0198705Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0199213Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0199776Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0200396Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-08-14T21:35:31.0200854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0201232Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0201396Z 2025-08-14T21:35:31.0201515Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0202064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0202581Z layer_outputs = layer_module( 2025-08-14T21:35:31.0203146Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0203542Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0203994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0204448Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0204900Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0205354Z self_outputs = self.self( 2025-08-14T21:35:31.0205776Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0206269Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0206913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0207541Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-08-14T21:35:31.0208101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-08-14T21:35:31.0208662Z chunked_hidden_states = nn.functional.pad( 2025-08-14T21:35:31.0209046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0209420Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0209593Z 2025-08-14T21:35:31.0209707Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0210266Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0210805Z layer_outputs = layer_module( 2025-08-14T21:35:31.0211177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0211579Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0212039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0212495Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0212941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0213391Z self_outputs = self.self( 2025-08-14T21:35:31.0213831Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0214317Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0214890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0215066Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0215071Z 2025-08-14T21:35:31.0215183Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0215566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0215651Z layer_outputs = layer_module( 2025-08-14T21:35:31.0215883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0215978Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0216282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0216367Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0216680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0216758Z self_outputs = self.self( 2025-08-14T21:35:31.0217058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0217193Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0217577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0217744Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0217748Z 2025-08-14T21:35:31.0217858Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0218282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0218398Z layer_outputs = layer_module( 2025-08-14T21:35:31.0218631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0218724Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0219020Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0219100Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0219403Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0219476Z self_outputs = self.self( 2025-08-14T21:35:31.0219773Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-08-14T21:35:31.0219972Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-08-14T21:35:31.0219977Z 2025-08-14T21:35:31.0220060Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0220150Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0220229Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0220308Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0220394Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0220472Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0220585Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0220955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0221028Z layer_outputs = layer_module( 2025-08-14T21:35:31.0221264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0221353Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0221645Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0221731Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0222021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0222100Z self_outputs = self.self( 2025-08-14T21:35:31.0222390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-08-14T21:35:31.0222478Z query_vectors = self.query(hidden_states) 2025-08-14T21:35:31.0222482Z 2025-08-14T21:35:31.0222568Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0222646Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0222756Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0223135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0223209Z layer_outputs = layer_module( 2025-08-14T21:35:31.0223449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0223531Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0223820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0223904Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0224193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0224273Z self_outputs = self.self( 2025-08-14T21:35:31.0224628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0224739Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0225103Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0225296Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0225300Z 2025-08-14T21:35:31.0225389Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0225470Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0225578Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0225950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0226024Z layer_outputs = layer_module( 2025-08-14T21:35:31.0226259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0226352Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0226648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0226736Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0227038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0227113Z self_outputs = self.self( 2025-08-14T21:35:31.0227419Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0227532Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0227907Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0228115Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0228119Z 2025-08-14T21:35:31.0228227Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0228599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0228674Z layer_outputs = layer_module( 2025-08-14T21:35:31.0228911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0228995Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0229285Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0229374Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0229667Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0229741Z self_outputs = self.self( 2025-08-14T21:35:31.0230039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0230145Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0230505Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0230694Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0230698Z 2025-08-14T21:35:31.0230805Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0231221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0231327Z layer_outputs = layer_module( 2025-08-14T21:35:31.0231570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0231651Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0231952Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0232036Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0232338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0232418Z self_outputs = self.self( 2025-08-14T21:35:31.0232720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0232829Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0233201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0233391Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0233394Z 2025-08-14T21:35:31.0233477Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0233566Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0233678Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0234072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0234149Z layer_outputs = layer_module( 2025-08-14T21:35:31.0234398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0234498Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0234814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0234902Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0235221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0235295Z self_outputs = self.self( 2025-08-14T21:35:31.0235617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-08-14T21:35:31.0235697Z attn_scores += diagonal_mask 2025-08-14T21:35:31.0235700Z 2025-08-14T21:35:31.0235809Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0236215Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0236296Z layer_outputs = layer_module( 2025-08-14T21:35:31.0236540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0236622Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0236937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0237027Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0237342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0237424Z self_outputs = self.self( 2025-08-14T21:35:31.0237737Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-08-14T21:35:31.0237873Z attn_probs = nn.functional.softmax( 2025-08-14T21:35:31.0237877Z 2025-08-14T21:35:31.0238007Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0238095Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0238205Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0238591Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0238667Z layer_outputs = layer_module( 2025-08-14T21:35:31.0238910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0238994Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0239297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0239384Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0239702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0239783Z self_outputs = self.self( 2025-08-14T21:35:31.0240093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0240221Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0240610Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0240798Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-08-14T21:35:31.0241012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0241119Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0241126Z 2025-08-14T21:35:31.0241239Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0241630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0241707Z layer_outputs = layer_module( 2025-08-14T21:35:31.0241945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0242037Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0242343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0242432Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0242734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0242810Z self_outputs = self.self( 2025-08-14T21:35:31.0243127Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0243253Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0243642Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0243788Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-08-14T21:35:31.0244130Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-08-14T21:35:31.0244237Z chunked_hidden_states = nn.functional.pad( 2025-08-14T21:35:31.0244444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0244549Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0244595Z 2025-08-14T21:35:31.0244736Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0245113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0245199Z layer_outputs = layer_module( 2025-08-14T21:35:31.0245436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0245522Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0245831Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0245912Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0246223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0246299Z self_outputs = self.self( 2025-08-14T21:35:31.0246605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0246739Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0247116Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0247287Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0247291Z 2025-08-14T21:35:31.0247403Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0247777Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0247860Z layer_outputs = layer_module( 2025-08-14T21:35:31.0248098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0248195Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0248496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0248647Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0248963Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0249039Z self_outputs = self.self( 2025-08-14T21:35:31.0249338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0249472Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0249844Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0250022Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0250026Z 2025-08-14T21:35:31.0250138Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0250523Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0250611Z layer_outputs = layer_module( 2025-08-14T21:35:31.0250848Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0250944Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0251241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0251322Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0251673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0251783Z self_outputs = self.self( 2025-08-14T21:35:31.0252105Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-08-14T21:35:31.0252310Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-08-14T21:35:31.0252314Z 2025-08-14T21:35:31.0252399Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0252491Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0252572Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0252653Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0252742Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0252822Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0252940Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0253333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0253412Z layer_outputs = layer_module( 2025-08-14T21:35:31.0253667Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0253754Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0254072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0254168Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0254468Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0254547Z self_outputs = self.self( 2025-08-14T21:35:31.0254846Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-08-14T21:35:31.0254944Z query_vectors = self.query(hidden_states) 2025-08-14T21:35:31.0254948Z 2025-08-14T21:35:31.0255038Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0255120Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0255229Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0255624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0255699Z layer_outputs = layer_module( 2025-08-14T21:35:31.0255952Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0256034Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0256353Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0256444Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0256765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0256846Z self_outputs = self.self( 2025-08-14T21:35:31.0257161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0257272Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0257659Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0257860Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0257864Z 2025-08-14T21:35:31.0257955Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0258036Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0258183Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0258626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0258701Z layer_outputs = layer_module( 2025-08-14T21:35:31.0258936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0259026Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0259319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0259405Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0259698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0259770Z self_outputs = self.self( 2025-08-14T21:35:31.0260075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0260183Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0260547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0260739Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0260742Z 2025-08-14T21:35:31.0260852Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0261225Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0261299Z layer_outputs = layer_module( 2025-08-14T21:35:31.0261539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0261627Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0261921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0262006Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0262353Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0262428Z self_outputs = self.self( 2025-08-14T21:35:31.0262747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0262856Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0263228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0263436Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0263439Z 2025-08-14T21:35:31.0263553Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0263939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0264017Z layer_outputs = layer_module( 2025-08-14T21:35:31.0264275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0264361Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0264657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0264744Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0265073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0265195Z self_outputs = self.self( 2025-08-14T21:35:31.0265498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0265608Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0265982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0266175Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0266179Z 2025-08-14T21:35:31.0266274Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0266362Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0266469Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0266844Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0266922Z layer_outputs = layer_module( 2025-08-14T21:35:31.0267155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0267246Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0267538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0267622Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0267911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0267982Z self_outputs = self.self( 2025-08-14T21:35:31.0268282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-08-14T21:35:31.0268361Z attn_scores += diagonal_mask 2025-08-14T21:35:31.0268367Z 2025-08-14T21:35:31.0268475Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0268852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0268928Z layer_outputs = layer_module( 2025-08-14T21:35:31.0269170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0269253Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0269554Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0269642Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0269941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0270037Z self_outputs = self.self( 2025-08-14T21:35:31.0270327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-08-14T21:35:31.0270410Z attn_probs = nn.functional.softmax( 2025-08-14T21:35:31.0270413Z 2025-08-14T21:35:31.0270504Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0270584Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0270691Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0271065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0271139Z layer_outputs = layer_module( 2025-08-14T21:35:31.0271383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0271510Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0271848Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0271938Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0272242Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0272328Z self_outputs = self.self( 2025-08-14T21:35:31.0272626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0272756Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0273145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0273334Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-08-14T21:35:31.0273559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0273667Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0273671Z 2025-08-14T21:35:31.0273785Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0274169Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0274246Z layer_outputs = layer_module( 2025-08-14T21:35:31.0274485Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0274579Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0274882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0274972Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0275277Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0275354Z self_outputs = self.self( 2025-08-14T21:35:31.0275663Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0275788Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0276177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0276325Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-08-14T21:35:31.0276676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-08-14T21:35:31.0276791Z chunked_hidden_states = nn.functional.pad( 2025-08-14T21:35:31.0277003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0277110Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0277120Z 2025-08-14T21:35:31.0277231Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0277607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0277694Z layer_outputs = layer_module( 2025-08-14T21:35:31.0277933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0278021Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0278333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0278483Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0278794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0278869Z self_outputs = self.self( 2025-08-14T21:35:31.0279169Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0279301Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0279678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0279851Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0279855Z 2025-08-14T21:35:31.0279967Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0280347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0280433Z layer_outputs = layer_module( 2025-08-14T21:35:31.0280669Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0280762Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0281063Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0281144Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0281451Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0281527Z self_outputs = self.self( 2025-08-14T21:35:31.0281825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0281964Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0282339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0282509Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0282513Z 2025-08-14T21:35:31.0282624Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0282998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0283084Z layer_outputs = layer_module( 2025-08-14T21:35:31.0283317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0283413Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0283715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0283797Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0284103Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0284176Z self_outputs = self.self( 2025-08-14T21:35:31.0284480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-08-14T21:35:31.0284679Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-08-14T21:35:31.0284683Z 2025-08-14T21:35:31.0284767Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0284858Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0284951Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0285068Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0285190Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0285274Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0285391Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0285772Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0285849Z layer_outputs = layer_module( 2025-08-14T21:35:31.0286092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0286177Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0286484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0286575Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0286881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0286962Z self_outputs = self.self( 2025-08-14T21:35:31.0287262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-08-14T21:35:31.0287353Z query_vectors = self.query(hidden_states) 2025-08-14T21:35:31.0287358Z 2025-08-14T21:35:31.0287447Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0287529Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0287639Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0288023Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0288101Z layer_outputs = layer_module( 2025-08-14T21:35:31.0288345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0288435Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0288809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0288906Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0289207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0289291Z self_outputs = self.self( 2025-08-14T21:35:31.0289589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0289701Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0290074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0290279Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0290283Z 2025-08-14T21:35:31.0290374Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0290457Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0290567Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0290950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0291026Z layer_outputs = layer_module( 2025-08-14T21:35:31.0291272Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0291360Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0291649Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0291799Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0292140Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0292218Z self_outputs = self.self( 2025-08-14T21:35:31.0292523Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0292633Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0293005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0293198Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0293202Z 2025-08-14T21:35:31.0293311Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0293703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0293780Z layer_outputs = layer_module( 2025-08-14T21:35:31.0294027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0294108Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0294400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0294484Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0294774Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0294847Z self_outputs = self.self( 2025-08-14T21:35:31.0295149Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0295265Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0295646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0295841Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0295844Z 2025-08-14T21:35:31.0295955Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0296340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0296417Z layer_outputs = layer_module( 2025-08-14T21:35:31.0296663Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0296750Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0297066Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0297153Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0297446Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0297526Z self_outputs = self.self( 2025-08-14T21:35:31.0297827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0297936Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0298313Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0298506Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0298552Z 2025-08-14T21:35:31.0298673Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0298759Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0298871Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0299260Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0299337Z layer_outputs = layer_module( 2025-08-14T21:35:31.0299587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0299679Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0299993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0300084Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0300402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0300476Z self_outputs = self.self( 2025-08-14T21:35:31.0300831Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-08-14T21:35:31.0300910Z attn_scores += diagonal_mask 2025-08-14T21:35:31.0300915Z 2025-08-14T21:35:31.0301026Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0301413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0301489Z layer_outputs = layer_module( 2025-08-14T21:35:31.0301735Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0301821Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0302139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0302230Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0302544Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0302815Z self_outputs = self.self( 2025-08-14T21:35:31.0303124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-08-14T21:35:31.0303211Z attn_probs = nn.functional.softmax( 2025-08-14T21:35:31.0303215Z 2025-08-14T21:35:31.0303310Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0303394Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0303513Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0303897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0303975Z layer_outputs = layer_module( 2025-08-14T21:35:31.0304220Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0304306Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0304606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0304696Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0304998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0305081Z self_outputs = self.self( 2025-08-14T21:35:31.0305377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0305589Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0306042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0306230Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-08-14T21:35:31.0306449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0306555Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0306560Z 2025-08-14T21:35:31.0306682Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0307060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0307134Z layer_outputs = layer_module( 2025-08-14T21:35:31.0307371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0307471Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0307778Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0307870Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0308174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0308250Z self_outputs = self.self( 2025-08-14T21:35:31.0308563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0308699Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0309079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0309228Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-08-14T21:35:31.0309562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-08-14T21:35:31.0309667Z chunked_hidden_states = nn.functional.pad( 2025-08-14T21:35:31.0309872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0309977Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0309988Z 2025-08-14T21:35:31.0310099Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0310477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0310561Z layer_outputs = layer_module( 2025-08-14T21:35:31.0310809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0310895Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0311207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0311286Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0311594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0311669Z self_outputs = self.self( 2025-08-14T21:35:31.0311981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0312109Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0312483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0312734Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0312739Z 2025-08-14T21:35:31.0312847Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0313227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0313311Z layer_outputs = layer_module( 2025-08-14T21:35:31.0313546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0313640Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0313942Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0314022Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0314338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0314412Z self_outputs = self.self( 2025-08-14T21:35:31.0314711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0314843Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0315217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0315385Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0315389Z 2025-08-14T21:35:31.0315499Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0315880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0315972Z layer_outputs = layer_module( 2025-08-14T21:35:31.0316210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0316304Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0316607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0316688Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0316998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0317072Z self_outputs = self.self( 2025-08-14T21:35:31.0317378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-08-14T21:35:31.0317582Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-08-14T21:35:31.0317589Z 2025-08-14T21:35:31.0317676Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0317768Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0317851Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0317932Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0318022Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0318102Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0318220Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0318601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0318676Z layer_outputs = layer_module( 2025-08-14T21:35:31.0318924Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0319059Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0319405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0319495Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0319795Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0319877Z self_outputs = self.self( 2025-08-14T21:35:31.0320175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-08-14T21:35:31.0320267Z query_vectors = self.query(hidden_states) 2025-08-14T21:35:31.0320271Z 2025-08-14T21:35:31.0320359Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0320441Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0320549Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0320937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0321013Z layer_outputs = layer_module( 2025-08-14T21:35:31.0321257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0321342Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0321643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0321730Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0322031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0322112Z self_outputs = self.self( 2025-08-14T21:35:31.0322409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0322525Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0322901Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0323096Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0323100Z 2025-08-14T21:35:31.0323188Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0323270Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0323381Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0323766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0323844Z layer_outputs = layer_module( 2025-08-14T21:35:31.0324083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0324177Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0324479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0324567Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0324867Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0324941Z self_outputs = self.self( 2025-08-14T21:35:31.0325245Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0325354Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0325725Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0326004Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0326009Z 2025-08-14T21:35:31.0326122Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0326504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0326580Z layer_outputs = layer_module( 2025-08-14T21:35:31.0326820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0326905Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0327202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0327287Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0327590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0327662Z self_outputs = self.self( 2025-08-14T21:35:31.0327965Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0328073Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0328438Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0328694Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0328701Z 2025-08-14T21:35:31.0328819Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0329202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0329286Z layer_outputs = layer_module( 2025-08-14T21:35:31.0329531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0329614Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0329916Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0330001Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0330300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0330382Z self_outputs = self.self( 2025-08-14T21:35:31.0330681Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0330791Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0331170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0331364Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0331371Z 2025-08-14T21:35:31.0331461Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0331544Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0331656Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0332041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0332119Z layer_outputs = layer_module( 2025-08-14T21:35:31.0332357Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0332498Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0332831Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0332923Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0333228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0333302Z self_outputs = self.self( 2025-08-14T21:35:31.0333608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-08-14T21:35:31.0333687Z attn_scores += diagonal_mask 2025-08-14T21:35:31.0333690Z 2025-08-14T21:35:31.0333801Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0334186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0334266Z layer_outputs = layer_module( 2025-08-14T21:35:31.0334515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0334600Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0334903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0334991Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0335294Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0335375Z self_outputs = self.self( 2025-08-14T21:35:31.0335677Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-08-14T21:35:31.0335762Z attn_probs = nn.functional.softmax( 2025-08-14T21:35:31.0335769Z 2025-08-14T21:35:31.0335862Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0335946Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0336058Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0336446Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0336522Z layer_outputs = layer_module( 2025-08-14T21:35:31.0336768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0336852Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0337164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0337250Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0337561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0337649Z self_outputs = self.self( 2025-08-14T21:35:31.0337960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0338087Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0338473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0338657Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-08-14T21:35:31.0338868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0338970Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0338974Z 2025-08-14T21:35:31.0339081Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0339584Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0339663Z layer_outputs = layer_module( 2025-08-14T21:35:31.0339906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0339999Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0340300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0340389Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0340703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0340778Z self_outputs = self.self( 2025-08-14T21:35:31.0341097Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0341230Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0341619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0341762Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-08-14T21:35:31.0342097Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-08-14T21:35:31.0342202Z chunked_hidden_states = nn.functional.pad( 2025-08-14T21:35:31.0342402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0342504Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0342517Z 2025-08-14T21:35:31.0342630Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0343014Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0343099Z layer_outputs = layer_module( 2025-08-14T21:35:31.0343344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0343427Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0343730Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0343809Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0344109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0344183Z self_outputs = self.self( 2025-08-14T21:35:31.0344482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0344614Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0344978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0345143Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0345147Z 2025-08-14T21:35:31.0345256Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0345620Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0345703Z layer_outputs = layer_module( 2025-08-14T21:35:31.0345935Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0346076Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0346414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0346495Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0346801Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0346875Z self_outputs = self.self( 2025-08-14T21:35:31.0347183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0347309Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0347673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0347838Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0347844Z 2025-08-14T21:35:31.0347953Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0348317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0348400Z layer_outputs = layer_module( 2025-08-14T21:35:31.0348629Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0348717Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0349007Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0349086Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0349383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0349462Z self_outputs = self.self( 2025-08-14T21:35:31.0349759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-08-14T21:35:31.0349954Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-08-14T21:35:31.0349958Z 2025-08-14T21:35:31.0350040Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0350135Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0350214Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0350293Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0350378Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0350457Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0350570Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0350936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0351013Z layer_outputs = layer_module( 2025-08-14T21:35:31.0351254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0351337Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0351632Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0351719Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0352009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0352089Z self_outputs = self.self( 2025-08-14T21:35:31.0352377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-08-14T21:35:31.0352507Z query_vectors = self.query(hidden_states) 2025-08-14T21:35:31.0352511Z 2025-08-14T21:35:31.0352631Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0352713Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0352819Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0353193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0353268Z layer_outputs = layer_module( 2025-08-14T21:35:31.0353519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0353604Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0353906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0353995Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0354304Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0354387Z self_outputs = self.self( 2025-08-14T21:35:31.0354699Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0354807Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0355174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0355365Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0355369Z 2025-08-14T21:35:31.0355457Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0355542Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0355658Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0356040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0356114Z layer_outputs = layer_module( 2025-08-14T21:35:31.0356345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0356435Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0356736Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0356822Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0357126Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0357201Z self_outputs = self.self( 2025-08-14T21:35:31.0357509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0357623Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0358011Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0358207Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0358211Z 2025-08-14T21:35:31.0358325Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0358711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0358788Z layer_outputs = layer_module( 2025-08-14T21:35:31.0359038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0359164Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0359509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0359599Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0359911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0359987Z self_outputs = self.self( 2025-08-14T21:35:31.0360306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0360417Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0360803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0360995Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0361004Z 2025-08-14T21:35:31.0361115Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0361504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0361582Z layer_outputs = layer_module( 2025-08-14T21:35:31.0361832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0361916Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0362226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0362314Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0362627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0362714Z self_outputs = self.self( 2025-08-14T21:35:31.0363026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0363133Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0363508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0363702Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0363706Z 2025-08-14T21:35:31.0363796Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0363881Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0363992Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0364375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0364456Z layer_outputs = layer_module( 2025-08-14T21:35:31.0364696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0364788Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0365100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0365187Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0365500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0365575Z self_outputs = self.self( 2025-08-14T21:35:31.0365884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-08-14T21:35:31.0366003Z attn_scores += diagonal_mask 2025-08-14T21:35:31.0366007Z 2025-08-14T21:35:31.0366170Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0366550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0366627Z layer_outputs = layer_module( 2025-08-14T21:35:31.0366872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0366958Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0367270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0367361Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0367659Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0367746Z self_outputs = self.self( 2025-08-14T21:35:31.0368045Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-08-14T21:35:31.0368132Z attn_probs = nn.functional.softmax( 2025-08-14T21:35:31.0368136Z 2025-08-14T21:35:31.0368227Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0368310Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0368420Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0369095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0369177Z layer_outputs = layer_module( 2025-08-14T21:35:31.0369423Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0369508Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0369817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0369907Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0370206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0370290Z self_outputs = self.self( 2025-08-14T21:35:31.0370589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0370718Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0371102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0371290Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-08-14T21:35:31.0371509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0371615Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0371620Z 2025-08-14T21:35:31.0371730Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0372116Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0372194Z layer_outputs = layer_module( 2025-08-14T21:35:31.0372432Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0372527Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0372829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0372918Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0373316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0373389Z self_outputs = self.self( 2025-08-14T21:35:31.0373686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0373808Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0374183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0374326Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-08-14T21:35:31.0374660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-08-14T21:35:31.0374764Z chunked_hidden_states = nn.functional.pad( 2025-08-14T21:35:31.0374970Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0375080Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0375084Z 2025-08-14T21:35:31.0375193Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0375557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0375641Z layer_outputs = layer_module( 2025-08-14T21:35:31.0375875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0375958Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0376264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0376347Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0376651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0376725Z self_outputs = self.self( 2025-08-14T21:35:31.0377015Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0377145Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0377515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0377682Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0377686Z 2025-08-14T21:35:31.0377795Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0378160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0378245Z layer_outputs = layer_module( 2025-08-14T21:35:31.0378474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0378562Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0378852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0378929Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0379227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0379300Z self_outputs = self.self( 2025-08-14T21:35:31.0379590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0379792Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0380157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0380320Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0380324Z 2025-08-14T21:35:31.0380431Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0380793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0380877Z layer_outputs = layer_module( 2025-08-14T21:35:31.0381105Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0381194Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0381490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0381568Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0381867Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0381939Z self_outputs = self.self( 2025-08-14T21:35:31.0382234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-08-14T21:35:31.0382425Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-08-14T21:35:31.0382429Z 2025-08-14T21:35:31.0382510Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0382599Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0382679Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0382758Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0382845Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0382925Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0383036Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0383399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0383473Z layer_outputs = layer_module( 2025-08-14T21:35:31.0383707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0383790Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0384079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0384168Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0384457Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0384542Z self_outputs = self.self( 2025-08-14T21:35:31.0384828Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-08-14T21:35:31.0384917Z query_vectors = self.query(hidden_states) 2025-08-14T21:35:31.0384920Z 2025-08-14T21:35:31.0385011Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0385090Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0385196Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0385565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0385640Z layer_outputs = layer_module( 2025-08-14T21:35:31.0385875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0385991Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0386321Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0386410Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0386702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0386780Z self_outputs = self.self( 2025-08-14T21:35:31.0387069Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0387181Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0387555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0387765Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0387771Z 2025-08-14T21:35:31.0387859Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0387938Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0388047Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0388418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0388491Z layer_outputs = layer_module( 2025-08-14T21:35:31.0388720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0388811Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0389104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0389190Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0389489Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0389561Z self_outputs = self.self( 2025-08-14T21:35:31.0389859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0389966Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0390326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0390515Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0390519Z 2025-08-14T21:35:31.0390629Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0390999Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0391076Z layer_outputs = layer_module( 2025-08-14T21:35:31.0391314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0391397Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0391690Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0391774Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0392067Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0392139Z self_outputs = self.self( 2025-08-14T21:35:31.0392445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0392602Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0393015Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0393205Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0393209Z 2025-08-14T21:35:31.0393319Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0393695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0393771Z layer_outputs = layer_module( 2025-08-14T21:35:31.0394009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0394093Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0394388Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0394479Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0394772Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0394850Z self_outputs = self.self( 2025-08-14T21:35:31.0395141Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-08-14T21:35:31.0395245Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-08-14T21:35:31.0395611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-08-14T21:35:31.0395796Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-08-14T21:35:31.0395803Z 2025-08-14T21:35:31.0395894Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0395978Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0396088Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0396463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0396540Z layer_outputs = layer_module( 2025-08-14T21:35:31.0396769Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0396860Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0397157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0397243Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0397536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0397615Z self_outputs = self.self( 2025-08-14T21:35:31.0397915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-08-14T21:35:31.0397993Z attn_scores += diagonal_mask 2025-08-14T21:35:31.0397996Z 2025-08-14T21:35:31.0398106Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0398481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0398556Z layer_outputs = layer_module( 2025-08-14T21:35:31.0398793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0398873Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0399166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0399316Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0399613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0399690Z self_outputs = self.self( 2025-08-14T21:35:31.0399983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-08-14T21:35:31.0400065Z attn_probs = nn.functional.softmax( 2025-08-14T21:35:31.0400069Z 2025-08-14T21:35:31.0400158Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0400238Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0400346Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0400720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0400798Z layer_outputs = layer_module( 2025-08-14T21:35:31.0401036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0401117Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0401411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0401495Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0401796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0401876Z self_outputs = self.self( 2025-08-14T21:35:31.0402186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0402314Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0402945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0403136Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-08-14T21:35:31.0403354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0403462Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0403466Z 2025-08-14T21:35:31.0403589Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0403959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0404035Z layer_outputs = layer_module( 2025-08-14T21:35:31.0404269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0404372Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0404674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0404763Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0405064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0405140Z self_outputs = self.self( 2025-08-14T21:35:31.0405460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0405587Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0405975Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0406253Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-08-14T21:35:31.0406598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-08-14T21:35:31.0406705Z chunked_hidden_states = nn.functional.pad( 2025-08-14T21:35:31.0406912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-08-14T21:35:31.0407019Z return torch._C._nn.pad(input, pad, mode, value) 2025-08-14T21:35:31.0407031Z 2025-08-14T21:35:31.0407143Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0407521Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0407609Z layer_outputs = layer_module( 2025-08-14T21:35:31.0407855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0407946Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0408261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0408344Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0408714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0408793Z self_outputs = self.self( 2025-08-14T21:35:31.0409093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0409226Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0409605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0409781Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0409786Z 2025-08-14T21:35:31.0409898Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0410274Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0410359Z layer_outputs = layer_module( 2025-08-14T21:35:31.0410599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0410691Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0410992Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0411072Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0411386Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0411462Z self_outputs = self.self( 2025-08-14T21:35:31.0411766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-08-14T21:35:31.0411898Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-08-14T21:35:31.0412272Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-08-14T21:35:31.0412442Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-08-14T21:35:31.0412446Z 2025-08-14T21:35:31.0412558Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:31.0412947Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-08-14T21:35:31.0413078Z layer_outputs = layer_module( 2025-08-14T21:35:31.0413348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:35:31.0413444Z return super().__call__(*args, **kwargs) 2025-08-14T21:35:31.0413746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-08-14T21:35:31.0413828Z self_attn_outputs = self.attention( 2025-08-14T21:35:31.0414134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-08-14T21:35:31.0414210Z self_outputs = self.self( 2025-08-14T21:35:31.0414519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-08-14T21:35:31.0414720Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-08-14T21:35:31.0414727Z 2025-08-14T21:35:31.0414816Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0414907Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0414990Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0415070Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0415159Z cudagraph partition due to non gpu ops 2025-08-14T21:35:31.0415240Z cudagraph partition due to non gpu ops 2025-08-14T21:35:48.7829448Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:48.7830173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1716, in torch_dynamo_resume_in_forward_at_1703 2025-08-14T21:35:48.7830735Z prediction_scores = self.lm_head(sequence_output) 2025-08-14T21:35:48.7831195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1333, in forward 2025-08-14T21:35:48.7831663Z x = self.dense(features) 2025-08-14T21:35:48.7831783Z 2025-08-14T21:35:48.7831931Z cudagraph partition due to non gpu ops 2025-08-14T21:35:48.7832166Z cudagraph partition due to non gpu ops 2025-08-14T21:35:48.7832388Z cudagraph partition due to non gpu ops 2025-08-14T21:35:48.7832607Z cudagraph partition due to non gpu ops 2025-08-14T21:35:48.7832856Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:35:48.7833416Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1723, in torch_dynamo_resume_in_forward_at_1703 2025-08-14T21:35:48.7834064Z masked_lm_loss = loss_fct(prediction_scores.view(-1, self.config.vocab_size), labels.view(-1)) 2025-08-14T21:35:48.7834309Z 2025-08-14T21:35:50.4192857Z Compilation time (from dynamo_timed): 53.003539305 2025-08-14T21:35:50.4358037Z pass 2025-08-14T21:35:50.4358464Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:35:50.4359343Z TIMING: _recursive_pre_grad_passes:0.14282 _recursive_joint_graph_passes:1.06882 inductor_compile:31.41291 backend_compile:46.41063 gc:0.00551 entire_frame_compile:53.00354 _recursive_post_grad_passes:1.07036 async_compile.wait:4.75751 code_gen:26.15327 total_wall_time:53.00354 2025-08-14T21:35:50.4360354Z STATS: call_* op count: 1789 | FakeTensorMode.__torch_dispatch__:75519 | FakeTensor.__torch_dispatch__:8895 | ProxyTorchDispatchMode.__torch_dispatch__:20282 2025-08-14T21:35:50.4360900Z Dynamo produced 5 graphs covering 1789 ops with 4 graph breaks (1 unique) 2025-08-14T21:35:56.5073199Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:35:56.5074189Z from pkg_resources import resource_filename 2025-08-14T21:35:57.2409497Z 2025-08-14T21:36:00.4804793Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:36:00.4808187Z loading model: 0it [00:03, ?it/s] 2025-08-14T21:36:00.4808599Z cpu eval BartForCausalLM 2025-08-14T21:36:02.1758026Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:36:02.5507429Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:36:02.9171180Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:36:13.9194875Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9195207Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9195443Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9195672Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9195900Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9196123Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9199014Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9202957Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9203839Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9204414Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9204692Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9204914Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9205143Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9205372Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9205593Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9205826Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9206054Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9206272Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9206502Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9206724Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9206951Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9207174Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9207447Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9207888Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9208266Z return mod(**inputs) 2025-08-14T21:36:13.9208721Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9209531Z outputs = self.model.decoder( 2025-08-14T21:36:13.9209956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9210386Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9210779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9211190Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9211616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9212082Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9212536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9212998Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9213483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:36:13.9214008Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:36:13.9214214Z 2025-08-14T21:36:13.9214346Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9214748Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9215100Z return mod(**inputs) 2025-08-14T21:36:13.9215903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9216484Z outputs = self.model.decoder( 2025-08-14T21:36:13.9216902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9217330Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9217712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9218174Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9218591Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9219039Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9219480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9219930Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9220428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:36:13.9220930Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:36:13.9221105Z 2025-08-14T21:36:13.9221200Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9221427Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9221651Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9221873Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9222085Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9222309Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9222601Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9222821Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9223036Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9223252Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9223475Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9223694Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9223916Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9224136Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9224347Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9224567Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9224820Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9225212Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9225566Z return mod(**inputs) 2025-08-14T21:36:13.9225973Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9226395Z outputs = self.model.decoder( 2025-08-14T21:36:13.9226790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9227197Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9227570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9227950Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9228347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9228788Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9229227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9229660Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9230138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:36:13.9230649Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:36:13.9230901Z 2025-08-14T21:36:13.9231066Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9231445Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9231801Z return mod(**inputs) 2025-08-14T21:36:13.9232201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9232642Z outputs = self.model.decoder( 2025-08-14T21:36:13.9233051Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9233473Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9233838Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9234214Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9234629Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9235071Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9235506Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9235932Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9236414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:36:13.9236909Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:36:13.9237078Z 2025-08-14T21:36:13.9237169Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9237387Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9237606Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9237816Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9238027Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9238250Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9238472Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9238687Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9238905Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9239124Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9239334Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9239556Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9239777Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9239999Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9240211Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9240434Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9240687Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9241076Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9241433Z return mod(**inputs) 2025-08-14T21:36:13.9241833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9242250Z outputs = self.model.decoder( 2025-08-14T21:36:13.9242661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9243074Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9243449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9243832Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9244248Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9244688Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9245120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9245592Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9246103Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:36:13.9246620Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:36:13.9246820Z 2025-08-14T21:36:13.9246934Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9247327Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9247680Z return mod(**inputs) 2025-08-14T21:36:13.9248106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9248524Z outputs = self.model.decoder( 2025-08-14T21:36:13.9249036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9249458Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9249845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9250240Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9250661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9251097Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9251536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9251975Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9252448Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:36:13.9252961Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:36:13.9253147Z 2025-08-14T21:36:13.9253234Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9253470Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9253691Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9253913Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9254132Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9254345Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9254567Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9254797Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9255000Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9255213Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9255426Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9255637Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9255843Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9256056Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9256267Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9256474Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9256721Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9257107Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9257450Z return mod(**inputs) 2025-08-14T21:36:13.9257849Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9258272Z outputs = self.model.decoder( 2025-08-14T21:36:13.9258703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9259112Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9259489Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9259931Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9260420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9260850Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9261285Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9261733Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9262190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:36:13.9262702Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:36:13.9262904Z 2025-08-14T21:36:13.9263019Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9263414Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9263773Z return mod(**inputs) 2025-08-14T21:36:13.9264180Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9264611Z outputs = self.model.decoder( 2025-08-14T21:36:13.9265012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9265455Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9265822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9266207Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9266607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9267047Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9267481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9267920Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9268391Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:36:13.9268888Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:36:13.9269059Z 2025-08-14T21:36:13.9269154Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9269384Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9269605Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9269823Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9270043Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9270252Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9270467Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9270687Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9270899Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9271118Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9271342Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9271560Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9271781Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9272002Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9272215Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9272437Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9272690Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9273083Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9273428Z return mod(**inputs) 2025-08-14T21:36:13.9273822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9274265Z outputs = self.model.decoder( 2025-08-14T21:36:13.9274670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9275220Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9275603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9276003Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9276420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9276871Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9277319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9277761Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9278251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:36:13.9278782Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:36:13.9278987Z 2025-08-14T21:36:13.9279115Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9279512Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9279872Z return mod(**inputs) 2025-08-14T21:36:13.9280275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9280703Z outputs = self.model.decoder( 2025-08-14T21:36:13.9281116Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9281536Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9281918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9282313Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9282747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9283200Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9283646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9284083Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9284570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:36:13.9285071Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:36:13.9285249Z 2025-08-14T21:36:13.9285348Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9285579Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9285814Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9286045Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9286277Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9286509Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9286741Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9286965Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9287192Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9287417Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9287638Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9287866Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9288091Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9288316Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9288536Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9288762Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9289110Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9289499Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9289905Z return mod(**inputs) 2025-08-14T21:36:13.9290343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9290762Z outputs = self.model.decoder( 2025-08-14T21:36:13.9291174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9291601Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9291975Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9292365Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9292778Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9293222Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9293658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9294093Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9294568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:36:13.9295089Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:36:13.9295288Z 2025-08-14T21:36:13.9295399Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9295787Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9296139Z return mod(**inputs) 2025-08-14T21:36:13.9296528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9296961Z outputs = self.model.decoder( 2025-08-14T21:36:13.9297379Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9297805Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9298172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9298562Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9298968Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9299400Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9299815Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9300257Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9300738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:36:13.9301228Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:36:13.9301398Z 2025-08-14T21:36:13.9301485Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9301709Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9301931Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9302142Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9302367Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9302588Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9302967Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9303197Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9303423Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9303649Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9303864Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9304090Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9304312Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9304531Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9304866Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9305141Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9305382Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9305766Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9306156Z return mod(**inputs) 2025-08-14T21:36:13.9306550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9306961Z outputs = self.model.decoder( 2025-08-14T21:36:13.9307377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9307819Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9308194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9308600Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9309011Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9309452Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9309878Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9310316Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9310794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:36:13.9311311Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:36:13.9311508Z 2025-08-14T21:36:13.9311619Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9312013Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9312364Z return mod(**inputs) 2025-08-14T21:36:13.9312755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9313176Z outputs = self.model.decoder( 2025-08-14T21:36:13.9313582Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9313998Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9314384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9314776Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9315192Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9315626Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9316058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9316500Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9316976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:36:13.9317462Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:36:13.9317641Z 2025-08-14T21:36:13.9317726Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9317956Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9318182Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9318399Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9318625Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9318845Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9319058Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9319280Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9319500Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9319759Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9320048Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9320271Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9320483Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9320704Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9320926Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9321147Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9321392Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9321780Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9322131Z return mod(**inputs) 2025-08-14T21:36:13.9322515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9322931Z outputs = self.model.decoder( 2025-08-14T21:36:13.9323339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9323758Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9324131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9324517Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9324931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9325366Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9325800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9326234Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9326710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:36:13.9327219Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:36:13.9327423Z 2025-08-14T21:36:13.9327538Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9327926Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9328276Z return mod(**inputs) 2025-08-14T21:36:13.9328666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9329194Z outputs = self.model.decoder( 2025-08-14T21:36:13.9329617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9330052Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9330440Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9330854Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9331280Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9331721Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9332160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9332600Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9333075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:36:13.9333579Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:36:13.9333762Z 2025-08-14T21:36:13.9333849Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9334083Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9334305Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9334536Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9334823Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9335071Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9335363Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9335580Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9335804Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9336026Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9336242Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9336464Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9336698Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9336916Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9337140Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9337361Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9337615Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9338000Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9338361Z return mod(**inputs) 2025-08-14T21:36:13.9338748Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9339162Z outputs = self.model.decoder( 2025-08-14T21:36:13.9339571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9339988Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9340358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9340735Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9341140Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9341570Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9341998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9342430Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9342899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:36:13.9343400Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:36:13.9343602Z 2025-08-14T21:36:13.9343713Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9344094Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9344484Z return mod(**inputs) 2025-08-14T21:36:13.9344888Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9345324Z outputs = self.model.decoder( 2025-08-14T21:36:13.9345777Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9346215Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9346595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9347001Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9347420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9347868Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9348280Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9348703Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9349174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:36:13.9349684Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:36:13.9349901Z 2025-08-14T21:36:13.9350023Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9350251Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9350473Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9350684Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9350907Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9351125Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9351335Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9351554Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9351773Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9351988Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9352200Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9352421Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9352641Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9352847Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9353065Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9353283Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9353528Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9353921Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9354276Z return mod(**inputs) 2025-08-14T21:36:13.9354661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9355079Z outputs = self.model.decoder( 2025-08-14T21:36:13.9355484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9355916Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9356289Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9356681Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9357104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9357544Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9357971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9358410Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9358891Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:36:13.9359403Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:36:13.9359610Z 2025-08-14T21:36:13.9359725Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9360118Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9360475Z return mod(**inputs) 2025-08-14T21:36:13.9360864Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9361285Z outputs = self.model.decoder( 2025-08-14T21:36:13.9361695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9362109Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9362483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9362873Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9363292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9363727Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9364161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9364675Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9365155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:36:13.9365642Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:36:13.9365823Z 2025-08-14T21:36:13.9365910Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9366145Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9366364Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9366588Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9366812Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9367032Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9367246Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9367468Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9367692Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9367910Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9368133Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9368356Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9368568Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9368845Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9369089Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9369308Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9369563Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9369966Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9370327Z return mod(**inputs) 2025-08-14T21:36:13.9370716Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9371158Z outputs = self.model.decoder( 2025-08-14T21:36:13.9371571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9371990Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9372371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9372764Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9373179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9373620Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9374055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9374516Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9374982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:36:13.9375478Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:36:13.9375679Z 2025-08-14T21:36:13.9375789Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9376172Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9376511Z return mod(**inputs) 2025-08-14T21:36:13.9376894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9377319Z outputs = self.model.decoder( 2025-08-14T21:36:13.9377714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9378110Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9378474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9378863Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9379348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9379782Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9380226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9380657Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9381113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:36:13.9381596Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:36:13.9381771Z 2025-08-14T21:36:13.9381857Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9382087Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9382307Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9382530Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9382757Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9382977Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9383196Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9383427Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9383632Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9383848Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9384068Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9384275Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9384490Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9384703Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9384912Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9385117Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9385360Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9385738Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9386076Z return mod(**inputs) 2025-08-14T21:36:13.9386462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9386879Z outputs = self.model.decoder( 2025-08-14T21:36:13.9387287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9387692Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9388055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9388432Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9388827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9389258Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9389679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9390296Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9390756Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:36:13.9391261Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:36:13.9391452Z 2025-08-14T21:36:13.9391570Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9391954Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9392293Z return mod(**inputs) 2025-08-14T21:36:13.9392679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-08-14T21:36:13.9393087Z outputs = self.model.decoder( 2025-08-14T21:36:13.9393475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:36:13.9393976Z layer_outputs = decoder_layer( 2025-08-14T21:36:13.9394347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:36:13.9394724Z return super().__call__(*args, **kwargs) 2025-08-14T21:36:13.9395121Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:36:13.9395552Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:36:13.9395973Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:36:13.9396394Z attn_output, attn_weights = attention_interface( 2025-08-14T21:36:13.9396862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:36:13.9397363Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:36:13.9397542Z 2025-08-14T21:36:13.9397636Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9397854Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9398074Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9398295Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9398503Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9398723Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9398935Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9399152Z cudagraph partition due to non gpu ops 2025-08-14T21:36:13.9399393Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:36:13.9399786Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:36:13.9400153Z return mod(**inputs) 2025-08-14T21:36:13.9400525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1923, in forward 2025-08-14T21:36:13.9401006Z loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) 2025-08-14T21:36:13.9401215Z 2025-08-14T21:36:24.1044928Z Compilation time (from dynamo_timed): 19.369274096 2025-08-14T21:36:24.1264024Z pass 2025-08-14T21:36:24.1264491Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:36:24.1265350Z TIMING: _recursive_pre_grad_passes:0.04063 _recursive_joint_graph_passes:0.42334 _recursive_post_grad_passes:0.08535 async_compile.wait:0.96583 code_gen:9.67954 inductor_compile:11.31633 backend_compile:16.88043 gc:0.00056 entire_frame_compile:19.36927 total_wall_time:19.36927 2025-08-14T21:36:24.1266335Z STATS: call_* op count: 374 | FakeTensorMode.__torch_dispatch__:27610 | FakeTensor.__torch_dispatch__:3334 | ProxyTorchDispatchMode.__torch_dispatch__:7489 2025-08-14T21:36:24.1266860Z Dynamo produced 1 graphs covering 374 ops with 0 graph breaks (0 unique) 2025-08-14T21:36:29.9556793Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:36:29.9557774Z from pkg_resources import resource_filename 2025-08-14T21:36:30.5537391Z 2025-08-14T21:36:35.6833401Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:36:35.6833764Z loading model: 0it [00:05, ?it/s] 2025-08-14T21:36:35.6834035Z cpu eval BartForConditionalGeneration 2025-08-14T21:36:39.5110392Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:36:40.2683060Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:36:41.0130044Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:37:06.5441657Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5442432Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5442678Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5442913Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5443138Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5443367Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5443596Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5443820Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5444046Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5444273Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5444635Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5444878Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5445123Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5445347Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5445576Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5445813Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5446042Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5446282Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5446513Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5446741Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5446959Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5447186Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5447456Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5447882Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5448244Z return mod(**inputs) 2025-08-14T21:37:06.5448682Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5449301Z outputs = self.model( 2025-08-14T21:37:06.5449714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5450156Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5450580Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5451006Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5451391Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5451798Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5452228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5452668Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5453109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5453556Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5454048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5454564Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5454794Z 2025-08-14T21:37:06.5454913Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5455314Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5455685Z return mod(**inputs) 2025-08-14T21:37:06.5456074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5456494Z outputs = self.model( 2025-08-14T21:37:06.5456888Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5457301Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5457929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5458349Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5458728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5459118Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5459537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5459972Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5460395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5460830Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5461312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5461812Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5461988Z 2025-08-14T21:37:06.5462078Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5462312Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5462540Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5462764Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5462979Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5463200Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5463420Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5463635Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5463854Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5464072Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5464282Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5464503Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5464722Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5464936Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5465158Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5465379Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5465711Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5466115Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5466477Z return mod(**inputs) 2025-08-14T21:37:06.5466867Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5467287Z outputs = self.model( 2025-08-14T21:37:06.5467679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5468088Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5468497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5468913Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5469297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5469683Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5470094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5470525Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5470944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5471377Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5471851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5472362Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5472611Z 2025-08-14T21:37:06.5472765Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5473159Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5473508Z return mod(**inputs) 2025-08-14T21:37:06.5473902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5474310Z outputs = self.model( 2025-08-14T21:37:06.5474774Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5475278Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5475740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5476416Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5476875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5477317Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5477884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5478401Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5478890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5479413Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5479981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5512297Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5512654Z 2025-08-14T21:37:06.5512784Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5513055Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5513356Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5513641Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5513916Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5514152Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5514383Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5514597Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5514822Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5515043Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5515266Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5515483Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5515702Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5515926Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5516138Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5516357Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5516618Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5517034Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5517405Z return mod(**inputs) 2025-08-14T21:37:06.5517836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5518263Z outputs = self.model( 2025-08-14T21:37:06.5518662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5519091Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5519535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5519954Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5520344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5520993Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5521494Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5521944Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5522390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5522839Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5523328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5523849Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5524060Z 2025-08-14T21:37:06.5524179Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5524588Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5524947Z return mod(**inputs) 2025-08-14T21:37:06.5525356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5525779Z outputs = self.model( 2025-08-14T21:37:06.5526176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5526588Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5527004Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5527424Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5527800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5528203Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5528626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5529187Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5529614Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5530055Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5530542Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5531040Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5531216Z 2025-08-14T21:37:06.5531306Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5531542Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5531773Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5531990Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5532218Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5532443Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5532712Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5532972Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5533195Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5533419Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5533634Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5533855Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5534076Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5534290Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5534512Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5534735Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5534984Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5535383Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5535738Z return mod(**inputs) 2025-08-14T21:37:06.5536139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5536633Z outputs = self.model( 2025-08-14T21:37:06.5537031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5537448Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5537847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5538265Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5538648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5539046Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5539456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5539894Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5540331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5540778Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5541256Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5541773Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5541972Z 2025-08-14T21:37:06.5542094Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5542479Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5542835Z return mod(**inputs) 2025-08-14T21:37:06.5543225Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5543634Z outputs = self.model( 2025-08-14T21:37:06.5544027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5544445Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5544853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5545260Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5545638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5546037Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5546454Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5546878Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5547321Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5547764Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5548246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5548734Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5548918Z 2025-08-14T21:37:06.5549004Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5549241Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5549464Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5549691Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5549916Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5550133Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5550355Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5550576Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5550795Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5551010Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5551269Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5551527Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5551746Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5551970Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5552191Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5552406Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5552664Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5553069Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5553428Z return mod(**inputs) 2025-08-14T21:37:06.5553822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5554245Z outputs = self.model( 2025-08-14T21:37:06.5554640Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5555069Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5555487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5555918Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5556306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5556698Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5557123Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5557580Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5558009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5558464Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5558957Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5559482Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5559681Z 2025-08-14T21:37:06.5559797Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5560202Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5560562Z return mod(**inputs) 2025-08-14T21:37:06.5560959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5561371Z outputs = self.model( 2025-08-14T21:37:06.5561773Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5562199Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5562604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5563032Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5563417Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5563816Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5564241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5564685Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5565120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5565570Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5566064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5566605Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5566828Z 2025-08-14T21:37:06.5566928Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5567150Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5567376Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5567600Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5567815Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5568039Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5568260Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5568482Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5568698Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5569016Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5569245Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5569464Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5569690Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5569925Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5570144Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5570369Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5570626Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5571011Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5571369Z return mod(**inputs) 2025-08-14T21:37:06.5571762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5572174Z outputs = self.model( 2025-08-14T21:37:06.5572555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5572970Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5573386Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5573794Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5574166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5574568Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5574976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5575390Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5575814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5576248Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5576718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5577216Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5577423Z 2025-08-14T21:37:06.5577535Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5577928Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5578271Z return mod(**inputs) 2025-08-14T21:37:06.5578662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5579070Z outputs = self.model( 2025-08-14T21:37:06.5579456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5579863Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5580264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5580679Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5581049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5581515Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5581944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5582380Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5582805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5583252Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5583738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5584241Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5584417Z 2025-08-14T21:37:06.5584504Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5584740Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5584972Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5585196Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5585698Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5585924Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5586148Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5586365Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5586588Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5586815Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5587030Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5587257Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5587481Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5587695Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5587914Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5588138Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5588385Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5588789Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5589147Z return mod(**inputs) 2025-08-14T21:37:06.5589547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5589956Z outputs = self.model( 2025-08-14T21:37:06.5590353Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5590776Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5591180Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5591607Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5591988Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5592384Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5592807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5593242Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5593675Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5594112Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5594588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5595107Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5595308Z 2025-08-14T21:37:06.5595430Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5595817Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5596171Z return mod(**inputs) 2025-08-14T21:37:06.5597528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5597953Z outputs = self.model( 2025-08-14T21:37:06.5598335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5598756Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5599163Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5599576Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5599946Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5600343Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5600761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5601195Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5601631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5602068Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5602543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5603215Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5603403Z 2025-08-14T21:37:06.5603490Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5603736Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5603958Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5604185Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5604412Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5604641Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5604863Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5605092Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5605317Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5605528Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5605749Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5605968Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5606180Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5606400Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5606623Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5606835Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5607090Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5607485Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5607841Z return mod(**inputs) 2025-08-14T21:37:06.5608224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5608645Z outputs = self.model( 2025-08-14T21:37:06.5609099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5609516Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5609923Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5610335Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5610717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5611101Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5611519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5611950Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5612559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5613007Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5613489Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5614010Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5614206Z 2025-08-14T21:37:06.5614319Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5614714Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5615070Z return mod(**inputs) 2025-08-14T21:37:06.5615458Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5615867Z outputs = self.model( 2025-08-14T21:37:06.5616261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5616689Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5617092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5617510Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5617893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5618287Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5618699Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5619133Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5619561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5620005Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5620481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5620978Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5621154Z 2025-08-14T21:37:06.5621246Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5621466Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5621693Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5621917Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5622132Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5622353Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5622575Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5622801Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5623018Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5623239Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5623464Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5623683Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5623906Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5624128Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5624343Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5624570Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5624824Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5625220Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5625571Z return mod(**inputs) 2025-08-14T21:37:06.5625968Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5626388Z outputs = self.model( 2025-08-14T21:37:06.5626775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5627242Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5627747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5628165Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5628545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5628938Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5629366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5629792Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5630237Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5630680Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5631170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5631679Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5631885Z 2025-08-14T21:37:06.5631997Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5632389Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5632743Z return mod(**inputs) 2025-08-14T21:37:06.5633128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5633543Z outputs = self.model( 2025-08-14T21:37:06.5633933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5634345Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5634760Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5635168Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5635540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5635923Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5636340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5636772Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5637179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5637605Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5638072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5638549Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5638725Z 2025-08-14T21:37:06.5638814Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5639047Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5639272Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5639488Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5639715Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5639942Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5640163Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5640376Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5640614Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5640829Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5641048Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5641269Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5641481Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5641712Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5641972Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5642234Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5642485Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5642869Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5643216Z return mod(**inputs) 2025-08-14T21:37:06.5643593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5644007Z outputs = self.model( 2025-08-14T21:37:06.5644399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5644810Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5645213Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5645639Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5646019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5646402Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5646827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5647258Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5647685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5648120Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5648602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5649214Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5649415Z 2025-08-14T21:37:06.5649536Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5649931Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5650286Z return mod(**inputs) 2025-08-14T21:37:06.5650689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5651085Z outputs = self.model( 2025-08-14T21:37:06.5651466Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5651878Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5652288Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5652710Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5653099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5653493Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5653906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5654340Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5654768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5655199Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5655675Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5656169Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5656341Z 2025-08-14T21:37:06.5656435Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5656658Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5656885Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5657154Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5657400Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5657628Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5657849Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5658072Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5658285Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5658504Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5658723Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5658933Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5659151Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5659369Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5659581Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5659802Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5660053Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5660435Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5660800Z return mod(**inputs) 2025-08-14T21:37:06.5661193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5661605Z outputs = self.model( 2025-08-14T21:37:06.5661984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5662397Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5662801Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5663205Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5663580Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5663974Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5664395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5664815Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5665245Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5665679Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5666153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5666657Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5666854Z 2025-08-14T21:37:06.5666966Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5667348Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5667684Z return mod(**inputs) 2025-08-14T21:37:06.5668070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5668474Z outputs = self.model( 2025-08-14T21:37:06.5668866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5669273Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5669685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5670087Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5670454Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5670826Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5671239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5671672Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5672172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5672610Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5673086Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5673563Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5673728Z 2025-08-14T21:37:06.5673815Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5674041Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5674261Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5674472Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5674691Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5674907Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5675115Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5675336Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5675554Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5675770Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5675978Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5676193Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5676408Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5676616Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5676829Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5677045Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5677288Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5677682Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5678035Z return mod(**inputs) 2025-08-14T21:37:06.5678426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5678832Z outputs = self.model( 2025-08-14T21:37:06.5679226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5679645Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5680043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5680456Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5680821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5681198Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5681594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5682015Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5682432Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5682854Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5683333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5683848Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5684045Z 2025-08-14T21:37:06.5684165Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5684550Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5684906Z return mod(**inputs) 2025-08-14T21:37:06.5685294Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5685707Z outputs = self.model( 2025-08-14T21:37:06.5686092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-08-14T21:37:06.5686606Z encoder_outputs = self.encoder( 2025-08-14T21:37:06.5687022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-08-14T21:37:06.5687442Z layer_outputs = encoder_layer( 2025-08-14T21:37:06.5687833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5688238Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5688666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-08-14T21:37:06.5689189Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:37:06.5689636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5690077Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5690568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5691066Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5691249Z 2025-08-14T21:37:06.5691337Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5691571Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5691790Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5692014Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5692236Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5692453Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5692675Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5692896Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5693118Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5693334Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5693554Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5693780Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5693995Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5694219Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5694444Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5694657Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5694909Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5695301Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5695651Z return mod(**inputs) 2025-08-14T21:37:06.5696035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5696446Z outputs = self.model( 2025-08-14T21:37:06.5696832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5697242Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5697655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5698070Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5698444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5698831Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5699242Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5699694Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5700109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5700544Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5701003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5701586Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5701780Z 2025-08-14T21:37:06.5701894Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5702282Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5702830Z return mod(**inputs) 2025-08-14T21:37:06.5703228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5703639Z outputs = self.model( 2025-08-14T21:37:06.5704034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5704456Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5704861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5705292Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5705679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5706078Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5706489Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5706932Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5707376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5707804Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5708282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5708769Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5708943Z 2025-08-14T21:37:06.5709039Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5709263Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5709492Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5709718Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5709934Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5710155Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5710374Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5710596Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5710812Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5711033Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5711264Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5711471Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5711716Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5712093Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5712436Z return mod(**inputs) 2025-08-14T21:37:06.5712814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5713221Z outputs = self.model( 2025-08-14T21:37:06.5713607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5714014Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5714425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5714841Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5715217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5715599Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5716003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5716605Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5717038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5717463Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5717928Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5718428Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5718621Z 2025-08-14T21:37:06.5718734Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5719116Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5719467Z return mod(**inputs) 2025-08-14T21:37:06.5719850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5720267Z outputs = self.model( 2025-08-14T21:37:06.5720656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5721073Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5721475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5721897Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5722278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5722671Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5723089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5723541Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5723991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5724419Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5724893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5725382Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5725552Z 2025-08-14T21:37:06.5725644Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5725868Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5726094Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5726317Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5726532Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5726756Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5726979Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5727197Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5727421Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5727643Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5727867Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5728081Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5728302Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5728520Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5728734Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5729029Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5729285Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5729672Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5730030Z return mod(**inputs) 2025-08-14T21:37:06.5730425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5730919Z outputs = self.model( 2025-08-14T21:37:06.5731366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5731783Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5732192Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5732612Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5732997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5733396Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5733815Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5734248Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5734695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5735151Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5735627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5736142Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5736345Z 2025-08-14T21:37:06.5736459Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5736852Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5737200Z return mod(**inputs) 2025-08-14T21:37:06.5737604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5738009Z outputs = self.model( 2025-08-14T21:37:06.5738387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5738797Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5739194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5739601Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5739960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5740342Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5740749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5741175Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5741587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5742013Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5742484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5742957Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5743123Z 2025-08-14T21:37:06.5743207Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5743431Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5743653Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5743863Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5744083Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5744307Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5744522Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5744746Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5744976Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5745185Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5745401Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5745674Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5745951Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5746329Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5746668Z return mod(**inputs) 2025-08-14T21:37:06.5747046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5747444Z outputs = self.model( 2025-08-14T21:37:06.5747824Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5748243Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5748638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5749036Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5749408Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5749792Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5750186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5750622Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5751052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5751477Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5751938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5752455Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5752661Z 2025-08-14T21:37:06.5752776Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5753173Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5753521Z return mod(**inputs) 2025-08-14T21:37:06.5753918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5754319Z outputs = self.model( 2025-08-14T21:37:06.5754690Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5755099Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5755504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5755922Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5756292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5756692Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5757113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5757563Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5757998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5758431Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5758902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5759382Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5759562Z 2025-08-14T21:37:06.5759648Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5759878Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5760107Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5760381Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5760636Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5760863Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5761079Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5761303Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5761523Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5761737Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5761958Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5762181Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5762395Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5762616Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5762837Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5763062Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5763306Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5763695Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5764056Z return mod(**inputs) 2025-08-14T21:37:06.5764443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5764855Z outputs = self.model( 2025-08-14T21:37:06.5765243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5765672Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5766077Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5766500Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5766883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5767270Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5767693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5768144Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5768588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5769123Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5769607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5770124Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5770322Z 2025-08-14T21:37:06.5770445Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5770826Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5771179Z return mod(**inputs) 2025-08-14T21:37:06.5771578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5771986Z outputs = self.model( 2025-08-14T21:37:06.5772402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5772820Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5773237Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5773657Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5774041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5774442Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5774835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5775262Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5775782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5776209Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5776672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5777152Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5777318Z 2025-08-14T21:37:06.5777412Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5777642Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5777856Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5778074Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5778293Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5778503Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5778718Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5778939Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5779152Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5779369Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5779590Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5779803Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5780066Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5780447Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5780788Z return mod(**inputs) 2025-08-14T21:37:06.5781162Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5781563Z outputs = self.model( 2025-08-14T21:37:06.5781940Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5782335Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5782736Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5783143Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5783511Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5783886Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5784291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5784727Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5785157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5785582Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5786047Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5786553Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5786743Z 2025-08-14T21:37:06.5786853Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5787232Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5787572Z return mod(**inputs) 2025-08-14T21:37:06.5787950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5788353Z outputs = self.model( 2025-08-14T21:37:06.5788730Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5789147Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5789532Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5789989Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5790393Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5790776Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5791176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5791616Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5792045Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5792472Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5792931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5793410Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5793579Z 2025-08-14T21:37:06.5793674Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5793891Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5794115Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5794336Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5794553Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5794762Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5794975Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5795190Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5795400Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5795615Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5795828Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5796035Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5796249Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5796464Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5796669Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5796890Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5797138Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5797516Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5797852Z return mod(**inputs) 2025-08-14T21:37:06.5798232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5798635Z outputs = self.model( 2025-08-14T21:37:06.5799007Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5799411Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5799807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5800209Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5800574Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5800958Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5801366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5801793Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5802233Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5802868Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5803372Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5803874Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5804085Z 2025-08-14T21:37:06.5804200Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5804751Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5805114Z return mod(**inputs) 2025-08-14T21:37:06.5805496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5805911Z outputs = self.model( 2025-08-14T21:37:06.5806300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5806710Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5807124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5807549Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5807926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5808318Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5808739Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5809244Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5809681Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5810116Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5810592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5811085Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5811259Z 2025-08-14T21:37:06.5811348Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5811585Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5811814Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5812049Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5812271Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5812499Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5812720Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5812936Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5813156Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5813373Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5813588Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5813806Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5814055Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5814436Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5814783Z return mod(**inputs) 2025-08-14T21:37:06.5815170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5815582Z outputs = self.model( 2025-08-14T21:37:06.5815967Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5816384Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5816794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5817204Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5817589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5817988Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5818389Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5818816Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5819254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5819767Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5820246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5820751Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5820954Z 2025-08-14T21:37:06.5821067Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5821460Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5821812Z return mod(**inputs) 2025-08-14T21:37:06.5822187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5822593Z outputs = self.model( 2025-08-14T21:37:06.5822986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5823402Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5823811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5824232Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5824589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5824983Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5825385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5825815Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5826234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5826671Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5827156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5827655Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5827822Z 2025-08-14T21:37:06.5827905Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5828128Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5828347Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5828556Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5828774Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5828990Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5829196Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5829412Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5829625Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5829839Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5830046Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5830263Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5830479Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5830684Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5830895Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5831107Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5831342Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5831720Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5832065Z return mod(**inputs) 2025-08-14T21:37:06.5832454Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5832857Z outputs = self.model( 2025-08-14T21:37:06.5833247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5833662Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5834133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5834541Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5834911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5835294Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5835693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5836126Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5836557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5836978Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5837448Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5837959Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5838154Z 2025-08-14T21:37:06.5838271Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5838647Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5838726Z return mod(**inputs) 2025-08-14T21:37:06.5838990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5839073Z outputs = self.model( 2025-08-14T21:37:06.5839335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5839413Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5839680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5839762Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5840000Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5840095Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5840356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5840466Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5840724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5840825Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5841132Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5841244Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5841250Z 2025-08-14T21:37:06.5841345Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5841426Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5841507Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5841594Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5841673Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5841750Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5841836Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5841913Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5841994Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5842079Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5842157Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5842244Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5842353Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5842564Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5842696Z return mod(**inputs) 2025-08-14T21:37:06.5843004Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5843080Z outputs = self.model( 2025-08-14T21:37:06.5843367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5843444Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5843713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5843790Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5844022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5844113Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5844370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5844491Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5844764Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5844868Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5845187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5845325Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5845329Z 2025-08-14T21:37:06.5845442Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5845665Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5845738Z return mod(**inputs) 2025-08-14T21:37:06.5846013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5846092Z outputs = self.model( 2025-08-14T21:37:06.5846360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5846447Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5846716Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5846795Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5847041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5847128Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5847400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5847517Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5847787Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5847898Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5848209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5848331Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5848335Z 2025-08-14T21:37:06.5848419Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5848503Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5848591Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5848671Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5848751Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5848919Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5849007Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5849149Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5849277Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5849362Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5849453Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5849534Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5849617Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5849707Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5849788Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5849871Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5849991Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5850207Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5850279Z return mod(**inputs) 2025-08-14T21:37:06.5850562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5850640Z outputs = self.model( 2025-08-14T21:37:06.5850922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5851002Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5851284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5851370Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5851608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5851696Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5851972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5852081Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5852353Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5852463Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5852779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5852924Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5852929Z 2025-08-14T21:37:06.5853040Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5853264Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5853336Z return mod(**inputs) 2025-08-14T21:37:06.5853615Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5853696Z outputs = self.model( 2025-08-14T21:37:06.5853976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5854062Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5854352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5854429Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5854680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5854766Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5855031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5855147Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5855414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5855523Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5855947Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5856065Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5856069Z 2025-08-14T21:37:06.5856161Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5856251Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5856327Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5856409Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5856484Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5856567Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5856640Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5856713Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5856794Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5856869Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5856948Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5857036Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5857147Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5857364Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5857444Z return mod(**inputs) 2025-08-14T21:37:06.5857709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5857789Z outputs = self.model( 2025-08-14T21:37:06.5858065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5858142Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5858417Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5858493Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5858735Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5858829Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5859090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5859210Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5859519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5859617Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5859938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5860072Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5860076Z 2025-08-14T21:37:06.5860189Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5860405Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5860472Z return mod(**inputs) 2025-08-14T21:37:06.5860741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5860822Z outputs = self.model( 2025-08-14T21:37:06.5861222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5861308Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5861556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5861636Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5861855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5861935Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5862279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5862395Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5862668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5862770Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5863081Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5863204Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5863208Z 2025-08-14T21:37:06.5863296Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5863382Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5863474Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5863560Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5863652Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5863732Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5863814Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5863903Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5863990Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5864069Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5864153Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5864233Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5864310Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5864396Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5864474Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5864559Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5864668Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5864880Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5864963Z return mod(**inputs) 2025-08-14T21:37:06.5865229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5865301Z outputs = self.model( 2025-08-14T21:37:06.5865573Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5865651Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5865922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5866002Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5866243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5866337Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5866603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5866718Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5866993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5867095Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5867410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5867548Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5867552Z 2025-08-14T21:37:06.5867663Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5867896Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5867967Z return mod(**inputs) 2025-08-14T21:37:06.5868288Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5868394Z outputs = self.model( 2025-08-14T21:37:06.5868654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5868737Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5869000Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5869076Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5869320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5869406Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5869678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5869784Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5870057Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5870168Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5870479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5870594Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5870605Z 2025-08-14T21:37:06.5870688Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5870771Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5870863Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5870956Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5871037Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5871124Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5871203Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5871285Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5871373Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5871453Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5871538Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5871617Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5871726Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5871944Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5872013Z return mod(**inputs) 2025-08-14T21:37:06.5872278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5872358Z outputs = self.model( 2025-08-14T21:37:06.5872619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5872696Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5872973Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5873050Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5873294Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5873377Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5873639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5873761Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5874027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5874132Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5874450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5874655Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5874659Z 2025-08-14T21:37:06.5874779Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5874994Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5875066Z return mod(**inputs) 2025-08-14T21:37:06.5875351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5875425Z outputs = self.model( 2025-08-14T21:37:06.5875722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5875799Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5876083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5876170Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5876426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5876517Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5876784Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5876897Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5877177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5877278Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5877598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5877723Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5877730Z 2025-08-14T21:37:06.5877813Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5877918Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5877999Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5878077Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5878165Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5878242Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5878320Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5878405Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5878485Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5878575Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5878656Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5878737Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5878826Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5878907Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5878987Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5879079Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5879196Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5879413Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5879492Z return mod(**inputs) 2025-08-14T21:37:06.5879771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5879853Z outputs = self.model( 2025-08-14T21:37:06.5880123Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5880203Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5880482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5880561Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5880878Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5880971Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5881241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5881352Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5881621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5881719Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5882041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5882175Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5882178Z 2025-08-14T21:37:06.5882293Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5882512Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5882584Z return mod(**inputs) 2025-08-14T21:37:06.5882862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5882938Z outputs = self.model( 2025-08-14T21:37:06.5883202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5883287Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5883554Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5883640Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5883879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5883968Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5884242Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5884346Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5884612Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5884723Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5885034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5885157Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5885161Z 2025-08-14T21:37:06.5885243Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5885327Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5885417Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5885500Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5885592Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5885674Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5885754Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5885841Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5885920Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5886000Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5886085Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5886165Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5886279Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5886503Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5886572Z return mod(**inputs) 2025-08-14T21:37:06.5886848Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5886958Z outputs = self.model( 2025-08-14T21:37:06.5887277Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5887365Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5887635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5887715Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5887959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5888046Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5888316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5888433Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5888698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5888884Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5889207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5889353Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5889360Z 2025-08-14T21:37:06.5889472Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5889690Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5889769Z return mod(**inputs) 2025-08-14T21:37:06.5890042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5890115Z outputs = self.model( 2025-08-14T21:37:06.5890400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5890486Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5890762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5890840Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5891080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5891174Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5891441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5891559Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5891834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5891937Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5892264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5892380Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5892384Z 2025-08-14T21:37:06.5892468Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5892559Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5892640Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5892729Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5892813Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5892893Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5892983Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5893064Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5893145Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5893234Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5893315Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5893441Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5893566Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5893648Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5893729Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5893816Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5893929Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5894155Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5894228Z return mod(**inputs) 2025-08-14T21:37:06.5894498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5894584Z outputs = self.model( 2025-08-14T21:37:06.5894862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5894943Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5895240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5895319Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5895566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5895651Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5895916Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5896036Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5896300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5896408Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5896749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5896895Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5896899Z 2025-08-14T21:37:06.5897018Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5897233Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5897303Z return mod(**inputs) 2025-08-14T21:37:06.5897587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5897661Z outputs = self.model( 2025-08-14T21:37:06.5897990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5898071Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5898339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5898430Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5898680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5898773Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5899036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5899142Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5899413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5899515Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5899828Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5899954Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5900000Z 2025-08-14T21:37:06.5900087Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5900212Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5900296Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5900380Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5900469Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5900549Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5900629Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5900718Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5900800Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5900878Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5900966Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5901046Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5901167Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5901382Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5901457Z return mod(**inputs) 2025-08-14T21:37:06.5901788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5901862Z outputs = self.model( 2025-08-14T21:37:06.5902141Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5902226Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5902505Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5902590Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5903012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5903102Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5903381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5903518Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5903790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5903891Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5904199Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5904344Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5904348Z 2025-08-14T21:37:06.5904458Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5904677Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5904757Z return mod(**inputs) 2025-08-14T21:37:06.5905039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5905126Z outputs = self.model( 2025-08-14T21:37:06.5905392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5905471Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5905743Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5905821Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5906055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5906147Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5906410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5906535Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5906937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5907040Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5907351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5907463Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5907467Z 2025-08-14T21:37:06.5907557Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5907639Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5907719Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5907808Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5907888Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5907968Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5908056Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5908139Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5908222Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5908311Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5908389Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5908476Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5908553Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5908633Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5908722Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5908804Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5908917Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5909144Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5909216Z return mod(**inputs) 2025-08-14T21:37:06.5909492Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5909571Z outputs = self.model( 2025-08-14T21:37:06.5909839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5909926Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5910192Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5910273Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5910525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5910613Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5910876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5910978Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5911234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5911345Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5911644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5911778Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5911790Z 2025-08-14T21:37:06.5911896Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5912104Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5912179Z return mod(**inputs) 2025-08-14T21:37:06.5912441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5912515Z outputs = self.model( 2025-08-14T21:37:06.5912780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5912934Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5913212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5913290Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5913529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5913622Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5913890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5914003Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5914272Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5914371Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5914695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5914811Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5914815Z 2025-08-14T21:37:06.5914899Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5914989Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5915069Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5915151Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5915240Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5915321Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5915407Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5915490Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5915569Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5915659Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5915738Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5915821Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5915942Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5916161Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5916232Z return mod(**inputs) 2025-08-14T21:37:06.5916518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5916589Z outputs = self.model( 2025-08-14T21:37:06.5916860Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5916935Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5917198Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5917283Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5917520Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5917613Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5917872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5917985Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5918250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5918350Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5918651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5918792Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5918795Z 2025-08-14T21:37:06.5918904Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5919187Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5919260Z return mod(**inputs) 2025-08-14T21:37:06.5919519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5919601Z outputs = self.model( 2025-08-14T21:37:06.5919868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5919955Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5920224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5920304Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5920548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5920634Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5920908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5921033Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5921296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5921416Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5921715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5921826Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5921830Z 2025-08-14T21:37:06.5921918Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5921998Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5922077Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5922168Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5922249Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5922334Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5922413Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5922491Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5922575Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5922752Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5922832Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5922914Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5923003Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5923083Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5923162Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5923253Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5923365Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5923576Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5923655Z return mod(**inputs) 2025-08-14T21:37:06.5923923Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5924004Z outputs = self.model( 2025-08-14T21:37:06.5924272Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5924350Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5924627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5924706Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5924943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5925039Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5925305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5925548Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5925816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5925919Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5926243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5926381Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5926385Z 2025-08-14T21:37:06.5926503Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5926718Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5926791Z return mod(**inputs) 2025-08-14T21:37:06.5927068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5927149Z outputs = self.model( 2025-08-14T21:37:06.5927417Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5927506Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5927771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5927861Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5928099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5928186Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5928462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5928569Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5928916Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5929027Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5929341Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5929467Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5929471Z 2025-08-14T21:37:06.5929557Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5929641Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5929734Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5929815Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5929905Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5929987Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5930067Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5930160Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5930246Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5930328Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5930418Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5930499Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5930613Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5930841Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5930912Z return mod(**inputs) 2025-08-14T21:37:06.5931196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5931270Z outputs = self.model( 2025-08-14T21:37:06.5931539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5931624Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5931973Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5932053Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5932299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5932384Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5932655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5932771Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5933035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5933143Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5933452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5933603Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5933607Z 2025-08-14T21:37:06.5933720Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5933934Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5934014Z return mod(**inputs) 2025-08-14T21:37:06.5934283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5934358Z outputs = self.model( 2025-08-14T21:37:06.5934632Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5934711Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5934985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5935068Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5935312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5935407Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5935675Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5935801Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5936068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5936170Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5936486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5936602Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5936609Z 2025-08-14T21:37:06.5936693Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5936787Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5936871Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5936962Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5937043Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5937123Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5937214Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5937295Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5937376Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5937467Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5937548Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5937629Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5937718Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5937798Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5937886Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5938017Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5938163Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5938388Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5938460Z return mod(**inputs) 2025-08-14T21:37:06.5938728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5938810Z outputs = self.model( 2025-08-14T21:37:06.5939076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5939162Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5939426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5939505Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5939755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5939842Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5940115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5940226Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5940480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5940585Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5940881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5941013Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5941017Z 2025-08-14T21:37:06.5941131Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5941345Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5941422Z return mod(**inputs) 2025-08-14T21:37:06.5941683Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5941757Z outputs = self.model( 2025-08-14T21:37:06.5942022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5942101Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5942358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5942442Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5942671Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5942762Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5943022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-08-14T21:37:06.5943124Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:37:06.5943387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5943485Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5943787Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5943906Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5943910Z 2025-08-14T21:37:06.5943990Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5944079Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5944157Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5944272Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5944387Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5944467Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5944545Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5944631Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5944709Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5944794Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5944873Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5944952Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5945069Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5945277Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5945345Z return mod(**inputs) 2025-08-14T21:37:06.5945611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5945687Z outputs = self.model( 2025-08-14T21:37:06.5945948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5946032Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5946291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5946373Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5946603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5946687Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5946955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5947070Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5947333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5947438Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5947739Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:37:06.5947882Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:06.5947886Z 2025-08-14T21:37:06.5947995Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5948212Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5948281Z return mod(**inputs) 2025-08-14T21:37:06.5948540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-08-14T21:37:06.5948618Z outputs = self.model( 2025-08-14T21:37:06.5948876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-08-14T21:37:06.5948961Z decoder_outputs = self.decoder( 2025-08-14T21:37:06.5949230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-08-14T21:37:06.5949307Z layer_outputs = decoder_layer( 2025-08-14T21:37:06.5949545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:06.5949629Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:06.5949885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-08-14T21:37:06.5950006Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:37:06.5950261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-08-14T21:37:06.5950362Z attn_output, attn_weights = attention_interface( 2025-08-14T21:37:06.5950737Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:37:06.5950852Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:37:06.5950856Z 2025-08-14T21:37:06.5950945Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5951026Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5951105Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5951193Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5951271Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5951351Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5951437Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5951514Z cudagraph partition due to non gpu ops 2025-08-14T21:37:06.5951631Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:06.5951841Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:06.5951918Z return mod(**inputs) 2025-08-14T21:37:06.5952188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1497, in forward 2025-08-14T21:37:06.5952365Z masked_lm_loss = loss_fct(lm_logits.view(-1, self.config.vocab_size), labels.view(-1)) 2025-08-14T21:37:06.5952370Z 2025-08-14T21:37:20.1249276Z Compilation time (from dynamo_timed): 36.922505848 2025-08-14T21:37:20.1462310Z pass 2025-08-14T21:37:20.1462768Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:37:20.1463646Z TIMING: _recursive_pre_grad_passes:0.09766 _recursive_joint_graph_passes:0.91811 _recursive_post_grad_passes:0.19354 async_compile.wait:1.08219 code_gen:12.58757 inductor_compile:15.87733 backend_compile:30.43239 gc:0.00085 entire_frame_compile:36.92251 total_wall_time:36.92251 2025-08-14T21:37:20.1464688Z STATS: call_* op count: 982 | FakeTensorMode.__torch_dispatch__:70183 | FakeTensor.__torch_dispatch__:8243 | ProxyTorchDispatchMode.__torch_dispatch__:19216 2025-08-14T21:37:20.1465291Z Dynamo produced 1 graphs covering 982 ops with 0 graph breaks (0 unique) 2025-08-14T21:37:26.6693971Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:37:26.6694936Z from pkg_resources import resource_filename 2025-08-14T21:37:27.2868377Z 2025-08-14T21:37:28.7267690Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:37:28.7278081Z loading model: 0it [00:01, ?it/s] 2025-08-14T21:37:28.7278464Z cpu eval BertForMaskedLM 2025-08-14T21:37:29.2024459Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:37:29.3318097Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:37:29.4635960Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:37:40.7269301Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7269688Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7269935Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7270178Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7270467Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7270700Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7270928Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7271162Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7271390Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7271625Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7271856Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7272082Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7272789Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7273179Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7273415Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7273644Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7274073Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7274333Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7274665Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7275030Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:40.7275482Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:40.7275864Z return mod(**inputs) 2025-08-14T21:37:40.7276300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-08-14T21:37:40.7276741Z outputs = self.bert( 2025-08-14T21:37:40.7277160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:37:40.7277612Z encoder_outputs = self.encoder( 2025-08-14T21:37:40.7278033Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:37:40.7278453Z layer_outputs = layer_module( 2025-08-14T21:37:40.7278851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:40.7279255Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:40.7279683Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:37:40.7280122Z self_attention_outputs = self.attention( 2025-08-14T21:37:40.7280549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7280960Z return func(*args, **kwargs) 2025-08-14T21:37:40.7281371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:37:40.7281811Z self_outputs = self.self( 2025-08-14T21:37:40.7282222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7282623Z return func(*args, **kwargs) 2025-08-14T21:37:40.7283018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:37:40.7283508Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:40.7283717Z 2025-08-14T21:37:40.7283812Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7284044Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7284262Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7284486Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7284709Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7284928Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7285152Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7285377Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7285591Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7285814Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7286031Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7286243Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7286512Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7286761Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:40.7287164Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:40.7287534Z return mod(**inputs) 2025-08-14T21:37:40.7287933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-08-14T21:37:40.7288346Z outputs = self.bert( 2025-08-14T21:37:40.7290136Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:37:40.7290558Z encoder_outputs = self.encoder( 2025-08-14T21:37:40.7290971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:37:40.7291393Z layer_outputs = layer_module( 2025-08-14T21:37:40.7291779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:40.7292182Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:40.7292597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:37:40.7293028Z self_attention_outputs = self.attention( 2025-08-14T21:37:40.7293455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7293864Z return func(*args, **kwargs) 2025-08-14T21:37:40.7294282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:37:40.7294699Z self_outputs = self.self( 2025-08-14T21:37:40.7295102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7295552Z return func(*args, **kwargs) 2025-08-14T21:37:40.7295996Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:37:40.7296478Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:40.7296678Z 2025-08-14T21:37:40.7296764Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7296993Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7297220Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7297442Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7297659Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7297885Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7298104Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7298315Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7298534Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7298753Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7298964Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7299185Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7299404Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7299651Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:40.7300047Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:40.7300398Z return mod(**inputs) 2025-08-14T21:37:40.7300791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-08-14T21:37:40.7301197Z outputs = self.bert( 2025-08-14T21:37:40.7301587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:37:40.7302008Z encoder_outputs = self.encoder( 2025-08-14T21:37:40.7302421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:37:40.7303018Z layer_outputs = layer_module( 2025-08-14T21:37:40.7303413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:40.7303816Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:40.7304230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:37:40.7304661Z self_attention_outputs = self.attention( 2025-08-14T21:37:40.7305079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7305621Z return func(*args, **kwargs) 2025-08-14T21:37:40.7306019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:37:40.7306437Z self_outputs = self.self( 2025-08-14T21:37:40.7306837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7307233Z return func(*args, **kwargs) 2025-08-14T21:37:40.7307633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:37:40.7308109Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:40.7308307Z 2025-08-14T21:37:40.7308398Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7308619Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7308843Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7309070Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7309287Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7309512Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7309736Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7309950Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7310173Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7310395Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7310616Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7310831Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7311054Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7311310Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:40.7311696Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:40.7312050Z return mod(**inputs) 2025-08-14T21:37:40.7312443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-08-14T21:37:40.7312855Z outputs = self.bert( 2025-08-14T21:37:40.7313246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:37:40.7313667Z encoder_outputs = self.encoder( 2025-08-14T21:37:40.7314073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:37:40.7314480Z layer_outputs = layer_module( 2025-08-14T21:37:40.7314892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:40.7315283Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:40.7315696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:37:40.7316111Z self_attention_outputs = self.attention( 2025-08-14T21:37:40.7316531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7316936Z return func(*args, **kwargs) 2025-08-14T21:37:40.7317330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:37:40.7317738Z self_outputs = self.self( 2025-08-14T21:37:40.7318128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7318528Z return func(*args, **kwargs) 2025-08-14T21:37:40.7318918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:37:40.7319394Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:40.7319593Z 2025-08-14T21:37:40.7319686Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7319910Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7320176Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7320445Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7320672Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7320891Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7321120Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7321348Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7321565Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7321790Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7322014Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7322234Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7322458Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7322717Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:40.7323112Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:40.7323466Z return mod(**inputs) 2025-08-14T21:37:40.7323871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-08-14T21:37:40.7324288Z outputs = self.bert( 2025-08-14T21:37:40.7324674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:37:40.7325101Z encoder_outputs = self.encoder( 2025-08-14T21:37:40.7325517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:37:40.7325940Z layer_outputs = layer_module( 2025-08-14T21:37:40.7326316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:40.7326714Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:40.7327138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:37:40.7327573Z self_attention_outputs = self.attention( 2025-08-14T21:37:40.7327998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7328409Z return func(*args, **kwargs) 2025-08-14T21:37:40.7328901Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:37:40.7329317Z self_outputs = self.self( 2025-08-14T21:37:40.7329713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7330124Z return func(*args, **kwargs) 2025-08-14T21:37:40.7330522Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:37:40.7331006Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:40.7331212Z 2025-08-14T21:37:40.7331305Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7331544Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7331766Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7331994Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7332221Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7332435Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7332657Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7332883Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7333100Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7333334Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7333550Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7333767Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7333976Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7334225Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:40.7334607Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:40.7334992Z return mod(**inputs) 2025-08-14T21:37:40.7335414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-08-14T21:37:40.7335818Z outputs = self.bert( 2025-08-14T21:37:40.7336203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:37:40.7336608Z encoder_outputs = self.encoder( 2025-08-14T21:37:40.7337003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:37:40.7337403Z layer_outputs = layer_module( 2025-08-14T21:37:40.7337762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:40.7338140Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:40.7338555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:37:40.7338986Z self_attention_outputs = self.attention( 2025-08-14T21:37:40.7339393Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7339798Z return func(*args, **kwargs) 2025-08-14T21:37:40.7340193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:37:40.7340596Z self_outputs = self.self( 2025-08-14T21:37:40.7340979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7341366Z return func(*args, **kwargs) 2025-08-14T21:37:40.7341749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:37:40.7342207Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:40.7342416Z 2025-08-14T21:37:40.7342507Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7342742Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7342963Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7343190Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7343423Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7343640Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7343852Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7344069Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7344284Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7344493Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7344708Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7344925Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7345134Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7345377Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:40.7345769Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:40.7346104Z return mod(**inputs) 2025-08-14T21:37:40.7346488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-08-14T21:37:40.7346895Z outputs = self.bert( 2025-08-14T21:37:40.7347273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:37:40.7347682Z encoder_outputs = self.encoder( 2025-08-14T21:37:40.7348089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:37:40.7348503Z layer_outputs = layer_module( 2025-08-14T21:37:40.7348870Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:40.7349252Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:40.7349741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:37:40.7350168Z self_attention_outputs = self.attention( 2025-08-14T21:37:40.7350632Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7351074Z return func(*args, **kwargs) 2025-08-14T21:37:40.7351466Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:37:40.7351873Z self_outputs = self.self( 2025-08-14T21:37:40.7352253Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7352660Z return func(*args, **kwargs) 2025-08-14T21:37:40.7353061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:37:40.7353535Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:40.7353748Z 2025-08-14T21:37:40.7353834Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7354062Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7354342Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7354565Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7354777Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7354996Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7355214Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7355433Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7355645Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7355866Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7356090Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7356303Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7356528Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7356785Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:40.7357172Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:40.7357528Z return mod(**inputs) 2025-08-14T21:37:40.7357919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-08-14T21:37:40.7358340Z outputs = self.bert( 2025-08-14T21:37:40.7358719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:37:40.7359136Z encoder_outputs = self.encoder( 2025-08-14T21:37:40.7359546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:37:40.7359948Z layer_outputs = layer_module( 2025-08-14T21:37:40.7360325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:40.7360717Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:40.7361137Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:37:40.7361553Z self_attention_outputs = self.attention( 2025-08-14T21:37:40.7361960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7362369Z return func(*args, **kwargs) 2025-08-14T21:37:40.7362755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:37:40.7363168Z self_outputs = self.self( 2025-08-14T21:37:40.7363562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7363975Z return func(*args, **kwargs) 2025-08-14T21:37:40.7364364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:37:40.7364915Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:40.7365121Z 2025-08-14T21:37:40.7365207Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7365439Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7365657Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7365881Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7366106Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7366318Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7366540Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7366765Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7366980Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7367205Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7367428Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7367642Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7367868Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7368124Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:40.7368519Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:40.7368947Z return mod(**inputs) 2025-08-14T21:37:40.7369340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-08-14T21:37:40.7369755Z outputs = self.bert( 2025-08-14T21:37:40.7370133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:37:40.7370552Z encoder_outputs = self.encoder( 2025-08-14T21:37:40.7370962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:37:40.7371377Z layer_outputs = layer_module( 2025-08-14T21:37:40.7371749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:40.7372151Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:40.7372571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:37:40.7372989Z self_attention_outputs = self.attention( 2025-08-14T21:37:40.7373402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7373810Z return func(*args, **kwargs) 2025-08-14T21:37:40.7374194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:37:40.7374582Z self_outputs = self.self( 2025-08-14T21:37:40.7374960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7375352Z return func(*args, **kwargs) 2025-08-14T21:37:40.7375743Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:37:40.7376194Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:40.7376392Z 2025-08-14T21:37:40.7376475Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7376699Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7376912Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7377130Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7377345Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7377552Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7377783Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7377996Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7378211Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7378420Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7378637Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7378915Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7379169Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7379421Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:40.7379804Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:40.7380137Z return mod(**inputs) 2025-08-14T21:37:40.7380518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-08-14T21:37:40.7380923Z outputs = self.bert( 2025-08-14T21:37:40.7381310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:37:40.7381729Z encoder_outputs = self.encoder( 2025-08-14T21:37:40.7382138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:37:40.7382551Z layer_outputs = layer_module( 2025-08-14T21:37:40.7382917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:40.7383296Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:40.7383707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:37:40.7384132Z self_attention_outputs = self.attention( 2025-08-14T21:37:40.7384525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7384909Z return func(*args, **kwargs) 2025-08-14T21:37:40.7385296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:37:40.7385690Z self_outputs = self.self( 2025-08-14T21:37:40.7386067Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7386462Z return func(*args, **kwargs) 2025-08-14T21:37:40.7386847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:37:40.7387292Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:40.7387487Z 2025-08-14T21:37:40.7387568Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7387789Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7388004Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7388213Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7388427Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7388641Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7388850Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7389067Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7389280Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7389490Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7389708Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7389928Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7390139Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7390392Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:40.7390783Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:40.7391138Z return mod(**inputs) 2025-08-14T21:37:40.7391541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-08-14T21:37:40.7391955Z outputs = self.bert( 2025-08-14T21:37:40.7392360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:37:40.7392767Z encoder_outputs = self.encoder( 2025-08-14T21:37:40.7393162Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:37:40.7393673Z layer_outputs = layer_module( 2025-08-14T21:37:40.7394052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:40.7394443Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:40.7394869Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:37:40.7395300Z self_attention_outputs = self.attention( 2025-08-14T21:37:40.7395713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7396230Z return func(*args, **kwargs) 2025-08-14T21:37:40.7396639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:37:40.7397064Z self_outputs = self.self( 2025-08-14T21:37:40.7397457Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7397879Z return func(*args, **kwargs) 2025-08-14T21:37:40.7398290Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:37:40.7398775Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:40.7398972Z 2025-08-14T21:37:40.7399060Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7399292Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7399519Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7399738Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7399969Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7400194Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7400408Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7400630Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7400853Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7401072Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7401302Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7401528Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7401754Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7402000Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:40.7402393Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:40.7402985Z return mod(**inputs) 2025-08-14T21:37:40.7403375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-08-14T21:37:40.7403786Z outputs = self.bert( 2025-08-14T21:37:40.7404171Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:37:40.7404596Z encoder_outputs = self.encoder( 2025-08-14T21:37:40.7405004Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:37:40.7405418Z layer_outputs = layer_module( 2025-08-14T21:37:40.7405796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:37:40.7406182Z return super().__call__(*args, **kwargs) 2025-08-14T21:37:40.7406603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:37:40.7407026Z self_attention_outputs = self.attention( 2025-08-14T21:37:40.7407439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7407845Z return func(*args, **kwargs) 2025-08-14T21:37:40.7408240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:37:40.7408834Z self_outputs = self.self( 2025-08-14T21:37:40.7409284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:37:40.7409690Z return func(*args, **kwargs) 2025-08-14T21:37:40.7410092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:37:40.7410569Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:37:40.7410765Z 2025-08-14T21:37:40.7410854Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7411086Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7411315Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7411530Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7411756Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7411975Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7412199Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7412416Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7412640Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7412865Z cudagraph partition due to non gpu ops 2025-08-14T21:37:40.7413115Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:37:40.7413510Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:37:40.7413864Z return mod(**inputs) 2025-08-14T21:37:40.7414250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1328, in forward 2025-08-14T21:37:40.7414793Z masked_lm_loss = loss_fct(prediction_scores.view(-1, self.config.vocab_size), labels.view(-1)) 2025-08-14T21:37:40.7415054Z 2025-08-14T21:37:49.9383242Z Compilation time (from dynamo_timed): 19.061421825 2025-08-14T21:37:49.9419456Z pass 2025-08-14T21:37:49.9419836Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:37:49.9420789Z TIMING: _recursive_pre_grad_passes:0.03687 _recursive_joint_graph_passes:0.43347 _recursive_post_grad_passes:0.08622 async_compile.wait:0.95424 code_gen:8.87714 inductor_compile:10.49561 backend_compile:16.04012 gc:0.00148 entire_frame_compile:19.06142 total_wall_time:19.06142 2025-08-14T21:37:49.9421803Z STATS: call_* op count: 291 | FakeTensorMode.__torch_dispatch__:26890 | FakeTensor.__torch_dispatch__:3240 | ProxyTorchDispatchMode.__torch_dispatch__:7211 2025-08-14T21:37:49.9423696Z Dynamo produced 1 graphs covering 291 ops with 0 graph breaks (0 unique) 2025-08-14T21:37:55.7341704Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:37:55.7342719Z from pkg_resources import resource_filename 2025-08-14T21:37:56.3395246Z 2025-08-14T21:37:57.6618827Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:37:57.6620415Z loading model: 0it [00:01, ?it/s] 2025-08-14T21:37:57.6621439Z cpu eval BertForQuestionAnswering 2025-08-14T21:37:58.0890385Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:37:58.2032593Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:37:58.3163848Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:38:09.3229126Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3229484Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3229710Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3229925Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3230150Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3230367Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3231025Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3231352Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3231575Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3231784Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3232004Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3232221Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3232476Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:09.3232904Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:09.3233288Z return mod(**inputs) 2025-08-14T21:38:09.3233728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1781, in forward 2025-08-14T21:38:09.3234172Z logits = self.qa_outputs(sequence_output) 2025-08-14T21:38:09.3234330Z 2025-08-14T21:38:09.3234417Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3234646Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3234869Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3235106Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3235336Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3235564Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3235807Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3236072Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:09.3236474Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:09.3236834Z return mod(**inputs) 2025-08-14T21:38:09.3237233Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-08-14T21:38:09.3237654Z outputs = self.bert( 2025-08-14T21:38:09.3238042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:38:09.3238474Z encoder_outputs = self.encoder( 2025-08-14T21:38:09.3238906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:38:09.3239333Z layer_outputs = layer_module( 2025-08-14T21:38:09.3239721Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:09.3240128Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:09.3240558Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:38:09.3240992Z self_attention_outputs = self.attention( 2025-08-14T21:38:09.3241415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3241835Z return func(*args, **kwargs) 2025-08-14T21:38:09.3242270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:38:09.3242689Z self_outputs = self.self( 2025-08-14T21:38:09.3243089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3243496Z return func(*args, **kwargs) 2025-08-14T21:38:09.3243900Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:38:09.3244459Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:09.3244683Z 2025-08-14T21:38:09.3244771Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3245001Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3245226Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3245441Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3245664Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3245885Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3246098Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3246398Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3246674Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3246891Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3247162Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3247375Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3247598Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3247852Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:09.3248244Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:09.3248608Z return mod(**inputs) 2025-08-14T21:38:09.3249411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-08-14T21:38:09.3249833Z outputs = self.bert( 2025-08-14T21:38:09.3250233Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:38:09.3250660Z encoder_outputs = self.encoder( 2025-08-14T21:38:09.3251076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:38:09.3251491Z layer_outputs = layer_module( 2025-08-14T21:38:09.3251871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:09.3252273Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:09.3252693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:38:09.3253114Z self_attention_outputs = self.attention( 2025-08-14T21:38:09.3253533Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3253936Z return func(*args, **kwargs) 2025-08-14T21:38:09.3254335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:38:09.3254746Z self_outputs = self.self( 2025-08-14T21:38:09.3255152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3255543Z return func(*args, **kwargs) 2025-08-14T21:38:09.3255923Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:38:09.3256386Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:09.3256585Z 2025-08-14T21:38:09.3256670Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3256896Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3257114Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3257334Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3257552Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3257765Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3257986Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3258208Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3258420Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3258646Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3258874Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3259089Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3259296Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3259543Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:09.3259944Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:09.3260292Z return mod(**inputs) 2025-08-14T21:38:09.3260688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-08-14T21:38:09.3261104Z outputs = self.bert( 2025-08-14T21:38:09.3261535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:38:09.3262029Z encoder_outputs = self.encoder( 2025-08-14T21:38:09.3262426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:38:09.3262833Z layer_outputs = layer_module( 2025-08-14T21:38:09.3263200Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:09.3263606Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:09.3264025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:38:09.3264459Z self_attention_outputs = self.attention( 2025-08-14T21:38:09.3264865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3265332Z return func(*args, **kwargs) 2025-08-14T21:38:09.3265724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:38:09.3266116Z self_outputs = self.self( 2025-08-14T21:38:09.3266494Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3266900Z return func(*args, **kwargs) 2025-08-14T21:38:09.3267305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:38:09.3267764Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:09.3267960Z 2025-08-14T21:38:09.3268042Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3268271Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3268484Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3268703Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3268921Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3269132Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3269349Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3269563Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3269776Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3269985Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3270199Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3270413Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3270620Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3270869Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:09.3271248Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:09.3271583Z return mod(**inputs) 2025-08-14T21:38:09.3271967Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-08-14T21:38:09.3272372Z outputs = self.bert( 2025-08-14T21:38:09.3272750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:38:09.3273146Z encoder_outputs = self.encoder( 2025-08-14T21:38:09.3273541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:38:09.3273946Z layer_outputs = layer_module( 2025-08-14T21:38:09.3274301Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:09.3274679Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:09.3275083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:38:09.3275493Z self_attention_outputs = self.attention( 2025-08-14T21:38:09.3275885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3276329Z return func(*args, **kwargs) 2025-08-14T21:38:09.3276762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:38:09.3277164Z self_outputs = self.self( 2025-08-14T21:38:09.3277543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3277941Z return func(*args, **kwargs) 2025-08-14T21:38:09.3278341Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:38:09.3278793Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:09.3278991Z 2025-08-14T21:38:09.3279075Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3279296Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3279511Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3279723Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3279940Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3280153Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3280360Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3280571Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3280785Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3280991Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3281203Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3281415Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3281621Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3281870Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:09.3282247Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:09.3282597Z return mod(**inputs) 2025-08-14T21:38:09.3282982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-08-14T21:38:09.3283399Z outputs = self.bert( 2025-08-14T21:38:09.3283786Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:38:09.3284196Z encoder_outputs = self.encoder( 2025-08-14T21:38:09.3284607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:38:09.3285021Z layer_outputs = layer_module( 2025-08-14T21:38:09.3285398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:09.3285781Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:09.3286198Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:38:09.3286628Z self_attention_outputs = self.attention( 2025-08-14T21:38:09.3287044Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3287443Z return func(*args, **kwargs) 2025-08-14T21:38:09.3287842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:38:09.3288259Z self_outputs = self.self( 2025-08-14T21:38:09.3288676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3289191Z return func(*args, **kwargs) 2025-08-14T21:38:09.3289595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:38:09.3290079Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:09.3290275Z 2025-08-14T21:38:09.3290360Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3290591Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3290896Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3291155Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3291381Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3291602Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3291818Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3292044Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3292267Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3292492Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3292708Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3292933Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3293155Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3293401Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:09.3293791Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:09.3294151Z return mod(**inputs) 2025-08-14T21:38:09.3294540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-08-14T21:38:09.3294963Z outputs = self.bert( 2025-08-14T21:38:09.3295366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:38:09.3295824Z encoder_outputs = self.encoder( 2025-08-14T21:38:09.3296224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:38:09.3296634Z layer_outputs = layer_module( 2025-08-14T21:38:09.3297013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:09.3297396Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:09.3297818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:38:09.3298244Z self_attention_outputs = self.attention( 2025-08-14T21:38:09.3298656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3299021Z return func(*args, **kwargs) 2025-08-14T21:38:09.3299401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:38:09.3299800Z self_outputs = self.self( 2025-08-14T21:38:09.3300178Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3300570Z return func(*args, **kwargs) 2025-08-14T21:38:09.3300955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:38:09.3301412Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:09.3301604Z 2025-08-14T21:38:09.3301685Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3301921Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3302133Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3302340Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3302536Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3303029Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3303304Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3303521Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3303747Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3303960Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3304178Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3304393Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3304602Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3304850Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:09.3305229Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:09.3305698Z return mod(**inputs) 2025-08-14T21:38:09.3306153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-08-14T21:38:09.3306556Z outputs = self.bert( 2025-08-14T21:38:09.3306934Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:38:09.3307333Z encoder_outputs = self.encoder( 2025-08-14T21:38:09.3307733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:38:09.3308139Z layer_outputs = layer_module( 2025-08-14T21:38:09.3308501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:09.3308890Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:09.3309305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:38:09.3309748Z self_attention_outputs = self.attention( 2025-08-14T21:38:09.3310121Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3310491Z return func(*args, **kwargs) 2025-08-14T21:38:09.3310855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:38:09.3311223Z self_outputs = self.self( 2025-08-14T21:38:09.3311585Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3311972Z return func(*args, **kwargs) 2025-08-14T21:38:09.3312364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:38:09.3312821Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:09.3313026Z 2025-08-14T21:38:09.3313111Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3313345Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3313572Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3313788Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3314002Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3314215Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3314422Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3314635Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3314850Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3315059Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3315273Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3315490Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3315697Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3315945Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:09.3316329Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:09.3316678Z return mod(**inputs) 2025-08-14T21:38:09.3317056Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-08-14T21:38:09.3317457Z outputs = self.bert( 2025-08-14T21:38:09.3317832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:38:09.3318231Z encoder_outputs = self.encoder( 2025-08-14T21:38:09.3318635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:38:09.3319008Z layer_outputs = layer_module( 2025-08-14T21:38:09.3319351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:09.3319702Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:09.3320201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:38:09.3320594Z self_attention_outputs = self.attention( 2025-08-14T21:38:09.3320970Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3321343Z return func(*args, **kwargs) 2025-08-14T21:38:09.3321712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:38:09.3322122Z self_outputs = self.self( 2025-08-14T21:38:09.3322502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3322906Z return func(*args, **kwargs) 2025-08-14T21:38:09.3323307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:38:09.3323773Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:09.3323979Z 2025-08-14T21:38:09.3324068Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3324301Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3324525Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3324739Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3324962Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3325183Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3325397Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3325617Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3325837Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3326047Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3326268Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3326486Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3326705Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3326949Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:09.3327351Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:09.3327705Z return mod(**inputs) 2025-08-14T21:38:09.3328089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-08-14T21:38:09.3328503Z outputs = self.bert( 2025-08-14T21:38:09.3328979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:38:09.3329454Z encoder_outputs = self.encoder( 2025-08-14T21:38:09.3329855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:38:09.3330274Z layer_outputs = layer_module( 2025-08-14T21:38:09.3330652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:09.3331043Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:09.3331464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:38:09.3331894Z self_attention_outputs = self.attention( 2025-08-14T21:38:09.3332308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3332708Z return func(*args, **kwargs) 2025-08-14T21:38:09.3333106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:38:09.3333514Z self_outputs = self.self( 2025-08-14T21:38:09.3333897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3334296Z return func(*args, **kwargs) 2025-08-14T21:38:09.3334694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:38:09.3335260Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:09.3335460Z 2025-08-14T21:38:09.3335546Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3335777Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3336003Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3336221Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3336448Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3336669Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3336893Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3337110Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3337334Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3337557Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3337772Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3337995Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3338217Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3338470Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:09.3338867Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:09.3339230Z return mod(**inputs) 2025-08-14T21:38:09.3339613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-08-14T21:38:09.3340007Z outputs = self.bert( 2025-08-14T21:38:09.3340384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:38:09.3340791Z encoder_outputs = self.encoder( 2025-08-14T21:38:09.3341181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:38:09.3341585Z layer_outputs = layer_module( 2025-08-14T21:38:09.3341949Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:09.3342335Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:09.3342732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:38:09.3343142Z self_attention_outputs = self.attention( 2025-08-14T21:38:09.3343544Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3343948Z return func(*args, **kwargs) 2025-08-14T21:38:09.3344353Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:38:09.3344753Z self_outputs = self.self( 2025-08-14T21:38:09.3345135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3345522Z return func(*args, **kwargs) 2025-08-14T21:38:09.3345911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:38:09.3346374Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:09.3346566Z 2025-08-14T21:38:09.3346656Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3346873Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3347097Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3347317Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3347528Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3347748Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3347965Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3348173Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3348389Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3348605Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3348814Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3349858Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3350123Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3350377Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:09.3350761Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:09.3351107Z return mod(**inputs) 2025-08-14T21:38:09.3351491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-08-14T21:38:09.3351884Z outputs = self.bert( 2025-08-14T21:38:09.3352261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:38:09.3352672Z encoder_outputs = self.encoder( 2025-08-14T21:38:09.3353068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:38:09.3353464Z layer_outputs = layer_module( 2025-08-14T21:38:09.3353854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:09.3354244Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:09.3354649Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:38:09.3355072Z self_attention_outputs = self.attention( 2025-08-14T21:38:09.3355487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3355897Z return func(*args, **kwargs) 2025-08-14T21:38:09.3356284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:38:09.3356701Z self_outputs = self.self( 2025-08-14T21:38:09.3357096Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3357487Z return func(*args, **kwargs) 2025-08-14T21:38:09.3357884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:38:09.3358354Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:09.3358550Z 2025-08-14T21:38:09.3358642Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3358862Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3359086Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3359306Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3359518Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3359740Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3359961Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3360171Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3360396Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3360615Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3360839Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3361054Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3361276Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3361529Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:09.3361923Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:09.3362278Z return mod(**inputs) 2025-08-14T21:38:09.3362676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-08-14T21:38:09.3363082Z outputs = self.bert( 2025-08-14T21:38:09.3363469Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-08-14T21:38:09.3363888Z encoder_outputs = self.encoder( 2025-08-14T21:38:09.3364297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-08-14T21:38:09.3364761Z layer_outputs = layer_module( 2025-08-14T21:38:09.3365179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:09.3365581Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:09.3366003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-08-14T21:38:09.3366434Z self_attention_outputs = self.attention( 2025-08-14T21:38:09.3366847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3367268Z return func(*args, **kwargs) 2025-08-14T21:38:09.3367660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-08-14T21:38:09.3368081Z self_outputs = self.self( 2025-08-14T21:38:09.3368477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:38:09.3368987Z return func(*args, **kwargs) 2025-08-14T21:38:09.3369384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-08-14T21:38:09.3369862Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:09.3370059Z 2025-08-14T21:38:09.3370153Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3370375Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3370602Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3370827Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3371053Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3371270Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3371493Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3371713Z cudagraph partition due to non gpu ops 2025-08-14T21:38:09.3371957Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:09.3372356Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:09.3372709Z return mod(**inputs) 2025-08-14T21:38:09.3373095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1799, in forward 2025-08-14T21:38:09.3373552Z start_loss = loss_fct(start_logits, start_positions) 2025-08-14T21:38:09.3373720Z 2025-08-14T21:38:09.3373827Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:09.3374199Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:09.3374543Z return mod(**inputs) 2025-08-14T21:38:09.3374919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1800, in forward 2025-08-14T21:38:09.3375354Z end_loss = loss_fct(end_logits, end_positions) 2025-08-14T21:38:09.3375509Z 2025-08-14T21:38:18.0109278Z Compilation time (from dynamo_timed): 18.408644471 2025-08-14T21:38:18.0109628Z pass 2025-08-14T21:38:18.0109997Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:38:18.0110841Z TIMING: _recursive_pre_grad_passes:0.03771 _recursive_joint_graph_passes:0.41839 _recursive_post_grad_passes:0.08939 async_compile.wait:0.712 code_gen:8.47146 inductor_compile:10.05671 backend_compile:15.39252 gc:0.00046 entire_frame_compile:18.40864 total_wall_time:18.40864 2025-08-14T21:38:18.0111841Z STATS: call_* op count: 298 | FakeTensorMode.__torch_dispatch__:26749 | FakeTensor.__torch_dispatch__:3252 | ProxyTorchDispatchMode.__torch_dispatch__:7220 2025-08-14T21:38:18.0112447Z Dynamo produced 1 graphs covering 298 ops with 0 graph breaks (0 unique) 2025-08-14T21:38:23.8605063Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:38:23.8606183Z from pkg_resources import resource_filename 2025-08-14T21:38:24.4784040Z 2025-08-14T21:38:45.0637021Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:38:45.0640072Z loading model: 0it [00:20, ?it/s] 2025-08-14T21:38:45.0641485Z cpu eval BlenderbotForCausalLM 2025-08-14T21:38:45.2694137Z Compilation time (from dynamo_timed): 0 2025-08-14T21:38:45.2694444Z pass_due_to_skip 2025-08-14T21:38:45.2707279Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:38:45.2707694Z TIMING: total_wall_time:0 2025-08-14T21:38:45.2707892Z STATS: call_* op count: 0 2025-08-14T21:38:45.2708170Z Dynamo produced 0 graphs covering 0 ops with 0 graph breaks (0 unique) 2025-08-14T21:38:50.2784221Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:38:50.2785238Z from pkg_resources import resource_filename 2025-08-14T21:38:50.8647788Z 2025-08-14T21:38:51.7871898Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:38:51.7872400Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:38:51.7875081Z cpu eval BlenderbotSmallForCausalLM 2025-08-14T21:38:51.9540165Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:38:52.0159712Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:38:52.0819746Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:38:59.9685073Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9685442Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9685734Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9685983Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9686218Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9686452Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9686687Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9686921Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9687148Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9687385Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9687617Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9687849Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9688073Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9688303Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9688533Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9688948Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9689195Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9689437Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9689707Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:59.9690153Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:59.9690526Z return mod(**inputs) 2025-08-14T21:38:59.9691041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-08-14T21:38:59.9691548Z outputs = self.model.decoder( 2025-08-14T21:38:59.9692049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:38:59.9692587Z layer_outputs = decoder_layer( 2025-08-14T21:38:59.9693021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:59.9693447Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:59.9694485Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:38:59.9695010Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:38:59.9695525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:38:59.9696047Z attn_output, attn_weights = attention_interface( 2025-08-14T21:38:59.9696538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:38:59.9697105Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:59.9697330Z 2025-08-14T21:38:59.9697461Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:59.9697865Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:59.9698238Z return mod(**inputs) 2025-08-14T21:38:59.9698706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-08-14T21:38:59.9699186Z outputs = self.model.decoder( 2025-08-14T21:38:59.9699670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:38:59.9700156Z layer_outputs = decoder_layer( 2025-08-14T21:38:59.9700549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:59.9700954Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:59.9701451Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:38:59.9701960Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:38:59.9702481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:38:59.9703239Z attn_output, attn_weights = attention_interface( 2025-08-14T21:38:59.9703737Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:38:59.9704231Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:38:59.9704415Z 2025-08-14T21:38:59.9704510Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9704731Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9704964Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9705191Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9705407Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9705634Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9705859Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9706078Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9706304Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9706525Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9706749Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9706965Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9707177Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9707390Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9707597Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9707812Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9708059Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:59.9708434Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:59.9708790Z return mod(**inputs) 2025-08-14T21:38:59.9709237Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-08-14T21:38:59.9709783Z outputs = self.model.decoder( 2025-08-14T21:38:59.9710303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:38:59.9710788Z layer_outputs = decoder_layer( 2025-08-14T21:38:59.9711161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:59.9711551Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:59.9712032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:38:59.9712540Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:38:59.9713041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:38:59.9713548Z attn_output, attn_weights = attention_interface( 2025-08-14T21:38:59.9714025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:38:59.9714550Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:59.9714743Z 2025-08-14T21:38:59.9714860Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:59.9715255Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:59.9715612Z return mod(**inputs) 2025-08-14T21:38:59.9716068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-08-14T21:38:59.9716543Z outputs = self.model.decoder( 2025-08-14T21:38:59.9716999Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:38:59.9717468Z layer_outputs = decoder_layer( 2025-08-14T21:38:59.9717836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:59.9718216Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:59.9718681Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:38:59.9719182Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:38:59.9719664Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:38:59.9720155Z attn_output, attn_weights = attention_interface( 2025-08-14T21:38:59.9720619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:38:59.9721096Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:38:59.9721268Z 2025-08-14T21:38:59.9721362Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9721582Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9721804Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9722022Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9722232Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9722449Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9722664Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9722872Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9723087Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9723302Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9723547Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9723928Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9724152Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9724413Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9724673Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9724929Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9725178Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:59.9725553Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:59.9725900Z return mod(**inputs) 2025-08-14T21:38:59.9726344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-08-14T21:38:59.9726803Z outputs = self.model.decoder( 2025-08-14T21:38:59.9727264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:38:59.9727729Z layer_outputs = decoder_layer( 2025-08-14T21:38:59.9728111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:59.9728507Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:59.9729034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:38:59.9729528Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:38:59.9730024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:38:59.9730519Z attn_output, attn_weights = attention_interface( 2025-08-14T21:38:59.9730983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:38:59.9731481Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:59.9731677Z 2025-08-14T21:38:59.9731793Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:59.9732163Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:59.9741780Z return mod(**inputs) 2025-08-14T21:38:59.9742393Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-08-14T21:38:59.9742868Z outputs = self.model.decoder( 2025-08-14T21:38:59.9743336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:38:59.9743800Z layer_outputs = decoder_layer( 2025-08-14T21:38:59.9744177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:59.9744568Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:59.9745049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:38:59.9745517Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:38:59.9745983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:38:59.9746448Z attn_output, attn_weights = attention_interface( 2025-08-14T21:38:59.9746899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:38:59.9747383Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:38:59.9747563Z 2025-08-14T21:38:59.9747655Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9747883Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9748095Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9748314Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9748529Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9748748Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9749103Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9749370Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9749583Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9749795Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9750010Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9750221Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9750428Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9750640Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9750853Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9751059Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9751309Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:59.9751704Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:59.9752055Z return mod(**inputs) 2025-08-14T21:38:59.9752508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-08-14T21:38:59.9752987Z outputs = self.model.decoder( 2025-08-14T21:38:59.9753448Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:38:59.9753905Z layer_outputs = decoder_layer( 2025-08-14T21:38:59.9754282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:59.9754670Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:59.9755139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:38:59.9755622Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:38:59.9756108Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:38:59.9756604Z attn_output, attn_weights = attention_interface( 2025-08-14T21:38:59.9757063Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:38:59.9757569Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:59.9757773Z 2025-08-14T21:38:59.9757887Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:59.9758273Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:59.9758617Z return mod(**inputs) 2025-08-14T21:38:59.9759070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-08-14T21:38:59.9759552Z outputs = self.model.decoder( 2025-08-14T21:38:59.9760008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:38:59.9760468Z layer_outputs = decoder_layer( 2025-08-14T21:38:59.9760840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:59.9761235Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:59.9761693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:38:59.9762197Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:38:59.9762692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:38:59.9763191Z attn_output, attn_weights = attention_interface( 2025-08-14T21:38:59.9763666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:38:59.9764204Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:38:59.9764426Z 2025-08-14T21:38:59.9764518Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9764753Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9764977Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9765206Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9765431Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9765649Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9765873Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9766099Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9766315Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9766537Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9766759Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9766975Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9767198Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9767424Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9767649Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9767862Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9768118Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:59.9768515Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:59.9769052Z return mod(**inputs) 2025-08-14T21:38:59.9769518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-08-14T21:38:59.9770001Z outputs = self.model.decoder( 2025-08-14T21:38:59.9770474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:38:59.9770945Z layer_outputs = decoder_layer( 2025-08-14T21:38:59.9771326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:59.9771730Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:59.9772198Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:38:59.9772701Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:38:59.9773201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:38:59.9773708Z attn_output, attn_weights = attention_interface( 2025-08-14T21:38:59.9774176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:38:59.9774702Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:59.9774910Z 2025-08-14T21:38:59.9775024Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:59.9775423Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:59.9775771Z return mod(**inputs) 2025-08-14T21:38:59.9776257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-08-14T21:38:59.9776745Z outputs = self.model.decoder( 2025-08-14T21:38:59.9777208Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:38:59.9777688Z layer_outputs = decoder_layer( 2025-08-14T21:38:59.9778072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:59.9778472Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:59.9778939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:38:59.9779554Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:38:59.9780064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:38:59.9780561Z attn_output, attn_weights = attention_interface( 2025-08-14T21:38:59.9781027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:38:59.9781515Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:38:59.9781687Z 2025-08-14T21:38:59.9781782Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9782010Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9782225Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9782446Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9782664Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9782879Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9783102Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9783319Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9783530Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9783747Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9783964Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9784171Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9784390Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9784606Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9784821Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9785027Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9785274Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:59.9785658Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:59.9786002Z return mod(**inputs) 2025-08-14T21:38:59.9786461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-08-14T21:38:59.9786934Z outputs = self.model.decoder( 2025-08-14T21:38:59.9787397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:38:59.9787872Z layer_outputs = decoder_layer( 2025-08-14T21:38:59.9788245Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:59.9788642Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:59.9789115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:38:59.9789611Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:38:59.9790112Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:38:59.9790613Z attn_output, attn_weights = attention_interface( 2025-08-14T21:38:59.9791078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:38:59.9791589Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:59.9791784Z 2025-08-14T21:38:59.9791902Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:59.9792291Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:59.9792638Z return mod(**inputs) 2025-08-14T21:38:59.9793092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-08-14T21:38:59.9793570Z outputs = self.model.decoder( 2025-08-14T21:38:59.9794066Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:38:59.9794574Z layer_outputs = decoder_layer( 2025-08-14T21:38:59.9794943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:59.9795329Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:59.9795788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:38:59.9796274Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:38:59.9796754Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:38:59.9797240Z attn_output, attn_weights = attention_interface( 2025-08-14T21:38:59.9797695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:38:59.9798182Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:38:59.9798353Z 2025-08-14T21:38:59.9798447Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9798666Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9798892Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9799110Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9799328Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9799540Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9799759Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9799979Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9800191Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9800408Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9800627Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9800836Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9801058Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9801274Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9801483Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9801698Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9801946Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:59.9802326Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:59.9802839Z return mod(**inputs) 2025-08-14T21:38:59.9803291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-08-14T21:38:59.9803758Z outputs = self.model.decoder( 2025-08-14T21:38:59.9804207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:38:59.9804669Z layer_outputs = decoder_layer( 2025-08-14T21:38:59.9805045Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:59.9805424Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:59.9805882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:38:59.9806366Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:38:59.9806860Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:38:59.9807371Z attn_output, attn_weights = attention_interface( 2025-08-14T21:38:59.9807828Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:38:59.9808344Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:59.9808544Z 2025-08-14T21:38:59.9808871Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:59.9809329Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:59.9809701Z return mod(**inputs) 2025-08-14T21:38:59.9810159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-08-14T21:38:59.9810638Z outputs = self.model.decoder( 2025-08-14T21:38:59.9811102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:38:59.9811579Z layer_outputs = decoder_layer( 2025-08-14T21:38:59.9811957Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:59.9812425Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:59.9812903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:38:59.9813415Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:38:59.9813912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:38:59.9814412Z attn_output, attn_weights = attention_interface( 2025-08-14T21:38:59.9814887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:38:59.9815378Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:38:59.9815553Z 2025-08-14T21:38:59.9815646Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9815870Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9816093Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9816316Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9816534Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9816761Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9816983Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9817203Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9817419Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9817639Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9817860Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9818073Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9818291Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9818493Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9818689Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9818896Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9819130Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:59.9819482Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:59.9819812Z return mod(**inputs) 2025-08-14T21:38:59.9820235Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-08-14T21:38:59.9820675Z outputs = self.model.decoder( 2025-08-14T21:38:59.9821122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:38:59.9821582Z layer_outputs = decoder_layer( 2025-08-14T21:38:59.9821931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:59.9822291Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:59.9822722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:38:59.9823179Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:38:59.9823711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:38:59.9824161Z attn_output, attn_weights = attention_interface( 2025-08-14T21:38:59.9824604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:38:59.9825080Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:38:59.9825263Z 2025-08-14T21:38:59.9825375Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:59.9825728Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:59.9826055Z return mod(**inputs) 2025-08-14T21:38:59.9826470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-08-14T21:38:59.9826911Z outputs = self.model.decoder( 2025-08-14T21:38:59.9827367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:38:59.9827826Z layer_outputs = decoder_layer( 2025-08-14T21:38:59.9828175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:38:59.9828531Z return super().__call__(*args, **kwargs) 2025-08-14T21:38:59.9828982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:38:59.9829469Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:38:59.9829950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:38:59.9830430Z attn_output, attn_weights = attention_interface( 2025-08-14T21:38:59.9830896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:38:59.9831376Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:38:59.9831543Z 2025-08-14T21:38:59.9831635Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9831850Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9832068Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9832284Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9832492Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9832705Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9832921Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9833129Z cudagraph partition due to non gpu ops 2025-08-14T21:38:59.9833378Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:38:59.9833761Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:38:59.9834100Z return mod(**inputs) 2025-08-14T21:38:59.9834528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1534, in forward 2025-08-14T21:38:59.9835065Z loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) 2025-08-14T21:38:59.9835272Z 2025-08-14T21:39:08.4005522Z Compilation time (from dynamo_timed): 15.092086637 2025-08-14T21:39:08.4034689Z pass 2025-08-14T21:39:08.4035266Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:39:08.4036330Z TIMING: _recursive_pre_grad_passes:0.02939 _recursive_joint_graph_passes:0.32811 _recursive_post_grad_passes:0.0686 async_compile.wait:0.85812 code_gen:8.16344 inductor_compile:9.50694 backend_compile:13.33157 gc:0.00054 entire_frame_compile:15.09209 total_wall_time:15.09209 2025-08-14T21:39:08.4037424Z STATS: call_* op count: 254 | FakeTensorMode.__torch_dispatch__:18848 | FakeTensor.__torch_dispatch__:2309 | ProxyTorchDispatchMode.__torch_dispatch__:5091 2025-08-14T21:39:08.4039266Z Dynamo produced 1 graphs covering 254 ops with 0 graph breaks (0 unique) 2025-08-14T21:39:14.0438721Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:39:14.0439781Z from pkg_resources import resource_filename 2025-08-14T21:39:14.7365644Z 2025-08-14T21:39:15.9363915Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:39:15.9366709Z loading model: 0it [00:01, ?it/s] 2025-08-14T21:39:15.9367074Z cpu eval BlenderbotSmallForConditionalGeneration 2025-08-14T21:39:16.2230820Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:39:16.3322523Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:39:16.4453572Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:39:33.8646343Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8646730Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8646972Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8647201Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8647440Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8647674Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8647911Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8648137Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8648367Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8648596Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8649085Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8649323Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8649586Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8649828Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8650065Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8650317Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8650543Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8650781Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8651052Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8651494Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8651868Z return mod(**inputs) 2025-08-14T21:39:33.8652377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8652884Z outputs = self.model( 2025-08-14T21:39:33.8653359Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-08-14T21:39:33.8653866Z encoder_outputs = self.encoder( 2025-08-14T21:39:33.8654355Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-08-14T21:39:33.8654841Z layer_outputs = encoder_layer( 2025-08-14T21:39:33.8655225Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8655715Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8656235Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-08-14T21:39:33.8656744Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:39:33.8657270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8657775Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8659463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8659997Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8660217Z 2025-08-14T21:39:33.8660341Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8660752Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8661122Z return mod(**inputs) 2025-08-14T21:39:33.8661591Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8662085Z outputs = self.model( 2025-08-14T21:39:33.8662559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-08-14T21:39:33.8663060Z encoder_outputs = self.encoder( 2025-08-14T21:39:33.8663541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-08-14T21:39:33.8664033Z layer_outputs = encoder_layer( 2025-08-14T21:39:33.8664470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8664875Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8665363Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-08-14T21:39:33.8665856Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:39:33.8666358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8666867Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8667360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8667859Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8668047Z 2025-08-14T21:39:33.8668139Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8668380Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8668613Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8668841Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8669074Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8669308Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8669531Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8669761Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8669990Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8670210Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8670440Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8670676Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8670901Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8671132Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8671363Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8671591Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8671850Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8672256Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8672619Z return mod(**inputs) 2025-08-14T21:39:33.8673078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8673568Z outputs = self.model( 2025-08-14T21:39:33.8674030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-08-14T21:39:33.8674570Z encoder_outputs = self.encoder( 2025-08-14T21:39:33.8675083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-08-14T21:39:33.8675565Z layer_outputs = encoder_layer( 2025-08-14T21:39:33.8675944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8676371Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8676849Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-08-14T21:39:33.8677348Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:39:33.8677842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8678350Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8678834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8679352Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8679552Z 2025-08-14T21:39:33.8679676Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8680065Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8680424Z return mod(**inputs) 2025-08-14T21:39:33.8680881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8681355Z outputs = self.model( 2025-08-14T21:39:33.8681810Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-08-14T21:39:33.8682295Z encoder_outputs = self.encoder( 2025-08-14T21:39:33.8682773Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-08-14T21:39:33.8683248Z layer_outputs = encoder_layer( 2025-08-14T21:39:33.8683638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8684034Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8684518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-08-14T21:39:33.8685004Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:39:33.8685496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8685999Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8686482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8686968Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8687152Z 2025-08-14T21:39:33.8687239Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8687470Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8687697Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8687913Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8688136Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8688358Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8688577Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8688957Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8689206Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8689423Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8689649Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8689961Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8690211Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8690439Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8690665Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8690889Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8691140Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8691543Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8691901Z return mod(**inputs) 2025-08-14T21:39:33.8692350Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8692823Z outputs = self.model( 2025-08-14T21:39:33.8693274Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-08-14T21:39:33.8693751Z encoder_outputs = self.encoder( 2025-08-14T21:39:33.8694217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-08-14T21:39:33.8694689Z layer_outputs = encoder_layer( 2025-08-14T21:39:33.8695067Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8695457Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8695938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-08-14T21:39:33.8696432Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:39:33.8696921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8697414Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8697900Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8698417Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8698615Z 2025-08-14T21:39:33.8698741Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8699127Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8699488Z return mod(**inputs) 2025-08-14T21:39:33.8699945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8700407Z outputs = self.model( 2025-08-14T21:39:33.8700856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-08-14T21:39:33.8701332Z encoder_outputs = self.encoder( 2025-08-14T21:39:33.8701805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-08-14T21:39:33.8702267Z layer_outputs = encoder_layer( 2025-08-14T21:39:33.8702982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8703399Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8703881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-08-14T21:39:33.8704417Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:39:33.8704915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8705421Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8706015Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8706577Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8706757Z 2025-08-14T21:39:33.8706850Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8707095Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8707326Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8707567Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8707800Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8708024Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8708257Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8708487Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8708719Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8708944Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8709174Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8709403Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8709633Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8709863Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8710092Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8710314Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8710576Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8710978Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8711336Z return mod(**inputs) 2025-08-14T21:39:33.8711798Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8713241Z outputs = self.model( 2025-08-14T21:39:33.8713875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-08-14T21:39:33.8714399Z encoder_outputs = self.encoder( 2025-08-14T21:39:33.8714932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-08-14T21:39:33.8715432Z layer_outputs = encoder_layer( 2025-08-14T21:39:33.8716028Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8716607Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8717369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-08-14T21:39:33.8718162Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:39:33.8718879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8719476Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8720221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8721008Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8721339Z 2025-08-14T21:39:33.8721469Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8721898Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8722272Z return mod(**inputs) 2025-08-14T21:39:33.8722752Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8723245Z outputs = self.model( 2025-08-14T21:39:33.8723724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-08-14T21:39:33.8724210Z encoder_outputs = self.encoder( 2025-08-14T21:39:33.8724917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-08-14T21:39:33.8725396Z layer_outputs = encoder_layer( 2025-08-14T21:39:33.8725797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8726228Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8727644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-08-14T21:39:33.8728500Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:39:33.8729355Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8729875Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8730581Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8731348Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8731618Z 2025-08-14T21:39:33.8731744Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8732062Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8732354Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8732681Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8732954Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8733273Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8733571Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8733892Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8734196Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8734512Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8734808Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8735125Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8735457Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8735779Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8736088Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8736372Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8736722Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8737345Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8737873Z return mod(**inputs) 2025-08-14T21:39:33.8738529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8739255Z outputs = self.model( 2025-08-14T21:39:33.8739845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-08-14T21:39:33.8740327Z encoder_outputs = self.encoder( 2025-08-14T21:39:33.8740809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-08-14T21:39:33.8741294Z layer_outputs = encoder_layer( 2025-08-14T21:39:33.8741668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8742084Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8742570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-08-14T21:39:33.8743073Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:39:33.8743570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8744083Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8744646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8745228Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8745427Z 2025-08-14T21:39:33.8745731Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8746124Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8746483Z return mod(**inputs) 2025-08-14T21:39:33.8746938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8747403Z outputs = self.model( 2025-08-14T21:39:33.8747856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-08-14T21:39:33.8748335Z encoder_outputs = self.encoder( 2025-08-14T21:39:33.8748800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-08-14T21:39:33.8749264Z layer_outputs = encoder_layer( 2025-08-14T21:39:33.8749634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8750017Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8750479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-08-14T21:39:33.8750959Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:39:33.8751446Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8751934Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8752398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8752904Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8753073Z 2025-08-14T21:39:33.8753170Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8753389Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8753611Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8753849Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8754065Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8754273Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8754487Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8754706Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8754917Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8755135Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8755358Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8755573Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8755805Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8756029Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8756242Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8756460Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8756713Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8757113Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8757461Z return mod(**inputs) 2025-08-14T21:39:33.8757920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8758391Z outputs = self.model( 2025-08-14T21:39:33.8758838Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-08-14T21:39:33.8759316Z encoder_outputs = self.encoder( 2025-08-14T21:39:33.8759870Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-08-14T21:39:33.8760349Z layer_outputs = encoder_layer( 2025-08-14T21:39:33.8760724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8761116Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8761595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-08-14T21:39:33.8762092Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:39:33.8762566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8763055Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8763539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8764059Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8764263Z 2025-08-14T21:39:33.8764377Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8764774Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8765132Z return mod(**inputs) 2025-08-14T21:39:33.8765583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8766061Z outputs = self.model( 2025-08-14T21:39:33.8766517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-08-14T21:39:33.8767000Z encoder_outputs = self.encoder( 2025-08-14T21:39:33.8767469Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-08-14T21:39:33.8767946Z layer_outputs = encoder_layer( 2025-08-14T21:39:33.8768328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8768824Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8769380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-08-14T21:39:33.8769884Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:39:33.8770387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8770887Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8771370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8771874Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8772050Z 2025-08-14T21:39:33.8772146Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8772374Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8772606Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8772833Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8773053Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8773278Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8773500Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8773714Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8773936Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8774159Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8774382Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8774597Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8774821Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8775093Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8775347Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8775571Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8775825Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8776212Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8776572Z return mod(**inputs) 2025-08-14T21:39:33.8777024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8777495Z outputs = self.model( 2025-08-14T21:39:33.8777944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-08-14T21:39:33.8778423Z encoder_outputs = self.encoder( 2025-08-14T21:39:33.8778893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-08-14T21:39:33.8779370Z layer_outputs = encoder_layer( 2025-08-14T21:39:33.8779744Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8780139Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8780621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-08-14T21:39:33.8781164Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:39:33.8781701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8782210Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8782692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8783211Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8783415Z 2025-08-14T21:39:33.8783539Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8783921Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8784257Z return mod(**inputs) 2025-08-14T21:39:33.8784695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8785151Z outputs = self.model( 2025-08-14T21:39:33.8785595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-08-14T21:39:33.8786049Z encoder_outputs = self.encoder( 2025-08-14T21:39:33.8786515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-08-14T21:39:33.8786982Z layer_outputs = encoder_layer( 2025-08-14T21:39:33.8787353Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8787757Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8788233Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-08-14T21:39:33.8788766Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:39:33.8789244Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8789751Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8790223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8790789Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8790964Z 2025-08-14T21:39:33.8791051Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8791295Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8791518Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8791737Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8791947Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8792162Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8792378Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8792589Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8792813Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8793040Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8793254Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8793475Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8793692Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8793910Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8794131Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8794352Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8794608Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8794998Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8795360Z return mod(**inputs) 2025-08-14T21:39:33.8795800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8796251Z outputs = self.model( 2025-08-14T21:39:33.8796696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-08-14T21:39:33.8797161Z encoder_outputs = self.encoder( 2025-08-14T21:39:33.8797621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-08-14T21:39:33.8798096Z layer_outputs = encoder_layer( 2025-08-14T21:39:33.8798478Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8798866Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8799340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-08-14T21:39:33.8799837Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:39:33.8800315Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8800833Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8801307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8801829Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8802033Z 2025-08-14T21:39:33.8802147Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8802540Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8803089Z return mod(**inputs) 2025-08-14T21:39:33.8803552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8804029Z outputs = self.model( 2025-08-14T21:39:33.8804474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-08-14T21:39:33.8804953Z encoder_outputs = self.encoder( 2025-08-14T21:39:33.8805421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-08-14T21:39:33.8806145Z layer_outputs = encoder_layer( 2025-08-14T21:39:33.8806519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8806911Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8807385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-08-14T21:39:33.8807874Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:39:33.8808354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8808925Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8809415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8809918Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8810099Z 2025-08-14T21:39:33.8810187Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8810422Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8810650Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8810870Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8811097Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8811323Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8811541Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8811764Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8811987Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8812203Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8812426Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8812646Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8812868Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8813084Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8813309Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8813530Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8813774Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8814172Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8814531Z return mod(**inputs) 2025-08-14T21:39:33.8814986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8815450Z outputs = self.model( 2025-08-14T21:39:33.8815896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8816365Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8816820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8817292Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8817677Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8818084Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8818548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:39:33.8819042Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:39:33.8819538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8820028Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8820495Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8821086Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8821281Z 2025-08-14T21:39:33.8821401Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8821777Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8822122Z return mod(**inputs) 2025-08-14T21:39:33.8822561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8823020Z outputs = self.model( 2025-08-14T21:39:33.8823453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8823926Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8824385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8824876Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8825243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8825627Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8826106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:39:33.8826600Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:39:33.8827083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8827568Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8828031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8828505Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8828686Z 2025-08-14T21:39:33.8828770Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8828995Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8829215Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8829429Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8829647Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8829864Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8830075Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8830291Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8830512Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8830719Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8830931Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8831147Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8831390Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8831790Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8832153Z return mod(**inputs) 2025-08-14T21:39:33.8832594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8833048Z outputs = self.model( 2025-08-14T21:39:33.8833500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8833980Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8834442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8834921Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8835307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8835725Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8836217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-08-14T21:39:33.8836732Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:39:33.8837249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8837769Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8838249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8838771Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8838973Z 2025-08-14T21:39:33.8839098Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8839501Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8839866Z return mod(**inputs) 2025-08-14T21:39:33.8840325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8840812Z outputs = self.model( 2025-08-14T21:39:33.8841267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8841769Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8842246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8842730Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8843109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8843518Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8844010Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-08-14T21:39:33.8844540Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:39:33.8845077Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8845591Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8846085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8846587Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8846771Z 2025-08-14T21:39:33.8846861Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8847098Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8847338Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8847567Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8847802Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8848031Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8848253Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8848481Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8848709Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8849011Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8849240Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8849461Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8849685Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8849899Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8850123Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8850349Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8850595Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8851038Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8851451Z return mod(**inputs) 2025-08-14T21:39:33.8851904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8852382Z outputs = self.model( 2025-08-14T21:39:33.8852840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8853316Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8853783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8854259Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8854626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8855014Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8855478Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:39:33.8855971Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:39:33.8856502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8856994Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8857484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8858009Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8858207Z 2025-08-14T21:39:33.8858327Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8858713Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8859069Z return mod(**inputs) 2025-08-14T21:39:33.8859513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8859976Z outputs = self.model( 2025-08-14T21:39:33.8860415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8860883Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8861345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8861799Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8862175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8862570Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8863064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:39:33.8863561Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:39:33.8864060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8864560Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8865027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8865502Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8865684Z 2025-08-14T21:39:33.8865772Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8866004Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8866224Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8866513Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8866771Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8866996Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8867212Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8867445Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8867660Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8867868Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8868083Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8868302Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8868548Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8868944Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8869307Z return mod(**inputs) 2025-08-14T21:39:33.8869741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8870204Z outputs = self.model( 2025-08-14T21:39:33.8870642Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8871105Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8871567Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8872049Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8872429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8872818Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8873287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-08-14T21:39:33.8873804Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:39:33.8874336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8874835Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8875315Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8875833Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8876029Z 2025-08-14T21:39:33.8876155Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8876538Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8876894Z return mod(**inputs) 2025-08-14T21:39:33.8877346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8877822Z outputs = self.model( 2025-08-14T21:39:33.8878268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8878746Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8879219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8879691Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8880061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8880624Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8881118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-08-14T21:39:33.8881617Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:39:33.8882227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8882729Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8883190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8883670Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8883849Z 2025-08-14T21:39:33.8883936Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8884167Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8884393Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8884611Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8884836Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8885060Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8885273Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8885496Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8885716Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8885928Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8886144Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8886366Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8886576Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8886793Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8887014Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8887233Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8887475Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8887865Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8888215Z return mod(**inputs) 2025-08-14T21:39:33.8888661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8889203Z outputs = self.model( 2025-08-14T21:39:33.8889666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8890141Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8890613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8891076Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8891444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8891817Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8892286Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:39:33.8892776Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:39:33.8893269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8893749Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8894219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8894720Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8894913Z 2025-08-14T21:39:33.8895032Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8895406Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8895757Z return mod(**inputs) 2025-08-14T21:39:33.8896212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8896738Z outputs = self.model( 2025-08-14T21:39:33.8897223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8897687Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8898144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8898599Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8898971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8899348Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8899809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:39:33.8900288Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:39:33.8900775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8901257Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8901713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8902187Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8902364Z 2025-08-14T21:39:33.8902450Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8902910Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8903131Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8903355Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8903576Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8903792Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8904018Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8904248Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8904473Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8904688Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8904910Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8905130Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8905379Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8905774Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8906129Z return mod(**inputs) 2025-08-14T21:39:33.8906573Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8907052Z outputs = self.model( 2025-08-14T21:39:33.8907506Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8907991Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8908461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8908941Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8909322Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8909719Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8910193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-08-14T21:39:33.8910712Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:39:33.8911221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8911716Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8912429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8912948Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8913146Z 2025-08-14T21:39:33.8913268Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8913656Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8914011Z return mod(**inputs) 2025-08-14T21:39:33.8914468Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8914943Z outputs = self.model( 2025-08-14T21:39:33.8915390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8915871Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8916353Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8916824Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8917207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8917603Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8918082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-08-14T21:39:33.8918595Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:39:33.8919109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8919616Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8920103Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8920572Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8920747Z 2025-08-14T21:39:33.8920831Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8921057Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8921273Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8921493Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8921712Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8921927Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8922138Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8922353Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8922567Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8922776Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8922989Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8923212Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8923425Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8923647Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8923867Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8924077Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8924330Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8924722Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8925078Z return mod(**inputs) 2025-08-14T21:39:33.8925526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8925995Z outputs = self.model( 2025-08-14T21:39:33.8926449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8926980Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8927482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8927965Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8928340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8928775Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8929272Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:39:33.8929770Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:39:33.8930281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8930754Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8931226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8931730Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8931927Z 2025-08-14T21:39:33.8932052Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8932437Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8932792Z return mod(**inputs) 2025-08-14T21:39:33.8933247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8933724Z outputs = self.model( 2025-08-14T21:39:33.8934166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8934629Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8935087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8935545Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8935921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8936322Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8936791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:39:33.8937300Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:39:33.8937803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8938307Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8938782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8939270Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8939447Z 2025-08-14T21:39:33.8939535Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8939762Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8939983Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8940202Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8940421Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8940634Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8940856Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8941073Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8941285Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8941504Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8941725Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8941998Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8942277Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8942670Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8943022Z return mod(**inputs) 2025-08-14T21:39:33.8943466Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8943939Z outputs = self.model( 2025-08-14T21:39:33.8944387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8944861Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8945325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8945804Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8946192Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8946582Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8947062Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-08-14T21:39:33.8947572Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:39:33.8948077Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8948569Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8949045Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8949552Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8949749Z 2025-08-14T21:39:33.8949874Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8950256Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8950608Z return mod(**inputs) 2025-08-14T21:39:33.8951055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8951528Z outputs = self.model( 2025-08-14T21:39:33.8951967Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8952429Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8952884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8953383Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8953756Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8954140Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8954679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-08-14T21:39:33.8955173Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:39:33.8955680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8956171Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8956636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8957101Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8957277Z 2025-08-14T21:39:33.8957407Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8957663Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8957878Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8958097Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8958313Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8958523Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8958742Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8958960Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8959174Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8959383Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8959598Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8959812Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8960022Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8960237Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8960452Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8960664Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8960912Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8961292Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8961633Z return mod(**inputs) 2025-08-14T21:39:33.8962073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8962533Z outputs = self.model( 2025-08-14T21:39:33.8963084Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8963544Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8964003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8964478Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8964869Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8965251Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8965728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:39:33.8966229Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:39:33.8966719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8967216Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8967689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8968214Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8968413Z 2025-08-14T21:39:33.8968525Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8969005Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8969361Z return mod(**inputs) 2025-08-14T21:39:33.8969812Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8970278Z outputs = self.model( 2025-08-14T21:39:33.8970731Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8971212Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8971679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8972147Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8972626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8973024Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8973496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:39:33.8973996Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:39:33.8974491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8974995Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8975463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8975955Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8976128Z 2025-08-14T21:39:33.8976225Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8976459Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8976676Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8976903Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8977121Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8977333Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8977554Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8977773Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8977985Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8978203Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8978423Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8978635Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8978883Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8979272Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8979627Z return mod(**inputs) 2025-08-14T21:39:33.8980075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8980549Z outputs = self.model( 2025-08-14T21:39:33.8980997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8981452Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8981909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8982373Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8982742Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8983126Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8983594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-08-14T21:39:33.8984090Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:39:33.8984589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8985115Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8985588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.8986092Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.8986295Z 2025-08-14T21:39:33.8986411Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8986782Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8987127Z return mod(**inputs) 2025-08-14T21:39:33.8987646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8988104Z outputs = self.model( 2025-08-14T21:39:33.8988546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8989009Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.8989470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.8989946Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.8990334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.8990734Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.8991195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-08-14T21:39:33.8991707Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:39:33.8992208Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.8992706Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.8993175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.8993660Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.8993834Z 2025-08-14T21:39:33.8993919Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8994145Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8994357Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8994574Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8994791Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8995002Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8995220Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8995437Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8995643Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8995855Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8996067Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8996282Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8996488Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8996704Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8996919Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8997130Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.8997374Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.8997763Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.8998112Z return mod(**inputs) 2025-08-14T21:39:33.8998584Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.8999047Z outputs = self.model( 2025-08-14T21:39:33.8999490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.8999952Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.9000410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.9000878Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.9001263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.9001657Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.9002179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:39:33.9002919Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:39:33.9003418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.9003919Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.9004412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.9005039Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.9005236Z 2025-08-14T21:39:33.9005351Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.9005746Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.9006103Z return mod(**inputs) 2025-08-14T21:39:33.9006566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.9007034Z outputs = self.model( 2025-08-14T21:39:33.9007487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.9007961Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.9008424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.9008960Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.9009348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.9009744Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.9010223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:39:33.9010741Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:39:33.9011246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.9011752Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.9012227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.9012725Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.9012900Z 2025-08-14T21:39:33.9012998Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9013222Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9013454Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9013679Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9013902Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9014124Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9014350Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9014576Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9014793Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9015017Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9015242Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9015461Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9015716Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.9016121Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.9016468Z return mod(**inputs) 2025-08-14T21:39:33.9016912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.9017392Z outputs = self.model( 2025-08-14T21:39:33.9017963Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.9018479Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.9018959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.9019425Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.9019793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.9020178Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.9020643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-08-14T21:39:33.9021165Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:39:33.9021682Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.9022177Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.9022653Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.9023163Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.9023359Z 2025-08-14T21:39:33.9023472Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.9023868Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.9024223Z return mod(**inputs) 2025-08-14T21:39:33.9024673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.9025137Z outputs = self.model( 2025-08-14T21:39:33.9025591Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.9026070Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.9026540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.9027008Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.9027374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.9027758Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.9028212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-08-14T21:39:33.9028704Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:39:33.9029196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.9029684Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.9030139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.9030609Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.9030780Z 2025-08-14T21:39:33.9030865Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9031086Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9031297Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9031514Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9031728Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9031936Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9032152Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9032367Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9032573Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9032837Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9033098Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9033308Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9033522Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9033737Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9033951Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9034158Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9034405Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.9034793Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.9035134Z return mod(**inputs) 2025-08-14T21:39:33.9035574Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.9036034Z outputs = self.model( 2025-08-14T21:39:33.9036475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.9036932Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.9037390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.9037854Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.9038212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.9038593Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.9039070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:39:33.9039579Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:39:33.9040069Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.9040557Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.9041019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.9041521Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.9041712Z 2025-08-14T21:39:33.9041819Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.9042203Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.9042549Z return mod(**inputs) 2025-08-14T21:39:33.9042981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.9043445Z outputs = self.model( 2025-08-14T21:39:33.9043901Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.9044381Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.9044850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.9045309Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.9045690Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.9046082Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.9046561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:39:33.9047060Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:39:33.9047554Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.9047733Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.9048052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.9048167Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.9048171Z 2025-08-14T21:39:33.9048256Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9048347Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9048429Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9048508Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9048598Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9048678Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9048855Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9048945Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9049028Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9049120Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9049203Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9049284Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9049408Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.9049625Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.9049699Z return mod(**inputs) 2025-08-14T21:39:33.9050041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.9050116Z outputs = self.model( 2025-08-14T21:39:33.9050456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.9050537Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.9050867Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.9050963Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.9051206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.9051295Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.9051628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-08-14T21:39:33.9051744Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:39:33.9052079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.9052183Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.9052493Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.9052648Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.9052652Z 2025-08-14T21:39:33.9052779Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.9053061Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.9053140Z return mod(**inputs) 2025-08-14T21:39:33.9053468Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.9053551Z outputs = self.model( 2025-08-14T21:39:33.9053880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.9053967Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.9054294Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.9054459Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.9054710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.9054799Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.9055128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-08-14T21:39:33.9055253Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:39:33.9055583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.9055694Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.9056002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.9056122Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.9056126Z 2025-08-14T21:39:33.9056221Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9056305Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9056392Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9056472Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9056554Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9056642Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9056723Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9056803Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9056891Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9056971Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9057051Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9057140Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9057219Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9057310Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9057393Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9057475Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9057595Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.9057808Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.9057879Z return mod(**inputs) 2025-08-14T21:39:33.9058221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.9058293Z outputs = self.model( 2025-08-14T21:39:33.9058615Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.9058700Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.9059018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.9059107Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.9059338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.9059423Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.9059760Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:39:33.9059865Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:39:33.9060194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.9060294Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.9060604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.9060813Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.9060817Z 2025-08-14T21:39:33.9060927Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.9061142Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.9061212Z return mod(**inputs) 2025-08-14T21:39:33.9061530Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.9061611Z outputs = self.model( 2025-08-14T21:39:33.9061933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.9062009Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.9062336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.9062416Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.9062657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.9062742Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.9063068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-08-14T21:39:33.9063178Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:39:33.9063507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.9063613Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.9063912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.9064026Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.9064033Z 2025-08-14T21:39:33.9064124Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9064204Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9064285Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9064370Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9064447Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9064532Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9064610Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9064688Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9064774Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9064852Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9064930Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9065016Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9065124Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.9065338Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.9065416Z return mod(**inputs) 2025-08-14T21:39:33.9065741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.9065819Z outputs = self.model( 2025-08-14T21:39:33.9066139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.9066215Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.9066545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.9066619Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.9066859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.9067033Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.9067382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-08-14T21:39:33.9067505Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:39:33.9067824Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.9067924Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.9068230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:39:33.9068364Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:39:33.9068368Z 2025-08-14T21:39:33.9068484Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.9068699Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.9068772Z return mod(**inputs) 2025-08-14T21:39:33.9069104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-08-14T21:39:33.9069177Z outputs = self.model( 2025-08-14T21:39:33.9069506Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-08-14T21:39:33.9069583Z decoder_outputs = self.decoder( 2025-08-14T21:39:33.9069902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-08-14T21:39:33.9069988Z layer_outputs = decoder_layer( 2025-08-14T21:39:33.9070219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:39:33.9070306Z return super().__call__(*args, **kwargs) 2025-08-14T21:39:33.9070634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-08-14T21:39:33.9070747Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:39:33.9071071Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-08-14T21:39:33.9071171Z attn_output, attn_weights = attention_interface( 2025-08-14T21:39:33.9071474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:39:33.9071596Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:39:33.9071600Z 2025-08-14T21:39:33.9071684Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9071770Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9071854Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9071935Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9072022Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9072101Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9072233Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9072319Z cudagraph partition due to non gpu ops 2025-08-14T21:39:33.9072428Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:39:33.9072645Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:39:33.9072715Z return mod(**inputs) 2025-08-14T21:39:33.9073034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1398, in forward 2025-08-14T21:39:33.9073216Z masked_lm_loss = loss_fct(lm_logits.view(-1, self.config.vocab_size), labels.view(-1)) 2025-08-14T21:39:33.9073220Z 2025-08-14T21:39:44.0673158Z Compilation time (from dynamo_timed): 26.352214824 2025-08-14T21:39:44.0687501Z pass 2025-08-14T21:39:44.0688105Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:39:44.0689312Z TIMING: _recursive_pre_grad_passes:0.06716 _recursive_joint_graph_passes:0.6389 _recursive_post_grad_passes:0.12438 async_compile.wait:0.91733 code_gen:10.11868 inductor_compile:12.43465 backend_compile:22.14558 gc:0.00074 entire_frame_compile:26.35221 total_wall_time:26.35221 2025-08-14T21:39:44.0690318Z STATS: call_* op count: 654 | FakeTensorMode.__torch_dispatch__:47172 | FakeTensor.__torch_dispatch__:5567 | ProxyTorchDispatchMode.__torch_dispatch__:12894 2025-08-14T21:39:44.0690886Z Dynamo produced 1 graphs covering 654 ops with 0 graph breaks (0 unique) 2025-08-14T21:39:49.9155743Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:39:49.9161687Z from pkg_resources import resource_filename 2025-08-14T21:39:50.9286494Z 2025-08-14T21:39:52.4162153Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:39:52.4164310Z loading model: 0it [00:01, ?it/s] 2025-08-14T21:39:52.4164565Z cpu eval CamemBert 2025-08-14T21:39:52.8907255Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:39:53.0224380Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:39:53.1657569Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:40:04.3672135Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:40:04.3672644Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:40:04.3673067Z return mod(**inputs) 2025-08-14T21:40:04.3673589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-08-14T21:40:04.3674041Z outputs = self.roberta( 2025-08-14T21:40:04.3674473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 886, in forward 2025-08-14T21:40:04.3674935Z embedding_output = self.embeddings( 2025-08-14T21:40:04.3675400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 90, in forward 2025-08-14T21:40:04.3676037Z position_ids = create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length) 2025-08-14T21:40:04.3676701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1590, in create_position_ids_from_input_ids 2025-08-14T21:40:04.3677247Z mask = input_ids.ne(padding_idx).int() 2025-08-14T21:40:04.3677445Z 2025-08-14T21:40:04.3677553Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3677800Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3678042Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3678271Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3678508Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3678741Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3678968Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3679180Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3679432Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3679650Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3679871Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3680082Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3680340Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:40:04.3680746Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:40:04.3681594Z return mod(**inputs) 2025-08-14T21:40:04.3682041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-08-14T21:40:04.3682494Z outputs = self.roberta( 2025-08-14T21:40:04.3682918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 886, in forward 2025-08-14T21:40:04.3683371Z embedding_output = self.embeddings( 2025-08-14T21:40:04.3683810Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 90, in forward 2025-08-14T21:40:04.3684400Z position_ids = create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length) 2025-08-14T21:40:04.3685093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1591, in create_position_ids_from_input_ids 2025-08-14T21:40:04.3685744Z incremental_indices = (torch.cumsum(mask, dim=1).type_as(mask) + past_key_values_length) * mask 2025-08-14T21:40:04.3686011Z 2025-08-14T21:40:04.3686129Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:40:04.3686534Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:40:04.3686892Z return mod(**inputs) 2025-08-14T21:40:04.3687332Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-08-14T21:40:04.3687784Z outputs = self.roberta( 2025-08-14T21:40:04.3688209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 886, in forward 2025-08-14T21:40:04.3688672Z embedding_output = self.embeddings( 2025-08-14T21:40:04.3689212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 90, in forward 2025-08-14T21:40:04.3689799Z position_ids = create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length) 2025-08-14T21:40:04.3690506Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1591, in create_position_ids_from_input_ids 2025-08-14T21:40:04.3691141Z incremental_indices = (torch.cumsum(mask, dim=1).type_as(mask) + past_key_values_length) * mask 2025-08-14T21:40:04.3691410Z 2025-08-14T21:40:04.3691500Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3691735Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3691952Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3692176Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3692399Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3692612Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3692834Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3693092Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:40:04.3693486Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:40:04.3693834Z return mod(**inputs) 2025-08-14T21:40:04.3694252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-08-14T21:40:04.3694688Z outputs = self.roberta( 2025-08-14T21:40:04.3695106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-08-14T21:40:04.3695535Z encoder_outputs = self.encoder( 2025-08-14T21:40:04.3695960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-08-14T21:40:04.3696389Z layer_outputs = layer_module( 2025-08-14T21:40:04.3696758Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:40:04.3697283Z return super().__call__(*args, **kwargs) 2025-08-14T21:40:04.3697733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-08-14T21:40:04.3698183Z self_attention_outputs = self.attention( 2025-08-14T21:40:04.3698614Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3699018Z return func(*args, **kwargs) 2025-08-14T21:40:04.3699437Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-08-14T21:40:04.3699856Z self_outputs = self.self( 2025-08-14T21:40:04.3700244Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3700665Z return func(*args, **kwargs) 2025-08-14T21:40:04.3701098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-08-14T21:40:04.3701615Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:40:04.3701817Z 2025-08-14T21:40:04.3701902Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3702128Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3702340Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3702559Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3703069Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3703285Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3703510Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3703736Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3703961Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3704181Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3704396Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3704618Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3704832Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3705087Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:40:04.3705477Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:40:04.3705828Z return mod(**inputs) 2025-08-14T21:40:04.3706257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-08-14T21:40:04.3706693Z outputs = self.roberta( 2025-08-14T21:40:04.3707109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-08-14T21:40:04.3707528Z encoder_outputs = self.encoder( 2025-08-14T21:40:04.3707947Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-08-14T21:40:04.3708377Z layer_outputs = layer_module( 2025-08-14T21:40:04.3708746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:40:04.3709138Z return super().__call__(*args, **kwargs) 2025-08-14T21:40:04.3709581Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-08-14T21:40:04.3710038Z self_attention_outputs = self.attention( 2025-08-14T21:40:04.3710443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3710855Z return func(*args, **kwargs) 2025-08-14T21:40:04.3711273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-08-14T21:40:04.3711707Z self_outputs = self.self( 2025-08-14T21:40:04.3712089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3712624Z return func(*args, **kwargs) 2025-08-14T21:40:04.3713055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-08-14T21:40:04.3713548Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:40:04.3713755Z 2025-08-14T21:40:04.3713840Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3714071Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3714377Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3714595Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3714818Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3715042Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3715254Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3715477Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3715699Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3715917Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3716141Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3716360Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3716579Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3716829Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:40:04.3717219Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:40:04.3717572Z return mod(**inputs) 2025-08-14T21:40:04.3717989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-08-14T21:40:04.3718430Z outputs = self.roberta( 2025-08-14T21:40:04.3718847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-08-14T21:40:04.3719288Z encoder_outputs = self.encoder( 2025-08-14T21:40:04.3719720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-08-14T21:40:04.3720161Z layer_outputs = layer_module( 2025-08-14T21:40:04.3720540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:40:04.3720927Z return super().__call__(*args, **kwargs) 2025-08-14T21:40:04.3721377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-08-14T21:40:04.3721830Z self_attention_outputs = self.attention( 2025-08-14T21:40:04.3722247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3722652Z return func(*args, **kwargs) 2025-08-14T21:40:04.3723077Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-08-14T21:40:04.3723522Z self_outputs = self.self( 2025-08-14T21:40:04.3723909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3724319Z return func(*args, **kwargs) 2025-08-14T21:40:04.3724731Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-08-14T21:40:04.3725218Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:40:04.3725410Z 2025-08-14T21:40:04.3725494Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3725715Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3725932Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3726139Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3726354Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3726567Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3726780Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3727038Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3727290Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3727507Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3727714Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3727925Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3728140Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3728375Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:40:04.3729024Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:40:04.3729388Z return mod(**inputs) 2025-08-14T21:40:04.3729813Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-08-14T21:40:04.3730246Z outputs = self.roberta( 2025-08-14T21:40:04.3730656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-08-14T21:40:04.3731126Z encoder_outputs = self.encoder( 2025-08-14T21:40:04.3731542Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-08-14T21:40:04.3731984Z layer_outputs = layer_module( 2025-08-14T21:40:04.3732351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:40:04.3732732Z return super().__call__(*args, **kwargs) 2025-08-14T21:40:04.3733158Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-08-14T21:40:04.3733596Z self_attention_outputs = self.attention( 2025-08-14T21:40:04.3733996Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3734375Z return func(*args, **kwargs) 2025-08-14T21:40:04.3734793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-08-14T21:40:04.3735215Z self_outputs = self.self( 2025-08-14T21:40:04.3735593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3735972Z return func(*args, **kwargs) 2025-08-14T21:40:04.3736385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-08-14T21:40:04.3736868Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:40:04.3737058Z 2025-08-14T21:40:04.3737147Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3737362Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3737580Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3737797Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3738005Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3738224Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3738444Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3738652Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3738868Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3739083Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3739299Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3739507Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3739720Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3739963Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:40:04.3740335Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:40:04.3740675Z return mod(**inputs) 2025-08-14T21:40:04.3741076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-08-14T21:40:04.3741470Z outputs = self.roberta( 2025-08-14T21:40:04.3741938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-08-14T21:40:04.3742361Z encoder_outputs = self.encoder( 2025-08-14T21:40:04.3742781Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-08-14T21:40:04.3743172Z layer_outputs = layer_module( 2025-08-14T21:40:04.3743515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:40:04.3743874Z return super().__call__(*args, **kwargs) 2025-08-14T21:40:04.3744291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-08-14T21:40:04.3744724Z self_attention_outputs = self.attention( 2025-08-14T21:40:04.3745122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3745513Z return func(*args, **kwargs) 2025-08-14T21:40:04.3745919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-08-14T21:40:04.3746328Z self_outputs = self.self( 2025-08-14T21:40:04.3746686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3747053Z return func(*args, **kwargs) 2025-08-14T21:40:04.3747433Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-08-14T21:40:04.3747960Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:40:04.3748155Z 2025-08-14T21:40:04.3748251Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3748481Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3748699Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3748919Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3749138Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3749346Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3749562Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3749778Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3749984Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3750184Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3750386Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3750581Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3750782Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3751011Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:40:04.3751360Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:40:04.3751694Z return mod(**inputs) 2025-08-14T21:40:04.3752107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-08-14T21:40:04.3752513Z outputs = self.roberta( 2025-08-14T21:40:04.3752892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-08-14T21:40:04.3753319Z encoder_outputs = self.encoder( 2025-08-14T21:40:04.3753737Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-08-14T21:40:04.3754169Z layer_outputs = layer_module( 2025-08-14T21:40:04.3754524Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:40:04.3754909Z return super().__call__(*args, **kwargs) 2025-08-14T21:40:04.3755352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-08-14T21:40:04.3755834Z self_attention_outputs = self.attention( 2025-08-14T21:40:04.3756302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3756704Z return func(*args, **kwargs) 2025-08-14T21:40:04.3757125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-08-14T21:40:04.3757549Z self_outputs = self.self( 2025-08-14T21:40:04.3757937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3758339Z return func(*args, **kwargs) 2025-08-14T21:40:04.3758752Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-08-14T21:40:04.3759247Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:40:04.3759447Z 2025-08-14T21:40:04.3759532Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3759761Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3759980Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3760203Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3760423Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3760636Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3760854Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3761076Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3761290Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3761512Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3761731Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3761953Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3762168Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3762422Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:40:04.3762813Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:40:04.3763162Z return mod(**inputs) 2025-08-14T21:40:04.3763587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-08-14T21:40:04.3764015Z outputs = self.roberta( 2025-08-14T21:40:04.3764434Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-08-14T21:40:04.3764864Z encoder_outputs = self.encoder( 2025-08-14T21:40:04.3765295Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-08-14T21:40:04.3765742Z layer_outputs = layer_module( 2025-08-14T21:40:04.3766110Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:40:04.3766501Z return super().__call__(*args, **kwargs) 2025-08-14T21:40:04.3766960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-08-14T21:40:04.3767408Z self_attention_outputs = self.attention( 2025-08-14T21:40:04.3767813Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3768215Z return func(*args, **kwargs) 2025-08-14T21:40:04.3768641Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-08-14T21:40:04.3769165Z self_outputs = self.self( 2025-08-14T21:40:04.3769562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3769970Z return func(*args, **kwargs) 2025-08-14T21:40:04.3770376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-08-14T21:40:04.3770881Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:40:04.3771071Z 2025-08-14T21:40:04.3771186Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3771404Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3771611Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3771809Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3772064Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3772289Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3772498Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3772711Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3772926Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3773136Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3773360Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3773579Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3773792Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3774046Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:40:04.3774457Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:40:04.3774798Z return mod(**inputs) 2025-08-14T21:40:04.3775183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-08-14T21:40:04.3775591Z outputs = self.roberta( 2025-08-14T21:40:04.3775979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-08-14T21:40:04.3776381Z encoder_outputs = self.encoder( 2025-08-14T21:40:04.3776783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-08-14T21:40:04.3777189Z layer_outputs = layer_module( 2025-08-14T21:40:04.3777539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:40:04.3777894Z return super().__call__(*args, **kwargs) 2025-08-14T21:40:04.3778309Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-08-14T21:40:04.3778781Z self_attention_outputs = self.attention( 2025-08-14T21:40:04.3779186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3779593Z return func(*args, **kwargs) 2025-08-14T21:40:04.3780020Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-08-14T21:40:04.3780459Z self_outputs = self.self( 2025-08-14T21:40:04.3780834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3781208Z return func(*args, **kwargs) 2025-08-14T21:40:04.3781604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-08-14T21:40:04.3782101Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:40:04.3782305Z 2025-08-14T21:40:04.3782388Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3782613Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3782822Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3783019Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3783228Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3783444Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3783653Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3783879Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3784084Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3784288Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3784486Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3784689Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3784936Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3785202Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:40:04.3785588Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:40:04.3785929Z return mod(**inputs) 2025-08-14T21:40:04.3786310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-08-14T21:40:04.3786714Z outputs = self.roberta( 2025-08-14T21:40:04.3787100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-08-14T21:40:04.3787505Z encoder_outputs = self.encoder( 2025-08-14T21:40:04.3787895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-08-14T21:40:04.3788299Z layer_outputs = layer_module( 2025-08-14T21:40:04.3788673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:40:04.3789049Z return super().__call__(*args, **kwargs) 2025-08-14T21:40:04.3789488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-08-14T21:40:04.3789930Z self_attention_outputs = self.attention( 2025-08-14T21:40:04.3790334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3790696Z return func(*args, **kwargs) 2025-08-14T21:40:04.3791091Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-08-14T21:40:04.3791485Z self_outputs = self.self( 2025-08-14T21:40:04.3791833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3792189Z return func(*args, **kwargs) 2025-08-14T21:40:04.3792572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-08-14T21:40:04.3793051Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:40:04.3793240Z 2025-08-14T21:40:04.3793322Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3793543Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3793766Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3793986Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3794198Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3794416Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3794634Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3794843Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3795061Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3795275Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3795487Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3795706Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3795923Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3796162Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:40:04.3796544Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:40:04.3796885Z return mod(**inputs) 2025-08-14T21:40:04.3797295Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-08-14T21:40:04.3797712Z outputs = self.roberta( 2025-08-14T21:40:04.3798121Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-08-14T21:40:04.3798547Z encoder_outputs = self.encoder( 2025-08-14T21:40:04.3798965Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-08-14T21:40:04.3799499Z layer_outputs = layer_module( 2025-08-14T21:40:04.3799867Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:40:04.3800248Z return super().__call__(*args, **kwargs) 2025-08-14T21:40:04.3800672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-08-14T21:40:04.3801109Z self_attention_outputs = self.attention( 2025-08-14T21:40:04.3801511Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3801904Z return func(*args, **kwargs) 2025-08-14T21:40:04.3802312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-08-14T21:40:04.3802889Z self_outputs = self.self( 2025-08-14T21:40:04.3803283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3803667Z return func(*args, **kwargs) 2025-08-14T21:40:04.3804084Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-08-14T21:40:04.3804570Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:40:04.3804761Z 2025-08-14T21:40:04.3804853Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3805070Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3805290Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3805507Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3805714Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3805930Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3806144Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3806359Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3806577Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3806791Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3807005Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3807214Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3807428Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3807675Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:40:04.3808044Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:40:04.3808388Z return mod(**inputs) 2025-08-14T21:40:04.3808852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-08-14T21:40:04.3809291Z outputs = self.roberta( 2025-08-14T21:40:04.3809704Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-08-14T21:40:04.3810149Z encoder_outputs = self.encoder( 2025-08-14T21:40:04.3810585Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-08-14T21:40:04.3811014Z layer_outputs = layer_module( 2025-08-14T21:40:04.3811388Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:40:04.3811787Z return super().__call__(*args, **kwargs) 2025-08-14T21:40:04.3812241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-08-14T21:40:04.3812696Z self_attention_outputs = self.attention( 2025-08-14T21:40:04.3813127Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3813543Z return func(*args, **kwargs) 2025-08-14T21:40:04.3813973Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-08-14T21:40:04.3814569Z self_outputs = self.self( 2025-08-14T21:40:04.3814960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3815360Z return func(*args, **kwargs) 2025-08-14T21:40:04.3815778Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-08-14T21:40:04.3816273Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:40:04.3816466Z 2025-08-14T21:40:04.3816560Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3816789Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3817005Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3817225Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3817446Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3817664Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3817889Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3818117Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3818325Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3818539Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3818750Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3818957Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3819159Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3819395Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:40:04.3819770Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:40:04.3820102Z return mod(**inputs) 2025-08-14T21:40:04.3820509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-08-14T21:40:04.3820933Z outputs = self.roberta( 2025-08-14T21:40:04.3821337Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-08-14T21:40:04.3821761Z encoder_outputs = self.encoder( 2025-08-14T21:40:04.3822179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-08-14T21:40:04.3822612Z layer_outputs = layer_module( 2025-08-14T21:40:04.3822949Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:40:04.3823307Z return super().__call__(*args, **kwargs) 2025-08-14T21:40:04.3823714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-08-14T21:40:04.3824122Z self_attention_outputs = self.attention( 2025-08-14T21:40:04.3824494Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3824866Z return func(*args, **kwargs) 2025-08-14T21:40:04.3825287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-08-14T21:40:04.3825699Z self_outputs = self.self( 2025-08-14T21:40:04.3826089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:40:04.3826501Z return func(*args, **kwargs) 2025-08-14T21:40:04.3826922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-08-14T21:40:04.3827423Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:40:04.3827624Z 2025-08-14T21:40:04.3827708Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3827945Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3828156Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3828422Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3828670Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3828880Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3829102Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3829316Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3829555Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3829764Z cudagraph partition due to non gpu ops 2025-08-14T21:40:04.3830013Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:40:04.3830392Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:40:04.3830729Z return mod(**inputs) 2025-08-14T21:40:04.3831140Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1059, in forward 2025-08-14T21:40:04.3831698Z masked_lm_loss = loss_fct(prediction_scores.view(-1, self.config.vocab_size), labels.view(-1)) 2025-08-14T21:40:04.3831952Z 2025-08-14T21:40:13.4404571Z Compilation time (from dynamo_timed): 18.923755548 2025-08-14T21:40:13.4442977Z pass 2025-08-14T21:40:13.4444475Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:40:13.4445354Z TIMING: _recursive_pre_grad_passes:0.03678 _recursive_joint_graph_passes:0.43348 _recursive_post_grad_passes:0.08762 async_compile.wait:0.84008 code_gen:8.43611 inductor_compile:10.41864 backend_compile:15.94227 gc:0.00014 entire_frame_compile:18.92376 total_wall_time:18.92376 2025-08-14T21:40:13.4446448Z STATS: call_* op count: 299 | FakeTensorMode.__torch_dispatch__:27085 | FakeTensor.__torch_dispatch__:3312 | ProxyTorchDispatchMode.__torch_dispatch__:7246 2025-08-14T21:40:13.4447054Z Dynamo produced 1 graphs covering 299 ops with 0 graph breaks (0 unique) 2025-08-14T21:40:19.4402173Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:40:19.4403471Z from pkg_resources import resource_filename 2025-08-14T21:40:20.0565813Z 2025-08-14T21:40:29.2965945Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:40:29.2966248Z loading model: 0it [00:09, ?it/s] 2025-08-14T21:40:29.2966497Z cpu eval DebertaV2ForMaskedLM 2025-08-14T21:40:29.4274565Z Compilation time (from dynamo_timed): 0 2025-08-14T21:40:29.4277777Z pass_due_to_skip 2025-08-14T21:40:29.4278665Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:40:29.4279048Z TIMING: total_wall_time:0 2025-08-14T21:40:29.4279248Z STATS: call_* op count: 0 2025-08-14T21:40:29.4279537Z Dynamo produced 0 graphs covering 0 ops with 0 graph breaks (0 unique) 2025-08-14T21:40:34.5009592Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:40:34.5010618Z from pkg_resources import resource_filename 2025-08-14T21:40:35.1481379Z 2025-08-14T21:40:42.7326901Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:40:42.7327252Z loading model: 0it [00:07, ?it/s] 2025-08-14T21:40:42.7327505Z cpu eval DebertaV2ForQuestionAnswering 2025-08-14T21:40:46.1401852Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:40:47.0749115Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:40:48.1046516Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:41:11.1895280Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.1899152Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.1899622Z return mod(**inputs) 2025-08-14T21:41:11.1900120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1244, in forward 2025-08-14T21:41:11.1900615Z logits = self.qa_outputs(sequence_output) 2025-08-14T21:41:11.1900778Z 2025-08-14T21:41:11.1900880Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1901137Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1901385Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1901614Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1901896Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.1902354Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.1902999Z return mod(**inputs) 2025-08-14T21:41:11.1903456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.1903935Z outputs = self.deberta( 2025-08-14T21:41:11.1904387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.1904851Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.1905354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.1905843Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.1906259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.1906698Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.1907176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.1907710Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.1908190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.1908689Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.1909149Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.1909723Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.1910354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.1910917Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.1911134Z 2025-08-14T21:41:11.1911254Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.1911666Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.1912048Z return mod(**inputs) 2025-08-14T21:41:11.1912471Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.1912934Z outputs = self.deberta( 2025-08-14T21:41:11.1913365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.1913821Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.1914261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.1914727Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.1915147Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.1915646Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.1916183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.1916655Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.1917119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.1917567Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.1918013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.1918643Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.1918930Z 2025-08-14T21:41:11.1919053Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.1919442Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.1919810Z return mod(**inputs) 2025-08-14T21:41:11.1920234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.1920674Z outputs = self.deberta( 2025-08-14T21:41:11.1921102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.1921571Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.1922026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.1922473Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.1922873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.1923282Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.1924813Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.1925360Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.1925900Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.1926375Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.1926823Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.1927441Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.1927746Z 2025-08-14T21:41:11.1927839Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1928084Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1928367Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.1929039Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.1929426Z return mod(**inputs) 2025-08-14T21:41:11.1929862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.1930310Z outputs = self.deberta( 2025-08-14T21:41:11.1930743Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.1931195Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.1931635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.1932099Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.1932521Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.1933149Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.1933604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.1934094Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.1934563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.1935016Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.1935457Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.1936030Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.1936657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.1937226Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.1937436Z 2025-08-14T21:41:11.1937554Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.1937953Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.1938324Z return mod(**inputs) 2025-08-14T21:41:11.1938751Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.1939206Z outputs = self.deberta( 2025-08-14T21:41:11.1939637Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.1940091Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.1940520Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.1940984Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.1941387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.1941785Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.1942240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.1942709Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.1943170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.1943608Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.1944051Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.1944497Z context_layer = torch.bmm( 2025-08-14T21:41:11.1944628Z 2025-08-14T21:41:11.1944758Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.1945150Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.1945509Z return mod(**inputs) 2025-08-14T21:41:11.1945933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.1946373Z outputs = self.deberta( 2025-08-14T21:41:11.1946803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.1947245Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.1947686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.1948131Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.1948568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.1948998Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.1949459Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.1949916Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.1950377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.1950828Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.1951291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.1951863Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.1952138Z 2025-08-14T21:41:11.1952233Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1952475Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1952710Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1952948Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1953178Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1953397Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1953627Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1953861Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1954080Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1954581Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1954844Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.1955246Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.1955601Z return mod(**inputs) 2025-08-14T21:41:11.1956037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.1956487Z outputs = self.deberta( 2025-08-14T21:41:11.1956904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.1957351Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.1957799Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.1958331Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.1958729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.1959125Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.1959588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.1960047Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.1960505Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.1960950Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.1961398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.1961975Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.1962569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.1963117Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.1963318Z 2025-08-14T21:41:11.1963440Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.1963829Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.1964237Z return mod(**inputs) 2025-08-14T21:41:11.1964693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.1965134Z outputs = self.deberta( 2025-08-14T21:41:11.1965548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.1965993Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.1966429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.1966880Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.1967273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.1967673Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.1968170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.1968629Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.1969203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.1969652Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.1970089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.1970669Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.1970963Z 2025-08-14T21:41:11.1971079Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.1971480Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.1971840Z return mod(**inputs) 2025-08-14T21:41:11.1972255Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.1972699Z outputs = self.deberta( 2025-08-14T21:41:11.1973117Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.1973549Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.1973980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.1974434Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.1974841Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.1975226Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.1975668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.1976226Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.1976929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.1977637Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.1978098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.1978703Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.1978991Z 2025-08-14T21:41:11.1979090Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1979318Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.1979581Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.1979980Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.1980445Z return mod(**inputs) 2025-08-14T21:41:11.1980871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.1981435Z outputs = self.deberta( 2025-08-14T21:41:11.1981920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.1982367Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.1982810Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.1983274Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.1983673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.1984068Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.1984519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.1984988Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.1985435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.1985881Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.1986318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.1986888Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.1987490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.1988039Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.1988245Z 2025-08-14T21:41:11.1988370Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.1988770Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.1989115Z return mod(**inputs) 2025-08-14T21:41:11.1989536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.1989971Z outputs = self.deberta( 2025-08-14T21:41:11.1990387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.1990842Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.1991447Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.1991898Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.1992297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.1992693Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.1993144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.1993608Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.1994059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.1994505Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.1994943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.1995385Z context_layer = torch.bmm( 2025-08-14T21:41:11.1995522Z 2025-08-14T21:41:11.1995635Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.1996119Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.1996479Z return mod(**inputs) 2025-08-14T21:41:11.1996892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.1997341Z outputs = self.deberta( 2025-08-14T21:41:11.1997777Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.1998209Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.1998655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.1999114Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.1999516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.1999911Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2000376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2000834Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2001303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2001756Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2002207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2002961Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2003228Z 2025-08-14T21:41:11.2003327Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2003557Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2003792Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2004025Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2004247Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2004482Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2004710Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2004926Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2005151Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2005373Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2005620Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2006018Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2006369Z return mod(**inputs) 2025-08-14T21:41:11.2006793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2007227Z outputs = self.deberta( 2025-08-14T21:41:11.2007672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2008116Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2008542Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2009167Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2009691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2010219Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2010657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2011258Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2011720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2012555Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2012997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2013568Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2014180Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2014724Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2014928Z 2025-08-14T21:41:11.2015042Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2015439Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2015799Z return mod(**inputs) 2025-08-14T21:41:11.2016226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2016677Z outputs = self.deberta( 2025-08-14T21:41:11.2017108Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2017553Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2017985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2018459Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2018865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2019267Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2019708Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2020176Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2020643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2021092Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2021548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2022149Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2022429Z 2025-08-14T21:41:11.2022554Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2022945Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2023321Z return mod(**inputs) 2025-08-14T21:41:11.2023745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2024188Z outputs = self.deberta( 2025-08-14T21:41:11.2024605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2025044Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2025481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2025917Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2026312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2026696Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2027128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2027627Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2028122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2028571Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2029007Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2029570Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2029852Z 2025-08-14T21:41:11.2029937Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2030163Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2030405Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2030790Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2031137Z return mod(**inputs) 2025-08-14T21:41:11.2031559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2031990Z outputs = self.deberta( 2025-08-14T21:41:11.2032407Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2032845Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2033271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2033723Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2034121Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2034519Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2034941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2035407Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2035861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2036316Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2036757Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2037317Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2037921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2038461Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2038660Z 2025-08-14T21:41:11.2038778Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2039172Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2039527Z return mod(**inputs) 2025-08-14T21:41:11.2039943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2040392Z outputs = self.deberta( 2025-08-14T21:41:11.2040819Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2041268Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2041706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2042172Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2042569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2043090Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2043524Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2043981Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2044444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2044890Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2045342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2045784Z context_layer = torch.bmm( 2025-08-14T21:41:11.2045909Z 2025-08-14T21:41:11.2046031Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2046413Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2046769Z return mod(**inputs) 2025-08-14T21:41:11.2047184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2047622Z outputs = self.deberta( 2025-08-14T21:41:11.2048030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2048468Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2049092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2049556Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2049957Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2050346Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2050794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2051246Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2051707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2052152Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2052581Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2053133Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2053393Z 2025-08-14T21:41:11.2053481Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2053704Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2053916Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2054141Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2054362Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2054577Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2054795Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2055011Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2055227Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2055435Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2055685Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2056068Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2056408Z return mod(**inputs) 2025-08-14T21:41:11.2056818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2057248Z outputs = self.deberta( 2025-08-14T21:41:11.2057654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2058196Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2058622Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2059076Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2059461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2059857Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2060290Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2060736Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2061185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2061626Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2062063Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2062631Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2063209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2063762Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2063956Z 2025-08-14T21:41:11.2064075Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2064449Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2064792Z return mod(**inputs) 2025-08-14T21:41:11.2065200Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2065630Z outputs = self.deberta( 2025-08-14T21:41:11.2066032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2066486Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2066926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2067378Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2067770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2068164Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2068605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2069058Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2069523Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2069966Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2070401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2070974Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2071264Z 2025-08-14T21:41:11.2071377Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2071772Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2072125Z return mod(**inputs) 2025-08-14T21:41:11.2072533Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2073033Z outputs = self.deberta( 2025-08-14T21:41:11.2073488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2073921Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2074354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2074819Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2075218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2075604Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2076046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2076502Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2076964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2077398Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2077850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2078436Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2078714Z 2025-08-14T21:41:11.2078809Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2079033Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2079291Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2079687Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2080037Z return mod(**inputs) 2025-08-14T21:41:11.2080454Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2080900Z outputs = self.deberta( 2025-08-14T21:41:11.2081308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2081752Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2082194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2082657Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2083046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2083443Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2083893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2084350Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2098529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2099030Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2099485Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2100070Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2100685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2101235Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2101452Z 2025-08-14T21:41:11.2101578Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2102124Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2102537Z return mod(**inputs) 2025-08-14T21:41:11.2103155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2103622Z outputs = self.deberta( 2025-08-14T21:41:11.2104058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2104501Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2104925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2105379Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2105781Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2106168Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2106621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2107076Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2107531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2107970Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2108415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2108864Z context_layer = torch.bmm( 2025-08-14T21:41:11.2108991Z 2025-08-14T21:41:11.2109114Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2109503Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2109865Z return mod(**inputs) 2025-08-14T21:41:11.2110295Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2110731Z outputs = self.deberta( 2025-08-14T21:41:11.2111155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2111596Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2112027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2112477Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2112881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2113278Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2113711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2114181Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2114643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2115089Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2115522Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2116090Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2116364Z 2025-08-14T21:41:11.2116456Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2116694Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2116915Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2117143Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2117366Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2118572Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2118854Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2119086Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2119303Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2119531Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2119790Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2120196Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2120548Z return mod(**inputs) 2025-08-14T21:41:11.2120978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2121418Z outputs = self.deberta( 2025-08-14T21:41:11.2121830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2122278Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2122714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2123169Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2123564Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2123959Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2124407Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2124883Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2125345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2125788Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2126240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2126800Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2127407Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2127954Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2128153Z 2025-08-14T21:41:11.2128275Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2128663Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2129159Z return mod(**inputs) 2025-08-14T21:41:11.2129597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2130042Z outputs = self.deberta( 2025-08-14T21:41:11.2130461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2130898Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2131330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2131771Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2132171Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2132569Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2133014Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2133460Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2133963Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2134442Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2134879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2135457Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2135750Z 2025-08-14T21:41:11.2135863Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2136257Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2136624Z return mod(**inputs) 2025-08-14T21:41:11.2137020Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2137449Z outputs = self.deberta( 2025-08-14T21:41:11.2137862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2138282Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2138718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2139171Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2139566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2139949Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2140396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2140840Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2141284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2141732Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2142169Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2142750Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2143030Z 2025-08-14T21:41:11.2143118Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2143351Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2143630Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2144019Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2144368Z return mod(**inputs) 2025-08-14T21:41:11.2144787Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2145227Z outputs = self.deberta( 2025-08-14T21:41:11.2145640Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2146075Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2146504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2146958Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2147345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2147738Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2148175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2148630Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2149154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2149601Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2150041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2150603Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2151211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2151754Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2151955Z 2025-08-14T21:41:11.2152075Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2152459Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2152820Z return mod(**inputs) 2025-08-14T21:41:11.2153251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2153690Z outputs = self.deberta( 2025-08-14T21:41:11.2154101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2154540Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2154975Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2155424Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2155827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2156225Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2156666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2157120Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2157578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2158017Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2158456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2158887Z context_layer = torch.bmm( 2025-08-14T21:41:11.2159023Z 2025-08-14T21:41:11.2159137Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2159533Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2159879Z return mod(**inputs) 2025-08-14T21:41:11.2160300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2160746Z outputs = self.deberta( 2025-08-14T21:41:11.2161164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2161600Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2162035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2162491Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2162894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2163279Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2163726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2164223Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2164711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2165155Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2165606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2166171Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2166429Z 2025-08-14T21:41:11.2166518Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2166751Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2166983Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2167202Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2167426Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2167649Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2167875Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2168089Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2168309Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2168531Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2168910Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2169319Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2169677Z return mod(**inputs) 2025-08-14T21:41:11.2170094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2170537Z outputs = self.deberta( 2025-08-14T21:41:11.2170959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2171403Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2171838Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2172299Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2172703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2173100Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2173535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2173980Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2174428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2174851Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2175283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2175838Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2176422Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2176938Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2177139Z 2025-08-14T21:41:11.2177251Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2177632Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2177973Z return mod(**inputs) 2025-08-14T21:41:11.2178372Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2178796Z outputs = self.deberta( 2025-08-14T21:41:11.2179251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2179707Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2180125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2180565Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2180949Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2181322Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2181750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2182195Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2182638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2183060Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2183485Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2184052Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2184329Z 2025-08-14T21:41:11.2184445Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2184817Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2185159Z return mod(**inputs) 2025-08-14T21:41:11.2185563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2185980Z outputs = self.deberta( 2025-08-14T21:41:11.2186631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2187081Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2187502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2187937Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2188323Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2188706Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2189123Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2189573Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2190014Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2190447Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2190872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2191436Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2191718Z 2025-08-14T21:41:11.2191804Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2192030Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2192274Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2192655Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2193008Z return mod(**inputs) 2025-08-14T21:41:11.2193407Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2193831Z outputs = self.deberta( 2025-08-14T21:41:11.2194279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2194743Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2195157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2195597Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2195985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2196360Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2196795Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2197240Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2197686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2198116Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2198540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2199096Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2199700Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2200229Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2200435Z 2025-08-14T21:41:11.2200546Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2200930Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2201272Z return mod(**inputs) 2025-08-14T21:41:11.2201673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2202101Z outputs = self.deberta( 2025-08-14T21:41:11.2202517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2203142Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2203592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2204045Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2204444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2204828Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2205280Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2205748Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2206204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2206639Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2207091Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2207536Z context_layer = torch.bmm( 2025-08-14T21:41:11.2207663Z 2025-08-14T21:41:11.2207777Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2208172Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2208528Z return mod(**inputs) 2025-08-14T21:41:11.2209078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2209647Z outputs = self.deberta( 2025-08-14T21:41:11.2210115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2210555Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2210978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2211427Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2211825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2212217Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2212652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2213111Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2213570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2214013Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2214443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2214989Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2215224Z 2025-08-14T21:41:11.2215314Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2215520Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2215730Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2215936Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2216140Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2216338Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2216541Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2216743Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2216941Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2217147Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2217380Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2217729Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2218055Z return mod(**inputs) 2025-08-14T21:41:11.2218446Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2218866Z outputs = self.deberta( 2025-08-14T21:41:11.2219263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2219686Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2220107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2220544Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2220906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2221263Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2221665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2222078Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2222498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2222899Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2223296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2223799Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2224436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2224934Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2225119Z 2025-08-14T21:41:11.2225241Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2225622Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2225975Z return mod(**inputs) 2025-08-14T21:41:11.2226391Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2226819Z outputs = self.deberta( 2025-08-14T21:41:11.2227230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2227637Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2228039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2228453Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2228829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2229224Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2229661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2230105Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2230558Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2230994Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2231426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2232006Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2232292Z 2025-08-14T21:41:11.2232408Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2232800Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2233142Z return mod(**inputs) 2025-08-14T21:41:11.2233558Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2233988Z outputs = self.deberta( 2025-08-14T21:41:11.2234399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2234825Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2235260Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2235710Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2236099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2236490Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2236924Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2237375Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2237820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2238257Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2238691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2239347Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2239638Z 2025-08-14T21:41:11.2239726Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2239957Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2240223Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2240594Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2240938Z return mod(**inputs) 2025-08-14T21:41:11.2241343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2241763Z outputs = self.deberta( 2025-08-14T21:41:11.2242162Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2242588Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2243012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2243441Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2243826Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2244208Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2244661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2245105Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2245573Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2246020Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2246465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2247021Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2247622Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2248159Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2248358Z 2025-08-14T21:41:11.2248478Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2248974Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2249338Z return mod(**inputs) 2025-08-14T21:41:11.2249762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2250206Z outputs = self.deberta( 2025-08-14T21:41:11.2250635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2251079Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2251525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2252010Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2252409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2252811Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2253258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2253709Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2254264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2254702Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2255137Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2255571Z context_layer = torch.bmm( 2025-08-14T21:41:11.2255698Z 2025-08-14T21:41:11.2255819Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2256202Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2256555Z return mod(**inputs) 2025-08-14T21:41:11.2256970Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2257395Z outputs = self.deberta( 2025-08-14T21:41:11.2257813Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2258256Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2258674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2259105Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2259491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2259874Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2260301Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2260758Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2261206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2261648Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2262065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2262608Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2262866Z 2025-08-14T21:41:11.2262953Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2263175Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2263392Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2263617Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2263837Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2264049Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2264271Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2264500Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2264705Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2264923Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2265174Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2265556Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2265889Z return mod(**inputs) 2025-08-14T21:41:11.2266297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2266715Z outputs = self.deberta( 2025-08-14T21:41:11.2267112Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2267537Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2267956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2268395Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2268941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2269324Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2269759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2270209Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2270664Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2271112Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2271553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2272093Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2272691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2273239Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2273438Z 2025-08-14T21:41:11.2273562Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2273948Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2274300Z return mod(**inputs) 2025-08-14T21:41:11.2274723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2275160Z outputs = self.deberta( 2025-08-14T21:41:11.2275575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2275999Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2276426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2276872Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2277270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2277667Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2278109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2278561Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2279017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2279461Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2279900Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2280477Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2280773Z 2025-08-14T21:41:11.2280888Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2281280Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2281624Z return mod(**inputs) 2025-08-14T21:41:11.2282042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2282478Z outputs = self.deberta( 2025-08-14T21:41:11.2282902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2283331Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2283764Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2284341Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2284741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2285125Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2285566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2286026Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2286472Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2286915Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2287365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2287950Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2288230Z 2025-08-14T21:41:11.2288317Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2288550Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2288920Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2289321Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2289668Z return mod(**inputs) 2025-08-14T21:41:11.2290093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2290534Z outputs = self.deberta( 2025-08-14T21:41:11.2290950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2291395Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2291843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2292308Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2292703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2293107Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2293557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2294047Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2294502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2294945Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2295385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2295955Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2296563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2297107Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2297306Z 2025-08-14T21:41:11.2297430Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2297820Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2298163Z return mod(**inputs) 2025-08-14T21:41:11.2298571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2298998Z outputs = self.deberta( 2025-08-14T21:41:11.2299486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2299913Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2300339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2300787Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2301188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2301583Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2302027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2302480Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2303116Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2303570Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2304004Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2304442Z context_layer = torch.bmm( 2025-08-14T21:41:11.2304581Z 2025-08-14T21:41:11.2304696Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2305093Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2305439Z return mod(**inputs) 2025-08-14T21:41:11.2305861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2306297Z outputs = self.deberta( 2025-08-14T21:41:11.2306719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2307158Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2307606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2308063Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2308454Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2308847Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2309300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2309760Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2310208Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2310651Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2311092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2311656Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2311915Z 2025-08-14T21:41:11.2312005Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2312236Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2312466Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2312683Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2312907Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2313129Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2313342Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2313562Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2313781Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2314000Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2314344Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2314801Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2315166Z return mod(**inputs) 2025-08-14T21:41:11.2315584Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2316029Z outputs = self.deberta( 2025-08-14T21:41:11.2316452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2316899Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2317344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2317801Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2318204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2318596Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2319045Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2319517Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2319976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2320417Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2320856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2321418Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2322016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2322552Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2322761Z 2025-08-14T21:41:11.2322875Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2323282Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2323632Z return mod(**inputs) 2025-08-14T21:41:11.2324044Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2324483Z outputs = self.deberta( 2025-08-14T21:41:11.2324905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2325338Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2325771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2326231Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2326634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2327019Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2327472Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2327932Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2328401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2328950Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2329408Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2330088Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2330378Z 2025-08-14T21:41:11.2330496Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2330899Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2331263Z return mod(**inputs) 2025-08-14T21:41:11.2331693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2332127Z outputs = self.deberta( 2025-08-14T21:41:11.2332558Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2333012Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2333451Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2333907Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2334318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2334720Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2335174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2335641Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2336087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2336523Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2336963Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2337534Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2337821Z 2025-08-14T21:41:11.2337913Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2338146Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2338398Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2338786Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2339136Z return mod(**inputs) 2025-08-14T21:41:11.2339544Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2339976Z outputs = self.deberta( 2025-08-14T21:41:11.2340388Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2340820Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2341239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2341693Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2342085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2342462Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2342896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2343342Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2343790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2344220Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2344653Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2345283Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2345607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2345751Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2345755Z 2025-08-14T21:41:11.2345865Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2346075Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2346151Z return mod(**inputs) 2025-08-14T21:41:11.2346444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2346524Z outputs = self.deberta( 2025-08-14T21:41:11.2346810Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2346894Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2347186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2347278Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2347511Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2347606Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2347906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2348012Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2348294Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2348379Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2348671Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2348748Z context_layer = torch.bmm( 2025-08-14T21:41:11.2348752Z 2025-08-14T21:41:11.2348871Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2349079Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2349148Z return mod(**inputs) 2025-08-14T21:41:11.2349452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2349524Z outputs = self.deberta( 2025-08-14T21:41:11.2349822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2349905Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2350193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2350293Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2350526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2350610Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2350901Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2350997Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2351285Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2351368Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2351647Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2351919Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2351924Z 2025-08-14T21:41:11.2352009Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2352100Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2352180Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2352259Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2352344Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2352422Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2352501Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2352587Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2352665Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2352741Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2352858Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2353072Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2353153Z return mod(**inputs) 2025-08-14T21:41:11.2353444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2353515Z outputs = self.deberta( 2025-08-14T21:41:11.2353804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2353880Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2354178Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2354278Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2354511Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2354606Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2354895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2354994Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2355283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2355363Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2355645Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2355856Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2356180Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2356325Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2356332Z 2025-08-14T21:41:11.2356444Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2356652Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2356728Z return mod(**inputs) 2025-08-14T21:41:11.2357019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2357100Z outputs = self.deberta( 2025-08-14T21:41:11.2357381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2357457Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2357745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2357837Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2358155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2358241Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2358526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2358630Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2358913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2358993Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2359283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2359503Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2359510Z 2025-08-14T21:41:11.2359628Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2359838Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2359908Z return mod(**inputs) 2025-08-14T21:41:11.2360205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2360275Z outputs = self.deberta( 2025-08-14T21:41:11.2360563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2360638Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2360919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2361018Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2361259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2361343Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2361633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2361730Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2362024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2362107Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2362399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2362629Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2362633Z 2025-08-14T21:41:11.2362721Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2362826Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2362937Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2363148Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2363226Z return mod(**inputs) 2025-08-14T21:41:11.2363519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2363594Z outputs = self.deberta( 2025-08-14T21:41:11.2363898Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2363976Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2364277Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2364419Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2364702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2364798Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2365087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2365192Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2365482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2365565Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2365865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2366074Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2366411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2366564Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2366568Z 2025-08-14T21:41:11.2366680Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2366905Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2366976Z return mod(**inputs) 2025-08-14T21:41:11.2367282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2367363Z outputs = self.deberta( 2025-08-14T21:41:11.2367655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2367742Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2368040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2368134Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2368381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2368468Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2368871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2368980Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2369287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2369380Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2369690Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2369770Z context_layer = torch.bmm( 2025-08-14T21:41:11.2369774Z 2025-08-14T21:41:11.2369893Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2370105Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2370185Z return mod(**inputs) 2025-08-14T21:41:11.2370493Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2370566Z outputs = self.deberta( 2025-08-14T21:41:11.2370875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2370952Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2371241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2371424Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2371664Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2371758Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2372058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2372157Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2372466Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2372549Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2372855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2373063Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2373067Z 2025-08-14T21:41:11.2373153Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2373245Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2373328Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2373410Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2373500Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2373581Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2373670Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2373751Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2373834Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2373921Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2374031Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2374244Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2374325Z return mod(**inputs) 2025-08-14T21:41:11.2374625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2374707Z outputs = self.deberta( 2025-08-14T21:41:11.2374999Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2375078Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2375384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2375477Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2375714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2375809Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2376118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2376226Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2376514Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2376597Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2376904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2377099Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2377427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2377563Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2377607Z 2025-08-14T21:41:11.2377717Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2377966Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2378036Z return mod(**inputs) 2025-08-14T21:41:11.2378331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2378409Z outputs = self.deberta( 2025-08-14T21:41:11.2378697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2378782Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2379071Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2379163Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2379406Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2379494Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2379791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2379887Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2380170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2380261Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2380547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2380763Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2380773Z 2025-08-14T21:41:11.2380881Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2381098Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2381175Z return mod(**inputs) 2025-08-14T21:41:11.2381469Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2381539Z outputs = self.deberta( 2025-08-14T21:41:11.2381834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2381908Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2382201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2382290Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2382526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2382619Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2382910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2383007Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2383302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2383382Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2383678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2383893Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2383898Z 2025-08-14T21:41:11.2383981Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2384071Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2384254Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2384470Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2384540Z return mod(**inputs) 2025-08-14T21:41:11.2384830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2384911Z outputs = self.deberta( 2025-08-14T21:41:11.2385192Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2385268Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2385559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2385651Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2385893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2385981Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2386266Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2386376Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2386830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2386960Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2387300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2387504Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2387843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2387983Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2387988Z 2025-08-14T21:41:11.2388108Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2388314Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2388383Z return mod(**inputs) 2025-08-14T21:41:11.2388679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2388749Z outputs = self.deberta( 2025-08-14T21:41:11.2389043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2389129Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2389425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2389527Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2389757Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2389852Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2390127Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2390219Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2390491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2390568Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2390836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2390968Z context_layer = torch.bmm( 2025-08-14T21:41:11.2391011Z 2025-08-14T21:41:11.2391119Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2391317Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2391388Z return mod(**inputs) 2025-08-14T21:41:11.2391659Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2391731Z outputs = self.deberta( 2025-08-14T21:41:11.2391995Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2392066Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2392340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2392424Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2392655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2392734Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2393000Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2393097Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2393362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2393437Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2393713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2393900Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2393906Z 2025-08-14T21:41:11.2393996Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2394075Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2394149Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2394234Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2394309Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2394383Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2394465Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2394538Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2394622Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2394695Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2394799Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2395006Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2395071Z return mod(**inputs) 2025-08-14T21:41:11.2395348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2395426Z outputs = self.deberta( 2025-08-14T21:41:11.2395694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2395774Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2396048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2396139Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2396376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2396461Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2396762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2396907Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2397219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2397308Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2397589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2397788Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2398120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2398249Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2398253Z 2025-08-14T21:41:11.2398364Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2398564Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2398630Z return mod(**inputs) 2025-08-14T21:41:11.2398925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2398994Z outputs = self.deberta( 2025-08-14T21:41:11.2399262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2399341Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2399606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2399699Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2399919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2400004Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2400282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2400373Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2400644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2400720Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2400982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2401197Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2401201Z 2025-08-14T21:41:11.2401303Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2401505Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2401576Z return mod(**inputs) 2025-08-14T21:41:11.2401847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2401922Z outputs = self.deberta( 2025-08-14T21:41:11.2402190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2402262Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2402535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2402811Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2403044Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2403125Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2403546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2403653Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2403938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2404027Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2404308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2404523Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2404528Z 2025-08-14T21:41:11.2404622Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2404703Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2404811Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2405038Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2405106Z return mod(**inputs) 2025-08-14T21:41:11.2405400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2405471Z outputs = self.deberta( 2025-08-14T21:41:11.2405752Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2405839Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2406124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2406218Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2406467Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2406557Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2406856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2406953Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2407240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2407330Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2407623Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2407837Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2408174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2408315Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2408319Z 2025-08-14T21:41:11.2408433Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2408640Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2408791Z return mod(**inputs) 2025-08-14T21:41:11.2409152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2409228Z outputs = self.deberta( 2025-08-14T21:41:11.2409535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2409615Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2409911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2410052Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2410320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2410413Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2410698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2410795Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2411087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2411168Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2411461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2411537Z context_layer = torch.bmm( 2025-08-14T21:41:11.2411545Z 2025-08-14T21:41:11.2411656Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2411877Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2411948Z return mod(**inputs) 2025-08-14T21:41:11.2412236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2412314Z outputs = self.deberta( 2025-08-14T21:41:11.2412597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2412679Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2412964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2413054Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2413295Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2413385Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2413678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2413774Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2414056Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2414143Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2414445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2414642Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2414654Z 2025-08-14T21:41:11.2414736Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2414822Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2414914Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2414994Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2415074Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2415161Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2415240Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2415319Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2415407Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2415485Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2415593Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2415810Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2415877Z return mod(**inputs) 2025-08-14T21:41:11.2416170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2416288Z outputs = self.deberta( 2025-08-14T21:41:11.2416605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2416692Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2416976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2417074Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2417306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2417388Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2417693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2417791Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2418082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2418170Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2418452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2418658Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2418981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2419120Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2419124Z 2025-08-14T21:41:11.2419239Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2419449Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2419530Z return mod(**inputs) 2025-08-14T21:41:11.2419824Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2419895Z outputs = self.deberta( 2025-08-14T21:41:11.2420184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2420260Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2420550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2420641Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2420872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2420960Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2421246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2421337Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2421616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2421691Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2421981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2422201Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2422205Z 2025-08-14T21:41:11.2422314Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2422529Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2422635Z return mod(**inputs) 2025-08-14T21:41:11.2423027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2423099Z outputs = self.deberta( 2025-08-14T21:41:11.2423379Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2423462Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2423741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2423827Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2424053Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2424131Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2424403Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2424501Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2424765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2424849Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2425114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2425325Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2425328Z 2025-08-14T21:41:11.2425407Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2425485Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2425592Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2425785Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2425853Z return mod(**inputs) 2025-08-14T21:41:11.2426132Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2426198Z outputs = self.deberta( 2025-08-14T21:41:11.2426468Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2426540Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2426805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2426897Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2427114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2427192Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2427467Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2427559Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2427836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2427916Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2428195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2428401Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2428720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2428863Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2428906Z 2025-08-14T21:41:11.2429045Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2429262Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2429336Z return mod(**inputs) 2025-08-14T21:41:11.2429606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2429680Z outputs = self.deberta( 2025-08-14T21:41:11.2429950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2430022Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2430297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2430386Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2430621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2430715Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2430996Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2431100Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2431395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2431475Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2431768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2431844Z context_layer = torch.bmm( 2025-08-14T21:41:11.2431847Z 2025-08-14T21:41:11.2431964Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2432178Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2432246Z return mod(**inputs) 2025-08-14T21:41:11.2432539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2432610Z outputs = self.deberta( 2025-08-14T21:41:11.2432889Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2432981Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2433248Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2433342Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2433557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2433639Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2433915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2434007Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2434283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2434363Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2434662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2434867Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2434871Z 2025-08-14T21:41:11.2434955Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2435038Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2435445Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2435559Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2435650Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2435729Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2435807Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2435894Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2435973Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2436051Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2436169Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2436379Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2436458Z return mod(**inputs) 2025-08-14T21:41:11.2436766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2436837Z outputs = self.deberta( 2025-08-14T21:41:11.2437135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2437213Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2437494Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2437593Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2437823Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2437914Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2438197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2438293Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2438582Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2438668Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2438958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2439157Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2439477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2439621Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2439624Z 2025-08-14T21:41:11.2439732Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2439943Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2440011Z return mod(**inputs) 2025-08-14T21:41:11.2440303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2440382Z outputs = self.deberta( 2025-08-14T21:41:11.2440676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2440752Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2441053Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2441144Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2441383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2441465Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2441749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2441936Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2442218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2442298Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2442598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2442822Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2442826Z 2025-08-14T21:41:11.2442945Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2443156Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2443226Z return mod(**inputs) 2025-08-14T21:41:11.2443538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2443617Z outputs = self.deberta( 2025-08-14T21:41:11.2443919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2443996Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2444293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2444396Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2444634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2444727Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2445025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2445126Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2445425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2445512Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2445798Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2446033Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2446037Z 2025-08-14T21:41:11.2446120Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2446214Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2446326Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2446545Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2446624Z return mod(**inputs) 2025-08-14T21:41:11.2446924Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2446998Z outputs = self.deberta( 2025-08-14T21:41:11.2447309Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2447387Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2447688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2447781Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2448023Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2448117Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2448406Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2448580Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2448991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2449081Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2449389Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2449595Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2449946Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2450085Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2450090Z 2025-08-14T21:41:11.2450202Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2450432Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2450513Z return mod(**inputs) 2025-08-14T21:41:11.2450809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2450891Z outputs = self.deberta( 2025-08-14T21:41:11.2451183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2451268Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2451568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2451661Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2451907Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2452000Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2452298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2452397Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2452702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2452794Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2453100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2453179Z context_layer = torch.bmm( 2025-08-14T21:41:11.2453192Z 2025-08-14T21:41:11.2453303Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2453525Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2453606Z return mod(**inputs) 2025-08-14T21:41:11.2453896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2453970Z outputs = self.deberta( 2025-08-14T21:41:11.2454258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2454336Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2454636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2454725Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2454954Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2455046Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2455400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2455499Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2455793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2455873Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2456176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2456371Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2456375Z 2025-08-14T21:41:11.2456457Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2456546Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2456625Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2456705Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2456794Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2456876Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2456960Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2457036Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2457113Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2457197Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2457305Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2457512Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2457588Z return mod(**inputs) 2025-08-14T21:41:11.2457880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2457958Z outputs = self.deberta( 2025-08-14T21:41:11.2458241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2458325Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2458616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2458707Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2458937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2459028Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2459309Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2459414Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2459712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2459797Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2460096Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2460294Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2460625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2460760Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2460764Z 2025-08-14T21:41:11.2460872Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2461089Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2461157Z return mod(**inputs) 2025-08-14T21:41:11.2461465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2461618Z outputs = self.deberta( 2025-08-14T21:41:11.2461899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2461983Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2462264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2462355Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2462591Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2462673Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2462960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2463056Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2463343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2463441Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2463720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2463945Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2463949Z 2025-08-14T21:41:11.2464056Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2464263Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2464339Z return mod(**inputs) 2025-08-14T21:41:11.2464623Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2464697Z outputs = self.deberta( 2025-08-14T21:41:11.2464988Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2465062Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2465350Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2465439Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2465665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2465755Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2466035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2466139Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2466422Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2466504Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2466791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2467005Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2467008Z 2025-08-14T21:41:11.2467100Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2467183Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2467291Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2467506Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2467576Z return mod(**inputs) 2025-08-14T21:41:11.2467861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2468018Z outputs = self.deberta( 2025-08-14T21:41:11.2468303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2468385Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2468665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2468757Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2468992Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2469075Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2469355Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2469460Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2469747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2469835Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2470116Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2470316Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2470646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2470783Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2470787Z 2025-08-14T21:41:11.2470900Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2471105Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2471179Z return mod(**inputs) 2025-08-14T21:41:11.2471475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2471546Z outputs = self.deberta( 2025-08-14T21:41:11.2471828Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2471910Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2472190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2472288Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2472514Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2472597Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2472891Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2472986Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2473272Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2473353Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2473650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2473734Z context_layer = torch.bmm( 2025-08-14T21:41:11.2473738Z 2025-08-14T21:41:11.2473844Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2474051Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2474126Z return mod(**inputs) 2025-08-14T21:41:11.2474484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2474565Z outputs = self.deberta( 2025-08-14T21:41:11.2474845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2474920Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2475207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2475297Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2475535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2475618Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2475897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2476007Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2476289Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2476371Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2476677Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2476872Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2476876Z 2025-08-14T21:41:11.2476969Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2477051Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2477131Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2477218Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2477298Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2477380Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2477468Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2477547Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2477634Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2477711Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2477819Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2478033Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2478100Z return mod(**inputs) 2025-08-14T21:41:11.2478387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2478466Z outputs = self.deberta( 2025-08-14T21:41:11.2478766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2478848Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2479135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2479225Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2479460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2479542Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2479821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2479926Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2480227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2480316Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2480625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2480860Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2481200Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2481328Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2481332Z 2025-08-14T21:41:11.2481441Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2481638Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2481702Z return mod(**inputs) 2025-08-14T21:41:11.2481984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2482052Z outputs = self.deberta( 2025-08-14T21:41:11.2482334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2482412Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2482696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2482794Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2483023Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2483108Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2483401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2483498Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2483788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2483875Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2484159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2484386Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2484390Z 2025-08-14T21:41:11.2484497Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2484713Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2484780Z return mod(**inputs) 2025-08-14T21:41:11.2485068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2485147Z outputs = self.deberta( 2025-08-14T21:41:11.2485434Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2485513Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2485805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2485896Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2486135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2486219Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2486502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2486606Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2487016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2487195Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2487487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2487707Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2487712Z 2025-08-14T21:41:11.2487805Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2487890Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2488002Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2488223Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2488295Z return mod(**inputs) 2025-08-14T21:41:11.2488597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2488673Z outputs = self.deberta( 2025-08-14T21:41:11.2489257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2489351Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2489646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2489748Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2489987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2490074Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2490372Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2490472Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2490773Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2490860Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2491127Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2491327Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2491631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2491761Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2491765Z 2025-08-14T21:41:11.2491879Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2492076Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2492158Z return mod(**inputs) 2025-08-14T21:41:11.2492450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2492522Z outputs = self.deberta( 2025-08-14T21:41:11.2492814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2492892Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2493173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2493274Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2493507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2493597Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2493880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2494142Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2494415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2494491Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2494761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2494832Z context_layer = torch.bmm( 2025-08-14T21:41:11.2494835Z 2025-08-14T21:41:11.2494938Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2495141Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2495206Z return mod(**inputs) 2025-08-14T21:41:11.2495474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2495553Z outputs = self.deberta( 2025-08-14T21:41:11.2495817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2495899Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2496163Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2496248Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2496475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2496555Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2496829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2496923Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2497195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2497285Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2497563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2497757Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2497768Z 2025-08-14T21:41:11.2497860Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2497937Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2498021Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2498095Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2498169Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2498250Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2498328Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2498404Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2498487Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2498561Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2498667Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2498864Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2498928Z return mod(**inputs) 2025-08-14T21:41:11.2499212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2499283Z outputs = self.deberta( 2025-08-14T21:41:11.2499562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2499646Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2499926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2500088Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2500321Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2500404Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2500692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2500790Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2501073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2501162Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2501444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2501653Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2501968Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2502096Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2502110Z 2025-08-14T21:41:11.2502212Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2502421Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2502494Z return mod(**inputs) 2025-08-14T21:41:11.2502985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2503064Z outputs = self.deberta( 2025-08-14T21:41:11.2503356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2503440Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2503733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2503825Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2504055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2504147Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2504429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2504526Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2504815Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2504901Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2505194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2505413Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2505417Z 2025-08-14T21:41:11.2505526Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2505746Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2505815Z return mod(**inputs) 2025-08-14T21:41:11.2506110Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2506181Z outputs = self.deberta( 2025-08-14T21:41:11.2506463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2506631Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2506965Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2507057Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2507293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2507377Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2507663Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2507760Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2508038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2508128Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2508415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2508637Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2508641Z 2025-08-14T21:41:11.2508725Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2508808Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2508926Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2509135Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2509205Z return mod(**inputs) 2025-08-14T21:41:11.2509497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2509567Z outputs = self.deberta( 2025-08-14T21:41:11.2509858Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2509937Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2510220Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2510319Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2510549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2510640Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2510920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2511016Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2511300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2511384Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2511666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2511875Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2512197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2512338Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2512342Z 2025-08-14T21:41:11.2512452Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2512657Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2512736Z return mod(**inputs) 2025-08-14T21:41:11.2513059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2513171Z outputs = self.deberta( 2025-08-14T21:41:11.2513457Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2513535Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2513854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2513946Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2514177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2514269Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2514584Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2514692Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2514979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2515061Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2515352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2515429Z context_layer = torch.bmm( 2025-08-14T21:41:11.2515433Z 2025-08-14T21:41:11.2515548Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2515756Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2515825Z return mod(**inputs) 2025-08-14T21:41:11.2516120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2516190Z outputs = self.deberta( 2025-08-14T21:41:11.2516480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2516563Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2516847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2516945Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2517177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2517259Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2517551Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2517647Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2517950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2518038Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2518326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2518536Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2518540Z 2025-08-14T21:41:11.2518626Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2518719Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2518800Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2518882Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2518971Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2519052Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2519132Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2524216Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2524395Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2524518Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2524648Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2524873Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2524947Z return mod(**inputs) 2025-08-14T21:41:11.2525266Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2525342Z outputs = self.deberta( 2025-08-14T21:41:11.2525637Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2525727Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2526017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2526167Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2526415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2526504Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2526795Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2526905Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2527195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2527287Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2527579Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2527785Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2528133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2528278Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2528283Z 2025-08-14T21:41:11.2528403Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2528618Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2528691Z return mod(**inputs) 2025-08-14T21:41:11.2529094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2529171Z outputs = self.deberta( 2025-08-14T21:41:11.2529465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2529559Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2529859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2529960Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2530190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2530276Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2530567Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2530665Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2530945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2531034Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2531475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2531711Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2531716Z 2025-08-14T21:41:11.2531825Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2532033Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2532109Z return mod(**inputs) 2025-08-14T21:41:11.2532396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2532472Z outputs = self.deberta( 2025-08-14T21:41:11.2532752Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2532828Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2533121Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2533212Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2533449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2533545Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2533840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2533940Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2534219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2534314Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2534577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2534788Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2534799Z 2025-08-14T21:41:11.2534877Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2534953Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2535062Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2535255Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2535319Z return mod(**inputs) 2025-08-14T21:41:11.2535594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2535660Z outputs = self.deberta( 2025-08-14T21:41:11.2535931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2536007Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2536290Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2536391Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2536628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2536714Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2537012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2537111Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2537409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2537493Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2537873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2538093Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2538416Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2538555Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2538559Z 2025-08-14T21:41:11.2538661Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2538857Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2538933Z return mod(**inputs) 2025-08-14T21:41:11.2539206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2539286Z outputs = self.deberta( 2025-08-14T21:41:11.2539557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2539631Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2539904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2539990Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2540209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2540295Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2540564Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2540661Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2540932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2541009Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2541283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2541354Z context_layer = torch.bmm( 2025-08-14T21:41:11.2541358Z 2025-08-14T21:41:11.2541466Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2541663Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2541727Z return mod(**inputs) 2025-08-14T21:41:11.2542018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2542089Z outputs = self.deberta( 2025-08-14T21:41:11.2542370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2542460Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2542739Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2542835Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2543063Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2543148Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2543436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2543531Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2543811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2543952Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2544261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2544467Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2544471Z 2025-08-14T21:41:11.2544556Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2544638Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2544727Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2544806Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2544893Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2544972Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2545050Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2545136Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2545215Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2545295Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2545414Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2545620Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2545689Z return mod(**inputs) 2025-08-14T21:41:11.2545985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2546055Z outputs = self.deberta( 2025-08-14T21:41:11.2546346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2546425Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2546706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2546804Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2547040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2547125Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2547411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2547509Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2547796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2547873Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2548138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2548334Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2548642Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2548779Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2548783Z 2025-08-14T21:41:11.2548889Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2549096Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2549174Z return mod(**inputs) 2025-08-14T21:41:11.2549461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2549540Z outputs = self.deberta( 2025-08-14T21:41:11.2549821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2549898Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2550245Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2550353Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2550582Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2550673Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2550954Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2551059Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2551338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2551418Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2551707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2551931Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2551935Z 2025-08-14T21:41:11.2552051Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2552258Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2552326Z return mod(**inputs) 2025-08-14T21:41:11.2552621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2552691Z outputs = self.deberta( 2025-08-14T21:41:11.2552974Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2553059Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2553338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2553443Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2553672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2553756Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2554046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2554142Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2554429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2554508Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2554787Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2555014Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2555018Z 2025-08-14T21:41:11.2555102Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2555192Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2555301Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2555507Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2555582Z return mod(**inputs) 2025-08-14T21:41:11.2555868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2555939Z outputs = self.deberta( 2025-08-14T21:41:11.2556227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2556302Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2556669Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2556761Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2556990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2557081Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2557361Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2557457Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2557779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2557860Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2558176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2558383Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2558703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2558849Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2558853Z 2025-08-14T21:41:11.2558960Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2559176Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2559245Z return mod(**inputs) 2025-08-14T21:41:11.2559559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2559639Z outputs = self.deberta( 2025-08-14T21:41:11.2559929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2560011Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2560295Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2560384Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2560620Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2560704Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2561013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2561117Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2561431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2561527Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2561807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2561883Z context_layer = torch.bmm( 2025-08-14T21:41:11.2561887Z 2025-08-14T21:41:11.2562001Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2562208Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2562286Z return mod(**inputs) 2025-08-14T21:41:11.2562594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2562667Z outputs = self.deberta( 2025-08-14T21:41:11.2562965Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2563081Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2563428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2563532Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2563782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2563872Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2564189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2564283Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2564603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2564684Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2564978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2565188Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2565192Z 2025-08-14T21:41:11.2565275Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2565367Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2565448Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2565530Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2565621Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2565701Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2565782Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2565871Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2565951Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2566039Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2566153Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2566371Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2566450Z return mod(**inputs) 2025-08-14T21:41:11.2566748Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2566820Z outputs = self.deberta( 2025-08-14T21:41:11.2567123Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2567200Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2567500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2567593Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2567830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2567932Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2568225Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2568331Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2568622Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2568704Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2569137Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2569340Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2569671Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2569911Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2569916Z 2025-08-14T21:41:11.2570029Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2570263Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2570332Z return mod(**inputs) 2025-08-14T21:41:11.2570627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2570708Z outputs = self.deberta( 2025-08-14T21:41:11.2570994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2571080Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2571366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2571462Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2571704Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2571789Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2572070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2572177Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2572463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2572551Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2572834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2573054Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2573062Z 2025-08-14T21:41:11.2573179Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2573390Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2573466Z return mod(**inputs) 2025-08-14T21:41:11.2573758Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2573828Z outputs = self.deberta( 2025-08-14T21:41:11.2574117Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2574194Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2574476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2574575Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2574811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2574903Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2575188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2575283Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2575575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2575654Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2575946Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2576163Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2576209Z 2025-08-14T21:41:11.2576334Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2576428Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2576538Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2576755Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2576824Z return mod(**inputs) 2025-08-14T21:41:11.2577114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2577193Z outputs = self.deberta( 2025-08-14T21:41:11.2577491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2577568Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2577862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2577957Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2578198Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2578282Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2578565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2578671Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2578971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2579052Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2579347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2579555Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2579872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2580005Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2580008Z 2025-08-14T21:41:11.2580110Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2580313Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2580379Z return mod(**inputs) 2025-08-14T21:41:11.2580661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2580728Z outputs = self.deberta( 2025-08-14T21:41:11.2580997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2581084Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2581352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2581444Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2581668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2581746Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2582024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2582121Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2582405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2582521Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2582852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2582933Z context_layer = torch.bmm( 2025-08-14T21:41:11.2582936Z 2025-08-14T21:41:11.2583039Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2583235Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2583308Z return mod(**inputs) 2025-08-14T21:41:11.2583578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2583651Z outputs = self.deberta( 2025-08-14T21:41:11.2583946Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2584021Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2584331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2584420Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2584652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2584743Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2585046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2585151Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2585449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2585529Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2585827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2586019Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2586023Z 2025-08-14T21:41:11.2586110Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2586187Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2586262Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2586345Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2586419Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2586493Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2586575Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2586649Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2586724Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2586806Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2586908Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2587122Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2587195Z return mod(**inputs) 2025-08-14T21:41:11.2587482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2587560Z outputs = self.deberta( 2025-08-14T21:41:11.2587841Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2587918Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2588206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2588298Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2588545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2588631Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2589019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2589126Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2589408Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2589494Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2589777Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2589974Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2590308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2590439Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2590446Z 2025-08-14T21:41:11.2590557Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2590751Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2590815Z return mod(**inputs) 2025-08-14T21:41:11.2591094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2591164Z outputs = self.deberta( 2025-08-14T21:41:11.2591446Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2591530Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2591812Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2591913Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2592150Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2592233Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2592523Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2592619Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2592902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2592991Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2593273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2593496Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2593502Z 2025-08-14T21:41:11.2593612Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2593819Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2593896Z return mod(**inputs) 2025-08-14T21:41:11.2594185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2594262Z outputs = self.deberta( 2025-08-14T21:41:11.2594545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2594621Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2594910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2595000Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2595261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2595396Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2595678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2595782Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2596064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2596145Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2596434Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2596647Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2596651Z 2025-08-14T21:41:11.2596743Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2596828Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2596943Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2597164Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2597235Z return mod(**inputs) 2025-08-14T21:41:11.2597541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2597620Z outputs = self.deberta( 2025-08-14T21:41:11.2597902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2597998Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2598297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2598390Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2598632Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2598715Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2599006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2599104Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2599402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2599494Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2599792Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2600000Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2600330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2600467Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2600470Z 2025-08-14T21:41:11.2600586Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2600790Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2600858Z return mod(**inputs) 2025-08-14T21:41:11.2601167Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2601237Z outputs = self.deberta( 2025-08-14T21:41:11.2601538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2601644Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2601979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2602079Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2602310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2602400Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2602854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2602958Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2603250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2603333Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2603624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2603717Z context_layer = torch.bmm( 2025-08-14T21:41:11.2603721Z 2025-08-14T21:41:11.2603834Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2604054Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2604126Z return mod(**inputs) 2025-08-14T21:41:11.2604422Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2604505Z outputs = self.deberta( 2025-08-14T21:41:11.2604809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2604894Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2605197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2605296Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2605544Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2605629Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2605916Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2606025Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2606329Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2606421Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2606726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2606929Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2606937Z 2025-08-14T21:41:11.2607030Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2607114Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2607204Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2607285Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2607367Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2607452Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2607532Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2607613Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2607701Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2607781Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2607893Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2608115Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2608304Z return mod(**inputs) 2025-08-14T21:41:11.2608662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2608821Z outputs = self.deberta( 2025-08-14T21:41:11.2609122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2609211Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2609502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2609596Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2609840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2609926Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2610225Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2610333Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2610633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2610725Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2611007Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2611214Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2611538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2611675Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2611681Z 2025-08-14T21:41:11.2611800Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2612009Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2612087Z return mod(**inputs) 2025-08-14T21:41:11.2612374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2612449Z outputs = self.deberta( 2025-08-14T21:41:11.2612745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2612823Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2613119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2613217Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2613448Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2613544Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2613827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2613924Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2614212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2614294Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2614577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2614802Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2614806Z 2025-08-14T21:41:11.2614915Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2615217Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2615287Z return mod(**inputs) 2025-08-14T21:41:11.2615578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2615656Z outputs = self.deberta( 2025-08-14T21:41:11.2615938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2616020Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2616305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2616394Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2616634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2616721Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2617014Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2617112Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2617397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2617486Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2617774Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2617991Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2618002Z 2025-08-14T21:41:11.2618085Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2618166Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2618287Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2618496Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2618565Z return mod(**inputs) 2025-08-14T21:41:11.2618863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2618934Z outputs = self.deberta( 2025-08-14T21:41:11.2619224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2619300Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2619584Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2619682Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2619915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2620004Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2620294Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2620389Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2620680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2620760Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2621042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2621254Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2621576Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2621792Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2621796Z 2025-08-14T21:41:11.2621905Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2622112Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2622187Z return mod(**inputs) 2025-08-14T21:41:11.2622474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2622544Z outputs = self.deberta( 2025-08-14T21:41:11.2622833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2622908Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2623197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2623299Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2623516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2623603Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2623870Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2623968Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2624231Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2624307Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2624579Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2624654Z context_layer = torch.bmm( 2025-08-14T21:41:11.2624657Z 2025-08-14T21:41:11.2624759Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2624962Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2625029Z return mod(**inputs) 2025-08-14T21:41:11.2625324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2625394Z outputs = self.deberta( 2025-08-14T21:41:11.2625676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2625759Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2626041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2626139Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2626374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2626457Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2626746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2626840Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2627123Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2627211Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2627495Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2627698Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2627741Z 2025-08-14T21:41:11.2627825Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2627941Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2628031Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2628112Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2628191Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2628282Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2628361Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2628450Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2628528Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2628608Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2628727Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2628934Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2629003Z return mod(**inputs) 2025-08-14T21:41:11.2629303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2629376Z outputs = self.deberta( 2025-08-14T21:41:11.2629669Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2629741Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2630009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2630108Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2630336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2630421Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2630711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2630811Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2631101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2631181Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2631462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2631668Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2631988Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2632132Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2632137Z 2025-08-14T21:41:11.2632244Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2632453Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2632533Z return mod(**inputs) 2025-08-14T21:41:11.2632824Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2632903Z outputs = self.deberta( 2025-08-14T21:41:11.2633184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2633261Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2633556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2633646Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2633877Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2634064Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2634382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2634489Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2634769Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2634849Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2635138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2635355Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2635359Z 2025-08-14T21:41:11.2635473Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2635678Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2635748Z return mod(**inputs) 2025-08-14T21:41:11.2636044Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2636116Z outputs = self.deberta( 2025-08-14T21:41:11.2636398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2636482Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2636759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2636855Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2637083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2637165Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2637463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2637560Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2637850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2637931Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2638248Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2638469Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2638473Z 2025-08-14T21:41:11.2638555Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2638636Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2638748Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2638964Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2639040Z return mod(**inputs) 2025-08-14T21:41:11.2639330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2639400Z outputs = self.deberta( 2025-08-14T21:41:11.2639692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2639767Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2640088Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2640178Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2640408Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2640544Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2640856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2640953Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2641242Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2641322Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2641610Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2641809Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2642127Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2642275Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2642281Z 2025-08-14T21:41:11.2642389Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2642601Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2642670Z return mod(**inputs) 2025-08-14T21:41:11.2642971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2643052Z outputs = self.deberta( 2025-08-14T21:41:11.2643350Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2643425Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2643729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2643823Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2644063Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2644147Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2644429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2644536Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2644833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2644921Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2645218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2645292Z context_layer = torch.bmm( 2025-08-14T21:41:11.2645297Z 2025-08-14T21:41:11.2645416Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2645623Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2645691Z return mod(**inputs) 2025-08-14T21:41:11.2645980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2646050Z outputs = self.deberta( 2025-08-14T21:41:11.2646353Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2646430Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2646724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2646821Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2647051Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2647222Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2647517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2647615Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2647926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2648008Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2648304Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2648514Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2648518Z 2025-08-14T21:41:11.2648604Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2648700Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2648894Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2648985Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2649078Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2649160Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2649242Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2649335Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2649415Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2649506Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2649618Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2649832Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2649913Z return mod(**inputs) 2025-08-14T21:41:11.2650218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2650293Z outputs = self.deberta( 2025-08-14T21:41:11.2650588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2650665Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2650958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2651050Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2651281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2651376Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2651657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2651753Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2652047Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2652130Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2652420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-08-14T21:41:11.2652618Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-08-14T21:41:11.2652935Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2653080Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2653084Z 2025-08-14T21:41:11.2653193Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2653408Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2653520Z return mod(**inputs) 2025-08-14T21:41:11.2653848Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2653930Z outputs = self.deberta( 2025-08-14T21:41:11.2654216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2654298Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2654580Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2654670Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2654905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2654988Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2655273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2655381Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2655660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2655749Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2656029Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2656244Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2656248Z 2025-08-14T21:41:11.2656363Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2656568Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2656646Z return mod(**inputs) 2025-08-14T21:41:11.2656936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2657006Z outputs = self.deberta( 2025-08-14T21:41:11.2657302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2657378Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2657668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2657765Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2657998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2658089Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2658378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2658475Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2658757Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2658833Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2659115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-08-14T21:41:11.2659319Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-08-14T21:41:11.2659322Z 2025-08-14T21:41:11.2659401Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2659490Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2659598Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2659812Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2659928Z return mod(**inputs) 2025-08-14T21:41:11.2660259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2660340Z outputs = self.deberta( 2025-08-14T21:41:11.2660622Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2660698Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2660987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2661079Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2661325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2661404Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2661675Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2661779Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2662058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2662139Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2662429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-08-14T21:41:11.2662628Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-08-14T21:41:11.2662955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-08-14T21:41:11.2663094Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-08-14T21:41:11.2663100Z 2025-08-14T21:41:11.2663211Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2663440Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2663507Z return mod(**inputs) 2025-08-14T21:41:11.2663807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2663878Z outputs = self.deberta( 2025-08-14T21:41:11.2664158Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2664243Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2664525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2664614Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2664857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2664942Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2665233Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2665325Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2665589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2665675Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2665944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-08-14T21:41:11.2666022Z context_layer = torch.bmm( 2025-08-14T21:41:11.2666026Z 2025-08-14T21:41:11.2666126Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2666393Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2666469Z return mod(**inputs) 2025-08-14T21:41:11.2666740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-08-14T21:41:11.2666809Z outputs = self.deberta( 2025-08-14T21:41:11.2667080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-08-14T21:41:11.2667151Z encoder_outputs = self.encoder( 2025-08-14T21:41:11.2667428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-08-14T21:41:11.2667514Z output_states, attn_weights = layer_module( 2025-08-14T21:41:11.2667743Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:11.2667839Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:11.2668125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-08-14T21:41:11.2668229Z attention_output, att_matrix = self.attention( 2025-08-14T21:41:11.2668509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-08-14T21:41:11.2668589Z self_output, att_matrix = self.self( 2025-08-14T21:41:11.2668877Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-08-14T21:41:11.2669071Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-08-14T21:41:11.2669074Z 2025-08-14T21:41:11.2669164Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2669246Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2669328Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2669415Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2669496Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2669575Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2669663Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2669742Z cudagraph partition due to non gpu ops 2025-08-14T21:41:11.2669850Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2670064Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2670131Z return mod(**inputs) 2025-08-14T21:41:11.2670424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1262, in forward 2025-08-14T21:41:11.2670536Z start_loss = loss_fct(start_logits, start_positions) 2025-08-14T21:41:11.2670540Z 2025-08-14T21:41:11.2670645Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:11.2670860Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:11.2670932Z return mod(**inputs) 2025-08-14T21:41:11.2671222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1263, in forward 2025-08-14T21:41:11.2671324Z end_loss = loss_fct(end_logits, end_positions) 2025-08-14T21:41:11.2671328Z 2025-08-14T21:41:25.0447098Z Compilation time (from dynamo_timed): 35.147737998 2025-08-14T21:41:25.0447431Z pass 2025-08-14T21:41:25.0447759Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:41:25.0448616Z TIMING: _recursive_pre_grad_passes:0.10015 _recursive_joint_graph_passes:0.95929 _recursive_post_grad_passes:0.34227 async_compile.wait:0.83701 code_gen:12.88227 inductor_compile:16.13412 backend_compile:29.38918 gc:0.00071 entire_frame_compile:35.14774 total_wall_time:35.14774 2025-08-14T21:41:25.0451153Z STATS: call_* op count: 1089 | FakeTensorMode.__torch_dispatch__:63404 | FakeTensor.__torch_dispatch__:8370 | ProxyTorchDispatchMode.__torch_dispatch__:16949 2025-08-14T21:41:25.0451783Z Dynamo produced 1 graphs covering 1089 ops with 0 graph breaks (0 unique) 2025-08-14T21:41:31.6038782Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:41:31.6039757Z from pkg_resources import resource_filename 2025-08-14T21:41:32.2426672Z 2025-08-14T21:41:33.0251134Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:41:33.0251448Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:41:33.0258339Z cpu eval DistilBertForMaskedLM 2025-08-14T21:41:33.3896672Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:41:33.4571784Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:41:33.5191514Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:41:40.2556811Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2557169Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2557404Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2557636Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2557872Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2558101Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2558331Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2558561Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2558788Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2559019Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2559250Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2559510Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2559757Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2560050Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:40.2560476Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:40.2560854Z return mod(**inputs) 2025-08-14T21:41:40.2561339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 826, in forward 2025-08-14T21:41:40.2561823Z dlbrt_output = self.distilbert( 2025-08-14T21:41:40.2562339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-08-14T21:41:40.2562806Z return self.transformer( 2025-08-14T21:41:40.2563262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-08-14T21:41:40.2563729Z layer_outputs = layer_module( 2025-08-14T21:41:40.2564113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:40.2564520Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:40.2564987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-08-14T21:41:40.2565441Z sa_output = self.attention( 2025-08-14T21:41:40.2565887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-08-14T21:41:40.2566406Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:41:40.2566608Z 2025-08-14T21:41:40.2566702Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2566963Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2567187Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2567798Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2568078Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2568398Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2568628Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2569091Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2569318Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2569540Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2569767Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2569985Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2570208Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2570472Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:40.2570870Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:40.2571240Z return mod(**inputs) 2025-08-14T21:41:40.2571684Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 826, in forward 2025-08-14T21:41:40.2572173Z dlbrt_output = self.distilbert( 2025-08-14T21:41:40.2572617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-08-14T21:41:40.2573061Z return self.transformer( 2025-08-14T21:41:40.2573495Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-08-14T21:41:40.2573942Z layer_outputs = layer_module( 2025-08-14T21:41:40.2574315Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:40.2574715Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:40.2575164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-08-14T21:41:40.2575616Z sa_output = self.attention( 2025-08-14T21:41:40.2576048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-08-14T21:41:40.2576556Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:41:40.2576758Z 2025-08-14T21:41:40.2576852Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2577081Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2577304Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2577526Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2577748Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2577963Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2578183Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2578402Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2578618Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2578840Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2579065Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2579284Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2579512Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2579771Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:40.2580167Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:40.2580515Z return mod(**inputs) 2025-08-14T21:41:40.2580943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 826, in forward 2025-08-14T21:41:40.2581395Z dlbrt_output = self.distilbert( 2025-08-14T21:41:40.2581828Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-08-14T21:41:40.2582277Z return self.transformer( 2025-08-14T21:41:40.2582709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-08-14T21:41:40.2583202Z layer_outputs = layer_module( 2025-08-14T21:41:40.2583612Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:40.2584014Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:40.2584465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-08-14T21:41:40.2584904Z sa_output = self.attention( 2025-08-14T21:41:40.2585339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-08-14T21:41:40.2585850Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:41:40.2586051Z 2025-08-14T21:41:40.2586148Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2586369Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2586593Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2586821Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2587039Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2587264Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2587486Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2587708Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2587926Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2588140Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2588352Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2588564Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2588783Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2589034Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:40.2589419Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:40.2589776Z return mod(**inputs) 2025-08-14T21:41:40.2590203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 826, in forward 2025-08-14T21:41:40.2590663Z dlbrt_output = self.distilbert( 2025-08-14T21:41:40.2591088Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-08-14T21:41:40.2591523Z return self.transformer( 2025-08-14T21:41:40.2591952Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-08-14T21:41:40.2592386Z layer_outputs = layer_module( 2025-08-14T21:41:40.2592767Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:40.2593162Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:40.2593614Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-08-14T21:41:40.2594051Z sa_output = self.attention( 2025-08-14T21:41:40.2594494Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-08-14T21:41:40.2594999Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:41:40.2595209Z 2025-08-14T21:41:40.2595302Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2595523Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2595751Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2595974Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2596190Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2596411Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2596634Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2596849Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2597071Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2597293Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2597509Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2597774Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2598028Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2598285Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:40.2598677Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:40.2599045Z return mod(**inputs) 2025-08-14T21:41:40.2599470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 826, in forward 2025-08-14T21:41:40.2599912Z dlbrt_output = self.distilbert( 2025-08-14T21:41:40.2600355Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-08-14T21:41:40.2600802Z return self.transformer( 2025-08-14T21:41:40.2601232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-08-14T21:41:40.2601675Z layer_outputs = layer_module( 2025-08-14T21:41:40.2602066Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:40.2602467Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:40.2603233Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-08-14T21:41:40.2603686Z sa_output = self.attention( 2025-08-14T21:41:40.2604122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-08-14T21:41:40.2604625Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:41:40.2604822Z 2025-08-14T21:41:40.2604907Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2605139Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2605368Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2605589Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2605816Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2606043Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2606297Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2606510Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2606732Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2606954Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2607167Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2607387Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2607606Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2607864Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:40.2608251Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:40.2608604Z return mod(**inputs) 2025-08-14T21:41:40.2609097Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 826, in forward 2025-08-14T21:41:40.2609551Z dlbrt_output = self.distilbert( 2025-08-14T21:41:40.2609996Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-08-14T21:41:40.2610439Z return self.transformer( 2025-08-14T21:41:40.2610871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-08-14T21:41:40.2611309Z layer_outputs = layer_module( 2025-08-14T21:41:40.2611688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:41:40.2612085Z return super().__call__(*args, **kwargs) 2025-08-14T21:41:40.2612530Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-08-14T21:41:40.2612972Z sa_output = self.attention( 2025-08-14T21:41:40.2613561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-08-14T21:41:40.2614073Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:41:40.2614270Z 2025-08-14T21:41:40.2614357Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2614588Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2614814Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2615031Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2615253Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2615475Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2615696Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2615910Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2616135Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2616356Z cudagraph partition due to non gpu ops 2025-08-14T21:41:40.2616603Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:41:40.2617007Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:41:40.2617363Z return mod(**inputs) 2025-08-14T21:41:40.2617782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 843, in forward 2025-08-14T21:41:40.2618375Z mlm_loss = self.mlm_loss_fct(prediction_logits.view(-1, prediction_logits.size(-1)), labels.view(-1)) 2025-08-14T21:41:40.2618651Z 2025-08-14T21:41:48.1895318Z Compilation time (from dynamo_timed): 13.446372597 2025-08-14T21:41:48.1895640Z pass 2025-08-14T21:41:48.1903791Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:41:48.1904831Z TIMING: _recursive_pre_grad_passes:0.02121 _recursive_joint_graph_passes:0.29106 _recursive_post_grad_passes:0.05954 async_compile.wait:0.81796 code_gen:7.63724 inductor_compile:8.79862 backend_compile:11.84141 gc:0.00032 entire_frame_compile:13.44637 total_wall_time:13.44637 2025-08-14T21:41:48.1906056Z STATS: call_* op count: 155 | FakeTensorMode.__torch_dispatch__:14301 | FakeTensor.__torch_dispatch__:1794 | ProxyTorchDispatchMode.__torch_dispatch__:3779 2025-08-14T21:41:48.1906601Z Dynamo produced 1 graphs covering 155 ops with 0 graph breaks (0 unique) 2025-08-14T21:41:53.7784313Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:41:53.7788836Z from pkg_resources import resource_filename 2025-08-14T21:41:54.4389751Z 2025-08-14T21:41:55.0379104Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:41:55.0383499Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:41:55.0383793Z cpu eval DistilBertForQuestionAnswering 2025-08-14T21:41:55.3531336Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:41:55.4151232Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:41:55.4864229Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:42:02.1454291Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1454610Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1454842Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1455065Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1455296Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1455547Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1455814Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:02.1456230Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:42:02.1457078Z return mod(**inputs) 2025-08-14T21:42:02.1457780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1043, in forward 2025-08-14T21:42:02.1458315Z logits = self.qa_outputs(hidden_states) # (bs, max_query_len, 2) 2025-08-14T21:42:02.1458515Z 2025-08-14T21:42:02.1458626Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1458861Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1459096Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1459317Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1459528Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1459744Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1459961Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1460202Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:02.1460602Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:42:02.1460968Z return mod(**inputs) 2025-08-14T21:42:02.1461422Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1031, in forward 2025-08-14T21:42:02.1461904Z distilbert_output = self.distilbert( 2025-08-14T21:42:02.1462394Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-08-14T21:42:02.1462869Z return self.transformer( 2025-08-14T21:42:02.1463319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-08-14T21:42:02.1463765Z layer_outputs = layer_module( 2025-08-14T21:42:02.1464145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:02.1464540Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:02.1465039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-08-14T21:42:02.1465505Z sa_output = self.attention( 2025-08-14T21:42:02.1465955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-08-14T21:42:02.1466479Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:42:02.1466704Z 2025-08-14T21:42:02.1466792Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1467034Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1467256Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1467466Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1467684Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1467901Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1468115Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1468337Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1468560Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1468778Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1469004Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1469228Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1469451Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1469700Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:02.1470104Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:42:02.1470476Z return mod(**inputs) 2025-08-14T21:42:02.1470909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1031, in forward 2025-08-14T21:42:02.1471379Z distilbert_output = self.distilbert( 2025-08-14T21:42:02.1471840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-08-14T21:42:02.1472296Z return self.transformer( 2025-08-14T21:42:02.1472820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-08-14T21:42:02.1473272Z layer_outputs = layer_module( 2025-08-14T21:42:02.1473661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:02.1474053Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:02.1474509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-08-14T21:42:02.1474953Z sa_output = self.attention( 2025-08-14T21:42:02.1475387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-08-14T21:42:02.1475889Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:42:02.1476097Z 2025-08-14T21:42:02.1476182Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1476414Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1476636Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1476861Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1477084Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1477303Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1477558Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1477774Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1477994Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1478213Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1478431Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1478646Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1478868Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1479117Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:02.1479501Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:42:02.1479859Z return mod(**inputs) 2025-08-14T21:42:02.1480298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1031, in forward 2025-08-14T21:42:02.1480749Z distilbert_output = self.distilbert( 2025-08-14T21:42:02.1481205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-08-14T21:42:02.1481649Z return self.transformer( 2025-08-14T21:42:02.1482081Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-08-14T21:42:02.1482528Z layer_outputs = layer_module( 2025-08-14T21:42:02.1482908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:02.1483312Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:02.1483766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-08-14T21:42:02.1484212Z sa_output = self.attention( 2025-08-14T21:42:02.1484644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-08-14T21:42:02.1485155Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:42:02.1485352Z 2025-08-14T21:42:02.1485439Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1485668Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1485892Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1486113Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1486327Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1486547Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1486770Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1486980Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1487232Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1487483Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1487743Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1487973Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1488196Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1488444Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:02.1489201Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:42:02.1489563Z return mod(**inputs) 2025-08-14T21:42:02.1489994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1031, in forward 2025-08-14T21:42:02.1490490Z distilbert_output = self.distilbert( 2025-08-14T21:42:02.1490931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-08-14T21:42:02.1491369Z return self.transformer( 2025-08-14T21:42:02.1491795Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-08-14T21:42:02.1492221Z layer_outputs = layer_module( 2025-08-14T21:42:02.1492587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:02.1492970Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:02.1493395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-08-14T21:42:02.1493826Z sa_output = self.attention( 2025-08-14T21:42:02.1494237Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-08-14T21:42:02.1494721Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:42:02.1494910Z 2025-08-14T21:42:02.1494992Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1495216Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1495433Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1495638Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1495855Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1496071Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1496283Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1496491Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1496708Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1496923Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1497135Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1497349Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1497566Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1497806Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:02.1498187Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:42:02.1498534Z return mod(**inputs) 2025-08-14T21:42:02.1498946Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1031, in forward 2025-08-14T21:42:02.1499391Z distilbert_output = self.distilbert( 2025-08-14T21:42:02.1499828Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-08-14T21:42:02.1500260Z return self.transformer( 2025-08-14T21:42:02.1500668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-08-14T21:42:02.1501096Z layer_outputs = layer_module( 2025-08-14T21:42:02.1501464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:02.1501848Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:02.1502376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-08-14T21:42:02.1503140Z sa_output = self.attention( 2025-08-14T21:42:02.1503585Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-08-14T21:42:02.1504088Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:42:02.1504309Z 2025-08-14T21:42:02.1504394Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1504617Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1504836Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1505045Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1505260Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1505475Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1505681Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1505898Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1506118Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1506328Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1506541Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1506757Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1506971Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1507214Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:02.1507593Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:42:02.1507931Z return mod(**inputs) 2025-08-14T21:42:02.1508336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1031, in forward 2025-08-14T21:42:02.1508832Z distilbert_output = self.distilbert( 2025-08-14T21:42:02.1509264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-08-14T21:42:02.1509690Z return self.transformer( 2025-08-14T21:42:02.1510106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-08-14T21:42:02.1510534Z layer_outputs = layer_module( 2025-08-14T21:42:02.1510899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:02.1511283Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:02.1511715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-08-14T21:42:02.1512143Z sa_output = self.attention( 2025-08-14T21:42:02.1512563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-08-14T21:42:02.1513063Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:42:02.1513263Z 2025-08-14T21:42:02.1513350Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1513583Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1513804Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1514022Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1514227Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1514432Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1514627Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1514829Z cudagraph partition due to non gpu ops 2025-08-14T21:42:02.1515059Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:02.1515407Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:42:02.1515742Z return mod(**inputs) 2025-08-14T21:42:02.1516153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1061, in forward 2025-08-14T21:42:02.1516613Z start_loss = loss_fct(start_logits, start_positions) 2025-08-14T21:42:02.1516885Z 2025-08-14T21:42:02.1517092Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:02.1517482Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:42:02.1517826Z return mod(**inputs) 2025-08-14T21:42:02.1518236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1062, in forward 2025-08-14T21:42:02.1518702Z end_loss = loss_fct(end_logits, end_positions) 2025-08-14T21:42:02.1518876Z 2025-08-14T21:42:09.8717439Z Compilation time (from dynamo_timed): 13.208792778 2025-08-14T21:42:09.8717961Z pass 2025-08-14T21:42:09.8721577Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:42:09.8722494Z TIMING: _recursive_pre_grad_passes:0.0215 _recursive_joint_graph_passes:0.28784 _recursive_post_grad_passes:0.06241 async_compile.wait:0.74844 code_gen:7.46389 inductor_compile:8.63171 backend_compile:11.6474 gc:0.00025 entire_frame_compile:13.20879 total_wall_time:13.20879 2025-08-14T21:42:09.8723638Z STATS: call_* op count: 163 | FakeTensorMode.__torch_dispatch__:14171 | FakeTensor.__torch_dispatch__:1806 | ProxyTorchDispatchMode.__torch_dispatch__:3793 2025-08-14T21:42:09.8724306Z Dynamo produced 1 graphs covering 163 ops with 0 graph breaks (0 unique) 2025-08-14T21:42:15.5232370Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:42:15.5233441Z from pkg_resources import resource_filename 2025-08-14T21:42:16.2314239Z 2025-08-14T21:42:18.5136315Z loading model: 0it [00:00, ?it/s]`loss_type=None` was set in the config but it is unrecognised.Using the default loss: `ForCausalLMLoss`. 2025-08-14T21:42:18.5137097Z WARNING:transformers.modeling_utils:`loss_type=None` was set in the config but it is unrecognised.Using the default loss: `ForCausalLMLoss`. 2025-08-14T21:42:18.5470556Z 2025-08-14T21:42:18.5471413Z loading model: 0it [00:02, ?it/s] 2025-08-14T21:42:18.5471735Z cpu eval DistillGPT2 2025-08-14T21:42:18.9774915Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:42:19.1151497Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:42:19.3017358Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:42:27.3916254Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.3916612Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.3916845Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.3917079Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.3917304Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.3917584Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.3917871Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.3918396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.3918866Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.3919314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.3919739Z outputs = block( 2025-08-14T21:42:27.3920112Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.3920526Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.3920954Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3921372Z return func(*args, **kwargs) 2025-08-14T21:42:27.3922305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.3922763Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.3923204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3923625Z return func(*args, **kwargs) 2025-08-14T21:42:27.3924031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:42:27.3924595Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:42:27.3925123Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.3925598Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.3925808Z 2025-08-14T21:42:27.3925934Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.3926413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.3926865Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.3927299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.3927719Z outputs = block( 2025-08-14T21:42:27.3928081Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.3928486Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.3929166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3929590Z return func(*args, **kwargs) 2025-08-14T21:42:27.3930008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.3930467Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.3930888Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3931282Z return func(*args, **kwargs) 2025-08-14T21:42:27.3931686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:42:27.3932226Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:42:27.3932873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.3933416Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.3933618Z 2025-08-14T21:42:27.3933718Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.3933957Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.3934190Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.3934420Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.3934685Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.3935129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.3935587Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.3936022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.3936431Z outputs = block( 2025-08-14T21:42:27.3936771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.3937162Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.3937567Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3938058Z return func(*args, **kwargs) 2025-08-14T21:42:27.3938486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.3938913Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.3939326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3939738Z return func(*args, **kwargs) 2025-08-14T21:42:27.3940132Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:42:27.3940564Z attn_output, attn_weights = attention_interface( 2025-08-14T21:42:27.3941030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:42:27.3941540Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:42:27.3941745Z 2025-08-14T21:42:27.3941858Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.3942307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.3942727Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.3943150Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.3943558Z outputs = block( 2025-08-14T21:42:27.3943912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.3944303Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.3944756Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3945158Z return func(*args, **kwargs) 2025-08-14T21:42:27.3945541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.3945970Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.3946396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3946799Z return func(*args, **kwargs) 2025-08-14T21:42:27.3947200Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:42:27.3947643Z attn_output, attn_weights = attention_interface( 2025-08-14T21:42:27.3948128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:42:27.3948625Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:42:27.3948810Z 2025-08-14T21:42:27.3948924Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.3949379Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.3949819Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.3950238Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.3950645Z outputs = block( 2025-08-14T21:42:27.3951002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.3951400Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.3951808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3952216Z return func(*args, **kwargs) 2025-08-14T21:42:27.3952618Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.3953043Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.3953540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3953946Z return func(*args, **kwargs) 2025-08-14T21:42:27.3954346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:42:27.3954763Z attn_output = self.c_proj(attn_output) 2025-08-14T21:42:27.3955151Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.3955587Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.3955774Z 2025-08-14T21:42:27.3955893Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.3956337Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.3956768Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.3957190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.3957587Z outputs = block( 2025-08-14T21:42:27.3957934Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.3958326Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.3958735Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3959140Z return func(*args, **kwargs) 2025-08-14T21:42:27.3959535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.3959962Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.3960370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3960773Z return func(*args, **kwargs) 2025-08-14T21:42:27.3961176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:42:27.3961600Z attn_output = self.c_proj(attn_output) 2025-08-14T21:42:27.3961980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.3962408Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.3962596Z 2025-08-14T21:42:27.3962720Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.3963172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.3963597Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.3964022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.3964433Z outputs = block( 2025-08-14T21:42:27.3964778Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.3965173Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.3965583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3965982Z return func(*args, **kwargs) 2025-08-14T21:42:27.3966373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.3966820Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.3967267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:42:27.3967683Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:42:27.3968070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.3968578Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.3968862Z 2025-08-14T21:42:27.3968992Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.3969445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.3969881Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.3970304Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.3970718Z outputs = block( 2025-08-14T21:42:27.3971055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.3971439Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.3971843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3972231Z return func(*args, **kwargs) 2025-08-14T21:42:27.3972624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.3973056Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.3973482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:42:27.3973929Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:42:27.3974297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.3974714Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.3974891Z 2025-08-14T21:42:27.3975009Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.3975440Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.3975865Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.3976275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.3976666Z outputs = block( 2025-08-14T21:42:27.3976997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.3977377Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.3977778Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3978174Z return func(*args, **kwargs) 2025-08-14T21:42:27.3978555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.3978988Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.3979419Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:42:27.3979824Z hidden_states = self.act(hidden_states) 2025-08-14T21:42:27.3980193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:42:27.3980677Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:42:27.3980924Z 2025-08-14T21:42:27.3981042Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.3981474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.3981892Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.3982301Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.3982693Z outputs = block( 2025-08-14T21:42:27.3983100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.3983482Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.3983877Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3984261Z return func(*args, **kwargs) 2025-08-14T21:42:27.3984648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.3985079Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.3985507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:42:27.3985916Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:42:27.3986296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.3986717Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.3986901Z 2025-08-14T21:42:27.3987011Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.3987449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.3987867Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.3988276Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.3988663Z outputs = block( 2025-08-14T21:42:27.3989003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.3989388Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.3989785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3990171Z return func(*args, **kwargs) 2025-08-14T21:42:27.3990562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.3990995Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.3991412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:42:27.3991834Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:42:27.3992221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.3992657Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.3992837Z 2025-08-14T21:42:27.3992945Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.3993391Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.3993824Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.3994245Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.3994645Z outputs = block( 2025-08-14T21:42:27.3994993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.3995389Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.3995792Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3996198Z return func(*args, **kwargs) 2025-08-14T21:42:27.3996597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.3997027Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.3997441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.3997972Z return func(*args, **kwargs) 2025-08-14T21:42:27.3998373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:42:27.3998907Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:42:27.3999407Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.3999838Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4000023Z 2025-08-14T21:42:27.4000144Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4000585Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4001021Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4001444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4001855Z outputs = block( 2025-08-14T21:42:27.4002203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4002596Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4003320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4003775Z return func(*args, **kwargs) 2025-08-14T21:42:27.4004179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4004624Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4005053Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4005466Z return func(*args, **kwargs) 2025-08-14T21:42:27.4005875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:42:27.4006397Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:42:27.4006885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4007299Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4007486Z 2025-08-14T21:42:27.4007574Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4007802Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4008027Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4008258Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4008516Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4009027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4009463Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4009899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4010304Z outputs = block( 2025-08-14T21:42:27.4010656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4011044Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4011441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4011835Z return func(*args, **kwargs) 2025-08-14T21:42:27.4012215Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4012637Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4013978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4014421Z return func(*args, **kwargs) 2025-08-14T21:42:27.4014817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:42:27.4015254Z attn_output, attn_weights = attention_interface( 2025-08-14T21:42:27.4015730Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:42:27.4016237Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:42:27.4016442Z 2025-08-14T21:42:27.4016557Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4017005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4017434Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4017851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4018253Z outputs = block( 2025-08-14T21:42:27.4018600Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4018983Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4019389Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4019789Z return func(*args, **kwargs) 2025-08-14T21:42:27.4020184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4020597Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4021008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4021400Z return func(*args, **kwargs) 2025-08-14T21:42:27.4021778Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:42:27.4022208Z attn_output, attn_weights = attention_interface( 2025-08-14T21:42:27.4022679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:42:27.4023169Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:42:27.4023344Z 2025-08-14T21:42:27.4023460Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4023906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4024336Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4024750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4025147Z outputs = block( 2025-08-14T21:42:27.4025497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4025888Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4026289Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4026690Z return func(*args, **kwargs) 2025-08-14T21:42:27.4027084Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4027508Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4027913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4028302Z return func(*args, **kwargs) 2025-08-14T21:42:27.4028677Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:42:27.4029142Z attn_output = self.c_proj(attn_output) 2025-08-14T21:42:27.4029515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4029935Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4030116Z 2025-08-14T21:42:27.4030234Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4030666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4031097Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4031514Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4031920Z outputs = block( 2025-08-14T21:42:27.4032250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4032632Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4033033Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4033415Z return func(*args, **kwargs) 2025-08-14T21:42:27.4033803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4034217Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4034623Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4035004Z return func(*args, **kwargs) 2025-08-14T21:42:27.4035389Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:42:27.4035798Z attn_output = self.c_proj(attn_output) 2025-08-14T21:42:27.4036175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4036593Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4036781Z 2025-08-14T21:42:27.4036891Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4037329Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4037736Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4038140Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4038536Z outputs = block( 2025-08-14T21:42:27.4038874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4039244Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4039635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4040030Z return func(*args, **kwargs) 2025-08-14T21:42:27.4040408Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4040840Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4041268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:42:27.4041672Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:42:27.4042039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4042448Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4042632Z 2025-08-14T21:42:27.4042740Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4043173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4043683Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4044090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4044482Z outputs = block( 2025-08-14T21:42:27.4044884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4045279Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4045677Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4046070Z return func(*args, **kwargs) 2025-08-14T21:42:27.4046468Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4046947Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4047401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:42:27.4047830Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:42:27.4048212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4048641Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4048909Z 2025-08-14T21:42:27.4049032Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4049475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4049926Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4050362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4050759Z outputs = block( 2025-08-14T21:42:27.4051103Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4051490Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4051885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4052278Z return func(*args, **kwargs) 2025-08-14T21:42:27.4052657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4053095Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4053521Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:42:27.4053923Z hidden_states = self.act(hidden_states) 2025-08-14T21:42:27.4054292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:42:27.4054774Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:42:27.4055016Z 2025-08-14T21:42:27.4055135Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4055568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4055984Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4056390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4056779Z outputs = block( 2025-08-14T21:42:27.4057108Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4057492Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4057888Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4058330Z return func(*args, **kwargs) 2025-08-14T21:42:27.4058755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4059187Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4059611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:42:27.4073687Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:42:27.4074156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4074610Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4074801Z 2025-08-14T21:42:27.4074916Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4075375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4075830Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4076262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4076668Z outputs = block( 2025-08-14T21:42:27.4077019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4077402Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4077813Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4078223Z return func(*args, **kwargs) 2025-08-14T21:42:27.4078617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4079063Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4079503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:42:27.4079930Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:42:27.4080307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4080753Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4080945Z 2025-08-14T21:42:27.4081075Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4081524Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4081942Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4082354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4082754Z outputs = block( 2025-08-14T21:42:27.4083093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4083490Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4083892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4084294Z return func(*args, **kwargs) 2025-08-14T21:42:27.4084678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4085102Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4085513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4085904Z return func(*args, **kwargs) 2025-08-14T21:42:27.4086303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:42:27.4086993Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:42:27.4087804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4088261Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4088474Z 2025-08-14T21:42:27.4088590Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4089123Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4089567Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4089991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4090391Z outputs = block( 2025-08-14T21:42:27.4090734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4091118Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4091526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4091925Z return func(*args, **kwargs) 2025-08-14T21:42:27.4092318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4092734Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4093145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4093543Z return func(*args, **kwargs) 2025-08-14T21:42:27.4093934Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:42:27.4094446Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:42:27.4094943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4095369Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4095552Z 2025-08-14T21:42:27.4095645Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4095886Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4096116Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4096341Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4096595Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4097059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4097492Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4097912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4098325Z outputs = block( 2025-08-14T21:42:27.4098688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4099087Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4099492Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4099906Z return func(*args, **kwargs) 2025-08-14T21:42:27.4100310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4100737Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4101160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4101565Z return func(*args, **kwargs) 2025-08-14T21:42:27.4101969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:42:27.4102459Z attn_output, attn_weights = attention_interface( 2025-08-14T21:42:27.4103251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:42:27.4103788Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:42:27.4103990Z 2025-08-14T21:42:27.4104112Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4104561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4104998Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4105426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4105838Z outputs = block( 2025-08-14T21:42:27.4106203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4106614Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4107038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4107446Z return func(*args, **kwargs) 2025-08-14T21:42:27.4107886Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4108335Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4108760Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4109177Z return func(*args, **kwargs) 2025-08-14T21:42:27.4109590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:42:27.4110051Z attn_output, attn_weights = attention_interface( 2025-08-14T21:42:27.4110544Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:42:27.4111040Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:42:27.4111221Z 2025-08-14T21:42:27.4111333Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4111780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4112199Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4112633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4113050Z outputs = block( 2025-08-14T21:42:27.4113387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4113789Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4114200Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4114613Z return func(*args, **kwargs) 2025-08-14T21:42:27.4114999Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4115423Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4115835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4116235Z return func(*args, **kwargs) 2025-08-14T21:42:27.4116621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:42:27.4117050Z attn_output = self.c_proj(attn_output) 2025-08-14T21:42:27.4117443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4117877Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4118138Z 2025-08-14T21:42:27.4118300Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4118753Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4119183Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4119597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4120003Z outputs = block( 2025-08-14T21:42:27.4120365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4120770Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4121184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4121600Z return func(*args, **kwargs) 2025-08-14T21:42:27.4122005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4122423Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4122840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4123245Z return func(*args, **kwargs) 2025-08-14T21:42:27.4123646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:42:27.4124074Z attn_output = self.c_proj(attn_output) 2025-08-14T21:42:27.4124462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4124907Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4125093Z 2025-08-14T21:42:27.4125207Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4125672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4126113Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4126543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4126950Z outputs = block( 2025-08-14T21:42:27.4127305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4127706Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4128124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4128533Z return func(*args, **kwargs) 2025-08-14T21:42:27.4129008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4129472Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4129909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:42:27.4130333Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:42:27.4130722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4131149Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4131337Z 2025-08-14T21:42:27.4131451Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4131904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4132341Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4132754Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4133210Z outputs = block( 2025-08-14T21:42:27.4133593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4133987Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4134392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4134797Z return func(*args, **kwargs) 2025-08-14T21:42:27.4135202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4135651Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4136085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:42:27.4136513Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:42:27.4136900Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4137334Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4137530Z 2025-08-14T21:42:27.4137643Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4138097Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4138529Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4138947Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4139354Z outputs = block( 2025-08-14T21:42:27.4139706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4140100Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4140516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4140916Z return func(*args, **kwargs) 2025-08-14T21:42:27.4141312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4141743Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4142172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:42:27.4142585Z hidden_states = self.act(hidden_states) 2025-08-14T21:42:27.4142955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:42:27.4143432Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:42:27.4143689Z 2025-08-14T21:42:27.4143799Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4144242Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4144669Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4145070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4145466Z outputs = block( 2025-08-14T21:42:27.4145809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4146185Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4146588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4146983Z return func(*args, **kwargs) 2025-08-14T21:42:27.4147375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4147807Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4148318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:42:27.4148761Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:42:27.4149130Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4149558Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4149749Z 2025-08-14T21:42:27.4149860Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4150298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4150709Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4151116Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4151508Z outputs = block( 2025-08-14T21:42:27.4151851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4152223Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4152617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4153011Z return func(*args, **kwargs) 2025-08-14T21:42:27.4153387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4153816Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4154239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:42:27.4154653Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:42:27.4155021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4155444Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4155620Z 2025-08-14T21:42:27.4155737Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4156164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4156580Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4156984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4157377Z outputs = block( 2025-08-14T21:42:27.4157714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4158107Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4158513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4158919Z return func(*args, **kwargs) 2025-08-14T21:42:27.4159313Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 442, in forward 2025-08-14T21:42:27.4159763Z hidden_states = residual + feed_forward_hidden_states 2025-08-14T21:42:27.4159946Z 2025-08-14T21:42:27.4160062Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4160485Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4160904Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4161308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4161703Z outputs = block( 2025-08-14T21:42:27.4162034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4162435Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4162894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4163290Z return func(*args, **kwargs) 2025-08-14T21:42:27.4163672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4164102Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4164521Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4164924Z return func(*args, **kwargs) 2025-08-14T21:42:27.4165324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:42:27.4165868Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:42:27.4166371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4166797Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4166988Z 2025-08-14T21:42:27.4167101Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4167549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4167981Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4168405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4168896Z outputs = block( 2025-08-14T21:42:27.4169249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4169643Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4170052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4170462Z return func(*args, **kwargs) 2025-08-14T21:42:27.4170859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4171279Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4171709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4172121Z return func(*args, **kwargs) 2025-08-14T21:42:27.4172515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:42:27.4173041Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:42:27.4173541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4173976Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4174163Z 2025-08-14T21:42:27.4174254Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4174491Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4174719Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4174938Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4175195Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4175643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4176090Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4176556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4176967Z outputs = block( 2025-08-14T21:42:27.4177316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4177799Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4178237Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4178645Z return func(*args, **kwargs) 2025-08-14T21:42:27.4179046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4179469Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4179892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4180356Z return func(*args, **kwargs) 2025-08-14T21:42:27.4180759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:42:27.4181192Z attn_output, attn_weights = attention_interface( 2025-08-14T21:42:27.4181679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:42:27.4182201Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:42:27.4182399Z 2025-08-14T21:42:27.4182514Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4182964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4183396Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4183815Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4184213Z outputs = block( 2025-08-14T21:42:27.4184561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4184952Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4185366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4185756Z return func(*args, **kwargs) 2025-08-14T21:42:27.4186142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4186556Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4186953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4187343Z return func(*args, **kwargs) 2025-08-14T21:42:27.4187727Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:42:27.4188176Z attn_output, attn_weights = attention_interface( 2025-08-14T21:42:27.4188651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:42:27.4189146Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:42:27.4189321Z 2025-08-14T21:42:27.4189445Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4189904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4190313Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4190719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4191109Z outputs = block( 2025-08-14T21:42:27.4191436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4191817Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4192214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4192624Z return func(*args, **kwargs) 2025-08-14T21:42:27.4193060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4193485Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4193892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4194276Z return func(*args, **kwargs) 2025-08-14T21:42:27.4194662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:42:27.4195069Z attn_output = self.c_proj(attn_output) 2025-08-14T21:42:27.4195444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4195855Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4196042Z 2025-08-14T21:42:27.4196152Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4196598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4197019Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4197431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4197838Z outputs = block( 2025-08-14T21:42:27.4198188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4198577Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4198990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4199393Z return func(*args, **kwargs) 2025-08-14T21:42:27.4199795Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4200222Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4200651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4201041Z return func(*args, **kwargs) 2025-08-14T21:42:27.4201422Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:42:27.4201876Z attn_output = self.c_proj(attn_output) 2025-08-14T21:42:27.4202255Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4202887Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4203078Z 2025-08-14T21:42:27.4203192Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4203638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4204080Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4204503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4204904Z outputs = block( 2025-08-14T21:42:27.4205252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4205661Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4206064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4206474Z return func(*args, **kwargs) 2025-08-14T21:42:27.4206871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4207322Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4207754Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:42:27.4208338Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:42:27.4208791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4209228Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4209412Z 2025-08-14T21:42:27.4209525Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4209973Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4210405Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4210814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4211215Z outputs = block( 2025-08-14T21:42:27.4211559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4211956Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4212356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4212755Z return func(*args, **kwargs) 2025-08-14T21:42:27.4213149Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4213585Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4214019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:42:27.4214436Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:42:27.4214818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4215235Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4215429Z 2025-08-14T21:42:27.4215542Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4215987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4216413Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4216820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4217218Z outputs = block( 2025-08-14T21:42:27.4217563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4217944Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4218349Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4218753Z return func(*args, **kwargs) 2025-08-14T21:42:27.4219153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4219592Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4220030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:42:27.4220450Z hidden_states = self.act(hidden_states) 2025-08-14T21:42:27.4220820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:42:27.4221310Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:42:27.4221564Z 2025-08-14T21:42:27.4221678Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4222129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4222583Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4223054Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4223458Z outputs = block( 2025-08-14T21:42:27.4223804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4224184Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4224598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4225009Z return func(*args, **kwargs) 2025-08-14T21:42:27.4225409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4225853Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4226302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:42:27.4226741Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:42:27.4227155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4227573Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4227752Z 2025-08-14T21:42:27.4227869Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4228302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4228720Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4229131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4229521Z outputs = block( 2025-08-14T21:42:27.4229863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4230251Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4230653Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4231047Z return func(*args, **kwargs) 2025-08-14T21:42:27.4231439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4231877Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4232306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:42:27.4232740Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:42:27.4233134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4233552Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4233736Z 2025-08-14T21:42:27.4233844Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4234288Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4234706Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4235116Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4235505Z outputs = block( 2025-08-14T21:42:27.4235847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4236231Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4236626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4237023Z return func(*args, **kwargs) 2025-08-14T21:42:27.4237415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4237924Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4238328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4238722Z return func(*args, **kwargs) 2025-08-14T21:42:27.4239107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:42:27.4239621Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:42:27.4240111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4240529Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4240707Z 2025-08-14T21:42:27.4240825Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4241267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4241693Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4242112Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4242517Z outputs = block( 2025-08-14T21:42:27.4242868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4243261Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4243674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4244083Z return func(*args, **kwargs) 2025-08-14T21:42:27.4244488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4244923Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4245349Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4245743Z return func(*args, **kwargs) 2025-08-14T21:42:27.4246149Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:42:27.4246678Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:42:27.4247177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4247602Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4247794Z 2025-08-14T21:42:27.4247883Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4248120Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4248341Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4248572Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4248916Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4249375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4249796Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4250218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4250623Z outputs = block( 2025-08-14T21:42:27.4250971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4251368Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4251787Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4252189Z return func(*args, **kwargs) 2025-08-14T21:42:27.4252714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4253144Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4253560Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4253965Z return func(*args, **kwargs) 2025-08-14T21:42:27.4254414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:42:27.4254864Z attn_output, attn_weights = attention_interface( 2025-08-14T21:42:27.4255351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:42:27.4255871Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:42:27.4256087Z 2025-08-14T21:42:27.4256201Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4256682Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4257114Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4257542Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4257954Z outputs = block( 2025-08-14T21:42:27.4258312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4258711Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4259132Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4259559Z return func(*args, **kwargs) 2025-08-14T21:42:27.4259972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4260396Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4260820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4261222Z return func(*args, **kwargs) 2025-08-14T21:42:27.4261610Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:42:27.4262049Z attn_output, attn_weights = attention_interface( 2025-08-14T21:42:27.4262531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:42:27.4263034Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:42:27.4263209Z 2025-08-14T21:42:27.4263322Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4263775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4264211Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4264636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4265023Z outputs = block( 2025-08-14T21:42:27.4265373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4265756Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4266146Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4266537Z return func(*args, **kwargs) 2025-08-14T21:42:27.4266922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4267336Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4267738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4268218Z return func(*args, **kwargs) 2025-08-14T21:42:27.4268626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:42:27.4269043Z attn_output = self.c_proj(attn_output) 2025-08-14T21:42:27.4269413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4269845Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4270031Z 2025-08-14T21:42:27.4270150Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4270597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4271040Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4271467Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4271875Z outputs = block( 2025-08-14T21:42:27.4272210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4272594Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4272995Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4273397Z return func(*args, **kwargs) 2025-08-14T21:42:27.4273792Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4274227Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4274635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4275020Z return func(*args, **kwargs) 2025-08-14T21:42:27.4275418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:42:27.4275829Z attn_output = self.c_proj(attn_output) 2025-08-14T21:42:27.4276204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4276622Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4276815Z 2025-08-14T21:42:27.4276927Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4277380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4277808Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4278231Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4278636Z outputs = block( 2025-08-14T21:42:27.4278990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4279386Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4279801Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4280212Z return func(*args, **kwargs) 2025-08-14T21:42:27.4280619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4281068Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4281514Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:42:27.4281941Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:42:27.4282325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4282781Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4282991Z 2025-08-14T21:42:27.4283151Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4283601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4284025Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4284442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4284843Z outputs = block( 2025-08-14T21:42:27.4285180Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4285573Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4285980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4286384Z return func(*args, **kwargs) 2025-08-14T21:42:27.4286779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4287225Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4287681Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:42:27.4288102Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:42:27.4288483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4288983Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4289173Z 2025-08-14T21:42:27.4289296Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4289740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4290179Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4290602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4291008Z outputs = block( 2025-08-14T21:42:27.4291349Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4291745Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4292158Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4292561Z return func(*args, **kwargs) 2025-08-14T21:42:27.4292949Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4293389Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4293826Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:42:27.4294244Z hidden_states = self.act(hidden_states) 2025-08-14T21:42:27.4294621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:42:27.4295109Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:42:27.4295358Z 2025-08-14T21:42:27.4295477Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4295918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4296355Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4296758Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4297147Z outputs = block( 2025-08-14T21:42:27.4297478Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4297944Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4298344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4298730Z return func(*args, **kwargs) 2025-08-14T21:42:27.4299137Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4299584Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4300026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:42:27.4300449Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:42:27.4300830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4301250Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4301433Z 2025-08-14T21:42:27.4301553Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4301987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4302411Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4303012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4303421Z outputs = block( 2025-08-14T21:42:27.4303765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4304146Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4304547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4304935Z return func(*args, **kwargs) 2025-08-14T21:42:27.4305329Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4305768Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4306188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:42:27.4306630Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:42:27.4307022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4307466Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4307650Z 2025-08-14T21:42:27.4307763Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4308213Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4308640Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4309062Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4309456Z outputs = block( 2025-08-14T21:42:27.4309801Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4310207Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4310605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4311005Z return func(*args, **kwargs) 2025-08-14T21:42:27.4311398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 442, in forward 2025-08-14T21:42:27.4311865Z hidden_states = residual + feed_forward_hidden_states 2025-08-14T21:42:27.4312035Z 2025-08-14T21:42:27.4312251Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4312835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4313291Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4313717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4314122Z outputs = block( 2025-08-14T21:42:27.4314465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4314879Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4315279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4315669Z return func(*args, **kwargs) 2025-08-14T21:42:27.4316046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4316467Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4316886Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4317277Z return func(*args, **kwargs) 2025-08-14T21:42:27.4317667Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:42:27.4318204Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:42:27.4318688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4319097Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4319284Z 2025-08-14T21:42:27.4319392Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4319832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4320254Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4320658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4321050Z outputs = block( 2025-08-14T21:42:27.4321387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4321775Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4322168Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4322563Z return func(*args, **kwargs) 2025-08-14T21:42:27.4322960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4323384Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4323802Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4324208Z return func(*args, **kwargs) 2025-08-14T21:42:27.4324606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:42:27.4325126Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:42:27.4325620Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4326056Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4326240Z 2025-08-14T21:42:27.4326338Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4326567Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4326797Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4327026Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4327272Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4327806Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4328243Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4328656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4329128Z outputs = block( 2025-08-14T21:42:27.4329478Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4329871Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4330270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4330676Z return func(*args, **kwargs) 2025-08-14T21:42:27.4331058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4331483Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4331893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4332298Z return func(*args, **kwargs) 2025-08-14T21:42:27.4332692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:42:27.4333125Z attn_output, attn_weights = attention_interface( 2025-08-14T21:42:27.4333609Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:42:27.4334125Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:42:27.4334322Z 2025-08-14T21:42:27.4334441Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4334883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4335317Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4335738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4336138Z outputs = block( 2025-08-14T21:42:27.4336487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4336881Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4337328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4337732Z return func(*args, **kwargs) 2025-08-14T21:42:27.4338129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4338554Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4338978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4339382Z return func(*args, **kwargs) 2025-08-14T21:42:27.4339785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:42:27.4340221Z attn_output, attn_weights = attention_interface( 2025-08-14T21:42:27.4340694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:42:27.4341196Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:42:27.4341376Z 2025-08-14T21:42:27.4341487Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4341932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4342353Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4342852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4343304Z outputs = block( 2025-08-14T21:42:27.4343656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4344047Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4344518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4344933Z return func(*args, **kwargs) 2025-08-14T21:42:27.4345335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4345766Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4346195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4346597Z return func(*args, **kwargs) 2025-08-14T21:42:27.4347000Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:42:27.4347411Z attn_output = self.c_proj(attn_output) 2025-08-14T21:42:27.4347782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4348200Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4348382Z 2025-08-14T21:42:27.4348493Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4348936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4349367Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4349635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4349702Z outputs = block( 2025-08-14T21:42:27.4349953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4350039Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4350300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4350374Z return func(*args, **kwargs) 2025-08-14T21:42:27.4350646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:42:27.4350742Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:42:27.4350991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4351068Z return func(*args, **kwargs) 2025-08-14T21:42:27.4351320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:42:27.4351409Z attn_output = self.c_proj(attn_output) 2025-08-14T21:42:27.4351644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4351768Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4351772Z 2025-08-14T21:42:27.4351882Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4352152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4352240Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4352502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4352567Z outputs = block( 2025-08-14T21:42:27.4352794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4352911Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4353215Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4353293Z return func(*args, **kwargs) 2025-08-14T21:42:27.4353565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4353678Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4353961Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:42:27.4354048Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:42:27.4354285Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4354429Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4354433Z 2025-08-14T21:42:27.4354546Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4354832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4354922Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4355188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4355262Z outputs = block( 2025-08-14T21:42:27.4355500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4355583Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4355852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4355925Z return func(*args, **kwargs) 2025-08-14T21:42:27.4356196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4356312Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4356577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:42:27.4356672Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:42:27.4356902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4357028Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4357032Z 2025-08-14T21:42:27.4357141Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4357412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4357508Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4357779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4357851Z outputs = block( 2025-08-14T21:42:27.4358103Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4358187Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4358462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4358535Z return func(*args, **kwargs) 2025-08-14T21:42:27.4358807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4358929Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4359201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:42:27.4359293Z hidden_states = self.act(hidden_states) 2025-08-14T21:42:27.4359606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:42:27.4359803Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:42:27.4359807Z 2025-08-14T21:42:27.4359926Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4360196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4360301Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4360555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4360621Z outputs = block( 2025-08-14T21:42:27.4360854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4360935Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4361189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4361270Z return func(*args, **kwargs) 2025-08-14T21:42:27.4361527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4361640Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4361895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:42:27.4361988Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:42:27.4362227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4362352Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4362356Z 2025-08-14T21:42:27.4362468Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:27.4362753Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1183, in forward 2025-08-14T21:42:27.4362844Z transformer_outputs = self.transformer( 2025-08-14T21:42:27.4363113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:42:27.4363181Z outputs = block( 2025-08-14T21:42:27.4363415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:42:27.4363509Z return super().__call__(*args, **kwargs) 2025-08-14T21:42:27.4363766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:42:27.4363847Z return func(*args, **kwargs) 2025-08-14T21:42:27.4364106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:42:27.4364221Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:42:27.4364491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:42:27.4364583Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:42:27.4364812Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:42:27.4364943Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:42:27.4364947Z 2025-08-14T21:42:27.4365034Z cudagraph partition due to non gpu ops 2025-08-14T21:42:27.4365127Z cudagraph partition due to non gpu ops 2025-08-14T21:42:36.0925168Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:36.0925793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/loss/loss_utils.py", line 67, in ForCausalLMLoss 2025-08-14T21:42:36.0926338Z loss = fixed_cross_entropy(logits, shift_labels, num_items_in_batch, ignore_index, **kwargs) 2025-08-14T21:42:36.0927494Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/loss/loss_utils.py", line 36, in fixed_cross_entropy 2025-08-14T21:42:36.0928050Z loss = nn.functional.cross_entropy(source, target, ignore_index=ignore_index, reduction=reduction) 2025-08-14T21:42:36.0928342Z 2025-08-14T21:42:37.4328631Z Compilation time (from dynamo_timed): 16.608718404 2025-08-14T21:42:37.4448605Z pass 2025-08-14T21:42:37.4449283Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:42:37.4450164Z TIMING: _recursive_pre_grad_passes:0.03015 _recursive_joint_graph_passes:0.54372 inductor_compile:11.02343 backend_compile:13.55375 gc:0.00465 entire_frame_compile:16.60872 _recursive_post_grad_passes:0.06814 async_compile.wait:1.63838 code_gen:9.36417 total_wall_time:16.60872 2025-08-14T21:42:37.4451173Z STATS: call_* op count: 301 | FakeTensorMode.__torch_dispatch__:13708 | FakeTensor.__torch_dispatch__:2206 | ProxyTorchDispatchMode.__torch_dispatch__:2742 2025-08-14T21:42:37.4453814Z Dynamo produced 3 graphs covering 301 ops with 2 graph breaks (1 unique) 2025-08-14T21:42:43.2595715Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:42:43.2596767Z from pkg_resources import resource_filename 2025-08-14T21:42:43.8775105Z 2025-08-14T21:42:43.8793272Z loading model: 0it [00:00, ?it/s]If you want to use `ElectraForCausalLM` as a standalone, add `is_decoder=True.` 2025-08-14T21:42:43.8794068Z WARNING:transformers.models.electra.modeling_electra:If you want to use `ElectraForCausalLM` as a standalone, add `is_decoder=True.` 2025-08-14T21:42:44.3097544Z 2025-08-14T21:42:44.3099654Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:42:44.3100028Z cpu eval ElectraForCausalLM 2025-08-14T21:42:44.5022584Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:42:44.5997064Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:42:44.6896610Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:42:56.4308512Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4313322Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4317521Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4321919Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4322955Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4323251Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4323492Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4323723Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4323991Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4324217Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4324442Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4324664Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4324880Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4325106Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4325325Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4325547Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4325762Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4325982Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4326259Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4326505Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4326718Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4326960Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4327179Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4327921Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4328265Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4328498Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4329106Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4329337Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4329561Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4329783Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4329997Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4330222Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4330446Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4330661Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4330884Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4331104Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4331318Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4331540Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4331767Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4331980Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4332200Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4332420Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4332638Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4332850Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4333066Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4333284Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4333500Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4333723Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4333941Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4334153Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4334375Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4334596Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4334806Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4335031Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4335256Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4335476Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4335692Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4335913Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4336140Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4336354Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4336579Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4336801Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4337012Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4337238Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4337460Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4337677Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4337899Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4338121Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4338343Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4338557Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4338784Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4339009Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4339224Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4339449Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4339670Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4339885Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4340106Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4340333Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4340551Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4340778Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4340998Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4341221Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4341440Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4341782Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4342039Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4342312Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4342570Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4342792Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4343016Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4343230Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4343454Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4343676Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4343891Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4344115Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4344340Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4344565Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4344782Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4345012Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4345240Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4345458Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4345689Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4345917Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4346134Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4346358Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4346581Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4346799Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4347034Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4347253Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4347477Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4347687Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4347907Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4348125Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4348335Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4348559Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4348780Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4348992Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4349209Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4349428Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4349650Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4349878Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4350102Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4350322Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4350539Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4350756Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4350975Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4351186Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4351412Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4351639Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4351860Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4352085Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4352311Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4352528Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4352753Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4352976Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4353198Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4353425Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4353640Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4353860Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4354073Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4354289Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4354505Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4354718Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4354943Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4355205Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4355446Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4355788Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4356023Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4356232Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4356450Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4356669Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4356883Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4357107Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4357327Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4357539Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4357757Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4357979Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4358203Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4358419Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4358647Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4358875Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4359096Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4359323Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4359543Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4359769Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4359988Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4360204Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4360413Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4360638Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4360859Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4361075Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4361288Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4361506Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4361725Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4361940Z cudagraph partition due to non gpu ops 2025-08-14T21:42:56.4362218Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:42:56.4362655Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:42:56.4363032Z return mod(**inputs) 2025-08-14T21:42:56.4363484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/electra/modeling_electra.py", line 1564, in forward 2025-08-14T21:42:56.4363939Z lm_loss = self.loss_function( 2025-08-14T21:42:56.4364366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/loss/loss_utils.py", line 67, in ForCausalLMLoss 2025-08-14T21:42:56.4364892Z loss = fixed_cross_entropy(logits, shift_labels, num_items_in_batch, ignore_index, **kwargs) 2025-08-14T21:42:56.4365449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/loss/loss_utils.py", line 36, in fixed_cross_entropy 2025-08-14T21:42:56.4366006Z loss = nn.functional.cross_entropy(source, target, ignore_index=ignore_index, reduction=reduction) 2025-08-14T21:42:56.4366300Z 2025-08-14T21:43:05.2661211Z Compilation time (from dynamo_timed): 19.230996377 2025-08-14T21:43:05.2705031Z pass 2025-08-14T21:43:05.2713499Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:43:05.2714415Z TIMING: _recursive_pre_grad_passes:0.04313 _recursive_joint_graph_passes:0.54767 _recursive_post_grad_passes:0.0999 async_compile.wait:0.86637 code_gen:8.06915 inductor_compile:10.03548 backend_compile:16.01525 gc:0.00016 entire_frame_compile:19.231 total_wall_time:19.231 2025-08-14T21:43:05.2715421Z STATS: call_* op count: 379 | FakeTensorMode.__torch_dispatch__:29655 | FakeTensor.__torch_dispatch__:3266 | ProxyTorchDispatchMode.__torch_dispatch__:8450 2025-08-14T21:43:05.2715985Z Dynamo produced 1 graphs covering 379 ops with 0 graph breaks (0 unique) 2025-08-14T21:43:11.1106458Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:43:11.1107714Z from pkg_resources import resource_filename 2025-08-14T21:43:11.7448071Z 2025-08-14T21:43:12.1252582Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:43:12.1252958Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:43:12.1253237Z cpu eval ElectraForQuestionAnswering 2025-08-14T21:43:12.2748980Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:43:12.3522061Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:43:12.4164760Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:43:24.1519192Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:24.1524463Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:24.1525090Z return mod(**inputs) 2025-08-14T21:43:24.1525997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/electra/modeling_electra.py", line 1330, in forward 2025-08-14T21:43:24.1526643Z logits = self.qa_outputs(sequence_output) 2025-08-14T21:43:24.1526886Z 2025-08-14T21:43:24.1527025Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1527359Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1527688Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1528006Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1528262Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1528484Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1528892Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1529167Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1529486Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1529839Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1530162Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1530487Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1530771Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1531016Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1531328Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1531549Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1531804Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1532056Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1532275Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1532518Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1532731Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1533032Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1533339Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1533553Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1533777Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1533999Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1534210Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1534430Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1534650Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1534870Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1535082Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1535298Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1535517Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1535728Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1536011Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1536301Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1536580Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1536845Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1537064Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1537706Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1538104Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1538403Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1538692Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1538906Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1539125Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1539345Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1539604Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1539886Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1540116Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1540377Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1540653Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1540876Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1541099Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1541416Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1541645Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1541868Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1542125Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1542401Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1542623Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1542887Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1543157Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1543424Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1543666Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1543884Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1544157Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1544461Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1544737Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1545050Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1545277Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1545494Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1545714Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1545932Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1546142Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1546358Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1546574Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1546814Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1547110Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1547333Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1547549Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1547771Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1547994Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1548214Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1548487Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1548795Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1549084Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1549406Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1549628Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1549846Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1550065Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1550276Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1550492Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1550710Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1550921Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1551139Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1551359Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1551571Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1551791Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1552016Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1552311Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1552625Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1552909Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1553136Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1553353Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1553576Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1553796Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1554008Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1554228Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1554448Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1554659Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1554910Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1555222Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1555446Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1555665Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1555885Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1556133Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1556355Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1556568Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1556786Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1557005Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1557217Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1557438Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1557657Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1557876Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1558090Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1558310Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1558530Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1558744Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1558961Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1559180Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1559397Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1559623Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1559840Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1560055Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1560370Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1560619Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1560836Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1561047Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1561264Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1561497Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1561707Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1561924Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1562143Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1562360Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1562571Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1562796Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1563016Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1563229Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1563448Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1563668Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1563878Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1564096Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1564317Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1564531Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1564754Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1564973Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1565195Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1565407Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1565627Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1565843Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1566100Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1566347Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1566625Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1566843Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1567067Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1567305Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1567512Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1567727Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1567941Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1568159Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1568372Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1568593Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1568959Z cudagraph partition due to non gpu ops 2025-08-14T21:43:24.1569220Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:24.1569635Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:24.1570014Z return mod(**inputs) 2025-08-14T21:43:24.1570448Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/electra/modeling_electra.py", line 1348, in forward 2025-08-14T21:43:24.1570936Z start_loss = loss_fct(start_logits, start_positions) 2025-08-14T21:43:24.1571113Z 2025-08-14T21:43:24.1571229Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:24.1571611Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:24.1571951Z return mod(**inputs) 2025-08-14T21:43:24.1572358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/electra/modeling_electra.py", line 1349, in forward 2025-08-14T21:43:24.1572819Z end_loss = loss_fct(end_logits, end_positions) 2025-08-14T21:43:24.1572982Z 2025-08-14T21:43:32.4514033Z Compilation time (from dynamo_timed): 18.81826375 2025-08-14T21:43:32.4517011Z pass 2025-08-14T21:43:32.4517426Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:43:32.4518287Z TIMING: _recursive_pre_grad_passes:0.0419 _recursive_joint_graph_passes:0.53655 _recursive_post_grad_passes:0.10309 async_compile.wait:0.67044 code_gen:7.68553 inductor_compile:9.65209 backend_compile:15.6562 gc:0.00022 entire_frame_compile:18.81826 total_wall_time:18.81826 2025-08-14T21:43:32.4522253Z STATS: call_* op count: 380 | FakeTensorMode.__torch_dispatch__:29449 | FakeTensor.__torch_dispatch__:3271 | ProxyTorchDispatchMode.__torch_dispatch__:8450 2025-08-14T21:43:32.4522831Z Dynamo produced 1 graphs covering 380 ops with 0 graph breaks (0 unique) 2025-08-14T21:43:38.2628203Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:43:38.2629322Z from pkg_resources import resource_filename 2025-08-14T21:43:38.9049715Z 2025-08-14T21:43:40.4255008Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:43:40.4260675Z loading model: 0it [00:01, ?it/s] 2025-08-14T21:43:40.4264872Z cpu eval GPT2ForSequenceClassification 2025-08-14T21:43:41.4636546Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:43:41.6894495Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:43:41.9199063Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:43:51.7330510Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7331072Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7331421Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7331764Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7332456Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7333029Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7333268Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7333502Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7333754Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7333991Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7334222Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7334455Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7334726Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7335156Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7335525Z return mod(**inputs) 2025-08-14T21:43:51.7335964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1509, in forward 2025-08-14T21:43:51.7336460Z last_non_pad_token = (token_indices * non_pad_mask).argmax(-1) 2025-08-14T21:43:51.7336674Z 2025-08-14T21:43:51.7338214Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7338649Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7339063Z return mod(**inputs) 2025-08-14T21:43:51.7339514Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7339971Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7340412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7340836Z outputs = block( 2025-08-14T21:43:51.7341199Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7341600Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7342031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7342455Z return func(*args, **kwargs) 2025-08-14T21:43:51.7342863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7343532Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7343955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7344362Z return func(*args, **kwargs) 2025-08-14T21:43:51.7344770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7345337Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7345843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7346294Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7346491Z 2025-08-14T21:43:51.7346609Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7347005Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7347364Z return mod(**inputs) 2025-08-14T21:43:51.7347759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7348194Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7348617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7349024Z outputs = block( 2025-08-14T21:43:51.7349368Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7349764Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7350341Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7350757Z return func(*args, **kwargs) 2025-08-14T21:43:51.7351161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7351595Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7352019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7352419Z return func(*args, **kwargs) 2025-08-14T21:43:51.7352815Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7353348Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7353850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7354286Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7354501Z 2025-08-14T21:43:51.7354594Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7354829Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7355057Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7355272Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7355538Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7355923Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7356263Z return mod(**inputs) 2025-08-14T21:43:51.7356667Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7357100Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7357531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7357934Z outputs = block( 2025-08-14T21:43:51.7358284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7358680Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7359090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7359503Z return func(*args, **kwargs) 2025-08-14T21:43:51.7359912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7360340Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7360755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7361160Z return func(*args, **kwargs) 2025-08-14T21:43:51.7361561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7362013Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7362497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:43:51.7363024Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:43:51.7363227Z 2025-08-14T21:43:51.7363351Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7363749Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7364096Z return mod(**inputs) 2025-08-14T21:43:51.7364489Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7364918Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7365405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7365828Z outputs = block( 2025-08-14T21:43:51.7366183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7366579Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7366986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7367390Z return func(*args, **kwargs) 2025-08-14T21:43:51.7367785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7368216Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7368633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7369238Z return func(*args, **kwargs) 2025-08-14T21:43:51.7369647Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7370078Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7370552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:43:51.7371051Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:43:51.7371228Z 2025-08-14T21:43:51.7371348Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7371732Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7372088Z return mod(**inputs) 2025-08-14T21:43:51.7372479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7372898Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7373329Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7373728Z outputs = block( 2025-08-14T21:43:51.7374074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7374454Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7374862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7375261Z return func(*args, **kwargs) 2025-08-14T21:43:51.7375647Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7376070Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7376485Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7376890Z return func(*args, **kwargs) 2025-08-14T21:43:51.7377279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.7377701Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.7378085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7378509Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7378696Z 2025-08-14T21:43:51.7378808Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7379203Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7379562Z return mod(**inputs) 2025-08-14T21:43:51.7379945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7380371Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7380890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7381295Z outputs = block( 2025-08-14T21:43:51.7381634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7382024Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7382435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7382838Z return func(*args, **kwargs) 2025-08-14T21:43:51.7383246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7383675Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7384095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7384489Z return func(*args, **kwargs) 2025-08-14T21:43:51.7384886Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.7385304Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.7385688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7386117Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7386308Z 2025-08-14T21:43:51.7386420Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7386825Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7387173Z return mod(**inputs) 2025-08-14T21:43:51.7387561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7387992Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7388415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7389240Z outputs = block( 2025-08-14T21:43:51.7389592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7389987Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7390398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7390806Z return func(*args, **kwargs) 2025-08-14T21:43:51.7391202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7391656Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7392088Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.7392511Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.7392899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7393332Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7393517Z 2025-08-14T21:43:51.7393630Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7394033Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7394395Z return mod(**inputs) 2025-08-14T21:43:51.7394774Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7395209Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7395636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7396088Z outputs = block( 2025-08-14T21:43:51.7396461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7396861Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7397278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7397685Z return func(*args, **kwargs) 2025-08-14T21:43:51.7398089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7398521Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7398948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.7399357Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.7399737Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7400163Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7400342Z 2025-08-14T21:43:51.7400458Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7400850Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7401214Z return mod(**inputs) 2025-08-14T21:43:51.7401602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7402031Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7402457Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7403161Z outputs = block( 2025-08-14T21:43:51.7403518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7403913Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7404331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7404738Z return func(*args, **kwargs) 2025-08-14T21:43:51.7405141Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7405592Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7406033Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:43:51.7406467Z hidden_states = self.act(hidden_states) 2025-08-14T21:43:51.7406843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:43:51.7407348Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:43:51.7407613Z 2025-08-14T21:43:51.7407728Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7408125Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7408472Z return mod(**inputs) 2025-08-14T21:43:51.7408943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7409390Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7409811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7410279Z outputs = block( 2025-08-14T21:43:51.7410627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7411024Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7411428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7412017Z return func(*args, **kwargs) 2025-08-14T21:43:51.7412418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7412872Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7413302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.7413736Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.7414125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7414557Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7414749Z 2025-08-14T21:43:51.7414863Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7415251Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7415606Z return mod(**inputs) 2025-08-14T21:43:51.7415989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7416417Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7416834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7417230Z outputs = block( 2025-08-14T21:43:51.7417568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7417957Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7418364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7418757Z return func(*args, **kwargs) 2025-08-14T21:43:51.7419142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7419570Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7419990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.7420394Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.7420768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7421182Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7421359Z 2025-08-14T21:43:51.7421467Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7421843Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7422182Z return mod(**inputs) 2025-08-14T21:43:51.7422556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7422969Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7423371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7423760Z outputs = block( 2025-08-14T21:43:51.7424089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7424465Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7424862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7425251Z return func(*args, **kwargs) 2025-08-14T21:43:51.7425629Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7426058Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7426542Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7426971Z return func(*args, **kwargs) 2025-08-14T21:43:51.7427348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7427864Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7428349Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7428757Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7428941Z 2025-08-14T21:43:51.7429072Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7429455Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7429791Z return mod(**inputs) 2025-08-14T21:43:51.7430165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7430580Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7430984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7431377Z outputs = block( 2025-08-14T21:43:51.7431705Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7432083Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7432478Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7432861Z return func(*args, **kwargs) 2025-08-14T21:43:51.7433249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7433669Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7434078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7434461Z return func(*args, **kwargs) 2025-08-14T21:43:51.7434843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7435354Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7435827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7436238Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7436421Z 2025-08-14T21:43:51.7436509Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7436738Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7436953Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7437173Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7437421Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7437808Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7438182Z return mod(**inputs) 2025-08-14T21:43:51.7438556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7438956Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7439357Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7439746Z outputs = block( 2025-08-14T21:43:51.7440083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7440457Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7440852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7441356Z return func(*args, **kwargs) 2025-08-14T21:43:51.7441723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7442124Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7442530Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7442920Z return func(*args, **kwargs) 2025-08-14T21:43:51.7443297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7443722Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7444191Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:43:51.7444712Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:43:51.7444914Z 2025-08-14T21:43:51.7445030Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7445421Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7445775Z return mod(**inputs) 2025-08-14T21:43:51.7446165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7446589Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7447008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7447419Z outputs = block( 2025-08-14T21:43:51.7447760Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7448160Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7448577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7449055Z return func(*args, **kwargs) 2025-08-14T21:43:51.7449449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7449878Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7450295Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7450704Z return func(*args, **kwargs) 2025-08-14T21:43:51.7451094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7451537Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7452017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:43:51.7452514Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:43:51.7452705Z 2025-08-14T21:43:51.7452819Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7453209Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7453564Z return mod(**inputs) 2025-08-14T21:43:51.7453948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7454377Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7454797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7455207Z outputs = block( 2025-08-14T21:43:51.7455552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7455948Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7456442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7456839Z return func(*args, **kwargs) 2025-08-14T21:43:51.7457240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7457669Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7458080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7458481Z return func(*args, **kwargs) 2025-08-14T21:43:51.7458863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.7459252Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.7459602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7460009Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7460191Z 2025-08-14T21:43:51.7460294Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7460658Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7460976Z return mod(**inputs) 2025-08-14T21:43:51.7461334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7461735Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7462139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7462533Z outputs = block( 2025-08-14T21:43:51.7462874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7463258Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7463655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7464050Z return func(*args, **kwargs) 2025-08-14T21:43:51.7464438Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7464849Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7465257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7465652Z return func(*args, **kwargs) 2025-08-14T21:43:51.7466039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.7466445Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.7466819Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7467252Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7467432Z 2025-08-14T21:43:51.7467548Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7467924Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7468269Z return mod(**inputs) 2025-08-14T21:43:51.7468647Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7469050Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7469437Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7469808Z outputs = block( 2025-08-14T21:43:51.7470139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7470545Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7471008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7471410Z return func(*args, **kwargs) 2025-08-14T21:43:51.7471797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7472245Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7472672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.7473078Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.7473445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7473875Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7474053Z 2025-08-14T21:43:51.7474172Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7474557Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7474892Z return mod(**inputs) 2025-08-14T21:43:51.7475270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7475685Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7476082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7476470Z outputs = block( 2025-08-14T21:43:51.7476808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7477198Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7477587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7477980Z return func(*args, **kwargs) 2025-08-14T21:43:51.7478372Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7478803Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7479230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.7479638Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.7480013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7480422Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7480607Z 2025-08-14T21:43:51.7480716Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7481098Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7481440Z return mod(**inputs) 2025-08-14T21:43:51.7481815Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7482235Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7482639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7483021Z outputs = block( 2025-08-14T21:43:51.7483357Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7483733Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7484126Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7484510Z return func(*args, **kwargs) 2025-08-14T21:43:51.7484902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7485436Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7485920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:43:51.7486344Z hidden_states = self.act(hidden_states) 2025-08-14T21:43:51.7486726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:43:51.7487220Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:43:51.7487470Z 2025-08-14T21:43:51.7487585Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7487983Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7488336Z return mod(**inputs) 2025-08-14T21:43:51.7488830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7489271Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7489688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7490089Z outputs = block( 2025-08-14T21:43:51.7490431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7490824Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7491221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7491623Z return func(*args, **kwargs) 2025-08-14T21:43:51.7492001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7492428Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7492854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.7493260Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.7493635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7494057Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7494236Z 2025-08-14T21:43:51.7494350Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7494721Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7495065Z return mod(**inputs) 2025-08-14T21:43:51.7495439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7495851Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7496249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7496648Z outputs = block( 2025-08-14T21:43:51.7496985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7497366Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7497760Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7498152Z return func(*args, **kwargs) 2025-08-14T21:43:51.7498541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7498963Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7499390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.7499814Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.7500292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7500708Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7500893Z 2025-08-14T21:43:51.7501002Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7501382Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7501717Z return mod(**inputs) 2025-08-14T21:43:51.7502100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7502535Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7503081Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7503468Z outputs = block( 2025-08-14T21:43:51.7503810Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7504200Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7504592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7504985Z return func(*args, **kwargs) 2025-08-14T21:43:51.7505384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7505819Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7506222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7506616Z return func(*args, **kwargs) 2025-08-14T21:43:51.7507014Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7507548Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7508029Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7508446Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7508634Z 2025-08-14T21:43:51.7508744Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7509094Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7509419Z return mod(**inputs) 2025-08-14T21:43:51.7509785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7510204Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7510604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7510996Z outputs = block( 2025-08-14T21:43:51.7511339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7511713Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7512106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7512493Z return func(*args, **kwargs) 2025-08-14T21:43:51.7512876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7513285Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7513689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7514080Z return func(*args, **kwargs) 2025-08-14T21:43:51.7514461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7515161Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7515650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7516065Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7516249Z 2025-08-14T21:43:51.7516338Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7516573Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7516800Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7517018Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7517259Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7517641Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7517984Z return mod(**inputs) 2025-08-14T21:43:51.7518356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7518782Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7519189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7519579Z outputs = block( 2025-08-14T21:43:51.7519912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7520293Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7520690Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7521071Z return func(*args, **kwargs) 2025-08-14T21:43:51.7521455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7521870Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7522286Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7522669Z return func(*args, **kwargs) 2025-08-14T21:43:51.7523052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7523475Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7523933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:43:51.7524437Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:43:51.7524637Z 2025-08-14T21:43:51.7524749Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7525133Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7525470Z return mod(**inputs) 2025-08-14T21:43:51.7525859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7526278Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7526696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7527091Z outputs = block( 2025-08-14T21:43:51.7527438Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7527831Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7528230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7528628Z return func(*args, **kwargs) 2025-08-14T21:43:51.7529083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7529515Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7530035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7530445Z return func(*args, **kwargs) 2025-08-14T21:43:51.7530845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7531274Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7531715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:43:51.7532175Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:43:51.7532336Z 2025-08-14T21:43:51.7532447Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7532804Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7533152Z return mod(**inputs) 2025-08-14T21:43:51.7533538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7533958Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7534366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7534774Z outputs = block( 2025-08-14T21:43:51.7535126Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7535529Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7535931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7536327Z return func(*args, **kwargs) 2025-08-14T21:43:51.7536695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7537106Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7537520Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7537922Z return func(*args, **kwargs) 2025-08-14T21:43:51.7538311Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.7538735Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.7539123Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7539563Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7539750Z 2025-08-14T21:43:51.7539862Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7540260Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7540606Z return mod(**inputs) 2025-08-14T21:43:51.7540994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7541412Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7541821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7542215Z outputs = block( 2025-08-14T21:43:51.7542546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7542934Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7543352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7543765Z return func(*args, **kwargs) 2025-08-14T21:43:51.7544157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7544640Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7545125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7545522Z return func(*args, **kwargs) 2025-08-14T21:43:51.7545909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.7546319Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.7546699Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7547127Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7547320Z 2025-08-14T21:43:51.7547432Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7547824Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7548194Z return mod(**inputs) 2025-08-14T21:43:51.7548581Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7549023Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7549429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7549812Z outputs = block( 2025-08-14T21:43:51.7550149Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7550549Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7550968Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7551373Z return func(*args, **kwargs) 2025-08-14T21:43:51.7551769Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7552220Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7552656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.7553082Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.7553470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7553905Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7554092Z 2025-08-14T21:43:51.7554206Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7554598Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7554953Z return mod(**inputs) 2025-08-14T21:43:51.7555340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7555762Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7556185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7556591Z outputs = block( 2025-08-14T21:43:51.7556930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7557331Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7557739Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7558150Z return func(*args, **kwargs) 2025-08-14T21:43:51.7558541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7558986Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7559425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.7560817Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.7561209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7561634Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7561816Z 2025-08-14T21:43:51.7561937Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7562320Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7562684Z return mod(**inputs) 2025-08-14T21:43:51.7563084Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7563521Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7563940Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7564351Z outputs = block( 2025-08-14T21:43:51.7564707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7565092Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7565513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7565920Z return func(*args, **kwargs) 2025-08-14T21:43:51.7566322Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7566761Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7567393Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:43:51.7567827Z hidden_states = self.act(hidden_states) 2025-08-14T21:43:51.7568201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:43:51.7568762Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:43:51.7569033Z 2025-08-14T21:43:51.7569149Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7569542Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7569890Z return mod(**inputs) 2025-08-14T21:43:51.7570283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7570715Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7571135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7571532Z outputs = block( 2025-08-14T21:43:51.7571880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7572283Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7572689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7573103Z return func(*args, **kwargs) 2025-08-14T21:43:51.7573502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7573945Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7574380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.7574821Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.7575205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7575617Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7575859Z 2025-08-14T21:43:51.7576007Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7576394Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7576739Z return mod(**inputs) 2025-08-14T21:43:51.7577114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7577532Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7577960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7578364Z outputs = block( 2025-08-14T21:43:51.7578706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7579103Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7579512Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7579928Z return func(*args, **kwargs) 2025-08-14T21:43:51.7580315Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7580746Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7581175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.7581585Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.7581977Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7582399Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7582578Z 2025-08-14T21:43:51.7582696Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7583086Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7583437Z return mod(**inputs) 2025-08-14T21:43:51.7583839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7584273Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7584691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7585096Z outputs = block( 2025-08-14T21:43:51.7585443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7585827Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7586245Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7586660Z return func(*args, **kwargs) 2025-08-14T21:43:51.7587062Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 442, in forward 2025-08-14T21:43:51.7587544Z hidden_states = residual + feed_forward_hidden_states 2025-08-14T21:43:51.7587717Z 2025-08-14T21:43:51.7587827Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7588206Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7588543Z return mod(**inputs) 2025-08-14T21:43:51.7588948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7589389Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7589805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7590213Z outputs = block( 2025-08-14T21:43:51.7590565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7591050Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7591462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7591883Z return func(*args, **kwargs) 2025-08-14T21:43:51.7592278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7592702Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7593111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7593519Z return func(*args, **kwargs) 2025-08-14T21:43:51.7593915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7594443Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7594948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7595379Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7595564Z 2025-08-14T21:43:51.7595685Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7596067Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7596419Z return mod(**inputs) 2025-08-14T21:43:51.7596808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7597228Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7597644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7598044Z outputs = block( 2025-08-14T21:43:51.7598399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7598782Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7599189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7599640Z return func(*args, **kwargs) 2025-08-14T21:43:51.7600038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7600458Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7600874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7601284Z return func(*args, **kwargs) 2025-08-14T21:43:51.7601675Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7602206Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7602925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7603365Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7603555Z 2025-08-14T21:43:51.7603645Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7603885Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7604116Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7604338Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7604592Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7604989Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7605342Z return mod(**inputs) 2025-08-14T21:43:51.7605731Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7606300Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7606790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7607195Z outputs = block( 2025-08-14T21:43:51.7607557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7607956Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7608371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7608834Z return func(*args, **kwargs) 2025-08-14T21:43:51.7609240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7609675Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7610088Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7610500Z return func(*args, **kwargs) 2025-08-14T21:43:51.7610900Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7611337Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7611817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:43:51.7612347Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:43:51.7612558Z 2025-08-14T21:43:51.7612672Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7613074Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7613426Z return mod(**inputs) 2025-08-14T21:43:51.7613823Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7614253Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7614658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7615063Z outputs = block( 2025-08-14T21:43:51.7615404Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7615783Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7616175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7616577Z return func(*args, **kwargs) 2025-08-14T21:43:51.7616978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7617409Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7617834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7618239Z return func(*args, **kwargs) 2025-08-14T21:43:51.7618638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7619080Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7619570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:43:51.7620059Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:43:51.7620231Z 2025-08-14T21:43:51.7620350Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7620729Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7621075Z return mod(**inputs) 2025-08-14T21:43:51.7621534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7621968Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7622389Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7622803Z outputs = block( 2025-08-14T21:43:51.7623152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7623552Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7623987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7624395Z return func(*args, **kwargs) 2025-08-14T21:43:51.7624785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7625211Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7625633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7626031Z return func(*args, **kwargs) 2025-08-14T21:43:51.7626420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.7626840Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.7627231Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7627672Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7627861Z 2025-08-14T21:43:51.7627973Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7628368Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7628726Z return mod(**inputs) 2025-08-14T21:43:51.7629126Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7629555Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7629972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7630415Z outputs = block( 2025-08-14T21:43:51.7630761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7631156Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7631568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7631971Z return func(*args, **kwargs) 2025-08-14T21:43:51.7632377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7632812Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7633241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7633637Z return func(*args, **kwargs) 2025-08-14T21:43:51.7634079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.7634513Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.7634903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7635335Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7635531Z 2025-08-14T21:43:51.7635644Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7636041Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7636391Z return mod(**inputs) 2025-08-14T21:43:51.7636873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7637309Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7637725Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7638133Z outputs = block( 2025-08-14T21:43:51.7638484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7638877Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7639279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7639691Z return func(*args, **kwargs) 2025-08-14T21:43:51.7640087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7640535Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7640973Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.7641392Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.7641776Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7642215Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7642400Z 2025-08-14T21:43:51.7642512Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7642903Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7643275Z return mod(**inputs) 2025-08-14T21:43:51.7643657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7644085Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7644509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7644910Z outputs = block( 2025-08-14T21:43:51.7645249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7645640Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7646048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7646444Z return func(*args, **kwargs) 2025-08-14T21:43:51.7646842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7647286Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7647724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.7648139Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.7648528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7649050Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7649240Z 2025-08-14T21:43:51.7649362Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7649752Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7650110Z return mod(**inputs) 2025-08-14T21:43:51.7650509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7650936Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7651361Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7651822Z outputs = block( 2025-08-14T21:43:51.7652209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7652599Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7653012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7653430Z return func(*args, **kwargs) 2025-08-14T21:43:51.7653822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7654277Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7654712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:43:51.7655127Z hidden_states = self.act(hidden_states) 2025-08-14T21:43:51.7655491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:43:51.7655980Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:43:51.7656234Z 2025-08-14T21:43:51.7656344Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7656731Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7657080Z return mod(**inputs) 2025-08-14T21:43:51.7657471Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7657898Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7658312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7658717Z outputs = block( 2025-08-14T21:43:51.7659071Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7659453Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7659844Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7660238Z return func(*args, **kwargs) 2025-08-14T21:43:51.7660623Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7661052Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7661480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.7661903Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.7662281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7662700Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7662895Z 2025-08-14T21:43:51.7663007Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7663406Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7663762Z return mod(**inputs) 2025-08-14T21:43:51.7664147Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7664578Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7665001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7665398Z outputs = block( 2025-08-14T21:43:51.7665752Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7666149Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7666563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7667091Z return func(*args, **kwargs) 2025-08-14T21:43:51.7667496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7667947Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7668385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.7668816Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.7669216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7669657Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7669851Z 2025-08-14T21:43:51.7669967Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7670366Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7670731Z return mod(**inputs) 2025-08-14T21:43:51.7671122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7671549Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7671975Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7672380Z outputs = block( 2025-08-14T21:43:51.7672725Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7673126Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7673543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7673951Z return func(*args, **kwargs) 2025-08-14T21:43:51.7674348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7674789Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7675214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7675613Z return func(*args, **kwargs) 2025-08-14T21:43:51.7676022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7676564Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7677066Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7677494Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7677689Z 2025-08-14T21:43:51.7677809Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7678211Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7678574Z return mod(**inputs) 2025-08-14T21:43:51.7678961Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7679395Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7679820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7680238Z outputs = block( 2025-08-14T21:43:51.7680591Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7680988Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7681402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7681823Z return func(*args, **kwargs) 2025-08-14T21:43:51.7682292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7682722Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7683133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7683554Z return func(*args, **kwargs) 2025-08-14T21:43:51.7683950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7684488Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7684981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7685418Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7685609Z 2025-08-14T21:43:51.7685703Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7685947Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7686181Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7686407Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7686663Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7687051Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7687410Z return mod(**inputs) 2025-08-14T21:43:51.7687823Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7688263Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7688687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7689176Z outputs = block( 2025-08-14T21:43:51.7689536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7689941Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7690362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7690774Z return func(*args, **kwargs) 2025-08-14T21:43:51.7691185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7691619Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7692046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7692505Z return func(*args, **kwargs) 2025-08-14T21:43:51.7692906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7693354Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7693839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:43:51.7694356Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:43:51.7694562Z 2025-08-14T21:43:51.7694675Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7695069Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7695421Z return mod(**inputs) 2025-08-14T21:43:51.7695815Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7696238Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7696665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7697078Z outputs = block( 2025-08-14T21:43:51.7697458Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7697918Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7698332Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7698742Z return func(*args, **kwargs) 2025-08-14T21:43:51.7699130Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7699559Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7699964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7700353Z return func(*args, **kwargs) 2025-08-14T21:43:51.7700739Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7701164Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7701631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:43:51.7702105Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:43:51.7702282Z 2025-08-14T21:43:51.7702392Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7702985Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7703335Z return mod(**inputs) 2025-08-14T21:43:51.7703717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7704146Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7704567Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7704964Z outputs = block( 2025-08-14T21:43:51.7705326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7705709Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7706107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7706499Z return func(*args, **kwargs) 2025-08-14T21:43:51.7706888Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7707308Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7707707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7708109Z return func(*args, **kwargs) 2025-08-14T21:43:51.7708496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.7708910Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.7709279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7709701Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7709882Z 2025-08-14T21:43:51.7710001Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7710380Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7710716Z return mod(**inputs) 2025-08-14T21:43:51.7711092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7711508Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7711909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7712311Z outputs = block( 2025-08-14T21:43:51.7712884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7713281Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7713692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7714104Z return func(*args, **kwargs) 2025-08-14T21:43:51.7714496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7714907Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7715319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7715714Z return func(*args, **kwargs) 2025-08-14T21:43:51.7716105Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.7716515Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.7716895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7717317Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7717495Z 2025-08-14T21:43:51.7717612Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7717990Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7718337Z return mod(**inputs) 2025-08-14T21:43:51.7718716Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7719126Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7719536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7719930Z outputs = block( 2025-08-14T21:43:51.7720276Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7720654Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7721068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7721484Z return func(*args, **kwargs) 2025-08-14T21:43:51.7721863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7722305Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7722746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.7723170Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.7723553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7723986Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7724172Z 2025-08-14T21:43:51.7724296Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7724691Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7725062Z return mod(**inputs) 2025-08-14T21:43:51.7725462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7725895Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7726308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7726723Z outputs = block( 2025-08-14T21:43:51.7727073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7727509Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7727974Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7728381Z return func(*args, **kwargs) 2025-08-14T21:43:51.7728845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7729310Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7729759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.7730187Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.7730572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7731002Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7731196Z 2025-08-14T21:43:51.7731314Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7731724Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7732092Z return mod(**inputs) 2025-08-14T21:43:51.7732495Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7732927Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7733351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7733799Z outputs = block( 2025-08-14T21:43:51.7734150Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7734549Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7734968Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7735377Z return func(*args, **kwargs) 2025-08-14T21:43:51.7735784Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7736235Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7736683Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:43:51.7737100Z hidden_states = self.act(hidden_states) 2025-08-14T21:43:51.7737465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:43:51.7737926Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:43:51.7738154Z 2025-08-14T21:43:51.7738258Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7738621Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7738950Z return mod(**inputs) 2025-08-14T21:43:51.7739349Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7739754Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7740159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7740560Z outputs = block( 2025-08-14T21:43:51.7740892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7741270Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7741667Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7742063Z return func(*args, **kwargs) 2025-08-14T21:43:51.7742441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7742938Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7743345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.7743434Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.7743650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7743782Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7743786Z 2025-08-14T21:43:51.7743895Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7744112Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7744182Z return mod(**inputs) 2025-08-14T21:43:51.7744444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7744544Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7744804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7744878Z outputs = block( 2025-08-14T21:43:51.7745108Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7745192Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7745451Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7745526Z return func(*args, **kwargs) 2025-08-14T21:43:51.7745781Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7745896Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7746155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.7746259Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.7746486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7746609Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7746613Z 2025-08-14T21:43:51.7746729Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7746937Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7747016Z return mod(**inputs) 2025-08-14T21:43:51.7747281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7747370Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7747633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7747708Z outputs = block( 2025-08-14T21:43:51.7747939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7748030Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7748281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7748362Z return func(*args, **kwargs) 2025-08-14T21:43:51.7748618Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 442, in forward 2025-08-14T21:43:51.7748728Z hidden_states = residual + feed_forward_hidden_states 2025-08-14T21:43:51.7748732Z 2025-08-14T21:43:51.7748849Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7749058Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7749169Z return mod(**inputs) 2025-08-14T21:43:51.7749477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7749566Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7749830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7749896Z outputs = block( 2025-08-14T21:43:51.7750126Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7750219Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7750472Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7750554Z return func(*args, **kwargs) 2025-08-14T21:43:51.7750812Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7750911Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7751170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7751243Z return func(*args, **kwargs) 2025-08-14T21:43:51.7751499Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7751705Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7751935Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7752070Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7752074Z 2025-08-14T21:43:51.7752186Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7752404Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7752490Z return mod(**inputs) 2025-08-14T21:43:51.7752759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7752860Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7753123Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7753192Z outputs = block( 2025-08-14T21:43:51.7753445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7753528Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7753780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7753859Z return func(*args, **kwargs) 2025-08-14T21:43:51.7754119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7754221Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7754471Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7754544Z return func(*args, **kwargs) 2025-08-14T21:43:51.7754818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7755017Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7755256Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7755384Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7755387Z 2025-08-14T21:43:51.7755477Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7755610Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7755753Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7755838Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7755957Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7756176Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7756248Z return mod(**inputs) 2025-08-14T21:43:51.7756527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7756619Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7756892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7756961Z outputs = block( 2025-08-14T21:43:51.7757200Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7757297Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7757560Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7757644Z return func(*args, **kwargs) 2025-08-14T21:43:51.7757909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7758002Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7758265Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7758339Z return func(*args, **kwargs) 2025-08-14T21:43:51.7758603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7758716Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7759036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:43:51.7759190Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:43:51.7759194Z 2025-08-14T21:43:51.7759306Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7759525Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7759607Z return mod(**inputs) 2025-08-14T21:43:51.7759877Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7759975Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7760243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7760312Z outputs = block( 2025-08-14T21:43:51.7760559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7760652Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7760915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7761000Z return func(*args, **kwargs) 2025-08-14T21:43:51.7761268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7761371Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7761632Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7761706Z return func(*args, **kwargs) 2025-08-14T21:43:51.7761976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7762082Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7762476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:43:51.7762608Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:43:51.7762612Z 2025-08-14T21:43:51.7762723Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7762946Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7763018Z return mod(**inputs) 2025-08-14T21:43:51.7763287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7763386Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7763648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7763724Z outputs = block( 2025-08-14T21:43:51.7763963Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7764053Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7764320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7764396Z return func(*args, **kwargs) 2025-08-14T21:43:51.7764659Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7764761Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7765021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7765104Z return func(*args, **kwargs) 2025-08-14T21:43:51.7765369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.7765461Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.7765706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7765835Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7765838Z 2025-08-14T21:43:51.7765956Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7766170Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7766242Z return mod(**inputs) 2025-08-14T21:43:51.7766516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7766606Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7766871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7766952Z outputs = block( 2025-08-14T21:43:51.7767191Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7767290Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7767549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7767624Z return func(*args, **kwargs) 2025-08-14T21:43:51.7767894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7767988Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7768245Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7768328Z return func(*args, **kwargs) 2025-08-14T21:43:51.7768592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.7768782Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.7769118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7769245Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7769249Z 2025-08-14T21:43:51.7769369Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7769585Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7769667Z return mod(**inputs) 2025-08-14T21:43:51.7769940Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7770031Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7770306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7770377Z outputs = block( 2025-08-14T21:43:51.7770621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7770716Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7770973Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7771054Z return func(*args, **kwargs) 2025-08-14T21:43:51.7771318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7771429Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7771702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.7771792Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.7772035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7772171Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7772178Z 2025-08-14T21:43:51.7772290Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7772512Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7772583Z return mod(**inputs) 2025-08-14T21:43:51.7772855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7772953Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7773221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7773296Z outputs = block( 2025-08-14T21:43:51.7773534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7773618Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7773890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7773966Z return func(*args, **kwargs) 2025-08-14T21:43:51.7774235Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7774354Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7774659Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.7774755Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.7774993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7775117Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7775121Z 2025-08-14T21:43:51.7775242Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7775537Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7775618Z return mod(**inputs) 2025-08-14T21:43:51.7775886Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7775976Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7776248Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7776317Z outputs = block( 2025-08-14T21:43:51.7776554Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7776646Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7776912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7776998Z return func(*args, **kwargs) 2025-08-14T21:43:51.7777275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7777387Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7777662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:43:51.7777748Z hidden_states = self.act(hidden_states) 2025-08-14T21:43:51.7777978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:43:51.7778181Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:43:51.7778185Z 2025-08-14T21:43:51.7778296Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7778527Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7778600Z return mod(**inputs) 2025-08-14T21:43:51.7778875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7778969Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7779224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7779297Z outputs = block( 2025-08-14T21:43:51.7779527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7779609Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7779869Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7779942Z return func(*args, **kwargs) 2025-08-14T21:43:51.7780197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7780317Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7780572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.7780671Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.7780901Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7781022Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7781025Z 2025-08-14T21:43:51.7781140Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7781349Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7781425Z return mod(**inputs) 2025-08-14T21:43:51.7781701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7781845Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7782161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7782232Z outputs = block( 2025-08-14T21:43:51.7782473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7782569Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7782829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7782912Z return func(*args, **kwargs) 2025-08-14T21:43:51.7783178Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7783288Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7783557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.7783667Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.7783895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7784026Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7784030Z 2025-08-14T21:43:51.7784139Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7784358Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7784427Z return mod(**inputs) 2025-08-14T21:43:51.7784692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7784790Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7785049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7785128Z outputs = block( 2025-08-14T21:43:51.7785362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7785445Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7785705Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7785777Z return func(*args, **kwargs) 2025-08-14T21:43:51.7786035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7786135Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7786388Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7786467Z return func(*args, **kwargs) 2025-08-14T21:43:51.7786726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7786925Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7787159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7787280Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7787284Z 2025-08-14T21:43:51.7787397Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7787606Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7787675Z return mod(**inputs) 2025-08-14T21:43:51.7787945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7788034Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7788291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7788448Z outputs = block( 2025-08-14T21:43:51.7788679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7788770Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7789018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7789089Z return func(*args, **kwargs) 2025-08-14T21:43:51.7789351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7789442Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7789691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7789772Z return func(*args, **kwargs) 2025-08-14T21:43:51.7790033Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7790229Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7790455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7790576Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7790580Z 2025-08-14T21:43:51.7790674Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7790760Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7790850Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7790931Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7791038Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7791256Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7791329Z return mod(**inputs) 2025-08-14T21:43:51.7791593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7791690Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7791945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7792018Z outputs = block( 2025-08-14T21:43:51.7792246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7792330Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7792595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7792670Z return func(*args, **kwargs) 2025-08-14T21:43:51.7792932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7793040Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7793299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7793380Z return func(*args, **kwargs) 2025-08-14T21:43:51.7793641Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7793746Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7794068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:43:51.7794210Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:43:51.7794213Z 2025-08-14T21:43:51.7794332Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7794559Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7794744Z return mod(**inputs) 2025-08-14T21:43:51.7795021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7795113Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7795374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7795451Z outputs = block( 2025-08-14T21:43:51.7795687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7795780Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7796037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7796111Z return func(*args, **kwargs) 2025-08-14T21:43:51.7796382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7796483Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7796739Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7796821Z return func(*args, **kwargs) 2025-08-14T21:43:51.7797082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7797193Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7797500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:43:51.7797622Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:43:51.7797626Z 2025-08-14T21:43:51.7797745Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7797958Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7798043Z return mod(**inputs) 2025-08-14T21:43:51.7798311Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7798401Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7798672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7798741Z outputs = block( 2025-08-14T21:43:51.7798978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7799073Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7799330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7799411Z return func(*args, **kwargs) 2025-08-14T21:43:51.7799679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7799773Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7800035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7800109Z return func(*args, **kwargs) 2025-08-14T21:43:51.7800370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.7800465Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.7800696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7800826Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7800830Z 2025-08-14T21:43:51.7800940Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7801189Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7801299Z return mod(**inputs) 2025-08-14T21:43:51.7801571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7801667Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7801930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7802001Z outputs = block( 2025-08-14T21:43:51.7802244Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7802331Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7802588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7802857Z return func(*args, **kwargs) 2025-08-14T21:43:51.7819638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7819858Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7820182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7821087Z return func(*args, **kwargs) 2025-08-14T21:43:51.7821842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.7822533Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.7823089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7823677Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7823941Z 2025-08-14T21:43:51.7824098Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7824732Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7825271Z return mod(**inputs) 2025-08-14T21:43:51.7825927Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7826715Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7827482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7828194Z outputs = block( 2025-08-14T21:43:51.7828791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7834517Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7834999Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7835418Z return func(*args, **kwargs) 2025-08-14T21:43:51.7835857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7836310Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7836765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.7837206Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.7837596Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7838034Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7838226Z 2025-08-14T21:43:51.7838352Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7838744Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7839102Z return mod(**inputs) 2025-08-14T21:43:51.7839731Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7840159Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7840588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7840993Z outputs = block( 2025-08-14T21:43:51.7841351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7841739Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7842152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7842562Z return func(*args, **kwargs) 2025-08-14T21:43:51.7842971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7843424Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7843872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.7844297Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.7844683Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7845116Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7845304Z 2025-08-14T21:43:51.7845428Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7845836Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7846201Z return mod(**inputs) 2025-08-14T21:43:51.7846606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7847033Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7847458Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7847860Z outputs = block( 2025-08-14T21:43:51.7848215Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7848611Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7849257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7849675Z return func(*args, **kwargs) 2025-08-14T21:43:51.7850080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7850517Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7850959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:43:51.7851390Z hidden_states = self.act(hidden_states) 2025-08-14T21:43:51.7851785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:43:51.7852260Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:43:51.7852515Z 2025-08-14T21:43:51.7852628Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7853016Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7853374Z return mod(**inputs) 2025-08-14T21:43:51.7853761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7854197Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7854625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7855063Z outputs = block( 2025-08-14T21:43:51.7855442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7855831Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7856232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7856650Z return func(*args, **kwargs) 2025-08-14T21:43:51.7857068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7857505Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7858023Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.7858465Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.7858858Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7859301Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7859496Z 2025-08-14T21:43:51.7859612Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7860012Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7860379Z return mod(**inputs) 2025-08-14T21:43:51.7860754Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7861185Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7861610Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7862018Z outputs = block( 2025-08-14T21:43:51.7862367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7862762Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7863178Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7863580Z return func(*args, **kwargs) 2025-08-14T21:43:51.7863981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7864428Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7864874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.7865284Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.7865676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7866111Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7866302Z 2025-08-14T21:43:51.7866425Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7866814Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7867168Z return mod(**inputs) 2025-08-14T21:43:51.7867572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7867988Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7868395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7868791Z outputs = block( 2025-08-14T21:43:51.7869135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7869523Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7869937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7870423Z return func(*args, **kwargs) 2025-08-14T21:43:51.7870830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 442, in forward 2025-08-14T21:43:51.7871282Z hidden_states = residual + feed_forward_hidden_states 2025-08-14T21:43:51.7871464Z 2025-08-14T21:43:51.7871576Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7871972Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7872341Z return mod(**inputs) 2025-08-14T21:43:51.7872740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7873176Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7873596Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7873998Z outputs = block( 2025-08-14T21:43:51.7874352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7874748Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7875152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7875560Z return func(*args, **kwargs) 2025-08-14T21:43:51.7875963Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7876396Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7876812Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7877215Z return func(*args, **kwargs) 2025-08-14T21:43:51.7877617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7878160Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7878659Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7879093Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7879282Z 2025-08-14T21:43:51.7879407Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7879793Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7880152Z return mod(**inputs) 2025-08-14T21:43:51.7880546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7880977Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7881390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7881802Z outputs = block( 2025-08-14T21:43:51.7882153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7882538Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7882951Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7883354Z return func(*args, **kwargs) 2025-08-14T21:43:51.7883753Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7884176Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7884597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7885000Z return func(*args, **kwargs) 2025-08-14T21:43:51.7885479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7886024Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7886528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7886963Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7887152Z 2025-08-14T21:43:51.7887246Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7887488Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7887722Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7887951Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7888201Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7888598Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7889082Z return mod(**inputs) 2025-08-14T21:43:51.7889482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7889916Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7890338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7890745Z outputs = block( 2025-08-14T21:43:51.7891090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7891488Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7891898Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7892295Z return func(*args, **kwargs) 2025-08-14T21:43:51.7892693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7893130Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7893557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7893940Z return func(*args, **kwargs) 2025-08-14T21:43:51.7894329Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7894755Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7895230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:43:51.7895764Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:43:51.7895972Z 2025-08-14T21:43:51.7896083Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7896468Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7896817Z return mod(**inputs) 2025-08-14T21:43:51.7897195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7897615Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7898021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7898406Z outputs = block( 2025-08-14T21:43:51.7898743Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7899124Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7899521Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7899902Z return func(*args, **kwargs) 2025-08-14T21:43:51.7900286Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7900786Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7901184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7901573Z return func(*args, **kwargs) 2025-08-14T21:43:51.7901952Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7902381Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7903222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:43:51.7903718Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:43:51.7903891Z 2025-08-14T21:43:51.7904009Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7904395Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7904742Z return mod(**inputs) 2025-08-14T21:43:51.7905124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7905545Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7905955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7906360Z outputs = block( 2025-08-14T21:43:51.7906745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7907147Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7907564Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7907968Z return func(*args, **kwargs) 2025-08-14T21:43:51.7908371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7908788Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7909209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7909612Z return func(*args, **kwargs) 2025-08-14T21:43:51.7910011Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.7910413Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.7910789Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7911210Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7911394Z 2025-08-14T21:43:51.7911511Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7911883Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7912230Z return mod(**inputs) 2025-08-14T21:43:51.7912610Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7913033Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7913448Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7913854Z outputs = block( 2025-08-14T21:43:51.7914207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7914603Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7914999Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7915396Z return func(*args, **kwargs) 2025-08-14T21:43:51.7915943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7916406Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7916829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7917231Z return func(*args, **kwargs) 2025-08-14T21:43:51.7917620Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.7918044Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.7918431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7918919Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7919113Z 2025-08-14T21:43:51.7919226Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7919628Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7919984Z return mod(**inputs) 2025-08-14T21:43:51.7920369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7920796Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7921214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7921616Z outputs = block( 2025-08-14T21:43:51.7921959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7922353Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7922763Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7923157Z return func(*args, **kwargs) 2025-08-14T21:43:51.7923558Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7924009Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7924452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.7924870Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.7925258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7925691Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7925879Z 2025-08-14T21:43:51.7925991Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7926390Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7926745Z return mod(**inputs) 2025-08-14T21:43:51.7927140Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7927570Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7927998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7928401Z outputs = block( 2025-08-14T21:43:51.7928849Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7929287Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7929702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7930110Z return func(*args, **kwargs) 2025-08-14T21:43:51.7930505Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7930954Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7931500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.7931927Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.7932305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7932733Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7932919Z 2025-08-14T21:43:51.7933043Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7933428Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7933783Z return mod(**inputs) 2025-08-14T21:43:51.7934173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7934601Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7935020Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7935421Z outputs = block( 2025-08-14T21:43:51.7935767Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7936157Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7936557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7936958Z return func(*args, **kwargs) 2025-08-14T21:43:51.7937362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7937810Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7938248Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:43:51.7938671Z hidden_states = self.act(hidden_states) 2025-08-14T21:43:51.7939048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:43:51.7939536Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:43:51.7939793Z 2025-08-14T21:43:51.7939903Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7940298Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7940645Z return mod(**inputs) 2025-08-14T21:43:51.7941034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7941462Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7941880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7942280Z outputs = block( 2025-08-14T21:43:51.7942630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7943026Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7943437Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7943832Z return func(*args, **kwargs) 2025-08-14T21:43:51.7944231Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7944677Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7945110Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.7945540Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.7945930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7946489Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7946679Z 2025-08-14T21:43:51.7946793Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7947202Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7947573Z return mod(**inputs) 2025-08-14T21:43:51.7947958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7948396Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7948819Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7949228Z outputs = block( 2025-08-14T21:43:51.7949572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7949975Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7950419Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7950828Z return func(*args, **kwargs) 2025-08-14T21:43:51.7951231Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.7951681Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.7952131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.7952555Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.7952953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7953389Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7953577Z 2025-08-14T21:43:51.7953704Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7954097Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7954454Z return mod(**inputs) 2025-08-14T21:43:51.7954848Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7955273Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7955800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7956224Z outputs = block( 2025-08-14T21:43:51.7956589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7956996Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7957418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7957839Z return func(*args, **kwargs) 2025-08-14T21:43:51.7958254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7958689Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7959121Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7959534Z return func(*args, **kwargs) 2025-08-14T21:43:51.7959933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7960481Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7960994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7961436Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7961692Z 2025-08-14T21:43:51.7961842Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7962241Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7962601Z return mod(**inputs) 2025-08-14T21:43:51.7962985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7963414Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7963833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7964237Z outputs = block( 2025-08-14T21:43:51.7964575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7964969Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7965380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7965791Z return func(*args, **kwargs) 2025-08-14T21:43:51.7966181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7966607Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7967028Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7967424Z return func(*args, **kwargs) 2025-08-14T21:43:51.7967821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.7968354Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.7969025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7969469Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7969673Z 2025-08-14T21:43:51.7969765Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7970059Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7970378Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7970701Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.7971076Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7971720Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7972071Z return mod(**inputs) 2025-08-14T21:43:51.7972477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7972926Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7973343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7973760Z outputs = block( 2025-08-14T21:43:51.7974113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7974524Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7974932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7975330Z return func(*args, **kwargs) 2025-08-14T21:43:51.7975807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7976423Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7977021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7977605Z return func(*args, **kwargs) 2025-08-14T21:43:51.7978159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7978927Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7979639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:43:51.7980390Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:43:51.7980684Z 2025-08-14T21:43:51.7980830Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7981381Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7981876Z return mod(**inputs) 2025-08-14T21:43:51.7982425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7983042Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7983649Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7984235Z outputs = block( 2025-08-14T21:43:51.7984719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7985277Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7985865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7986430Z return func(*args, **kwargs) 2025-08-14T21:43:51.7986990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7987612Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7988203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7988776Z return func(*args, **kwargs) 2025-08-14T21:43:51.7989342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.7989988Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.7990656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:43:51.7991402Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:43:51.7991585Z 2025-08-14T21:43:51.7991701Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7992115Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.7992487Z return mod(**inputs) 2025-08-14T21:43:51.7992878Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.7993306Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.7993725Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.7994128Z outputs = block( 2025-08-14T21:43:51.7994477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.7994872Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.7995280Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7995685Z return func(*args, **kwargs) 2025-08-14T21:43:51.7996083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.7996513Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.7996922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.7997321Z return func(*args, **kwargs) 2025-08-14T21:43:51.7997811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.7998232Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.7998624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.7999058Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.7999248Z 2025-08-14T21:43:51.7999374Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.7999761Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8000118Z return mod(**inputs) 2025-08-14T21:43:51.8000515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8000949Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8001363Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8001777Z outputs = block( 2025-08-14T21:43:51.8002128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8002514Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8003251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8003668Z return func(*args, **kwargs) 2025-08-14T21:43:51.8004070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8004496Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8004930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8005344Z return func(*args, **kwargs) 2025-08-14T21:43:51.8005746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.8006171Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.8006557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8006990Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8007176Z 2025-08-14T21:43:51.8007290Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8007694Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8008049Z return mod(**inputs) 2025-08-14T21:43:51.8008440Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8008948Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8009396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8009801Z outputs = block( 2025-08-14T21:43:51.8010142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8010536Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8010954Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8011356Z return func(*args, **kwargs) 2025-08-14T21:43:51.8011800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8012244Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8012691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.8013224Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.8013699Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8014133Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8014320Z 2025-08-14T21:43:51.8014444Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8014836Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8015192Z return mod(**inputs) 2025-08-14T21:43:51.8015582Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8016012Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8016428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8016836Z outputs = block( 2025-08-14T21:43:51.8017192Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8017582Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8018001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8018394Z return func(*args, **kwargs) 2025-08-14T21:43:51.8018788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8019227Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8019664Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.8020079Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.8020460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8020890Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8021080Z 2025-08-14T21:43:51.8021197Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8021592Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8021938Z return mod(**inputs) 2025-08-14T21:43:51.8022332Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8022764Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8023176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8023581Z outputs = block( 2025-08-14T21:43:51.8023930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8024322Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8024722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8025126Z return func(*args, **kwargs) 2025-08-14T21:43:51.8025526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8025969Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8026398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:43:51.8026824Z hidden_states = self.act(hidden_states) 2025-08-14T21:43:51.8027190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:43:51.8027676Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:43:51.8027940Z 2025-08-14T21:43:51.8028079Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8028529Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8028888Z return mod(**inputs) 2025-08-14T21:43:51.8029282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8029725Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8030153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8030550Z outputs = block( 2025-08-14T21:43:51.8030887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8031285Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8031707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8032114Z return func(*args, **kwargs) 2025-08-14T21:43:51.8032523Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8032976Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8033420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.8033849Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.8034253Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8034693Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8034884Z 2025-08-14T21:43:51.8035008Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8035405Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8035771Z return mod(**inputs) 2025-08-14T21:43:51.8036172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8036602Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8037031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8037439Z outputs = block( 2025-08-14T21:43:51.8037796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8038189Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8038607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8038687Z return func(*args, **kwargs) 2025-08-14T21:43:51.8038964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8039081Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8039353Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.8039459Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.8039696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8039836Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8039840Z 2025-08-14T21:43:51.8039957Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8040177Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8040261Z return mod(**inputs) 2025-08-14T21:43:51.8040534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8040672Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8040965Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8041036Z outputs = block( 2025-08-14T21:43:51.8041281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8041367Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8041627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8041709Z return func(*args, **kwargs) 2025-08-14T21:43:51.8041973Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 442, in forward 2025-08-14T21:43:51.8042097Z hidden_states = residual + feed_forward_hidden_states 2025-08-14T21:43:51.8042101Z 2025-08-14T21:43:51.8042214Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8042434Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8042514Z return mod(**inputs) 2025-08-14T21:43:51.8042787Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8042884Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8043151Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8043220Z outputs = block( 2025-08-14T21:43:51.8043465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8043550Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8043811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8043896Z return func(*args, **kwargs) 2025-08-14T21:43:51.8044163Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8044265Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8044525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8044599Z return func(*args, **kwargs) 2025-08-14T21:43:51.8044872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.8045075Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.8045316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8045451Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8045458Z 2025-08-14T21:43:51.8045568Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8045795Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8045866Z return mod(**inputs) 2025-08-14T21:43:51.8046136Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8046236Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8046508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8046584Z outputs = block( 2025-08-14T21:43:51.8046822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8046906Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8047177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8047360Z return func(*args, **kwargs) 2025-08-14T21:43:51.8047632Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8047734Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8047997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8048080Z return func(*args, **kwargs) 2025-08-14T21:43:51.8048347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.8048547Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.8048941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8049084Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8049090Z 2025-08-14T21:43:51.8049195Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.8049283Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.8049368Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.8049461Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.8049572Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8049800Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8049879Z return mod(**inputs) 2025-08-14T21:43:51.8050149Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8050249Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8050570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8050642Z outputs = block( 2025-08-14T21:43:51.8050892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8050978Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8051246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8051330Z return func(*args, **kwargs) 2025-08-14T21:43:51.8051602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8051706Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8051972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8052048Z return func(*args, **kwargs) 2025-08-14T21:43:51.8052321Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.8052429Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.8052759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:43:51.8052909Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:43:51.8052914Z 2025-08-14T21:43:51.8053024Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8053243Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8053314Z return mod(**inputs) 2025-08-14T21:43:51.8053579Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8053676Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8053940Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8054063Z outputs = block( 2025-08-14T21:43:51.8054334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8054420Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8054692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8054766Z return func(*args, **kwargs) 2025-08-14T21:43:51.8055031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8055133Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8055391Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8055471Z return func(*args, **kwargs) 2025-08-14T21:43:51.8055733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.8055844Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.8056169Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:43:51.8056292Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:43:51.8056297Z 2025-08-14T21:43:51.8056416Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8056631Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8056703Z return mod(**inputs) 2025-08-14T21:43:51.8056981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8057071Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8057336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8057416Z outputs = block( 2025-08-14T21:43:51.8057657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8057749Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8058010Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8058084Z return func(*args, **kwargs) 2025-08-14T21:43:51.8058353Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8058445Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8058704Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8058783Z return func(*args, **kwargs) 2025-08-14T21:43:51.8059050Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.8059150Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.8059385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8059512Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8059516Z 2025-08-14T21:43:51.8059634Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8059848Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8059924Z return mod(**inputs) 2025-08-14T21:43:51.8060193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8060282Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8060556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8060716Z outputs = block( 2025-08-14T21:43:51.8060951Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8061045Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8061300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8061380Z return func(*args, **kwargs) 2025-08-14T21:43:51.8061642Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8061735Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8061999Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8062072Z return func(*args, **kwargs) 2025-08-14T21:43:51.8062339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.8062445Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.8062674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8062806Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8062809Z 2025-08-14T21:43:51.8062917Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8063128Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8063207Z return mod(**inputs) 2025-08-14T21:43:51.8063472Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8063570Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8063830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8063905Z outputs = block( 2025-08-14T21:43:51.8064148Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8064234Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8064493Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8064576Z return func(*args, **kwargs) 2025-08-14T21:43:51.8064839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8064957Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8065219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.8065306Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.8065554Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8065680Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8065683Z 2025-08-14T21:43:51.8065802Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8066013Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8066085Z return mod(**inputs) 2025-08-14T21:43:51.8066358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8066447Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8066710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8066784Z outputs = block( 2025-08-14T21:43:51.8067021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8067199Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8067465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8067539Z return func(*args, **kwargs) 2025-08-14T21:43:51.8067817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8067927Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8068194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.8068288Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.8068521Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8068652Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8068658Z 2025-08-14T21:43:51.8068771Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8068984Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8069065Z return mod(**inputs) 2025-08-14T21:43:51.8069334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8069430Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8069693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8069760Z outputs = block( 2025-08-14T21:43:51.8070005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8070090Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8070356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8070442Z return func(*args, **kwargs) 2025-08-14T21:43:51.8070707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8070825Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8071089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:43:51.8071175Z hidden_states = self.act(hidden_states) 2025-08-14T21:43:51.8071411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:43:51.8071608Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:43:51.8071612Z 2025-08-14T21:43:51.8071729Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8071952Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8072024Z return mod(**inputs) 2025-08-14T21:43:51.8072303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8072393Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8072658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8072733Z outputs = block( 2025-08-14T21:43:51.8072967Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8073059Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8073319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8073421Z return func(*args, **kwargs) 2025-08-14T21:43:51.8073750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8073861Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8074124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.8074223Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.8074452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8074582Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8074586Z 2025-08-14T21:43:51.8074697Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8074912Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8074989Z return mod(**inputs) 2025-08-14T21:43:51.8075264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8075360Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8075622Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8075689Z outputs = block( 2025-08-14T21:43:51.8075932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8076015Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8076274Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8076356Z return func(*args, **kwargs) 2025-08-14T21:43:51.8076615Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8076732Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8076997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.8077090Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.8077329Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8077452Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8077456Z 2025-08-14T21:43:51.8077581Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8077786Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8077855Z return mod(**inputs) 2025-08-14T21:43:51.8078122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8078212Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8078481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8078556Z outputs = block( 2025-08-14T21:43:51.8078791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8078882Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8079140Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8079214Z return func(*args, **kwargs) 2025-08-14T21:43:51.8079482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8079577Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8079834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8079957Z return func(*args, **kwargs) 2025-08-14T21:43:51.8080259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.8080469Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.8080702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8080826Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8080830Z 2025-08-14T21:43:51.8080949Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8081164Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8081242Z return mod(**inputs) 2025-08-14T21:43:51.8081515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8081608Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8081885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8081952Z outputs = block( 2025-08-14T21:43:51.8082190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8082285Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8082547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8082629Z return func(*args, **kwargs) 2025-08-14T21:43:51.8082894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8082989Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8083257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8083337Z return func(*args, **kwargs) 2025-08-14T21:43:51.8083601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.8083805Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.8084038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8084170Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8084174Z 2025-08-14T21:43:51.8084261Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.8084348Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.8084442Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.8084524Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.8084646Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8084866Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8084937Z return mod(**inputs) 2025-08-14T21:43:51.8085220Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8085307Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8085579Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8085654Z outputs = block( 2025-08-14T21:43:51.8085892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8085984Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8086254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8086365Z return func(*args, **kwargs) 2025-08-14T21:43:51.8086673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8086769Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8087032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8087114Z return func(*args, **kwargs) 2025-08-14T21:43:51.8087385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.8087498Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.8087811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:43:51.8087952Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:43:51.8087958Z 2025-08-14T21:43:51.8088078Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8088295Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8088374Z return mod(**inputs) 2025-08-14T21:43:51.8088644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8088841Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8089127Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8089195Z outputs = block( 2025-08-14T21:43:51.8089431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8089527Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8089792Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8089880Z return func(*args, **kwargs) 2025-08-14T21:43:51.8090155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8090250Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8090541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8090614Z return func(*args, **kwargs) 2025-08-14T21:43:51.8090892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.8091001Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.8091315Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:43:51.8091440Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:43:51.8091447Z 2025-08-14T21:43:51.8091557Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8091767Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8091843Z return mod(**inputs) 2025-08-14T21:43:51.8092115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8092206Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8092483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8092548Z outputs = block( 2025-08-14T21:43:51.8092782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8092865Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8093136Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8093337Z return func(*args, **kwargs) 2025-08-14T21:43:51.8093622Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8093718Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8093989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8094061Z return func(*args, **kwargs) 2025-08-14T21:43:51.8094331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.8094420Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.8094660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8094786Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8094793Z 2025-08-14T21:43:51.8094908Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8095129Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8095200Z return mod(**inputs) 2025-08-14T21:43:51.8095469Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8095568Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8095841Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8095917Z outputs = block( 2025-08-14T21:43:51.8096154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8096241Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8096508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8096588Z return func(*args, **kwargs) 2025-08-14T21:43:51.8096854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8096957Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8097218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8097298Z return func(*args, **kwargs) 2025-08-14T21:43:51.8097561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.8097649Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.8097891Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8098016Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8098023Z 2025-08-14T21:43:51.8098144Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8098359Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8098430Z return mod(**inputs) 2025-08-14T21:43:51.8098704Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8098794Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8099058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8099133Z outputs = block( 2025-08-14T21:43:51.8099370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8099462Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8099720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8099869Z return func(*args, **kwargs) 2025-08-14T21:43:51.8100141Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8100252Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8100513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.8100607Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.8100837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8100968Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8100972Z 2025-08-14T21:43:51.8101081Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8101293Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8101377Z return mod(**inputs) 2025-08-14T21:43:51.8101645Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8101742Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8102005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8102072Z outputs = block( 2025-08-14T21:43:51.8102313Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8102397Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8102850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8102970Z return func(*args, **kwargs) 2025-08-14T21:43:51.8103351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8103474Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8103738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.8103826Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.8104063Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8104188Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8104192Z 2025-08-14T21:43:51.8104311Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8104523Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8104594Z return mod(**inputs) 2025-08-14T21:43:51.8104868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8104963Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8105227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8105303Z outputs = block( 2025-08-14T21:43:51.8105540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8105633Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8105889Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8105963Z return func(*args, **kwargs) 2025-08-14T21:43:51.8106229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8106338Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8106769Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:43:51.8106866Z hidden_states = self.act(hidden_states) 2025-08-14T21:43:51.8107092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:43:51.8107296Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:43:51.8107300Z 2025-08-14T21:43:51.8107412Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8107626Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8107707Z return mod(**inputs) 2025-08-14T21:43:51.8107976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8108073Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8108340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8108407Z outputs = block( 2025-08-14T21:43:51.8108650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8108735Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8108991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8109075Z return func(*args, **kwargs) 2025-08-14T21:43:51.8109336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8109452Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8109712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.8109809Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.8110060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8110181Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8110185Z 2025-08-14T21:43:51.8110300Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8110505Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8110573Z return mod(**inputs) 2025-08-14T21:43:51.8110840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8110927Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8111182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8111257Z outputs = block( 2025-08-14T21:43:51.8111490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8111580Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8111834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8111907Z return func(*args, **kwargs) 2025-08-14T21:43:51.8112171Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8112276Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8112529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.8112627Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.8112853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8113050Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8113054Z 2025-08-14T21:43:51.8113163Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8113372Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8113447Z return mod(**inputs) 2025-08-14T21:43:51.8113707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8113801Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8114057Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8114122Z outputs = block( 2025-08-14T21:43:51.8114360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8114447Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8114703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8114785Z return func(*args, **kwargs) 2025-08-14T21:43:51.8115041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 442, in forward 2025-08-14T21:43:51.8115161Z hidden_states = residual + feed_forward_hidden_states 2025-08-14T21:43:51.8115165Z 2025-08-14T21:43:51.8115272Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8115479Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8115556Z return mod(**inputs) 2025-08-14T21:43:51.8115818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8115910Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8116175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8116240Z outputs = block( 2025-08-14T21:43:51.8116481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8116565Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8116837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8116919Z return func(*args, **kwargs) 2025-08-14T21:43:51.8117187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8117285Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8117537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8117612Z return func(*args, **kwargs) 2025-08-14T21:43:51.8117885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.8118086Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.8118328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8118454Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8118458Z 2025-08-14T21:43:51.8118564Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8118776Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8118844Z return mod(**inputs) 2025-08-14T21:43:51.8119105Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8119245Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8119541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8119616Z outputs = block( 2025-08-14T21:43:51.8119852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8119938Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8120204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8120281Z return func(*args, **kwargs) 2025-08-14T21:43:51.8120543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8120645Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8120903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8121000Z return func(*args, **kwargs) 2025-08-14T21:43:51.8121255Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:43:51.8121448Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:43:51.8121687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8121807Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8121811Z 2025-08-14T21:43:51.8121902Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.8121986Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.8122066Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.8122155Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.8122265Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8122487Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8122565Z return mod(**inputs) 2025-08-14T21:43:51.8122846Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8122938Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8123198Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8123263Z outputs = block( 2025-08-14T21:43:51.8123499Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8123583Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8123842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8123926Z return func(*args, **kwargs) 2025-08-14T21:43:51.8124196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8124297Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8124553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8124627Z return func(*args, **kwargs) 2025-08-14T21:43:51.8124897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.8125003Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.8125315Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:43:51.8125464Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:43:51.8125469Z 2025-08-14T21:43:51.8125614Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8125882Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8125953Z return mod(**inputs) 2025-08-14T21:43:51.8126228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8126325Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8126593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8126668Z outputs = block( 2025-08-14T21:43:51.8126909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8126996Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8127266Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8127342Z return func(*args, **kwargs) 2025-08-14T21:43:51.8127617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8127722Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8127986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8128066Z return func(*args, **kwargs) 2025-08-14T21:43:51.8128335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:43:51.8128442Z attn_output, attn_weights = attention_interface( 2025-08-14T21:43:51.8128841Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:43:51.8128967Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:43:51.8128975Z 2025-08-14T21:43:51.8129098Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8129319Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8129390Z return mod(**inputs) 2025-08-14T21:43:51.8129670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8129760Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8130027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8130106Z outputs = block( 2025-08-14T21:43:51.8130342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8130436Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8130696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8130774Z return func(*args, **kwargs) 2025-08-14T21:43:51.8131049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8131145Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8131403Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8131485Z return func(*args, **kwargs) 2025-08-14T21:43:51.8131748Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.8131848Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.8132083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8132211Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8132255Z 2025-08-14T21:43:51.8132376Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8132633Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8132712Z return mod(**inputs) 2025-08-14T21:43:51.8132983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8133071Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8133344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8133411Z outputs = block( 2025-08-14T21:43:51.8133647Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8133739Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8133999Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8134086Z return func(*args, **kwargs) 2025-08-14T21:43:51.8134352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:43:51.8134446Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:43:51.8134713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8134787Z return func(*args, **kwargs) 2025-08-14T21:43:51.8135054Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:43:51.8135149Z attn_output = self.c_proj(attn_output) 2025-08-14T21:43:51.8135383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8135514Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8135521Z 2025-08-14T21:43:51.8135631Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8135846Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8135925Z return mod(**inputs) 2025-08-14T21:43:51.8136192Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8136287Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8136554Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8136621Z outputs = block( 2025-08-14T21:43:51.8136866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8136950Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8137211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8137300Z return func(*args, **kwargs) 2025-08-14T21:43:51.8137566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8137688Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8137953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.8138040Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.8138291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8138416Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8138419Z 2025-08-14T21:43:51.8138536Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8138749Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8138855Z return mod(**inputs) 2025-08-14T21:43:51.8139161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8139251Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8139516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8139590Z outputs = block( 2025-08-14T21:43:51.8139824Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8139916Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8140175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8140248Z return func(*args, **kwargs) 2025-08-14T21:43:51.8140518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8140637Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8140902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:43:51.8140996Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:43:51.8141229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8141360Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8141364Z 2025-08-14T21:43:51.8141474Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8141694Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8141771Z return mod(**inputs) 2025-08-14T21:43:51.8142032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8142132Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8142397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8142464Z outputs = block( 2025-08-14T21:43:51.8142707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8142792Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8143050Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8143132Z return func(*args, **kwargs) 2025-08-14T21:43:51.8143395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8143512Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8143783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:43:51.8143869Z hidden_states = self.act(hidden_states) 2025-08-14T21:43:51.8144107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:43:51.8144304Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:43:51.8144308Z 2025-08-14T21:43:51.8144426Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8144640Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8144711Z return mod(**inputs) 2025-08-14T21:43:51.8144990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8145079Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8145450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8145529Z outputs = block( 2025-08-14T21:43:51.8145766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8145860Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8146119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8146204Z return func(*args, **kwargs) 2025-08-14T21:43:51.8146470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8146578Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8146834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.8146937Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.8147167Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8147297Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8147300Z 2025-08-14T21:43:51.8147407Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8147636Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8147713Z return mod(**inputs) 2025-08-14T21:43:51.8147987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:43:51.8148082Z transformer_outputs = self.transformer( 2025-08-14T21:43:51.8148359Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:43:51.8148427Z outputs = block( 2025-08-14T21:43:51.8148669Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:43:51.8148751Z return super().__call__(*args, **kwargs) 2025-08-14T21:43:51.8149022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:43:51.8149104Z return func(*args, **kwargs) 2025-08-14T21:43:51.8149381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:43:51.8149494Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:43:51.8149772Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:43:51.8149863Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:43:51.8150109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:43:51.8150235Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:43:51.8150242Z 2025-08-14T21:43:51.8150333Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.8150418Z cudagraph partition due to non gpu ops 2025-08-14T21:43:51.8150525Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8150755Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8150823Z return mod(**inputs) 2025-08-14T21:43:51.8151097Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1537, in forward 2025-08-14T21:43:51.8151258Z loss = loss_fct(pooled_logits.view(-1, self.num_labels), labels.view(-1)) 2025-08-14T21:43:51.8151262Z 2025-08-14T21:43:51.8151433Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:43:51.8151665Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:43:51.8151774Z return mod(**inputs) 2025-08-14T21:43:51.8152073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1537, in forward 2025-08-14T21:43:51.8152235Z loss = loss_fct(pooled_logits.view(-1, self.num_labels), labels.view(-1)) 2025-08-14T21:43:51.8152240Z 2025-08-14T21:44:08.8807426Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8807746Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8807982Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8808214Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8808448Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8808924Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8809167Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8809398Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8809626Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8809848Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8810106Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8810410Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8810682Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8811107Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8811482Z return mod(**inputs) 2025-08-14T21:44:08.8811929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1509, in forward 2025-08-14T21:44:08.8812421Z last_non_pad_token = (token_indices * non_pad_mask).argmax(-1) 2025-08-14T21:44:08.8812630Z 2025-08-14T21:44:08.8812750Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8813157Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8813513Z return mod(**inputs) 2025-08-14T21:44:08.8813911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8814387Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8814826Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8815249Z outputs = block( 2025-08-14T21:44:08.8815619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8816027Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8816453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8816918Z return func(*args, **kwargs) 2025-08-14T21:44:08.8817348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.8817799Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.8818236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8818652Z return func(*args, **kwargs) 2025-08-14T21:44:08.8819069Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.8819620Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.8820119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8820599Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8820791Z 2025-08-14T21:44:08.8820916Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8821318Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8821658Z return mod(**inputs) 2025-08-14T21:44:08.8822579Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8823017Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8823426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8823831Z outputs = block( 2025-08-14T21:44:08.8824188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8824586Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8824998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8825405Z return func(*args, **kwargs) 2025-08-14T21:44:08.8825804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.8826242Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.8826685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8827091Z return func(*args, **kwargs) 2025-08-14T21:44:08.8827485Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.8828022Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.8828536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8828976Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8829167Z 2025-08-14T21:44:08.8829266Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8829501Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8829738Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8829971Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8830233Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8830639Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8830994Z return mod(**inputs) 2025-08-14T21:44:08.8831382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8831810Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8832228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8832640Z outputs = block( 2025-08-14T21:44:08.8833003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8833391Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8833813Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8834241Z return func(*args, **kwargs) 2025-08-14T21:44:08.8834631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.8835061Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.8835481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8835883Z return func(*args, **kwargs) 2025-08-14T21:44:08.8836295Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.8836732Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.8837211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:44:08.8837767Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:44:08.8838021Z 2025-08-14T21:44:08.8838137Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8838518Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8838873Z return mod(**inputs) 2025-08-14T21:44:08.8839263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8839697Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8840130Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8840517Z outputs = block( 2025-08-14T21:44:08.8840858Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8841241Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8841659Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8842057Z return func(*args, **kwargs) 2025-08-14T21:44:08.8842457Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.8842888Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.8843299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8843707Z return func(*args, **kwargs) 2025-08-14T21:44:08.8844100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.8844608Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.8845093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:44:08.8845601Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:44:08.8845781Z 2025-08-14T21:44:08.8845899Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8846293Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8846643Z return mod(**inputs) 2025-08-14T21:44:08.8847044Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8847485Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8847897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8848303Z outputs = block( 2025-08-14T21:44:08.8848654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8849124Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8849540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8849955Z return func(*args, **kwargs) 2025-08-14T21:44:08.8850360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.8850791Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.8851215Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8851623Z return func(*args, **kwargs) 2025-08-14T21:44:08.8852026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.8852445Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.8852845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8853422Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8853617Z 2025-08-14T21:44:08.8853739Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8854128Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8854500Z return mod(**inputs) 2025-08-14T21:44:08.8854896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8855320Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8855746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8856161Z outputs = block( 2025-08-14T21:44:08.8856511Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8856901Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8857317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8857726Z return func(*args, **kwargs) 2025-08-14T21:44:08.8858124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.8858569Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.8858988Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8859391Z return func(*args, **kwargs) 2025-08-14T21:44:08.8859779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.8860200Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.8860588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8861024Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8861210Z 2025-08-14T21:44:08.8861322Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8861716Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8862092Z return mod(**inputs) 2025-08-14T21:44:08.8862475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8862912Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8863330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8863729Z outputs = block( 2025-08-14T21:44:08.8864070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8864466Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8864878Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8865274Z return func(*args, **kwargs) 2025-08-14T21:44:08.8865672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.8866128Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.8866575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.8866991Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.8867378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8867813Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8868025Z 2025-08-14T21:44:08.8868167Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8868620Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8868987Z return mod(**inputs) 2025-08-14T21:44:08.8869382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8869805Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8870224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8870631Z outputs = block( 2025-08-14T21:44:08.8870981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8871367Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8871780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8872189Z return func(*args, **kwargs) 2025-08-14T21:44:08.8872583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.8873030Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.8873472Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.8873897Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.8874279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8874712Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8874899Z 2025-08-14T21:44:08.8875020Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8875413Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8875766Z return mod(**inputs) 2025-08-14T21:44:08.8876164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8876594Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8877006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8877418Z outputs = block( 2025-08-14T21:44:08.8877765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8878156Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8878558Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8878961Z return func(*args, **kwargs) 2025-08-14T21:44:08.8879359Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.8879802Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.8880244Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:44:08.8880673Z hidden_states = self.act(hidden_states) 2025-08-14T21:44:08.8881053Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:08.8881559Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:08.8881816Z 2025-08-14T21:44:08.8881930Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8882323Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8882686Z return mod(**inputs) 2025-08-14T21:44:08.8883058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8883556Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8883967Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8884355Z outputs = block( 2025-08-14T21:44:08.8884706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8885116Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8885515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8885899Z return func(*args, **kwargs) 2025-08-14T21:44:08.8886324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.8886774Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.8887220Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.8887656Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.8888064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8888503Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8888770Z 2025-08-14T21:44:08.8888891Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8889295Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8889661Z return mod(**inputs) 2025-08-14T21:44:08.8890056Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8890490Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8890911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8891333Z outputs = block( 2025-08-14T21:44:08.8891671Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8892058Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8892459Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8892852Z return func(*args, **kwargs) 2025-08-14T21:44:08.8893234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.8893666Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.8894096Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.8894515Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.8894895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8895314Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8895496Z 2025-08-14T21:44:08.8895614Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8895990Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8896333Z return mod(**inputs) 2025-08-14T21:44:08.8896714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8897134Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8897534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8897928Z outputs = block( 2025-08-14T21:44:08.8898300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8898728Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8899128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8899520Z return func(*args, **kwargs) 2025-08-14T21:44:08.8899907Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.8900317Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.8900722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8901117Z return func(*args, **kwargs) 2025-08-14T21:44:08.8901502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.8901994Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.8902476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8903159Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8903345Z 2025-08-14T21:44:08.8903458Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8903845Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8904196Z return mod(**inputs) 2025-08-14T21:44:08.8904559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8904949Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8905337Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8905713Z outputs = block( 2025-08-14T21:44:08.8906051Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8906436Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8906843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8907234Z return func(*args, **kwargs) 2025-08-14T21:44:08.8907612Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.8908028Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.8908434Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8908823Z return func(*args, **kwargs) 2025-08-14T21:44:08.8909202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.8909728Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.8910210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8910618Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8910805Z 2025-08-14T21:44:08.8910892Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8911122Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8911348Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8911563Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8911812Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8912198Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8912535Z return mod(**inputs) 2025-08-14T21:44:08.8912919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8913504Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8913919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8914306Z outputs = block( 2025-08-14T21:44:08.8914646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8915028Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8915419Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8915817Z return func(*args, **kwargs) 2025-08-14T21:44:08.8916206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.8916621Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.8917032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8917426Z return func(*args, **kwargs) 2025-08-14T21:44:08.8917817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.8918239Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.8918709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:44:08.8919213Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:44:08.8919406Z 2025-08-14T21:44:08.8919927Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8920302Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8920649Z return mod(**inputs) 2025-08-14T21:44:08.8921039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8921459Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8921861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8922253Z outputs = block( 2025-08-14T21:44:08.8922594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8922965Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8923363Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8923754Z return func(*args, **kwargs) 2025-08-14T21:44:08.8924145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.8924560Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.8924971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8925365Z return func(*args, **kwargs) 2025-08-14T21:44:08.8925775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.8926216Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.8926697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:44:08.8927198Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:44:08.8927372Z 2025-08-14T21:44:08.8927486Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8927880Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8928281Z return mod(**inputs) 2025-08-14T21:44:08.8928829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8929267Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8929701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8930184Z outputs = block( 2025-08-14T21:44:08.8930528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8930917Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8931324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8931726Z return func(*args, **kwargs) 2025-08-14T21:44:08.8932114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.8932547Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.8932962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8933357Z return func(*args, **kwargs) 2025-08-14T21:44:08.8933742Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.8934155Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.8934539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8934957Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8935149Z 2025-08-14T21:44:08.8935260Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8935647Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8935997Z return mod(**inputs) 2025-08-14T21:44:08.8936381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8936805Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8937225Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8937617Z outputs = block( 2025-08-14T21:44:08.8937963Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8938348Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8938733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8939102Z return func(*args, **kwargs) 2025-08-14T21:44:08.8939473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.8939877Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.8940271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8940641Z return func(*args, **kwargs) 2025-08-14T21:44:08.8941016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.8941410Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.8941765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8942163Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8942350Z 2025-08-14T21:44:08.8942464Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8942852Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8944113Z return mod(**inputs) 2025-08-14T21:44:08.8944516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8944930Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8945338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8945737Z outputs = block( 2025-08-14T21:44:08.8946079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8946464Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8946859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8947238Z return func(*args, **kwargs) 2025-08-14T21:44:08.8947609Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.8948024Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.8948437Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.8948831Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.8949191Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8949580Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8949758Z 2025-08-14T21:44:08.8949862Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8950227Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8950556Z return mod(**inputs) 2025-08-14T21:44:08.8950910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8951332Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8951744Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8952132Z outputs = block( 2025-08-14T21:44:08.8952474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8952858Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8953256Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8953646Z return func(*args, **kwargs) 2025-08-14T21:44:08.8954039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.8954475Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.8954898Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.8955317Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.8955695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8956117Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8956299Z 2025-08-14T21:44:08.8956409Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8956794Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8957143Z return mod(**inputs) 2025-08-14T21:44:08.8957527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8957941Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8958352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8958800Z outputs = block( 2025-08-14T21:44:08.8959179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8959565Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8959965Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8960358Z return func(*args, **kwargs) 2025-08-14T21:44:08.8960739Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.8961173Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.8961611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:44:08.8962017Z hidden_states = self.act(hidden_states) 2025-08-14T21:44:08.8962390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:08.8962880Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:08.8963126Z 2025-08-14T21:44:08.8963243Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8963620Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8963967Z return mod(**inputs) 2025-08-14T21:44:08.8964347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8964768Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8965170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8965558Z outputs = block( 2025-08-14T21:44:08.8965899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8966277Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8966679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8967077Z return func(*args, **kwargs) 2025-08-14T21:44:08.8967475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.8967910Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.8968350Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.8968872Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.8969272Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8969697Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8969900Z 2025-08-14T21:44:08.8970017Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8970412Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8970761Z return mod(**inputs) 2025-08-14T21:44:08.8971144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8971537Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8971921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8972285Z outputs = block( 2025-08-14T21:44:08.8972622Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8973004Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8973458Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8973874Z return func(*args, **kwargs) 2025-08-14T21:44:08.8974264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.8974701Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.8975120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.8975537Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.8975918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8976335Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8976517Z 2025-08-14T21:44:08.8976626Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8977014Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8977364Z return mod(**inputs) 2025-08-14T21:44:08.8977747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8978193Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8978619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8979012Z outputs = block( 2025-08-14T21:44:08.8979343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8979723Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8980120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8980483Z return func(*args, **kwargs) 2025-08-14T21:44:08.8980877Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.8981291Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.8981697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8982078Z return func(*args, **kwargs) 2025-08-14T21:44:08.8982465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.8982987Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.8983472Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8983881Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8984069Z 2025-08-14T21:44:08.8984178Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8984569Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8984911Z return mod(**inputs) 2025-08-14T21:44:08.8985291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8985711Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8986121Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8986526Z outputs = block( 2025-08-14T21:44:08.8986871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8987260Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8987665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8988117Z return func(*args, **kwargs) 2025-08-14T21:44:08.8988550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.8988973Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.8989368Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8989758Z return func(*args, **kwargs) 2025-08-14T21:44:08.8990144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.8990655Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.8991131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.8991547Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.8991732Z 2025-08-14T21:44:08.8991828Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8992050Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8992274Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8992494Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.8992743Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.8993095Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.8993422Z return mod(**inputs) 2025-08-14T21:44:08.8993785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.8994174Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.8994561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.8994932Z outputs = block( 2025-08-14T21:44:08.8995273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.8995652Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.8996053Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8996458Z return func(*args, **kwargs) 2025-08-14T21:44:08.8996860Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.8997293Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.8997698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.8998086Z return func(*args, **kwargs) 2025-08-14T21:44:08.8998462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.8998892Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.8999365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:44:08.8999868Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:44:08.9000058Z 2025-08-14T21:44:08.9000166Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9000546Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9000889Z return mod(**inputs) 2025-08-14T21:44:08.9001261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9001679Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9002093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9002535Z outputs = block( 2025-08-14T21:44:08.9003138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9003537Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9003951Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9004349Z return func(*args, **kwargs) 2025-08-14T21:44:08.9004748Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9005185Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9005604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9006007Z return func(*args, **kwargs) 2025-08-14T21:44:08.9006405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9006851Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9007326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:44:08.9007827Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:44:08.9008009Z 2025-08-14T21:44:08.9008120Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9008514Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9008933Z return mod(**inputs) 2025-08-14T21:44:08.9009335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9009766Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9010186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9010589Z outputs = block( 2025-08-14T21:44:08.9010941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9011335Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9011738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9012143Z return func(*args, **kwargs) 2025-08-14T21:44:08.9012543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9012971Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9013385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9013788Z return func(*args, **kwargs) 2025-08-14T21:44:08.9014185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9014604Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9014994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9015426Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9015615Z 2025-08-14T21:44:08.9015734Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9016117Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9016469Z return mod(**inputs) 2025-08-14T21:44:08.9016860Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9017285Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9017695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9018172Z outputs = block( 2025-08-14T21:44:08.9018555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9018942Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9019352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9019752Z return func(*args, **kwargs) 2025-08-14T21:44:08.9020150Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9020569Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9020993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9021403Z return func(*args, **kwargs) 2025-08-14T21:44:08.9021793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9022223Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9022611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9023045Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9023225Z 2025-08-14T21:44:08.9023332Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9023715Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9024059Z return mod(**inputs) 2025-08-14T21:44:08.9024436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9024842Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9025249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9025645Z outputs = block( 2025-08-14T21:44:08.9025979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9026357Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9026752Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9027139Z return func(*args, **kwargs) 2025-08-14T21:44:08.9027513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9027945Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9028371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9028773Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9029149Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9029572Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9029751Z 2025-08-14T21:44:08.9029867Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9030243Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9030587Z return mod(**inputs) 2025-08-14T21:44:08.9031016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9031434Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9031835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9032227Z outputs = block( 2025-08-14T21:44:08.9032569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9032987Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9033431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9033830Z return func(*args, **kwargs) 2025-08-14T21:44:08.9034220Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9034650Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9035081Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9035488Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9035849Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9036249Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9036428Z 2025-08-14T21:44:08.9036532Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9036900Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9037219Z return mod(**inputs) 2025-08-14T21:44:08.9037578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9037975Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9038361Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9038727Z outputs = block( 2025-08-14T21:44:08.9039049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9039411Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9039779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9040154Z return func(*args, **kwargs) 2025-08-14T21:44:08.9040524Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9040945Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9041367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:44:08.9041777Z hidden_states = self.act(hidden_states) 2025-08-14T21:44:08.9042148Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:08.9042619Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:08.9042869Z 2025-08-14T21:44:08.9042977Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9043363Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9043713Z return mod(**inputs) 2025-08-14T21:44:08.9044091Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9044518Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9044936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9045347Z outputs = block( 2025-08-14T21:44:08.9045691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9046084Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9046492Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9046890Z return func(*args, **kwargs) 2025-08-14T21:44:08.9047359Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9047821Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9048257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9048773Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9049185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9049620Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9049808Z 2025-08-14T21:44:08.9049937Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9050290Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9050617Z return mod(**inputs) 2025-08-14T21:44:08.9050983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9051378Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9051766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9052139Z outputs = block( 2025-08-14T21:44:08.9052463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9052825Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9053195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9053575Z return func(*args, **kwargs) 2025-08-14T21:44:08.9053951Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9054381Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9054810Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9055222Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9055592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9056010Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9056192Z 2025-08-14T21:44:08.9056307Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9056690Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9057024Z return mod(**inputs) 2025-08-14T21:44:08.9057405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9057820Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9058226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9058619Z outputs = block( 2025-08-14T21:44:08.9058957Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9059336Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9059730Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9060120Z return func(*args, **kwargs) 2025-08-14T21:44:08.9060504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 442, in forward 2025-08-14T21:44:08.9060933Z hidden_states = residual + feed_forward_hidden_states 2025-08-14T21:44:08.9061107Z 2025-08-14T21:44:08.9061214Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9061590Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9062051Z return mod(**inputs) 2025-08-14T21:44:08.9062424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9062837Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9063238Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9063618Z outputs = block( 2025-08-14T21:44:08.9063955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9064333Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9064728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9065110Z return func(*args, **kwargs) 2025-08-14T21:44:08.9065510Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9065931Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9066339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9066720Z return func(*args, **kwargs) 2025-08-14T21:44:08.9067104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9067623Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9068100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9068518Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9068705Z 2025-08-14T21:44:08.9068815Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9069205Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9069544Z return mod(**inputs) 2025-08-14T21:44:08.9069925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9070391Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9070791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9071182Z outputs = block( 2025-08-14T21:44:08.9071521Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9071902Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9072296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9072690Z return func(*args, **kwargs) 2025-08-14T21:44:08.9073079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9073531Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9073932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9074323Z return func(*args, **kwargs) 2025-08-14T21:44:08.9074706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9075209Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9075690Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9076108Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9076312Z 2025-08-14T21:44:08.9076425Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9076683Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9076912Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9077132Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9077369Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9077755Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9078102Z return mod(**inputs) 2025-08-14T21:44:08.9078486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9078902Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9079311Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9079703Z outputs = block( 2025-08-14T21:44:08.9080037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9080425Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9080822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9081214Z return func(*args, **kwargs) 2025-08-14T21:44:08.9081595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9082012Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9082421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9082804Z return func(*args, **kwargs) 2025-08-14T21:44:08.9083193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9083622Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9084098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:44:08.9084601Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:44:08.9084805Z 2025-08-14T21:44:08.9084918Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9085309Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9085662Z return mod(**inputs) 2025-08-14T21:44:08.9086050Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9086484Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9086908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9087307Z outputs = block( 2025-08-14T21:44:08.9087888Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9088299Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9088757Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9089169Z return func(*args, **kwargs) 2025-08-14T21:44:08.9089572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9090078Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9090492Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9090899Z return func(*args, **kwargs) 2025-08-14T21:44:08.9091291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9091759Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9092300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:44:08.9092796Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:44:08.9092976Z 2025-08-14T21:44:08.9093092Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9093489Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9093836Z return mod(**inputs) 2025-08-14T21:44:08.9094226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9094657Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9095071Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9095479Z outputs = block( 2025-08-14T21:44:08.9095836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9096227Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9096630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9097033Z return func(*args, **kwargs) 2025-08-14T21:44:08.9097422Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9097818Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9098195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9098563Z return func(*args, **kwargs) 2025-08-14T21:44:08.9098928Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9099314Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9099676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9100074Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9100246Z 2025-08-14T21:44:08.9100356Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9100714Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9101039Z return mod(**inputs) 2025-08-14T21:44:08.9101402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9101805Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9102217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9102750Z outputs = block( 2025-08-14T21:44:08.9103112Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9103488Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9103891Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9104262Z return func(*args, **kwargs) 2025-08-14T21:44:08.9104622Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9105017Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9105435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9105810Z return func(*args, **kwargs) 2025-08-14T21:44:08.9106168Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9106642Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9107048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9107457Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9107636Z 2025-08-14T21:44:08.9107746Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9108130Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9108477Z return mod(**inputs) 2025-08-14T21:44:08.9108861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9109280Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9109687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9110077Z outputs = block( 2025-08-14T21:44:08.9110413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9110794Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9111193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9111575Z return func(*args, **kwargs) 2025-08-14T21:44:08.9111962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9112410Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9112853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9113254Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9113639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9114075Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9114259Z 2025-08-14T21:44:08.9114378Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9114764Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9115122Z return mod(**inputs) 2025-08-14T21:44:08.9115515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9115954Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9116384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9116793Z outputs = block( 2025-08-14T21:44:08.9117138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9117527Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9117944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9118345Z return func(*args, **kwargs) 2025-08-14T21:44:08.9118742Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9119195Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9119644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9120072Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9120460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9120895Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9121093Z 2025-08-14T21:44:08.9121255Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9121687Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9122041Z return mod(**inputs) 2025-08-14T21:44:08.9122450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9122875Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9123281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9123677Z outputs = block( 2025-08-14T21:44:08.9124036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9124436Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9124847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9125263Z return func(*args, **kwargs) 2025-08-14T21:44:08.9125671Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9126115Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9126561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:44:08.9126993Z hidden_states = self.act(hidden_states) 2025-08-14T21:44:08.9127379Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:08.9127886Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:08.9128147Z 2025-08-14T21:44:08.9128264Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9128663Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9129091Z return mod(**inputs) 2025-08-14T21:44:08.9129477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9129909Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9130295Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9130660Z outputs = block( 2025-08-14T21:44:08.9130983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9131343Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9131722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9132086Z return func(*args, **kwargs) 2025-08-14T21:44:08.9132456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9132874Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9133270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9133666Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9134027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9134446Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9134625Z 2025-08-14T21:44:08.9134734Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9135115Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9135460Z return mod(**inputs) 2025-08-14T21:44:08.9135833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9136388Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9136800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9137170Z outputs = block( 2025-08-14T21:44:08.9137481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9137842Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9138222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9138592Z return func(*args, **kwargs) 2025-08-14T21:44:08.9138956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9139380Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9139813Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9140224Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9140606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9141024Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9141203Z 2025-08-14T21:44:08.9141322Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9141697Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9142043Z return mod(**inputs) 2025-08-14T21:44:08.9142424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9142840Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9143241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9143639Z outputs = block( 2025-08-14T21:44:08.9143980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9144354Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9144752Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9145159Z return func(*args, **kwargs) 2025-08-14T21:44:08.9145543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9145953Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9146361Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9146749Z return func(*args, **kwargs) 2025-08-14T21:44:08.9147132Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9147650Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9148134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9148553Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9148732Z 2025-08-14T21:44:08.9148840Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9149226Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9149568Z return mod(**inputs) 2025-08-14T21:44:08.9149950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9150359Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9150859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9151257Z outputs = block( 2025-08-14T21:44:08.9151597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9151991Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9152397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9152797Z return func(*args, **kwargs) 2025-08-14T21:44:08.9153183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9153609Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9154024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9154430Z return func(*args, **kwargs) 2025-08-14T21:44:08.9154820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9155344Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9155837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9156255Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9156462Z 2025-08-14T21:44:08.9156556Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9156801Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9157038Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9157266Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9157529Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9157939Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9158301Z return mod(**inputs) 2025-08-14T21:44:08.9158709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9159162Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9159602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9160013Z outputs = block( 2025-08-14T21:44:08.9160380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9160787Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9161209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9161633Z return func(*args, **kwargs) 2025-08-14T21:44:08.9162052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9162502Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9162935Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9163346Z return func(*args, **kwargs) 2025-08-14T21:44:08.9163755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9164207Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9164697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:44:08.9165227Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:44:08.9165438Z 2025-08-14T21:44:08.9165565Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9166028Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9166388Z return mod(**inputs) 2025-08-14T21:44:08.9166780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9167214Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9167629Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9168036Z outputs = block( 2025-08-14T21:44:08.9168384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9168863Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9169288Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9169714Z return func(*args, **kwargs) 2025-08-14T21:44:08.9170130Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9170568Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9170998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9171416Z return func(*args, **kwargs) 2025-08-14T21:44:08.9171812Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9172269Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9172758Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:44:08.9173264Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:44:08.9173444Z 2025-08-14T21:44:08.9173556Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9173961Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9174317Z return mod(**inputs) 2025-08-14T21:44:08.9174712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9175154Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9175579Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9175999Z outputs = block( 2025-08-14T21:44:08.9176346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9176755Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9177162Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9177571Z return func(*args, **kwargs) 2025-08-14T21:44:08.9177962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9178388Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9178802Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9179194Z return func(*args, **kwargs) 2025-08-14T21:44:08.9179598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9180035Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9180432Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9180862Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9181059Z 2025-08-14T21:44:08.9181174Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9181654Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9182004Z return mod(**inputs) 2025-08-14T21:44:08.9182377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9182795Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9183197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9183581Z outputs = block( 2025-08-14T21:44:08.9183929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9184307Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9184710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9185096Z return func(*args, **kwargs) 2025-08-14T21:44:08.9185485Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9185899Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9186299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9186687Z return func(*args, **kwargs) 2025-08-14T21:44:08.9187073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9187481Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9187850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9188269Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9188448Z 2025-08-14T21:44:08.9188570Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9188959Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9189298Z return mod(**inputs) 2025-08-14T21:44:08.9189677Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9190096Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9190494Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9190888Z outputs = block( 2025-08-14T21:44:08.9191224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9191601Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9191992Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9192398Z return func(*args, **kwargs) 2025-08-14T21:44:08.9192801Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9193226Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9193655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9194066Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9194441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9194852Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9195038Z 2025-08-14T21:44:08.9195147Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9195532Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9195918Z return mod(**inputs) 2025-08-14T21:44:08.9196327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9196749Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9197159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9197600Z outputs = block( 2025-08-14T21:44:08.9197921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9198289Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9198685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9199068Z return func(*args, **kwargs) 2025-08-14T21:44:08.9199455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9199891Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9200320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9200708Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9201062Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9201456Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9201636Z 2025-08-14T21:44:08.9201749Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9202133Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9202475Z return mod(**inputs) 2025-08-14T21:44:08.9202990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9203432Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9203855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9204262Z outputs = block( 2025-08-14T21:44:08.9204605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9205001Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9205409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9205815Z return func(*args, **kwargs) 2025-08-14T21:44:08.9206208Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9206648Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9207093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:44:08.9207512Z hidden_states = self.act(hidden_states) 2025-08-14T21:44:08.9207888Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:08.9208377Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:08.9208624Z 2025-08-14T21:44:08.9208796Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9209191Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9209549Z return mod(**inputs) 2025-08-14T21:44:08.9209953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9210369Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9210771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9211322Z outputs = block( 2025-08-14T21:44:08.9211668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9212041Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9212443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9212834Z return func(*args, **kwargs) 2025-08-14T21:44:08.9213222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9213645Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9214070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9214162Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9214408Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9214533Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9214537Z 2025-08-14T21:44:08.9214654Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9214865Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9214936Z return mod(**inputs) 2025-08-14T21:44:08.9215204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9215291Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9215547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9215622Z outputs = block( 2025-08-14T21:44:08.9215851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9215948Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9216197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9216271Z return func(*args, **kwargs) 2025-08-14T21:44:08.9216535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9216641Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9216908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9216998Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9217224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9217357Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9217364Z 2025-08-14T21:44:08.9217476Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9217685Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9217763Z return mod(**inputs) 2025-08-14T21:44:08.9218026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9218121Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9218375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9218441Z outputs = block( 2025-08-14T21:44:08.9218677Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9218762Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9219034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9219165Z return func(*args, **kwargs) 2025-08-14T21:44:08.9219423Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 442, in forward 2025-08-14T21:44:08.9219544Z hidden_states = residual + feed_forward_hidden_states 2025-08-14T21:44:08.9219548Z 2025-08-14T21:44:08.9219658Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9219868Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9219945Z return mod(**inputs) 2025-08-14T21:44:08.9220207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9220302Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9220604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9220676Z outputs = block( 2025-08-14T21:44:08.9220917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9221001Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9221256Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9221339Z return func(*args, **kwargs) 2025-08-14T21:44:08.9221601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9221703Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9221958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9222031Z return func(*args, **kwargs) 2025-08-14T21:44:08.9222298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9222499Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9222738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9222862Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9222866Z 2025-08-14T21:44:08.9222974Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9223192Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9223260Z return mod(**inputs) 2025-08-14T21:44:08.9223522Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9223618Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9223883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9223957Z outputs = block( 2025-08-14T21:44:08.9224190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9224274Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9224535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9224606Z return func(*args, **kwargs) 2025-08-14T21:44:08.9224866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9224968Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9225222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9225323Z return func(*args, **kwargs) 2025-08-14T21:44:08.9225633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9225826Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9226064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9226188Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9226191Z 2025-08-14T21:44:08.9226285Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9226372Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9226453Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9226538Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9226646Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9226890Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9226968Z return mod(**inputs) 2025-08-14T21:44:08.9227236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9227329Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9227588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9227652Z outputs = block( 2025-08-14T21:44:08.9227890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9227975Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9228231Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9228310Z return func(*args, **kwargs) 2025-08-14T21:44:08.9228570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9228674Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9228929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9229002Z return func(*args, **kwargs) 2025-08-14T21:44:08.9229266Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9229372Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9229681Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:44:08.9229826Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:44:08.9229829Z 2025-08-14T21:44:08.9229937Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9230154Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9230228Z return mod(**inputs) 2025-08-14T21:44:08.9230496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9230593Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9230852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9230926Z outputs = block( 2025-08-14T21:44:08.9231156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9231237Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9231498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9231571Z return func(*args, **kwargs) 2025-08-14T21:44:08.9231882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9232008Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9232261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9232342Z return func(*args, **kwargs) 2025-08-14T21:44:08.9232600Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9232702Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9233016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:44:08.9233138Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:44:08.9233142Z 2025-08-14T21:44:08.9233257Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9233472Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9233541Z return mod(**inputs) 2025-08-14T21:44:08.9233811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9233899Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9234157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9234231Z outputs = block( 2025-08-14T21:44:08.9234462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9234553Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9234806Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9234878Z return func(*args, **kwargs) 2025-08-14T21:44:08.9235155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9235243Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9235479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9235555Z return func(*args, **kwargs) 2025-08-14T21:44:08.9235799Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9235887Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9236102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9236217Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9236221Z 2025-08-14T21:44:08.9236336Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9236551Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9236626Z return mod(**inputs) 2025-08-14T21:44:08.9236887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9236973Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9237236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9237301Z outputs = block( 2025-08-14T21:44:08.9237533Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9237631Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9237870Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9237943Z return func(*args, **kwargs) 2025-08-14T21:44:08.9238251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9238340Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9238588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9238655Z return func(*args, **kwargs) 2025-08-14T21:44:08.9238898Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9238989Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9239204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9239330Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9239333Z 2025-08-14T21:44:08.9239440Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9239655Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9239733Z return mod(**inputs) 2025-08-14T21:44:08.9240002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9240092Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9240335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9240396Z outputs = block( 2025-08-14T21:44:08.9240622Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9240702Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9240941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9241017Z return func(*args, **kwargs) 2025-08-14T21:44:08.9241270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9241388Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9241644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9241730Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9241962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9242082Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9242085Z 2025-08-14T21:44:08.9242199Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9242403Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9242472Z return mod(**inputs) 2025-08-14T21:44:08.9242747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9242833Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9243090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9243161Z outputs = block( 2025-08-14T21:44:08.9243387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9243474Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9243727Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9243796Z return func(*args, **kwargs) 2025-08-14T21:44:08.9244059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9244208Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9244513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9244610Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9244842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9244973Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9244976Z 2025-08-14T21:44:08.9245085Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9245302Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9245381Z return mod(**inputs) 2025-08-14T21:44:08.9245651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9245748Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9246021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9246088Z outputs = block( 2025-08-14T21:44:08.9246334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9246420Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9246682Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9246764Z return func(*args, **kwargs) 2025-08-14T21:44:08.9247032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9247149Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9247415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:44:08.9247503Z hidden_states = self.act(hidden_states) 2025-08-14T21:44:08.9247745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:08.9247946Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:08.9247950Z 2025-08-14T21:44:08.9248068Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9248286Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9248356Z return mod(**inputs) 2025-08-14T21:44:08.9248633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9248813Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9249088Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9249170Z outputs = block( 2025-08-14T21:44:08.9249411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9249506Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9249775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9249849Z return func(*args, **kwargs) 2025-08-14T21:44:08.9250117Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9250225Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9250484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9250581Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9250833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9251046Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9251050Z 2025-08-14T21:44:08.9251153Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9251349Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9251422Z return mod(**inputs) 2025-08-14T21:44:08.9251668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9251759Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9252002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9252063Z outputs = block( 2025-08-14T21:44:08.9252296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9252379Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9252611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9252686Z return func(*args, **kwargs) 2025-08-14T21:44:08.9252920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9253028Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9253262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9253346Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9253561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9253674Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9253680Z 2025-08-14T21:44:08.9253791Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9253980Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9254049Z return mod(**inputs) 2025-08-14T21:44:08.9254318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9254405Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9254661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9254735Z outputs = block( 2025-08-14T21:44:08.9254966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9255055Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9255305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9255382Z return func(*args, **kwargs) 2025-08-14T21:44:08.9255643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9255737Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9255989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9256067Z return func(*args, **kwargs) 2025-08-14T21:44:08.9256307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9256497Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9256710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9256862Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9256865Z 2025-08-14T21:44:08.9257015Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9257213Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9257286Z return mod(**inputs) 2025-08-14T21:44:08.9257538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9257620Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9257868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9257930Z outputs = block( 2025-08-14T21:44:08.9258150Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9258235Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9258482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9258558Z return func(*args, **kwargs) 2025-08-14T21:44:08.9258804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9258890Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9259138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9259206Z return func(*args, **kwargs) 2025-08-14T21:44:08.9259456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9259642Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9259870Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9260005Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9260009Z 2025-08-14T21:44:08.9260094Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9260178Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9260267Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9260347Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9260461Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9260672Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9260742Z return mod(**inputs) 2025-08-14T21:44:08.9261014Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9261101Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9261364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9261443Z outputs = block( 2025-08-14T21:44:08.9261672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9261754Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9261990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9262057Z return func(*args, **kwargs) 2025-08-14T21:44:08.9262306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9262390Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9262627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9262701Z return func(*args, **kwargs) 2025-08-14T21:44:08.9262999Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9263122Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9263409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:44:08.9263538Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:44:08.9263542Z 2025-08-14T21:44:08.9263653Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9263850Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9263926Z return mod(**inputs) 2025-08-14T21:44:08.9264177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9264257Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9264508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9264576Z outputs = block( 2025-08-14T21:44:08.9264806Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9264898Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9265151Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9265232Z return func(*args, **kwargs) 2025-08-14T21:44:08.9265489Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9265581Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9265841Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9265913Z return func(*args, **kwargs) 2025-08-14T21:44:08.9266185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9266287Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9266587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:44:08.9266704Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:44:08.9266707Z 2025-08-14T21:44:08.9266809Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9267007Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9267082Z return mod(**inputs) 2025-08-14T21:44:08.9267331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9267424Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9267672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9267735Z outputs = block( 2025-08-14T21:44:08.9267960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9268040Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9268287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9268362Z return func(*args, **kwargs) 2025-08-14T21:44:08.9268599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9268691Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9268927Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9269028Z return func(*args, **kwargs) 2025-08-14T21:44:08.9269309Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9269392Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9269626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9269749Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9269753Z 2025-08-14T21:44:08.9269862Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9270077Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9270145Z return mod(**inputs) 2025-08-14T21:44:08.9270406Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9270502Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9270764Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9270837Z outputs = block( 2025-08-14T21:44:08.9271067Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9271150Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9271421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9271488Z return func(*args, **kwargs) 2025-08-14T21:44:08.9271729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9271822Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9272061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9272138Z return func(*args, **kwargs) 2025-08-14T21:44:08.9272383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9272463Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9272695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9272817Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9272820Z 2025-08-14T21:44:08.9272934Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9273140Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9273216Z return mod(**inputs) 2025-08-14T21:44:08.9273465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9273545Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9273794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9273864Z outputs = block( 2025-08-14T21:44:08.9274082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9274167Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9274404Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9274471Z return func(*args, **kwargs) 2025-08-14T21:44:08.9274722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9274825Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9275082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9275208Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9275468Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9275600Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9275604Z 2025-08-14T21:44:08.9275712Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9275922Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9276001Z return mod(**inputs) 2025-08-14T21:44:08.9276267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9276373Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9276626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9276693Z outputs = block( 2025-08-14T21:44:08.9276936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9277019Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9277269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9277349Z return func(*args, **kwargs) 2025-08-14T21:44:08.9277603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9277728Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9277969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9278049Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9278268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9278387Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9278390Z 2025-08-14T21:44:08.9278499Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9278692Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9278757Z return mod(**inputs) 2025-08-14T21:44:08.9279008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9279090Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9279333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9279402Z outputs = block( 2025-08-14T21:44:08.9279630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9279721Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9279974Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9280045Z return func(*args, **kwargs) 2025-08-14T21:44:08.9280307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9280414Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9280667Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:44:08.9280760Z hidden_states = self.act(hidden_states) 2025-08-14T21:44:08.9280980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:08.9281175Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:08.9281218Z 2025-08-14T21:44:08.9281327Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9281570Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9281648Z return mod(**inputs) 2025-08-14T21:44:08.9281912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9282003Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9282263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9282329Z outputs = block( 2025-08-14T21:44:08.9282566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9282649Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9282902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9282987Z return func(*args, **kwargs) 2025-08-14T21:44:08.9283246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9283362Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9283619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9283709Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9283944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9284065Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9284069Z 2025-08-14T21:44:08.9284184Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9284397Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9284470Z return mod(**inputs) 2025-08-14T21:44:08.9284747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9284833Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9285094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9285169Z outputs = block( 2025-08-14T21:44:08.9285399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9285489Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9285754Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9285829Z return func(*args, **kwargs) 2025-08-14T21:44:08.9286104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9286220Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9286505Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9286598Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9286841Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9286973Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9286976Z 2025-08-14T21:44:08.9287089Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9287320Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9287396Z return mod(**inputs) 2025-08-14T21:44:08.9287666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9287840Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9288112Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9288180Z outputs = block( 2025-08-14T21:44:08.9288423Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9288507Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9288852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9288940Z return func(*args, **kwargs) 2025-08-14T21:44:08.9289211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 442, in forward 2025-08-14T21:44:08.9289334Z hidden_states = residual + feed_forward_hidden_states 2025-08-14T21:44:08.9289342Z 2025-08-14T21:44:08.9289454Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9289670Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9289751Z return mod(**inputs) 2025-08-14T21:44:08.9290019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9290119Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9290392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9290460Z outputs = block( 2025-08-14T21:44:08.9290698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9290782Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9291033Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9291120Z return func(*args, **kwargs) 2025-08-14T21:44:08.9291380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9291482Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9291732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9291803Z return func(*args, **kwargs) 2025-08-14T21:44:08.9292066Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9292259Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9292491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9292613Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9292619Z 2025-08-14T21:44:08.9292729Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9292949Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9293021Z return mod(**inputs) 2025-08-14T21:44:08.9293289Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9293387Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9293657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9293732Z outputs = block( 2025-08-14T21:44:08.9293961Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9294045Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9294334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9294460Z return func(*args, **kwargs) 2025-08-14T21:44:08.9294719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9294821Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9295072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9295152Z return func(*args, **kwargs) 2025-08-14T21:44:08.9295409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9295598Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9295835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9295961Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9295967Z 2025-08-14T21:44:08.9296058Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9296142Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9296223Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9296309Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9296415Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9296623Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9296700Z return mod(**inputs) 2025-08-14T21:44:08.9296962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9297054Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9297314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9297382Z outputs = block( 2025-08-14T21:44:08.9297626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9297710Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9297964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9298044Z return func(*args, **kwargs) 2025-08-14T21:44:08.9298302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9298401Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9298653Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9298726Z return func(*args, **kwargs) 2025-08-14T21:44:08.9298989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9299095Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9299401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:44:08.9299550Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:44:08.9299553Z 2025-08-14T21:44:08.9299659Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9299873Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9299942Z return mod(**inputs) 2025-08-14T21:44:08.9300202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9300295Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9300551Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9300694Z outputs = block( 2025-08-14T21:44:08.9300925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9301008Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9301266Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9301337Z return func(*args, **kwargs) 2025-08-14T21:44:08.9301590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9301690Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9301939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9302019Z return func(*args, **kwargs) 2025-08-14T21:44:08.9302278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9302383Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9302880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:44:08.9303007Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:44:08.9303011Z 2025-08-14T21:44:08.9303126Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9303334Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9303404Z return mod(**inputs) 2025-08-14T21:44:08.9303672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9303760Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9304021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9304097Z outputs = block( 2025-08-14T21:44:08.9304327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9304418Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9304671Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9304743Z return func(*args, **kwargs) 2025-08-14T21:44:08.9305004Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9305095Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9305349Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9305431Z return func(*args, **kwargs) 2025-08-14T21:44:08.9305700Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9305789Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9306003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9306118Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9306121Z 2025-08-14T21:44:08.9306229Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9306426Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9306496Z return mod(**inputs) 2025-08-14T21:44:08.9306743Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9306824Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9307187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9307277Z outputs = block( 2025-08-14T21:44:08.9307501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9307594Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9307849Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9307928Z return func(*args, **kwargs) 2025-08-14T21:44:08.9308187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9308279Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9308540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9308614Z return func(*args, **kwargs) 2025-08-14T21:44:08.9308873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9308968Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9309197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9309327Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9309330Z 2025-08-14T21:44:08.9309446Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9309644Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9309718Z return mod(**inputs) 2025-08-14T21:44:08.9309963Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9310054Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9310301Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9310362Z outputs = block( 2025-08-14T21:44:08.9310598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9310680Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9310933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9311013Z return func(*args, **kwargs) 2025-08-14T21:44:08.9311269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9311387Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9311644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9311733Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9311969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9312099Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9312103Z 2025-08-14T21:44:08.9312216Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9312424Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9312492Z return mod(**inputs) 2025-08-14T21:44:08.9312767Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9312855Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9313120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9313195Z outputs = block( 2025-08-14T21:44:08.9313503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9313598Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9313856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9313930Z return func(*args, **kwargs) 2025-08-14T21:44:08.9314203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9314317Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9314581Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9314673Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9314903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9315046Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9315052Z 2025-08-14T21:44:08.9315152Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9315347Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9315421Z return mod(**inputs) 2025-08-14T21:44:08.9315672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9315763Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9316005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9316066Z outputs = block( 2025-08-14T21:44:08.9316291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9316372Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9316629Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9316707Z return func(*args, **kwargs) 2025-08-14T21:44:08.9316963Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9317076Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9317330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:44:08.9317413Z hidden_states = self.act(hidden_states) 2025-08-14T21:44:08.9317643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:08.9317831Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:08.9317835Z 2025-08-14T21:44:08.9317949Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9318159Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9318228Z return mod(**inputs) 2025-08-14T21:44:08.9318497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9318582Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9318849Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9318916Z outputs = block( 2025-08-14T21:44:08.9319131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9319216Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9319457Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9319563Z return func(*args, **kwargs) 2025-08-14T21:44:08.9319861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9319970Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9320227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9320323Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9320546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9320674Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9320677Z 2025-08-14T21:44:08.9320783Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9320988Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9321067Z return mod(**inputs) 2025-08-14T21:44:08.9321330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9321423Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9321679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9321745Z outputs = block( 2025-08-14T21:44:08.9321980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9322063Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9322313Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9322392Z return func(*args, **kwargs) 2025-08-14T21:44:08.9322648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9322770Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9323029Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9323118Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9323352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9323474Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9323478Z 2025-08-14T21:44:08.9323590Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9323799Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9323870Z return mod(**inputs) 2025-08-14T21:44:08.9324150Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9324243Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9324507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9324584Z outputs = block( 2025-08-14T21:44:08.9324821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9324914Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9325183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9325258Z return func(*args, **kwargs) 2025-08-14T21:44:08.9325534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9325628Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9325893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9326052Z return func(*args, **kwargs) 2025-08-14T21:44:08.9326336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9326545Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9326787Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9326911Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9326915Z 2025-08-14T21:44:08.9327031Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9327255Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9327331Z return mod(**inputs) 2025-08-14T21:44:08.9327605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9327699Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9327984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9328048Z outputs = block( 2025-08-14T21:44:08.9328285Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9328378Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9328644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9328791Z return func(*args, **kwargs) 2025-08-14T21:44:08.9329074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9329167Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9329460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9329535Z return func(*args, **kwargs) 2025-08-14T21:44:08.9329819Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9330008Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9330239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9330372Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9330375Z 2025-08-14T21:44:08.9330462Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9330547Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9330639Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9330720Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9330841Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9331053Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9331125Z return mod(**inputs) 2025-08-14T21:44:08.9331398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9331486Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9331744Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9331818Z outputs = block( 2025-08-14T21:44:08.9332050Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9332140Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9332394Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9332507Z return func(*args, **kwargs) 2025-08-14T21:44:08.9332804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9332897Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9333149Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9333227Z return func(*args, **kwargs) 2025-08-14T21:44:08.9333481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9333589Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9333892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:44:08.9334032Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:44:08.9334039Z 2025-08-14T21:44:08.9334157Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9334369Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9334442Z return mod(**inputs) 2025-08-14T21:44:08.9334705Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9334790Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9335054Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9335117Z outputs = block( 2025-08-14T21:44:08.9335348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9335438Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9335687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9335771Z return func(*args, **kwargs) 2025-08-14T21:44:08.9336028Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9336119Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9336377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9336448Z return func(*args, **kwargs) 2025-08-14T21:44:08.9336709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9336811Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9337115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:44:08.9337244Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:44:08.9337248Z 2025-08-14T21:44:08.9337360Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9337569Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9337644Z return mod(**inputs) 2025-08-14T21:44:08.9337902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9337995Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9338251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9338315Z outputs = block( 2025-08-14T21:44:08.9338554Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9338638Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9339327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9339411Z return func(*args, **kwargs) 2025-08-14T21:44:08.9339672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9339773Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9340028Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9340104Z return func(*args, **kwargs) 2025-08-14T21:44:08.9340371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9340455Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9340680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9340805Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9340808Z 2025-08-14T21:44:08.9340916Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9341125Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9341194Z return mod(**inputs) 2025-08-14T21:44:08.9341445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9341540Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9341783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9341857Z outputs = block( 2025-08-14T21:44:08.9342091Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9342178Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9342450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9342535Z return func(*args, **kwargs) 2025-08-14T21:44:08.9342779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9342878Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9343119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9343195Z return func(*args, **kwargs) 2025-08-14T21:44:08.9343439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9343522Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9343749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9343877Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9343884Z 2025-08-14T21:44:08.9344001Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9344210Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9344281Z return mod(**inputs) 2025-08-14T21:44:08.9344551Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9344643Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9344902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9344979Z outputs = block( 2025-08-14T21:44:08.9345212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9345305Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9345631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9345705Z return func(*args, **kwargs) 2025-08-14T21:44:08.9345970Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9346081Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9346342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9346424Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9346649Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9346776Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9346779Z 2025-08-14T21:44:08.9346886Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9347099Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9347178Z return mod(**inputs) 2025-08-14T21:44:08.9347439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9347544Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9347781Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9347842Z outputs = block( 2025-08-14T21:44:08.9348065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9348144Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9348383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9348460Z return func(*args, **kwargs) 2025-08-14T21:44:08.9348704Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9348814Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9349054Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9349134Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9349354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9349472Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9349476Z 2025-08-14T21:44:08.9349589Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9349797Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9349866Z return mod(**inputs) 2025-08-14T21:44:08.9350138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9350223Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9350474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9350547Z outputs = block( 2025-08-14T21:44:08.9350776Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9350866Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9351116Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9351199Z return func(*args, **kwargs) 2025-08-14T21:44:08.9351448Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9351617Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9351897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:44:08.9351986Z hidden_states = self.act(hidden_states) 2025-08-14T21:44:08.9352195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:08.9352385Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:08.9352389Z 2025-08-14T21:44:08.9352494Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9352702Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9352778Z return mod(**inputs) 2025-08-14T21:44:08.9353037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9353132Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9353391Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9353456Z outputs = block( 2025-08-14T21:44:08.9353692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9353775Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9354028Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9354106Z return func(*args, **kwargs) 2025-08-14T21:44:08.9354361Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9354476Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9354733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9354829Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9355063Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9355186Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9355189Z 2025-08-14T21:44:08.9355303Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9355510Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9355578Z return mod(**inputs) 2025-08-14T21:44:08.9355845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9355933Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9356188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9356264Z outputs = block( 2025-08-14T21:44:08.9356498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9356590Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9356844Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9356916Z return func(*args, **kwargs) 2025-08-14T21:44:08.9357179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9357284Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9357546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9357636Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9357879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9358081Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9358085Z 2025-08-14T21:44:08.9358193Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9358399Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9358475Z return mod(**inputs) 2025-08-14T21:44:08.9358734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9358830Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9359085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9359152Z outputs = block( 2025-08-14T21:44:08.9359393Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9359480Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9359735Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9359815Z return func(*args, **kwargs) 2025-08-14T21:44:08.9360071Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 442, in forward 2025-08-14T21:44:08.9360192Z hidden_states = residual + feed_forward_hidden_states 2025-08-14T21:44:08.9360195Z 2025-08-14T21:44:08.9360303Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9360511Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9360587Z return mod(**inputs) 2025-08-14T21:44:08.9360848Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9360951Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9361209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9361277Z outputs = block( 2025-08-14T21:44:08.9361514Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9361597Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9361849Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9361927Z return func(*args, **kwargs) 2025-08-14T21:44:08.9362183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9362284Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9362535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9362609Z return func(*args, **kwargs) 2025-08-14T21:44:08.9362875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9363071Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9363306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9363428Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9363431Z 2025-08-14T21:44:08.9363537Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9363751Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9363819Z return mod(**inputs) 2025-08-14T21:44:08.9364080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9364239Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9364501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9364572Z outputs = block( 2025-08-14T21:44:08.9364805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9364888Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9365147Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9365218Z return func(*args, **kwargs) 2025-08-14T21:44:08.9365473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9365574Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9365838Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9365920Z return func(*args, **kwargs) 2025-08-14T21:44:08.9366190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9366384Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9366625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9366748Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9366751Z 2025-08-14T21:44:08.9366848Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9366934Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9367017Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9367104Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9367216Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9367435Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9367513Z return mod(**inputs) 2025-08-14T21:44:08.9367787Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9367884Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9368160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9368229Z outputs = block( 2025-08-14T21:44:08.9368473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9368558Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9368908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9369003Z return func(*args, **kwargs) 2025-08-14T21:44:08.9369279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9369381Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9369650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9369724Z return func(*args, **kwargs) 2025-08-14T21:44:08.9370039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9370148Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9370465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:44:08.9370619Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:44:08.9370663Z 2025-08-14T21:44:08.9370777Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9371039Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9371113Z return mod(**inputs) 2025-08-14T21:44:08.9371383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9371483Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9371755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9371830Z outputs = block( 2025-08-14T21:44:08.9372067Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9372153Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9372426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9372508Z return func(*args, **kwargs) 2025-08-14T21:44:08.9372790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9372891Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9373157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9373239Z return func(*args, **kwargs) 2025-08-14T21:44:08.9373512Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9373620Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9373942Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:44:08.9374065Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:44:08.9374072Z 2025-08-14T21:44:08.9374194Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9374410Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9374480Z return mod(**inputs) 2025-08-14T21:44:08.9374768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9374857Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9375119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9375194Z outputs = block( 2025-08-14T21:44:08.9375431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9375527Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9375785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9375864Z return func(*args, **kwargs) 2025-08-14T21:44:08.9376136Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9376230Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9376493Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9376574Z return func(*args, **kwargs) 2025-08-14T21:44:08.9376840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9376936Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9377168Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9377296Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9377344Z 2025-08-14T21:44:08.9377500Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9377715Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9377794Z return mod(**inputs) 2025-08-14T21:44:08.9378068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9378160Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9378435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9378502Z outputs = block( 2025-08-14T21:44:08.9378743Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9378837Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9379100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9379188Z return func(*args, **kwargs) 2025-08-14T21:44:08.9379455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9379548Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9379815Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9379889Z return func(*args, **kwargs) 2025-08-14T21:44:08.9380155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9380249Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9380481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9380614Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9380629Z 2025-08-14T21:44:08.9380742Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9380949Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9381026Z return mod(**inputs) 2025-08-14T21:44:08.9381290Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9381383Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9381638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9381703Z outputs = block( 2025-08-14T21:44:08.9381943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9382027Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9382292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9382371Z return func(*args, **kwargs) 2025-08-14T21:44:08.9382636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9398302Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9398813Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9398925Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9399175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9399321Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9399328Z 2025-08-14T21:44:08.9399451Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9399793Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9400003Z return mod(**inputs) 2025-08-14T21:44:08.9400289Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9400394Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9400664Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9400735Z outputs = block( 2025-08-14T21:44:08.9400981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9401070Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9401330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9401418Z return func(*args, **kwargs) 2025-08-14T21:44:08.9401688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9401809Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9402070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9402155Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9402400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9402536Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9402541Z 2025-08-14T21:44:08.9402985Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9403209Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9403282Z return mod(**inputs) 2025-08-14T21:44:08.9403564Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9403660Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9403918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9403996Z outputs = block( 2025-08-14T21:44:08.9404230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9404324Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9404579Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9404655Z return func(*args, **kwargs) 2025-08-14T21:44:08.9404921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9405035Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9405305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:44:08.9405403Z hidden_states = self.act(hidden_states) 2025-08-14T21:44:08.9405634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:08.9405841Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:08.9405845Z 2025-08-14T21:44:08.9405963Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9406181Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9406262Z return mod(**inputs) 2025-08-14T21:44:08.9406534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9406736Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9407116Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9407188Z outputs = block( 2025-08-14T21:44:08.9407438Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9407524Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9407798Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9407882Z return func(*args, **kwargs) 2025-08-14T21:44:08.9408155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9408272Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9408538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9408640Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9409143Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9409279Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9409284Z 2025-08-14T21:44:08.9409404Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9409620Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9409693Z return mod(**inputs) 2025-08-14T21:44:08.9409972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9410063Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9410338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9410423Z outputs = block( 2025-08-14T21:44:08.9410664Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9410755Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9411016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9411092Z return func(*args, **kwargs) 2025-08-14T21:44:08.9411367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9411479Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9411743Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9411848Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9412085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9412230Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9412234Z 2025-08-14T21:44:08.9412349Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9412564Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9412646Z return mod(**inputs) 2025-08-14T21:44:08.9412917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9413017Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9413289Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9413358Z outputs = block( 2025-08-14T21:44:08.9413601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9413730Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9414022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9414107Z return func(*args, **kwargs) 2025-08-14T21:44:08.9414381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9414486Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9414746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9414819Z return func(*args, **kwargs) 2025-08-14T21:44:08.9415091Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9415295Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9415543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9415667Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9415671Z 2025-08-14T21:44:08.9415786Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9416005Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9416075Z return mod(**inputs) 2025-08-14T21:44:08.9416342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9416448Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9416691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9416762Z outputs = block( 2025-08-14T21:44:08.9416979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9417063Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9417316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9417387Z return func(*args, **kwargs) 2025-08-14T21:44:08.9417641Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9417745Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9417993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9418073Z return func(*args, **kwargs) 2025-08-14T21:44:08.9418328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9418518Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9418759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9418880Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9418883Z 2025-08-14T21:44:08.9418980Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9419066Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9419154Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9419237Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9419339Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9419535Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9419611Z return mod(**inputs) 2025-08-14T21:44:08.9419859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9419985Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9420257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9420322Z outputs = block( 2025-08-14T21:44:08.9420552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9420633Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9420871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9420953Z return func(*args, **kwargs) 2025-08-14T21:44:08.9421210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9421312Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9421565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9421650Z return func(*args, **kwargs) 2025-08-14T21:44:08.9421904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9422004Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9422310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:44:08.9422459Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:44:08.9422463Z 2025-08-14T21:44:08.9422570Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9422787Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9422855Z return mod(**inputs) 2025-08-14T21:44:08.9423122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9423219Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9423486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9423556Z outputs = block( 2025-08-14T21:44:08.9423773Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9423853Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9424099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9424169Z return func(*args, **kwargs) 2025-08-14T21:44:08.9424412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9424509Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9424750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9424832Z return func(*args, **kwargs) 2025-08-14T21:44:08.9425077Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9425175Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9425488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:44:08.9425607Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:44:08.9425611Z 2025-08-14T21:44:08.9425728Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9425935Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9426005Z return mod(**inputs) 2025-08-14T21:44:08.9426273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9426431Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9426692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9426766Z outputs = block( 2025-08-14T21:44:08.9426999Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9427089Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9427351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9427421Z return func(*args, **kwargs) 2025-08-14T21:44:08.9427678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9427765Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9428013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9428091Z return func(*args, **kwargs) 2025-08-14T21:44:08.9428338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9428432Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9428662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9428788Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9428791Z 2025-08-14T21:44:08.9428910Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9429123Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9429199Z return mod(**inputs) 2025-08-14T21:44:08.9429467Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9429561Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9429834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9429900Z outputs = block( 2025-08-14T21:44:08.9430135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9430226Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9430484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9430559Z return func(*args, **kwargs) 2025-08-14T21:44:08.9430807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9430893Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9431150Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9431217Z return func(*args, **kwargs) 2025-08-14T21:44:08.9431464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9431553Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9431770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9431895Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9431898Z 2025-08-14T21:44:08.9432001Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9432203Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9432275Z return mod(**inputs) 2025-08-14T21:44:08.9432527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9432690Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9432935Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9432998Z outputs = block( 2025-08-14T21:44:08.9433226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9433306Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9433556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9433636Z return func(*args, **kwargs) 2025-08-14T21:44:08.9433893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9434009Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9434273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9434359Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9434596Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9434716Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9434719Z 2025-08-14T21:44:08.9434834Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9435043Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9435110Z return mod(**inputs) 2025-08-14T21:44:08.9435381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9435468Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9435731Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9435808Z outputs = block( 2025-08-14T21:44:08.9436040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9436131Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9436382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9436453Z return func(*args, **kwargs) 2025-08-14T21:44:08.9436716Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9436825Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9437084Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9437178Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9437410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9437537Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9437540Z 2025-08-14T21:44:08.9437647Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9437855Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9437930Z return mod(**inputs) 2025-08-14T21:44:08.9438190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9438283Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9438539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9438604Z outputs = block( 2025-08-14T21:44:08.9438914Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9438996Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9439247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9439324Z return func(*args, **kwargs) 2025-08-14T21:44:08.9439577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9439692Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9439947Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:44:08.9440030Z hidden_states = self.act(hidden_states) 2025-08-14T21:44:08.9440258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:08.9440455Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:08.9440460Z 2025-08-14T21:44:08.9440574Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9440785Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9440853Z return mod(**inputs) 2025-08-14T21:44:08.9441120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9441206Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9441462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9441535Z outputs = block( 2025-08-14T21:44:08.9441761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9441857Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9442109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9442181Z return func(*args, **kwargs) 2025-08-14T21:44:08.9442445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9442553Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9442808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9442908Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9443133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9443262Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9443266Z 2025-08-14T21:44:08.9443382Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9443597Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9443678Z return mod(**inputs) 2025-08-14T21:44:08.9443946Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9444040Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9444301Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9444369Z outputs = block( 2025-08-14T21:44:08.9444609Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9444694Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9444953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9445075Z return func(*args, **kwargs) 2025-08-14T21:44:08.9445372Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9445492Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9445752Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9445845Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9446086Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9446209Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9446212Z 2025-08-14T21:44:08.9446328Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9446541Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9446615Z return mod(**inputs) 2025-08-14T21:44:08.9446892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9446981Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9447241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9447316Z outputs = block( 2025-08-14T21:44:08.9447547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9447638Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9447895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9447969Z return func(*args, **kwargs) 2025-08-14T21:44:08.9448238Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 442, in forward 2025-08-14T21:44:08.9448362Z hidden_states = residual + feed_forward_hidden_states 2025-08-14T21:44:08.9448367Z 2025-08-14T21:44:08.9448488Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9448780Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9448860Z return mod(**inputs) 2025-08-14T21:44:08.9449136Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9449225Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9449491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9449571Z outputs = block( 2025-08-14T21:44:08.9449805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9449901Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9450170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9450245Z return func(*args, **kwargs) 2025-08-14T21:44:08.9450520Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9450616Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9450874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9450958Z return func(*args, **kwargs) 2025-08-14T21:44:08.9451220Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9451429Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9451689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9451866Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9451871Z 2025-08-14T21:44:08.9451995Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9452216Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9452297Z return mod(**inputs) 2025-08-14T21:44:08.9452565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9452656Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9452926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9452993Z outputs = block( 2025-08-14T21:44:08.9453228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9453326Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9453588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9453671Z return func(*args, **kwargs) 2025-08-14T21:44:08.9453932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9454025Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9454292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9454367Z return func(*args, **kwargs) 2025-08-14T21:44:08.9454627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 294, in forward 2025-08-14T21:44:08.9454830Z query_states, key_states, value_states = self.c_attn(hidden_states).split(self.split_size, dim=2) 2025-08-14T21:44:08.9455067Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9455197Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9455201Z 2025-08-14T21:44:08.9455289Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9455376Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9455470Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9455551Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9455668Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9455890Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9455959Z return mod(**inputs) 2025-08-14T21:44:08.9456234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9456325Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9456592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9456670Z outputs = block( 2025-08-14T21:44:08.9456905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9457000Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9457259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9457334Z return func(*args, **kwargs) 2025-08-14T21:44:08.9457604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9457698Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9457962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9458094Z return func(*args, **kwargs) 2025-08-14T21:44:08.9458449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9458563Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9458875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:44:08.9459016Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:44:08.9459020Z 2025-08-14T21:44:08.9459138Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9459352Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9459431Z return mod(**inputs) 2025-08-14T21:44:08.9459701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9459792Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9460067Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9460135Z outputs = block( 2025-08-14T21:44:08.9460370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9460466Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9460723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9460804Z return func(*args, **kwargs) 2025-08-14T21:44:08.9461068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9461159Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9461435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9461518Z return func(*args, **kwargs) 2025-08-14T21:44:08.9461782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 336, in forward 2025-08-14T21:44:08.9461895Z attn_output, attn_weights = attention_interface( 2025-08-14T21:44:08.9462205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:44:08.9462331Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:44:08.9462336Z 2025-08-14T21:44:08.9462445Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9462661Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9462741Z return mod(**inputs) 2025-08-14T21:44:08.9463007Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9463108Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9463373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9463441Z outputs = block( 2025-08-14T21:44:08.9463685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9463769Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9464025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9464106Z return func(*args, **kwargs) 2025-08-14T21:44:08.9464373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9464470Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9464717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9464856Z return func(*args, **kwargs) 2025-08-14T21:44:08.9465119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9465204Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9465428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9465559Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9465563Z 2025-08-14T21:44:08.9465670Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9465883Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9465949Z return mod(**inputs) 2025-08-14T21:44:08.9466207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9466305Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9466559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9466633Z outputs = block( 2025-08-14T21:44:08.9466861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9466944Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9467199Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9467269Z return func(*args, **kwargs) 2025-08-14T21:44:08.9467522Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 404, in forward 2025-08-14T21:44:08.9467619Z attn_output, self_attn_weights = self.attn( 2025-08-14T21:44:08.9467872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9467952Z return func(*args, **kwargs) 2025-08-14T21:44:08.9468205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 349, in forward 2025-08-14T21:44:08.9468290Z attn_output = self.c_proj(attn_output) 2025-08-14T21:44:08.9468520Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9468641Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9468644Z 2025-08-14T21:44:08.9468751Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9468964Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9469031Z return mod(**inputs) 2025-08-14T21:44:08.9469297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9469388Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9469649Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9469726Z outputs = block( 2025-08-14T21:44:08.9469959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9470053Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9470311Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9470383Z return func(*args, **kwargs) 2025-08-14T21:44:08.9470654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9470775Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9471102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9471196Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9471424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9471552Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9471555Z 2025-08-14T21:44:08.9471662Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9471869Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9471948Z return mod(**inputs) 2025-08-14T21:44:08.9472210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9472297Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9472562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9472636Z outputs = block( 2025-08-14T21:44:08.9472873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9472955Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9473206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9473286Z return func(*args, **kwargs) 2025-08-14T21:44:08.9473541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9473657Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9473914Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 365, in forward 2025-08-14T21:44:08.9473997Z hidden_states = self.c_fc(hidden_states) 2025-08-14T21:44:08.9474239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9474358Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9474361Z 2025-08-14T21:44:08.9474468Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9474685Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9474754Z return mod(**inputs) 2025-08-14T21:44:08.9475028Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9475115Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9475378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9475452Z outputs = block( 2025-08-14T21:44:08.9475689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9475785Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9476044Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9476116Z return func(*args, **kwargs) 2025-08-14T21:44:08.9476385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9476493Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9476756Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 366, in forward 2025-08-14T21:44:08.9476849Z hidden_states = self.act(hidden_states) 2025-08-14T21:44:08.9477076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:08.9477300Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:08.9477352Z 2025-08-14T21:44:08.9477464Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9477677Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9477756Z return mod(**inputs) 2025-08-14T21:44:08.9478025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9478130Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9478385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9478450Z outputs = block( 2025-08-14T21:44:08.9478687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9478771Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9479026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9479108Z return func(*args, **kwargs) 2025-08-14T21:44:08.9479362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9479476Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9479732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9479824Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9480061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9480183Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9480187Z 2025-08-14T21:44:08.9480295Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9480523Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9480593Z return mod(**inputs) 2025-08-14T21:44:08.9480868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1480, in forward 2025-08-14T21:44:08.9480956Z transformer_outputs = self.transformer( 2025-08-14T21:44:08.9481219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 917, in forward 2025-08-14T21:44:08.9481294Z outputs = block( 2025-08-14T21:44:08.9481528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:08.9481619Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:08.9481878Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:08.9481956Z return func(*args, **kwargs) 2025-08-14T21:44:08.9482229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 440, in forward 2025-08-14T21:44:08.9482338Z feed_forward_hidden_states = self.mlp(hidden_states) 2025-08-14T21:44:08.9482598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 367, in forward 2025-08-14T21:44:08.9482698Z hidden_states = self.c_proj(hidden_states) 2025-08-14T21:44:08.9482931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 116, in forward 2025-08-14T21:44:08.9483060Z x = torch.addmm(self.bias, x.view(-1, x.size(-1)), self.weight) 2025-08-14T21:44:08.9483064Z 2025-08-14T21:44:08.9483151Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9483236Z cudagraph partition due to non gpu ops 2025-08-14T21:44:08.9483353Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9483599Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9483719Z return mod(**inputs) 2025-08-14T21:44:08.9483995Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1537, in forward 2025-08-14T21:44:08.9484154Z loss = loss_fct(pooled_logits.view(-1, self.num_labels), labels.view(-1)) 2025-08-14T21:44:08.9484158Z 2025-08-14T21:44:08.9484273Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:08.9484486Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:08.9484555Z return mod(**inputs) 2025-08-14T21:44:08.9484829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1537, in forward 2025-08-14T21:44:08.9484981Z loss = loss_fct(pooled_logits.view(-1, self.num_labels), labels.view(-1)) 2025-08-14T21:44:08.9484986Z 2025-08-14T21:44:11.2366836Z Compilation time (from dynamo_timed): 27.963237731 2025-08-14T21:44:11.2367452Z pass 2025-08-14T21:44:11.2368548Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:44:11.2369731Z TIMING: _recursive_pre_grad_passes:0.09451 _recursive_joint_graph_passes:0.86963 _recursive_post_grad_passes:0.17749 async_compile.wait:0.91982 code_gen:12.10816 inductor_compile:14.98184 backend_compile:23.55148 gc:0.00072 entire_frame_compile:27.96324 total_wall_time:27.96324 2025-08-14T21:44:11.2370868Z STATS: call_* op count: 1142 | FakeTensorMode.__torch_dispatch__:45825 | FakeTensor.__torch_dispatch__:8203 | ProxyTorchDispatchMode.__torch_dispatch__:10247 2025-08-14T21:44:11.2371424Z Dynamo produced 2 graphs covering 1142 ops with 0 graph breaks (0 unique) 2025-08-14T21:44:17.3045594Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:44:17.3046587Z from pkg_resources import resource_filename 2025-08-14T21:44:17.9386865Z 2025-08-14T21:44:19.0652475Z loading model: 0it [00:00, ?it/s]WARNING:common:Model GoogleFnet supports float32 only 2025-08-14T21:44:19.2185530Z 2025-08-14T21:44:19.2186341Z loading model: 0it [00:01, ?it/s] 2025-08-14T21:44:19.2186867Z WARNING:common:Model GoogleFnet supports float32 only 2025-08-14T21:44:19.2206614Z cpu eval GoogleFnet 2025-08-14T21:44:19.7013868Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:44:19.7022403Z WARNING:common:Model GoogleFnet supports float32 only 2025-08-14T21:44:19.8686252Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:44:20.0389242Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:44:26.9013400Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9014025Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9014421Z return mod(**inputs) 2025-08-14T21:44:26.9014843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9015272Z outputs = self.fnet( 2025-08-14T21:44:26.9015657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9016080Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9016517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9016950Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9017712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9018367Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9018790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9019250Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9019697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9020144Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9020562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9021017Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9021187Z 2025-08-14T21:44:26.9021303Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9021706Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9022075Z return mod(**inputs) 2025-08-14T21:44:26.9022473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9022886Z outputs = self.fnet( 2025-08-14T21:44:26.9023278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9023693Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9024106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9024547Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9024952Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9025348Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9025766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9026204Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9026640Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9027064Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9027479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9027918Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9028088Z 2025-08-14T21:44:26.9028218Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9028603Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9028967Z return mod(**inputs) 2025-08-14T21:44:26.9029354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9029768Z outputs = self.fnet( 2025-08-14T21:44:26.9030145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9030562Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9030969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9031390Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9031788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9032181Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9032602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9033096Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9033621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9034071Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9034503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9034950Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9035140Z 2025-08-14T21:44:26.9035256Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9035654Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9036383Z return mod(**inputs) 2025-08-14T21:44:26.9036788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9037212Z outputs = self.fnet( 2025-08-14T21:44:26.9037592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9038000Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9038405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9038836Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9039218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9039606Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9040013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9040451Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9040868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9041285Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9041699Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9042137Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9042302Z 2025-08-14T21:44:26.9042410Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9042793Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9043148Z return mod(**inputs) 2025-08-14T21:44:26.9043528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9043950Z outputs = self.fnet( 2025-08-14T21:44:26.9044329Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9044750Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9045155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9045586Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9045986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9046374Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9046800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9047249Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9047685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9048111Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9048529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9050254Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9050451Z 2025-08-14T21:44:26.9050571Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9050944Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9051287Z return mod(**inputs) 2025-08-14T21:44:26.9051664Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9052064Z outputs = self.fnet( 2025-08-14T21:44:26.9052447Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9052870Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9053266Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9053681Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9054073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9054457Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9054855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9055282Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9055702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9056110Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9056741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9057169Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9057341Z 2025-08-14T21:44:26.9057454Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9057840Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9058186Z return mod(**inputs) 2025-08-14T21:44:26.9058731Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9059153Z outputs = self.fnet( 2025-08-14T21:44:26.9059535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9059960Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9060365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9060800Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9061194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9061602Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9062028Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9062478Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9062910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9063336Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9063758Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9064199Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9064374Z 2025-08-14T21:44:26.9064489Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9064885Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9065303Z return mod(**inputs) 2025-08-14T21:44:26.9065740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9066154Z outputs = self.fnet( 2025-08-14T21:44:26.9066538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9066944Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9067353Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9067783Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9068182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9068568Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9068984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9069429Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9069863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9070278Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9070696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9071139Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9071308Z 2025-08-14T21:44:26.9071420Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9071812Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9072166Z return mod(**inputs) 2025-08-14T21:44:26.9072547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9072950Z outputs = self.fnet( 2025-08-14T21:44:26.9073334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9073747Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9074147Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9074570Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9074970Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9075364Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9075774Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9076213Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9076655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9077078Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9077487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9077929Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9078098Z 2025-08-14T21:44:26.9078216Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9078597Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9078950Z return mod(**inputs) 2025-08-14T21:44:26.9079332Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9079736Z outputs = self.fnet( 2025-08-14T21:44:26.9080112Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9080586Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9080983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9081408Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9081805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9082213Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9082636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9083087Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9083531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9083969Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9084402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9084847Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9085025Z 2025-08-14T21:44:26.9085142Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9085541Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9085893Z return mod(**inputs) 2025-08-14T21:44:26.9086282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9086697Z outputs = self.fnet( 2025-08-14T21:44:26.9087086Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9087501Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9087918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9088354Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9088885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9089301Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9089719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9090160Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9090589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9091007Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9091414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9091877Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9092043Z 2025-08-14T21:44:26.9092156Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9092559Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9092917Z return mod(**inputs) 2025-08-14T21:44:26.9093303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9093768Z outputs = self.fnet( 2025-08-14T21:44:26.9094159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9094604Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9095010Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9095442Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9095920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9096294Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9096714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9097151Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9097569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9097996Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9098420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9098871Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9099031Z 2025-08-14T21:44:26.9099127Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9099379Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9099759Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9100097Z return mod(**inputs) 2025-08-14T21:44:26.9100481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9100885Z outputs = self.fnet( 2025-08-14T21:44:26.9101306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 512, in forward 2025-08-14T21:44:26.9101732Z embedding_output = self.embeddings( 2025-08-14T21:44:26.9102127Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 142, in forward 2025-08-14T21:44:26.9102541Z embeddings = self.projection(embeddings) 2025-08-14T21:44:26.9102994Z 2025-08-14T21:44:26.9103094Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9103361Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9103756Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9104111Z return mod(**inputs) 2025-08-14T21:44:26.9104498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9104906Z outputs = self.fnet( 2025-08-14T21:44:26.9105300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9105712Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9106149Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9106581Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9106967Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9107366Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9107778Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9108223Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9108660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9109105Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9109516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9109967Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9110130Z 2025-08-14T21:44:26.9110250Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9110624Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9111096Z return mod(**inputs) 2025-08-14T21:44:26.9111535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9111952Z outputs = self.fnet( 2025-08-14T21:44:26.9112328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9112746Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9113153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9113562Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9113951Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9114332Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9114731Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9115158Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9115582Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9116002Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9116424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9116846Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9117017Z 2025-08-14T21:44:26.9117127Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9117506Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9117843Z return mod(**inputs) 2025-08-14T21:44:26.9118215Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9118616Z outputs = self.fnet( 2025-08-14T21:44:26.9118984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9119385Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9119759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9120166Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9120544Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9120924Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9121332Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9121758Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9122179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9122595Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9123013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9123466Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9123633Z 2025-08-14T21:44:26.9123747Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9124143Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9124498Z return mod(**inputs) 2025-08-14T21:44:26.9124877Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9125292Z outputs = self.fnet( 2025-08-14T21:44:26.9125672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9126182Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9126585Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9127035Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9127432Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9127823Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9128229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9128666Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9129189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9129610Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9130032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9130489Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9130651Z 2025-08-14T21:44:26.9130745Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9130991Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9131370Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9131719Z return mod(**inputs) 2025-08-14T21:44:26.9132089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9132500Z outputs = self.fnet( 2025-08-14T21:44:26.9132958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9133391Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9133800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9134254Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9134652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9135027Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9135442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 252, in forward 2025-08-14T21:44:26.9135877Z layer_output = apply_chunking_to_forward( 2025-08-14T21:44:26.9136304Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 251, in apply_chunking_to_forward 2025-08-14T21:44:26.9136733Z return forward_fn(*input_tensors) 2025-08-14T21:44:26.9137182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 261, in feed_forward_chunk 2025-08-14T21:44:26.9137693Z intermediate_output = self.intermediate(fourier_output) 2025-08-14T21:44:26.9138140Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 220, in forward 2025-08-14T21:44:26.9138576Z hidden_states = self.intermediate_act_fn(hidden_states) 2025-08-14T21:44:26.9138980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:26.9139459Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:26.9139701Z 2025-08-14T21:44:26.9139795Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9140017Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9140271Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9140658Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9141047Z return mod(**inputs) 2025-08-14T21:44:26.9141461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9141863Z outputs = self.fnet( 2025-08-14T21:44:26.9142229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9142644Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9143041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9143483Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9143874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9144265Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9144673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9145113Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9145526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9145940Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9146347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9146794Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9146971Z 2025-08-14T21:44:26.9147084Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9147478Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9147826Z return mod(**inputs) 2025-08-14T21:44:26.9148187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9148588Z outputs = self.fnet( 2025-08-14T21:44:26.9148955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9149369Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9149758Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9150190Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9150584Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9150967Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9151384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9151896Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9152320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9152721Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9153123Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9153555Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9153717Z 2025-08-14T21:44:26.9153827Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9154204Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9154547Z return mod(**inputs) 2025-08-14T21:44:26.9154914Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9155299Z outputs = self.fnet( 2025-08-14T21:44:26.9155662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9156143Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9156537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9156943Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9157326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9157704Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9158098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9158520Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9158937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9159346Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9159746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9160175Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9160335Z 2025-08-14T21:44:26.9160453Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9160824Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9161166Z return mod(**inputs) 2025-08-14T21:44:26.9161537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9161938Z outputs = self.fnet( 2025-08-14T21:44:26.9162312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9162726Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9163139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9163567Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9163955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9164347Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9164761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9165200Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9165633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9166063Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9166476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9166910Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9167086Z 2025-08-14T21:44:26.9167176Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9167436Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9167819Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9168170Z return mod(**inputs) 2025-08-14T21:44:26.9168555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9169157Z outputs = self.fnet( 2025-08-14T21:44:26.9169533Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9169956Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9170351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9170831Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9171261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9171656Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9172074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 252, in forward 2025-08-14T21:44:26.9172508Z layer_output = apply_chunking_to_forward( 2025-08-14T21:44:26.9172953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 251, in apply_chunking_to_forward 2025-08-14T21:44:26.9173389Z return forward_fn(*input_tensors) 2025-08-14T21:44:26.9173838Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 261, in feed_forward_chunk 2025-08-14T21:44:26.9174328Z intermediate_output = self.intermediate(fourier_output) 2025-08-14T21:44:26.9174789Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 220, in forward 2025-08-14T21:44:26.9175241Z hidden_states = self.intermediate_act_fn(hidden_states) 2025-08-14T21:44:26.9175649Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:26.9176143Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:26.9176402Z 2025-08-14T21:44:26.9176491Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9176722Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9176973Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9177365Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9177722Z return mod(**inputs) 2025-08-14T21:44:26.9178113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9178531Z outputs = self.fnet( 2025-08-14T21:44:26.9178925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9179353Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9179762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9180202Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9180604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9180998Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9181417Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9181867Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9182315Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9182729Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9183157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9183605Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9183770Z 2025-08-14T21:44:26.9183890Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9184285Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9184638Z return mod(**inputs) 2025-08-14T21:44:26.9185022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9185437Z outputs = self.fnet( 2025-08-14T21:44:26.9185810Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9186372Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9186780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9187199Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9187583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9187963Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9188369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9188795Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9189216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9189623Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9190024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9190454Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9190623Z 2025-08-14T21:44:26.9190733Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9191114Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9191447Z return mod(**inputs) 2025-08-14T21:44:26.9191816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9192219Z outputs = self.fnet( 2025-08-14T21:44:26.9192595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9193010Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9193421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9193847Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9194239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9194628Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9195027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9195449Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9195861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9196271Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9196675Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9197098Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9197274Z 2025-08-14T21:44:26.9197387Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9197763Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9198107Z return mod(**inputs) 2025-08-14T21:44:26.9198471Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9198872Z outputs = self.fnet( 2025-08-14T21:44:26.9199252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9199670Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9200066Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9200489Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9200961Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9201335Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9201739Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9202176Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9202743Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9203173Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9203594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9204039Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9204205Z 2025-08-14T21:44:26.9204303Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9204563Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9204958Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9205315Z return mod(**inputs) 2025-08-14T21:44:26.9205692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9206113Z outputs = self.fnet( 2025-08-14T21:44:26.9206496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9206922Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9207319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9207756Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9208152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9208546Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9209022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 252, in forward 2025-08-14T21:44:26.9209467Z layer_output = apply_chunking_to_forward( 2025-08-14T21:44:26.9209906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 251, in apply_chunking_to_forward 2025-08-14T21:44:26.9210331Z return forward_fn(*input_tensors) 2025-08-14T21:44:26.9210773Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 261, in feed_forward_chunk 2025-08-14T21:44:26.9211270Z intermediate_output = self.intermediate(fourier_output) 2025-08-14T21:44:26.9211718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 220, in forward 2025-08-14T21:44:26.9212178Z hidden_states = self.intermediate_act_fn(hidden_states) 2025-08-14T21:44:26.9212600Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:26.9213087Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:26.9213333Z 2025-08-14T21:44:26.9213424Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9213657Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9213916Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9214306Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9214653Z return mod(**inputs) 2025-08-14T21:44:26.9215041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9215448Z outputs = self.fnet( 2025-08-14T21:44:26.9215822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9216406Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9216830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9217325Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9217713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9218107Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9218526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9218971Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9219416Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9219842Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9220313Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9220764Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9220942Z 2025-08-14T21:44:26.9221056Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9221448Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9221798Z return mod(**inputs) 2025-08-14T21:44:26.9222170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9222578Z outputs = self.fnet( 2025-08-14T21:44:26.9222960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9223374Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9223794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9224228Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9224624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9225007Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9225432Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9225868Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9226279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9226745Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9227157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9227600Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9227765Z 2025-08-14T21:44:26.9227878Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9228269Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9228617Z return mod(**inputs) 2025-08-14T21:44:26.9228998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9229406Z outputs = self.fnet( 2025-08-14T21:44:26.9229780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9230188Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9230574Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9231030Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9231492Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9231871Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9232262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9232688Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9233105Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9233518Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9233922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9234353Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9234514Z 2025-08-14T21:44:26.9234634Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9235012Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9235359Z return mod(**inputs) 2025-08-14T21:44:26.9235735Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9236132Z outputs = self.fnet( 2025-08-14T21:44:26.9236493Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9236902Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9237309Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9237728Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9238125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9238522Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9238942Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9239363Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9239782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9240190Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9240583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9241010Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9241179Z 2025-08-14T21:44:26.9241265Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9241519Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9241892Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9242243Z return mod(**inputs) 2025-08-14T21:44:26.9242615Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9243011Z outputs = self.fnet( 2025-08-14T21:44:26.9243384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9243804Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9244199Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9244609Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9245000Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9245392Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9245866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 252, in forward 2025-08-14T21:44:26.9246320Z layer_output = apply_chunking_to_forward( 2025-08-14T21:44:26.9246759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 251, in apply_chunking_to_forward 2025-08-14T21:44:26.9247190Z return forward_fn(*input_tensors) 2025-08-14T21:44:26.9247628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 261, in feed_forward_chunk 2025-08-14T21:44:26.9248118Z intermediate_output = self.intermediate(fourier_output) 2025-08-14T21:44:26.9248577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 220, in forward 2025-08-14T21:44:26.9249135Z hidden_states = self.intermediate_act_fn(hidden_states) 2025-08-14T21:44:26.9249550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:26.9250049Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:26.9250309Z 2025-08-14T21:44:26.9250399Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9250637Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9250890Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9251284Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9251638Z return mod(**inputs) 2025-08-14T21:44:26.9252024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9252448Z outputs = self.fnet( 2025-08-14T21:44:26.9252834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9253307Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9253716Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9254145Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9254548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9254935Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9255352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9255804Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9256247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9256671Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9257091Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9257546Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9257717Z 2025-08-14T21:44:26.9257838Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9258221Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9258576Z return mod(**inputs) 2025-08-14T21:44:26.9258961Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9259367Z outputs = self.fnet( 2025-08-14T21:44:26.9259750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9260164Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9260570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9261026Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9261483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9261878Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9262285Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9262727Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9263161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9263582Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9263989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9264432Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9264653Z 2025-08-14T21:44:26.9264821Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9265217Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9265563Z return mod(**inputs) 2025-08-14T21:44:26.9265946Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9266353Z outputs = self.fnet( 2025-08-14T21:44:26.9266728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9267141Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9267545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9267974Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9268358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9268751Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9269164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9269595Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9270023Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9270448Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9270859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9271303Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9271476Z 2025-08-14T21:44:26.9271587Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9271975Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9272332Z return mod(**inputs) 2025-08-14T21:44:26.9272711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9273118Z outputs = self.fnet( 2025-08-14T21:44:26.9273500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9273917Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9274321Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9274759Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9275166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9275539Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9275944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9276458Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9276880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9277281Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9277682Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9278123Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9278300Z 2025-08-14T21:44:26.9278385Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9278639Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9279016Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9279358Z return mod(**inputs) 2025-08-14T21:44:26.9279726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9280126Z outputs = self.fnet( 2025-08-14T21:44:26.9280495Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9280893Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9281287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9281702Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9282083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9282453Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9282853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 252, in forward 2025-08-14T21:44:26.9283266Z layer_output = apply_chunking_to_forward( 2025-08-14T21:44:26.9283686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 251, in apply_chunking_to_forward 2025-08-14T21:44:26.9284101Z return forward_fn(*input_tensors) 2025-08-14T21:44:26.9284528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 261, in feed_forward_chunk 2025-08-14T21:44:26.9284999Z intermediate_output = self.intermediate(fourier_output) 2025-08-14T21:44:26.9285430Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 220, in forward 2025-08-14T21:44:26.9285872Z hidden_states = self.intermediate_act_fn(hidden_states) 2025-08-14T21:44:26.9286269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:26.9286741Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:26.9286989Z 2025-08-14T21:44:26.9287078Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9287312Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9287570Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9287951Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9288304Z return mod(**inputs) 2025-08-14T21:44:26.9288766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9289192Z outputs = self.fnet( 2025-08-14T21:44:26.9289578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9289999Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9290416Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9290878Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9291423Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9291822Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9292240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9292682Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9293118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9293553Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9293972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9294421Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9294601Z 2025-08-14T21:44:26.9294719Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9295118Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9295468Z return mod(**inputs) 2025-08-14T21:44:26.9295854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9296268Z outputs = self.fnet( 2025-08-14T21:44:26.9296648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9297069Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9297473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9297913Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9298304Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9298701Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9299119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9299559Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9299983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9300408Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9300825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9301270Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9301437Z 2025-08-14T21:44:26.9301550Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9301945Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9302301Z return mod(**inputs) 2025-08-14T21:44:26.9302872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9303296Z outputs = self.fnet( 2025-08-14T21:44:26.9303683Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9304107Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9304512Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9304950Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9305355Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9305755Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9306156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9306738Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9307159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9307563Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9307970Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9308399Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9308562Z 2025-08-14T21:44:26.9308679Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9309053Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9309396Z return mod(**inputs) 2025-08-14T21:44:26.9309769Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9310161Z outputs = self.fnet( 2025-08-14T21:44:26.9310534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9310936Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9311328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9311734Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9312125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9312518Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9312933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9313361Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9313788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9314198Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9314595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9315019Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9315189Z 2025-08-14T21:44:26.9315273Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9315526Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9315895Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9316245Z return mod(**inputs) 2025-08-14T21:44:26.9316625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9317023Z outputs = self.fnet( 2025-08-14T21:44:26.9317417Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9317818Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9318212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9318616Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9319001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9319386Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9319790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 252, in forward 2025-08-14T21:44:26.9320216Z layer_output = apply_chunking_to_forward( 2025-08-14T21:44:26.9320651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 251, in apply_chunking_to_forward 2025-08-14T21:44:26.9321121Z return forward_fn(*input_tensors) 2025-08-14T21:44:26.9321589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 261, in feed_forward_chunk 2025-08-14T21:44:26.9322087Z intermediate_output = self.intermediate(fourier_output) 2025-08-14T21:44:26.9322526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 220, in forward 2025-08-14T21:44:26.9322964Z hidden_states = self.intermediate_act_fn(hidden_states) 2025-08-14T21:44:26.9323364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:26.9323852Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:26.9324101Z 2025-08-14T21:44:26.9324198Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9324434Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9324690Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9325091Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9325434Z return mod(**inputs) 2025-08-14T21:44:26.9325814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9326224Z outputs = self.fnet( 2025-08-14T21:44:26.9326605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9327011Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9327418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9327849Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9328243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9328629Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9329121Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9329564Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9329997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9330421Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9330837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9331294Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9331465Z 2025-08-14T21:44:26.9331580Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9331974Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9332330Z return mod(**inputs) 2025-08-14T21:44:26.9332721Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9333122Z outputs = self.fnet( 2025-08-14T21:44:26.9333505Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9333919Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9334317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9334764Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9335159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9335548Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9335953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9336478Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9336918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9337350Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9337763Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9338209Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9338377Z 2025-08-14T21:44:26.9338500Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9338886Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9339242Z return mod(**inputs) 2025-08-14T21:44:26.9339631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9340043Z outputs = self.fnet( 2025-08-14T21:44:26.9340421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9340840Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9341232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9341653Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9342052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9342430Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9342837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9343256Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9343684Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9344095Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9344501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9344923Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9345093Z 2025-08-14T21:44:26.9345203Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9345582Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9345917Z return mod(**inputs) 2025-08-14T21:44:26.9346291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9346689Z outputs = self.fnet( 2025-08-14T21:44:26.9347061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9347463Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9347857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9348273Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9348661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9349038Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9349445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9349875Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9350292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9350705Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9351213Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9351645Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9351806Z 2025-08-14T21:44:26.9351893Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9352147Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9352522Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9352855Z return mod(**inputs) 2025-08-14T21:44:26.9355961Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9356411Z outputs = self.fnet( 2025-08-14T21:44:26.9356792Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9357211Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9357628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9358056Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9358447Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9358841Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9359256Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 252, in forward 2025-08-14T21:44:26.9359689Z layer_output = apply_chunking_to_forward( 2025-08-14T21:44:26.9360179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 251, in apply_chunking_to_forward 2025-08-14T21:44:26.9360611Z return forward_fn(*input_tensors) 2025-08-14T21:44:26.9361048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 261, in feed_forward_chunk 2025-08-14T21:44:26.9361545Z intermediate_output = self.intermediate(fourier_output) 2025-08-14T21:44:26.9362008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 220, in forward 2025-08-14T21:44:26.9362470Z hidden_states = self.intermediate_act_fn(hidden_states) 2025-08-14T21:44:26.9362877Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:26.9363370Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:26.9363617Z 2025-08-14T21:44:26.9363715Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9363943Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9364202Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9364605Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9364975Z return mod(**inputs) 2025-08-14T21:44:26.9365359Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9365764Z outputs = self.fnet( 2025-08-14T21:44:26.9366150Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9366567Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9366981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9367412Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9367806Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9368192Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9368613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9369243Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9369694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9370117Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9370550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9371003Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9371174Z 2025-08-14T21:44:26.9371286Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9371750Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9372107Z return mod(**inputs) 2025-08-14T21:44:26.9372492Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9372897Z outputs = self.fnet( 2025-08-14T21:44:26.9373279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9373693Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9374090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9374530Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9374928Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9375320Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9375729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9376173Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9376612Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9377041Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9377443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9377878Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9378042Z 2025-08-14T21:44:26.9378163Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9378550Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9378908Z return mod(**inputs) 2025-08-14T21:44:26.9379298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9379707Z outputs = self.fnet( 2025-08-14T21:44:26.9380083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9380507Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9380919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9381352Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9381755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9382150Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9382563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9383001Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9383436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9383882Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9384338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9384781Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9384957Z 2025-08-14T21:44:26.9385070Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9385466Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9385815Z return mod(**inputs) 2025-08-14T21:44:26.9386201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9386613Z outputs = self.fnet( 2025-08-14T21:44:26.9387040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9387445Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9387854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9388280Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9388668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9389059Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9389470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9389910Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9390336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9390755Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9391173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9391617Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9391786Z 2025-08-14T21:44:26.9391872Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9392131Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9392521Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9392865Z return mod(**inputs) 2025-08-14T21:44:26.9393249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9393664Z outputs = self.fnet( 2025-08-14T21:44:26.9394046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9394450Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9394851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9395278Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9395669Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9396062Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9396475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 252, in forward 2025-08-14T21:44:26.9396910Z layer_output = apply_chunking_to_forward( 2025-08-14T21:44:26.9397339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 251, in apply_chunking_to_forward 2025-08-14T21:44:26.9397768Z return forward_fn(*input_tensors) 2025-08-14T21:44:26.9398213Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 261, in feed_forward_chunk 2025-08-14T21:44:26.9398702Z intermediate_output = self.intermediate(fourier_output) 2025-08-14T21:44:26.9399210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 220, in forward 2025-08-14T21:44:26.9399666Z hidden_states = self.intermediate_act_fn(hidden_states) 2025-08-14T21:44:26.9400077Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:26.9400557Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:26.9400817Z 2025-08-14T21:44:26.9400907Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9401143Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9401405Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9401831Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9402192Z return mod(**inputs) 2025-08-14T21:44:26.9402581Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9403248Z outputs = self.fnet( 2025-08-14T21:44:26.9403639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9404059Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9404475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9404905Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9405306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9405711Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9406135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9406579Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9407030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9407461Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9407882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9408338Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9408515Z 2025-08-14T21:44:26.9408630Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9409092Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9409442Z return mod(**inputs) 2025-08-14T21:44:26.9409832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9410239Z outputs = self.fnet( 2025-08-14T21:44:26.9410620Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9411051Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9411467Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9411907Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9412317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9412700Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9413103Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9413538Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9413959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9414445Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9414915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9415354Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9415516Z 2025-08-14T21:44:26.9415626Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9416014Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9416356Z return mod(**inputs) 2025-08-14T21:44:26.9416727Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9417157Z outputs = self.fnet( 2025-08-14T21:44:26.9417530Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9417943Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9418333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9418751Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9419141Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9419531Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9419932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9420356Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9420776Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9421182Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9421580Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9422011Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9422175Z 2025-08-14T21:44:26.9422291Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9422661Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9423006Z return mod(**inputs) 2025-08-14T21:44:26.9423378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9423790Z outputs = self.fnet( 2025-08-14T21:44:26.9424174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9424577Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9424969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9425386Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9425756Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9426117Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9426499Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9426896Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9427309Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9427723Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9428129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9428554Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9428726Z 2025-08-14T21:44:26.9428813Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9429135Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9429530Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9429879Z return mod(**inputs) 2025-08-14T21:44:26.9430258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9430652Z outputs = self.fnet( 2025-08-14T21:44:26.9430910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9430986Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9431267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9431360Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9431590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9431689Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9431943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 252, in forward 2025-08-14T21:44:26.9432042Z layer_output = apply_chunking_to_forward( 2025-08-14T21:44:26.9432317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 251, in apply_chunking_to_forward 2025-08-14T21:44:26.9432401Z return forward_fn(*input_tensors) 2025-08-14T21:44:26.9432696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 261, in feed_forward_chunk 2025-08-14T21:44:26.9432819Z intermediate_output = self.intermediate(fourier_output) 2025-08-14T21:44:26.9433083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 220, in forward 2025-08-14T21:44:26.9433200Z hidden_states = self.intermediate_act_fn(hidden_states) 2025-08-14T21:44:26.9433426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:26.9433621Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:26.9433625Z 2025-08-14T21:44:26.9433709Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9433792Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9433911Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9434119Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9434197Z return mod(**inputs) 2025-08-14T21:44:26.9434453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9434523Z outputs = self.fnet( 2025-08-14T21:44:26.9434789Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9434874Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9435128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9435229Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9435460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9435550Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9435808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9435912Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9436175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9436280Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9436575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9436684Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9436689Z 2025-08-14T21:44:26.9436796Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9437014Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9437083Z return mod(**inputs) 2025-08-14T21:44:26.9437337Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9437434Z outputs = self.fnet( 2025-08-14T21:44:26.9437691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9437777Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9438036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9438126Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9438364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9438448Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9438708Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9438809Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9439064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9439157Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9439412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9439519Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9439525Z 2025-08-14T21:44:26.9439642Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9439849Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9439926Z return mod(**inputs) 2025-08-14T21:44:26.9440178Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9440246Z outputs = self.fnet( 2025-08-14T21:44:26.9440508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9440586Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9440841Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9440940Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9441179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9441271Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9441528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9441632Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9441895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9441979Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9442243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9442350Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9442354Z 2025-08-14T21:44:26.9442484Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9442758Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9442832Z return mod(**inputs) 2025-08-14T21:44:26.9443096Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9443175Z outputs = self.fnet( 2025-08-14T21:44:26.9443441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9443527Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9443810Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9443908Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9444157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9444248Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9444520Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9444626Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9444890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9444985Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9445252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9445362Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9445368Z 2025-08-14T21:44:26.9445466Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9445580Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9445802Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9445876Z return mod(**inputs) 2025-08-14T21:44:26.9446144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9446223Z outputs = self.fnet( 2025-08-14T21:44:26.9446492Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9446571Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9446843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9446935Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9447181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9447271Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9447535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 252, in forward 2025-08-14T21:44:26.9447641Z layer_output = apply_chunking_to_forward( 2025-08-14T21:44:26.9447924Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 251, in apply_chunking_to_forward 2025-08-14T21:44:26.9448016Z return forward_fn(*input_tensors) 2025-08-14T21:44:26.9448317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 261, in feed_forward_chunk 2025-08-14T21:44:26.9448444Z intermediate_output = self.intermediate(fourier_output) 2025-08-14T21:44:26.9448791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 220, in forward 2025-08-14T21:44:26.9448924Z hidden_states = self.intermediate_act_fn(hidden_states) 2025-08-14T21:44:26.9449158Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:26.9449424Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:26.9449429Z 2025-08-14T21:44:26.9449519Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9449614Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9449729Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9449948Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9450029Z return mod(**inputs) 2025-08-14T21:44:26.9450296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9450397Z outputs = self.fnet( 2025-08-14T21:44:26.9450666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9450744Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9451024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9451117Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9451355Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9451451Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9451717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9451829Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9452094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9452180Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9452452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9452569Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9452572Z 2025-08-14T21:44:26.9452691Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9452909Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9452978Z return mod(**inputs) 2025-08-14T21:44:26.9453248Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9453319Z outputs = self.fnet( 2025-08-14T21:44:26.9453583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9453673Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9453938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9454037Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9454280Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9454365Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9454635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9454740Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9455005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9455098Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9455364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9455481Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9455485Z 2025-08-14T21:44:26.9455621Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9455885Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9455965Z return mod(**inputs) 2025-08-14T21:44:26.9456229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9456306Z outputs = self.fnet( 2025-08-14T21:44:26.9456569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9456647Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9456933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9457029Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9457267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9457363Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9457632Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9457744Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9458009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9458095Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9458367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9458480Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9458483Z 2025-08-14T21:44:26.9458602Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9458817Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9458900Z return mod(**inputs) 2025-08-14T21:44:26.9459150Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9459216Z outputs = self.fnet( 2025-08-14T21:44:26.9459458Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9459542Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9459796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9459893Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9460134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9460215Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9460464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9460565Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9460807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9460896Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9461137Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9461247Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9461250Z 2025-08-14T21:44:26.9461331Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9461441Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9461658Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9461728Z return mod(**inputs) 2025-08-14T21:44:26.9461989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9462112Z outputs = self.fnet( 2025-08-14T21:44:26.9462366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9462451Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9462706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9462797Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9463035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9463136Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9463399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 252, in forward 2025-08-14T21:44:26.9463490Z layer_output = apply_chunking_to_forward( 2025-08-14T21:44:26.9463767Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 251, in apply_chunking_to_forward 2025-08-14T21:44:26.9463856Z return forward_fn(*input_tensors) 2025-08-14T21:44:26.9464143Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 261, in feed_forward_chunk 2025-08-14T21:44:26.9464265Z intermediate_output = self.intermediate(fourier_output) 2025-08-14T21:44:26.9464529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 220, in forward 2025-08-14T21:44:26.9464646Z hidden_states = self.intermediate_act_fn(hidden_states) 2025-08-14T21:44:26.9464875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:26.9465062Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:26.9465068Z 2025-08-14T21:44:26.9465154Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9465247Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9465357Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9465574Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9465643Z return mod(**inputs) 2025-08-14T21:44:26.9465907Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9465986Z outputs = self.fnet( 2025-08-14T21:44:26.9466250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9466327Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9466599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9466691Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9466930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9467012Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9467278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9467391Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9467655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9467742Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9468026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9468133Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9468137Z 2025-08-14T21:44:26.9468274Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9468535Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9468608Z return mod(**inputs) 2025-08-14T21:44:26.9468880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9468950Z outputs = self.fnet( 2025-08-14T21:44:26.9469217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9469289Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9469547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9469642Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9469860Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9469942Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9470196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9470293Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9470543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9470622Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9470865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9470976Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9470979Z 2025-08-14T21:44:26.9471082Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9471285Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9471355Z return mod(**inputs) 2025-08-14T21:44:26.9471605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9471682Z outputs = self.fnet( 2025-08-14T21:44:26.9471945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9472020Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9472292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9472382Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9472624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9472707Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9472973Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9473086Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9473342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9473438Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9473704Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9473811Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9473815Z 2025-08-14T21:44:26.9473930Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9474139Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9474210Z return mod(**inputs) 2025-08-14T21:44:26.9474484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9474592Z outputs = self.fnet( 2025-08-14T21:44:26.9474882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9474964Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9475259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9475358Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9475590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9475672Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9475988Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 249, in forward 2025-08-14T21:44:26.9476092Z self_fourier_outputs = self.fourier(hidden_states) 2025-08-14T21:44:26.9476360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 202, in forward 2025-08-14T21:44:26.9476447Z self_outputs = self.self(hidden_states) 2025-08-14T21:44:26.9476699Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 181, in forward 2025-08-14T21:44:26.9476813Z outputs = self.fourier_transform(hidden_states).real 2025-08-14T21:44:26.9476816Z 2025-08-14T21:44:26.9476901Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9477014Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9477219Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9477290Z return mod(**inputs) 2025-08-14T21:44:26.9477552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 671, in forward 2025-08-14T21:44:26.9477619Z outputs = self.fnet( 2025-08-14T21:44:26.9477887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 518, in forward 2025-08-14T21:44:26.9477974Z encoder_outputs = self.encoder( 2025-08-14T21:44:26.9478230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 280, in forward 2025-08-14T21:44:26.9478327Z layer_outputs = layer_module(hidden_states) 2025-08-14T21:44:26.9478555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:44:26.9478638Z return super().__call__(*args, **kwargs) 2025-08-14T21:44:26.9478902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 252, in forward 2025-08-14T21:44:26.9478990Z layer_output = apply_chunking_to_forward( 2025-08-14T21:44:26.9479259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/pytorch_utils.py", line 251, in apply_chunking_to_forward 2025-08-14T21:44:26.9479350Z return forward_fn(*input_tensors) 2025-08-14T21:44:26.9479638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 261, in feed_forward_chunk 2025-08-14T21:44:26.9479765Z intermediate_output = self.intermediate(fourier_output) 2025-08-14T21:44:26.9480020Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 220, in forward 2025-08-14T21:44:26.9480134Z hidden_states = self.intermediate_act_fn(hidden_states) 2025-08-14T21:44:26.9480362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/activations.py", line 47, in forward 2025-08-14T21:44:26.9480547Z return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0)))) 2025-08-14T21:44:26.9480551Z 2025-08-14T21:44:26.9480643Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9480725Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9480843Z cudagraph partition due to non gpu ops 2025-08-14T21:44:26.9480978Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:26.9481189Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:26.9481259Z return mod(**inputs) 2025-08-14T21:44:26.9481524Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/fnet/modeling_fnet.py", line 686, in forward 2025-08-14T21:44:26.9481724Z masked_lm_loss = loss_fct(prediction_scores.view(-1, self.config.vocab_size), labels.view(-1)) 2025-08-14T21:44:26.9481727Z 2025-08-14T21:44:36.0038425Z Compilation time (from dynamo_timed): 14.580806089 2025-08-14T21:44:36.0101405Z pass 2025-08-14T21:44:36.0104975Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:44:36.0105912Z TIMING: _recursive_pre_grad_passes:0.02592 _recursive_joint_graph_passes:0.222 _recursive_post_grad_passes:0.08031 async_compile.wait:0.7924 code_gen:8.66589 inductor_compile:10.15959 backend_compile:12.70155 gc:0.00159 entire_frame_compile:14.58081 total_wall_time:14.58081 2025-08-14T21:44:36.0106926Z STATS: call_* op count: 232 | FakeTensorMode.__torch_dispatch__:14364 | FakeTensor.__torch_dispatch__:3342 | ProxyTorchDispatchMode.__torch_dispatch__:2923 2025-08-14T21:44:36.0107470Z Dynamo produced 1 graphs covering 232 ops with 0 graph breaks (0 unique) 2025-08-14T21:44:41.7225622Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:44:41.7226827Z from pkg_resources import resource_filename 2025-08-14T21:44:42.3293942Z 2025-08-14T21:44:43.8929708Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:44:43.8930051Z loading model: 0it [00:01, ?it/s] 2025-08-14T21:44:43.8930341Z cpu eval LayoutLMForMaskedLM 2025-08-14T21:44:44.4738709Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:44:44.6205740Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:44:44.7658561Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:44:56.8461604Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8461903Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8462138Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8462362Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8462624Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8462837Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8463057Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8463275Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8463495Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8463719Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8463952Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8464165Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8464400Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8464649Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8464886Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8465106Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8465326Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8465547Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8465761Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8465982Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8466200Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8466434Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8466659Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8467195Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8467482Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8467831Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8468075Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8468291Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8468516Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8468737Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8468977Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8469190Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8469409Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8469630Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8469912Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8470133Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8470351Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8470563Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8470782Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8471005Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8471227Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8471437Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8471661Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8471888Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8472106Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8472331Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8472560Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8472782Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8473005Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8473221Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8473428Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8473640Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8473853Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8474070Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8474278Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8474490Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8474701Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8474905Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8475114Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8475325Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8475545Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8475760Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8475975Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8476189Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8476397Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8476611Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8476827Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8477032Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8477246Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8477464Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8477673Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8477886Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8478101Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8478307Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8478522Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8478741Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8478957Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8479169Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8486981Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8487353Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8487625Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8487863Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8488093Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8488548Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8489202Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8489514Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8489746Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8489986Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8490204Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8490425Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8490647Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8490865Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8491094Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8491319Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8491534Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8491798Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8492019Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8492232Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8492451Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8492677Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8492901Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8493118Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8493352Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8493578Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8493802Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8494023Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8494250Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8494465Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8494689Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8494912Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8495136Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8495362Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8495578Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8495798Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8496008Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8496229Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8496446Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8496662Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8496880Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8497100Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8497317Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8497540Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8497758Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8497974Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8498193Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8498415Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8498637Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8498848Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8499067Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8499288Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8499499Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8499715Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8499933Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8500147Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8500365Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8500580Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8500794Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8501016Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8501232Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8501452Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8501669Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8501883Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8502106Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8502318Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8502588Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8503132Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8503357Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8503582Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8503807Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8504030Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8504258Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8504475Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8504693Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8504903Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8505121Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8505379Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8505590Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8505804Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8506024Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8506238Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8506460Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8506678Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8506886Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8507105Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8507325Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8507529Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8507727Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8507933Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8508140Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8508339Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8508553Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8508754Z cudagraph partition due to non gpu ops 2025-08-14T21:44:56.8508984Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:44:56.8509373Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:44:56.8509717Z return mod(**inputs) 2025-08-14T21:44:56.8510119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:56.8510511Z return func(*args, **kwargs) 2025-08-14T21:44:56.8510874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:44:56.8511242Z return func(*args, **kwargs) 2025-08-14T21:44:56.8511571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:44:56.8511932Z output = func(self, *args, **kwargs) 2025-08-14T21:44:56.8512350Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 776, in forward 2025-08-14T21:44:56.8512762Z masked_lm_loss = loss_fct( 2025-08-14T21:44:56.8512887Z 2025-08-14T21:45:05.8467985Z Compilation time (from dynamo_timed): 19.71984193 2025-08-14T21:45:05.8501069Z pass 2025-08-14T21:45:05.8504961Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:45:05.8505845Z TIMING: _recursive_pre_grad_passes:0.04578 _recursive_joint_graph_passes:0.5296 _recursive_post_grad_passes:0.44558 async_compile.wait:0.69054 code_gen:8.25155 inductor_compile:10.24244 backend_compile:16.196 gc:0.00024 entire_frame_compile:19.71984 total_wall_time:19.71984 2025-08-14T21:45:05.8506960Z STATS: call_* op count: 434 | FakeTensorMode.__torch_dispatch__:30268 | FakeTensor.__torch_dispatch__:3356 | ProxyTorchDispatchMode.__torch_dispatch__:8633 2025-08-14T21:45:05.8507522Z Dynamo produced 1 graphs covering 434 ops with 0 graph breaks (0 unique) 2025-08-14T21:45:11.9391274Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:45:11.9392348Z from pkg_resources import resource_filename 2025-08-14T21:45:12.6269200Z 2025-08-14T21:45:14.0476467Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:45:14.0476876Z loading model: 0it [00:01, ?it/s] 2025-08-14T21:45:14.0477173Z cpu eval LayoutLMForSequenceClassification 2025-08-14T21:45:14.6868705Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:45:14.8367981Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:45:14.9786848Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:45:27.4040820Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:27.4044126Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:27.4044753Z return mod(**inputs) 2025-08-14T21:45:27.4045303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:27.4045746Z output = func(self, *args, **kwargs) 2025-08-14T21:45:27.4046384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 891, in forward 2025-08-14T21:45:27.4047034Z logits = self.classifier(pooled_output) 2025-08-14T21:45:27.4049439Z 2025-08-14T21:45:27.4049785Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4050076Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4050457Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4050764Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4051002Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4051365Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4051590Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4051949Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4052212Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4052448Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4052698Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4052935Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4053167Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4053449Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4053682Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4053909Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4054140Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4054362Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4054593Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4054820Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4055045Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4055276Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4055558Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4055795Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4056038Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4056266Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4056487Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4056715Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4056943Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4057172Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4057396Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4057621Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4057850Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4058074Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4058297Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4058523Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4058743Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4059351Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4059645Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4059872Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4060101Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4060330Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4060562Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4060782Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4061005Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4061234Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4061456Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4061683Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4061961Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4062188Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4062413Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4062647Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4062898Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4063125Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4063348Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4063566Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4063792Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4064021Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4064242Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4064469Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4064693Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4064922Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4065147Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4065375Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4065629Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4065850Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4066081Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4066309Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4066539Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4066755Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4066982Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4067204Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4067419Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4067643Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4067866Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4068080Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4068301Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4068520Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4068733Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4068956Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4069177Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4069400Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4069618Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4069841Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4070065Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4070280Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4070501Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4070726Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4070947Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4071170Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4071393Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4071610Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4071834Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4072059Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4072282Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4072496Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4072719Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4072997Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4073231Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4073454Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4073677Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4073893Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4074117Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4074340Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4074556Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4074782Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4075001Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4075224Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4075466Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4075691Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4075913Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4076128Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4076354Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4076577Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4076794Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4077016Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4077238Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4077454Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4077677Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4077898Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4078110Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4078345Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4078569Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4078796Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4079012Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4079235Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4079460Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4079688Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4079909Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4080132Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4080355Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4080571Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4080796Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4081019Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4081235Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4081459Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4081681Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4081897Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4082125Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4082350Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4082565Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4082787Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4083009Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4083235Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4083459Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4083684Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4083907Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4084120Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4084344Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4084565Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4084778Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4085001Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4085226Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4085444Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4085666Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4085887Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4086111Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4086356Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4086612Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4086861Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4087110Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4087327Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4087551Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4087773Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4087988Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4088211Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4088433Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4088651Z cudagraph partition due to non gpu ops 2025-08-14T21:45:27.4089108Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:27.4089547Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:27.4089922Z return mod(**inputs) 2025-08-14T21:45:27.4090324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:27.4090743Z output = func(self, *args, **kwargs) 2025-08-14T21:45:27.4091198Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 875, in forward 2025-08-14T21:45:27.4091640Z outputs = self.layoutlm( 2025-08-14T21:45:27.4092048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:45:27.4092475Z return func(*args, **kwargs) 2025-08-14T21:45:27.4092879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:45:27.4093283Z return func(*args, **kwargs) 2025-08-14T21:45:27.4093659Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:27.4094054Z output = func(self, *args, **kwargs) 2025-08-14T21:45:27.4094511Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 654, in forward 2025-08-14T21:45:27.4094983Z pooled_output = self.pooler(sequence_output) 2025-08-14T21:45:27.4095448Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 430, in forward 2025-08-14T21:45:27.4095918Z pooled_output = self.dense(first_token_tensor) 2025-08-14T21:45:27.4096090Z 2025-08-14T21:45:27.4096211Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:27.4096619Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:27.4096986Z return mod(**inputs) 2025-08-14T21:45:27.4097357Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:27.4097743Z output = func(self, *args, **kwargs) 2025-08-14T21:45:27.4098190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 875, in forward 2025-08-14T21:45:27.4098635Z outputs = self.layoutlm( 2025-08-14T21:45:27.4099027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:45:27.4099435Z return func(*args, **kwargs) 2025-08-14T21:45:27.4099829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:45:27.4100230Z return func(*args, **kwargs) 2025-08-14T21:45:27.4100602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:27.4100989Z output = func(self, *args, **kwargs) 2025-08-14T21:45:27.4101429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 654, in forward 2025-08-14T21:45:27.4101883Z pooled_output = self.pooler(sequence_output) 2025-08-14T21:45:27.4102412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 431, in forward 2025-08-14T21:45:27.4103186Z pooled_output = self.activation(pooled_output) 2025-08-14T21:45:27.4103356Z 2025-08-14T21:45:27.4103488Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:27.4103888Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:27.4104253Z return mod(**inputs) 2025-08-14T21:45:27.4104646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:27.4105032Z output = func(self, *args, **kwargs) 2025-08-14T21:45:27.4105536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 875, in forward 2025-08-14T21:45:27.4105977Z outputs = self.layoutlm( 2025-08-14T21:45:27.4106373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:45:27.4106775Z return func(*args, **kwargs) 2025-08-14T21:45:27.4107159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:45:27.4107569Z return func(*args, **kwargs) 2025-08-14T21:45:27.4107939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:27.4108325Z output = func(self, *args, **kwargs) 2025-08-14T21:45:27.4108763Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 654, in forward 2025-08-14T21:45:27.4109224Z pooled_output = self.pooler(sequence_output) 2025-08-14T21:45:27.4109672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 431, in forward 2025-08-14T21:45:27.4110135Z pooled_output = self.activation(pooled_output) 2025-08-14T21:45:27.4110309Z 2025-08-14T21:45:27.4110427Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:27.4110821Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:27.4111179Z return mod(**inputs) 2025-08-14T21:45:27.4111531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:27.4111909Z output = func(self, *args, **kwargs) 2025-08-14T21:45:27.4112341Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 891, in forward 2025-08-14T21:45:27.4112791Z logits = self.classifier(pooled_output) 2025-08-14T21:45:27.4112949Z 2025-08-14T21:45:27.4113064Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:27.4113460Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:27.4113821Z return mod(**inputs) 2025-08-14T21:45:27.4114173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:27.4114549Z output = func(self, *args, **kwargs) 2025-08-14T21:45:27.4114972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 891, in forward 2025-08-14T21:45:27.4115412Z logits = self.classifier(pooled_output) 2025-08-14T21:45:27.4115571Z 2025-08-14T21:45:27.4115702Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:27.4116090Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:27.4116449Z return mod(**inputs) 2025-08-14T21:45:27.4116798Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:27.4117181Z output = func(self, *args, **kwargs) 2025-08-14T21:45:27.4117723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 911, in forward 2025-08-14T21:45:27.4118241Z loss = loss_fct(logits.view(-1, self.num_labels), labels.view(-1)) 2025-08-14T21:45:27.4118450Z 2025-08-14T21:45:27.4118564Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:27.4118956Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:27.4119309Z return mod(**inputs) 2025-08-14T21:45:27.4119660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:27.4120047Z output = func(self, *args, **kwargs) 2025-08-14T21:45:27.4120508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 911, in forward 2025-08-14T21:45:27.4121005Z loss = loss_fct(logits.view(-1, self.num_labels), labels.view(-1)) 2025-08-14T21:45:27.4121202Z 2025-08-14T21:45:27.4121313Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:27.4121703Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:27.4122059Z return mod(**inputs) 2025-08-14T21:45:27.4122405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:27.4122791Z output = func(self, *args, **kwargs) 2025-08-14T21:45:27.4123227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 911, in forward 2025-08-14T21:45:27.4123717Z loss = loss_fct(logits.view(-1, self.num_labels), labels.view(-1)) 2025-08-14T21:45:27.4123911Z 2025-08-14T21:45:46.3643282Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:46.3643831Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:46.3644399Z return mod(**inputs) 2025-08-14T21:45:46.3644832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:46.3645247Z output = func(self, *args, **kwargs) 2025-08-14T21:45:46.3645710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 891, in forward 2025-08-14T21:45:46.3646181Z logits = self.classifier(pooled_output) 2025-08-14T21:45:46.3646341Z 2025-08-14T21:45:46.3646447Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3646685Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3646916Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3647138Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3647372Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3647599Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3647829Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3648048Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3648280Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3648513Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3649020Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3649331Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3649558Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3649786Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3650007Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3650237Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3650466Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3650691Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3650915Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3651151Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3651418Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3651678Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3651907Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3652543Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3652833Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3653069Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3653302Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3653528Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3653753Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3653973Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3654206Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3654452Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3654691Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3654919Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3655220Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3655465Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3655705Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3655926Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3656149Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3656378Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3656593Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3656814Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3657042Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3657261Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3657483Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3657713Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3657930Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3658151Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3658381Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3658618Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3658831Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3659079Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3659305Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3659522Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3659748Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3659973Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3660195Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3660417Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3660641Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3660861Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3661089Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3661313Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3661553Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3661773Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3661994Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3662212Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3662428Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3662649Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3662873Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3663103Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3663316Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3663539Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3663771Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3663989Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3664208Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3664429Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3664645Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3664867Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3665094Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3665310Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3665531Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3665752Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3665972Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3666328Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3666579Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3666805Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3667020Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3667243Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3667467Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3667685Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3667979Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3668202Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3668415Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3668636Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3668892Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3669118Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3669335Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3669559Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3669785Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3670980Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3671255Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3671485Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3671703Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3671927Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3672157Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3672373Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3672592Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3672811Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3673022Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3673244Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3673577Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3673906Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3674229Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3674544Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3674882Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3675204Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3675439Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3675709Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3675928Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3676139Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3676359Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3676578Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3676791Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3677009Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3677232Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3677454Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3677674Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3677893Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3678118Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3678335Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3678555Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3678777Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3678991Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3679210Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3679429Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3679641Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3679862Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3680080Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3680298Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3680511Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3680729Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3680951Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3681163Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3681564Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3681818Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3682036Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3682258Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3682478Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3682692Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3682911Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3683129Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3683345Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3683556Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3683775Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3684058Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3684283Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3684501Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3684714Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3684938Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3685163Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3685376Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3685596Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3685815Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3686037Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3686258Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3686478Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3686697Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3686908Z cudagraph partition due to non gpu ops 2025-08-14T21:45:46.3687168Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:46.3687589Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:46.3687953Z return mod(**inputs) 2025-08-14T21:45:46.3688375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:46.3689151Z output = func(self, *args, **kwargs) 2025-08-14T21:45:46.3689623Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 875, in forward 2025-08-14T21:45:46.3690068Z outputs = self.layoutlm( 2025-08-14T21:45:46.3690473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:45:46.3690891Z return func(*args, **kwargs) 2025-08-14T21:45:46.3691282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:45:46.3691687Z return func(*args, **kwargs) 2025-08-14T21:45:46.3692058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:46.3692445Z output = func(self, *args, **kwargs) 2025-08-14T21:45:46.3692881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 654, in forward 2025-08-14T21:45:46.3693356Z pooled_output = self.pooler(sequence_output) 2025-08-14T21:45:46.3693814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 430, in forward 2025-08-14T21:45:46.3694276Z pooled_output = self.dense(first_token_tensor) 2025-08-14T21:45:46.3694448Z 2025-08-14T21:45:46.3694604Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:46.3695008Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:46.3695372Z return mod(**inputs) 2025-08-14T21:45:46.3695742Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:46.3696123Z output = func(self, *args, **kwargs) 2025-08-14T21:45:46.3696562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 875, in forward 2025-08-14T21:45:46.3697079Z outputs = self.layoutlm( 2025-08-14T21:45:46.3697484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:45:46.3697879Z return func(*args, **kwargs) 2025-08-14T21:45:46.3698272Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:45:46.3698672Z return func(*args, **kwargs) 2025-08-14T21:45:46.3699029Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:46.3699416Z output = func(self, *args, **kwargs) 2025-08-14T21:45:46.3699885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 654, in forward 2025-08-14T21:45:46.3700345Z pooled_output = self.pooler(sequence_output) 2025-08-14T21:45:46.3700791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 431, in forward 2025-08-14T21:45:46.3701249Z pooled_output = self.activation(pooled_output) 2025-08-14T21:45:46.3701415Z 2025-08-14T21:45:46.3701537Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:46.3701923Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:46.3702269Z return mod(**inputs) 2025-08-14T21:45:46.3702890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:46.3703292Z output = func(self, *args, **kwargs) 2025-08-14T21:45:46.3703727Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 875, in forward 2025-08-14T21:45:46.3704167Z outputs = self.layoutlm( 2025-08-14T21:45:46.3704561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:45:46.3704972Z return func(*args, **kwargs) 2025-08-14T21:45:46.3705359Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:45:46.3705762Z return func(*args, **kwargs) 2025-08-14T21:45:46.3706135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:46.3706516Z output = func(self, *args, **kwargs) 2025-08-14T21:45:46.3706953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 654, in forward 2025-08-14T21:45:46.3707417Z pooled_output = self.pooler(sequence_output) 2025-08-14T21:45:46.3707872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 431, in forward 2025-08-14T21:45:46.3708325Z pooled_output = self.activation(pooled_output) 2025-08-14T21:45:46.3708503Z 2025-08-14T21:45:46.3708620Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:46.3709012Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:46.3709365Z return mod(**inputs) 2025-08-14T21:45:46.3709711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:46.3710096Z output = func(self, *args, **kwargs) 2025-08-14T21:45:46.3710528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 891, in forward 2025-08-14T21:45:46.3710967Z logits = self.classifier(pooled_output) 2025-08-14T21:45:46.3711130Z 2025-08-14T21:45:46.3711243Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:46.3711635Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:46.3712005Z return mod(**inputs) 2025-08-14T21:45:46.3712557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:46.3712947Z output = func(self, *args, **kwargs) 2025-08-14T21:45:46.3713382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 891, in forward 2025-08-14T21:45:46.3713821Z logits = self.classifier(pooled_output) 2025-08-14T21:45:46.3713984Z 2025-08-14T21:45:46.3714096Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:46.3714486Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:46.3714841Z return mod(**inputs) 2025-08-14T21:45:46.3715247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:46.3715636Z output = func(self, *args, **kwargs) 2025-08-14T21:45:46.3716072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 911, in forward 2025-08-14T21:45:46.3716574Z loss = loss_fct(logits.view(-1, self.num_labels), labels.view(-1)) 2025-08-14T21:45:46.3716774Z 2025-08-14T21:45:46.3716888Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:46.3717278Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:46.3717631Z return mod(**inputs) 2025-08-14T21:45:46.3717977Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:46.3718366Z output = func(self, *args, **kwargs) 2025-08-14T21:45:46.3718804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 911, in forward 2025-08-14T21:45:46.3719297Z loss = loss_fct(logits.view(-1, self.num_labels), labels.view(-1)) 2025-08-14T21:45:46.3719491Z 2025-08-14T21:45:46.3719606Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:45:46.3719999Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:45:46.3720349Z return mod(**inputs) 2025-08-14T21:45:46.3720699Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:45:46.3721092Z output = func(self, *args, **kwargs) 2025-08-14T21:45:46.3721513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 911, in forward 2025-08-14T21:45:46.3721998Z loss = loss_fct(logits.view(-1, self.num_labels), labels.view(-1)) 2025-08-14T21:45:46.3722189Z 2025-08-14T21:45:48.7488496Z Compilation time (from dynamo_timed): 32.436183414 2025-08-14T21:45:48.7489196Z pass 2025-08-14T21:45:48.7489534Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:45:48.7490402Z TIMING: _recursive_pre_grad_passes:0.08841 _recursive_joint_graph_passes:0.96806 _recursive_post_grad_passes:0.52369 async_compile.wait:0.74435 code_gen:9.94436 inductor_compile:12.96787 backend_compile:24.99322 gc:0.0035 entire_frame_compile:32.43618 total_wall_time:32.43618 2025-08-14T21:45:48.7491499Z STATS: call_* op count: 864 | FakeTensorMode.__torch_dispatch__:58988 | FakeTensor.__torch_dispatch__:6419 | ProxyTorchDispatchMode.__torch_dispatch__:16959 2025-08-14T21:45:48.7492077Z Dynamo produced 2 graphs covering 864 ops with 0 graph breaks (0 unique) 2025-08-14T21:45:54.9536816Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:45:54.9542054Z from pkg_resources import resource_filename 2025-08-14T21:45:55.7209599Z 2025-08-14T21:46:03.2587167Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:46:03.2587876Z loading model: 0it [00:07, ?it/s] 2025-08-14T21:46:03.2588177Z cpu eval M2M100ForConditionalGeneration 2025-08-14T21:46:04.1904174Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:46:04.6778423Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:46:05.1405561Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:46:30.7409478Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7410107Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7410922Z return mod(**inputs) 2025-08-14T21:46:30.7411391Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7411840Z outputs = self.model( 2025-08-14T21:46:30.7412281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7412729Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7413180Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 844, in forward 2025-08-14T21:46:30.7413662Z embed_pos = self.embed_positions(input_ids, inputs_embeds) 2025-08-14T21:46:30.7414121Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 120, in decorate_context 2025-08-14T21:46:30.7414507Z return func(*args, **kwargs) 2025-08-14T21:46:30.7414905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 148, in forward 2025-08-14T21:46:30.7415484Z position_ids = create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length).to( 2025-08-14T21:46:30.7416154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 80, in create_position_ids_from_input_ids 2025-08-14T21:46:30.7416664Z mask = input_ids.ne(padding_idx).int() 2025-08-14T21:46:30.7416818Z 2025-08-14T21:46:30.7416909Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7417141Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7417370Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7417596Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7417809Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7418026Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7418255Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7418464Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7418676Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7418888Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7419094Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7419318Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7419580Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7419982Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7420352Z return mod(**inputs) 2025-08-14T21:46:30.7420760Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7421188Z outputs = self.model( 2025-08-14T21:46:30.7421580Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7421988Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7422397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 844, in forward 2025-08-14T21:46:30.7422846Z embed_pos = self.embed_positions(input_ids, inputs_embeds) 2025-08-14T21:46:30.7423510Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 120, in decorate_context 2025-08-14T21:46:30.7423910Z return func(*args, **kwargs) 2025-08-14T21:46:30.7424326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 148, in forward 2025-08-14T21:46:30.7424897Z position_ids = create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length).to( 2025-08-14T21:46:30.7425535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 81, in create_position_ids_from_input_ids 2025-08-14T21:46:30.7426165Z incremental_indices = (torch.cumsum(mask, dim=1).type_as(mask) + past_key_values_length) * mask 2025-08-14T21:46:30.7426431Z 2025-08-14T21:46:30.7426566Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7426948Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7427301Z return mod(**inputs) 2025-08-14T21:46:30.7427735Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7428148Z outputs = self.model( 2025-08-14T21:46:30.7428527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7428943Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7429352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 844, in forward 2025-08-14T21:46:30.7429807Z embed_pos = self.embed_positions(input_ids, inputs_embeds) 2025-08-14T21:46:30.7430271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 120, in decorate_context 2025-08-14T21:46:30.7430656Z return func(*args, **kwargs) 2025-08-14T21:46:30.7431103Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 148, in forward 2025-08-14T21:46:30.7431673Z position_ids = create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length).to( 2025-08-14T21:46:30.7432311Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 81, in create_position_ids_from_input_ids 2025-08-14T21:46:30.7432912Z incremental_indices = (torch.cumsum(mask, dim=1).type_as(mask) + past_key_values_length) * mask 2025-08-14T21:46:30.7433180Z 2025-08-14T21:46:30.7433268Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7434390Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7434679Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7434912Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7435131Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7435355Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7435578Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7435801Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7436026Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7436237Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7436504Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7449489Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7449901Z return mod(**inputs) 2025-08-14T21:46:30.7450429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7450864Z outputs = self.model( 2025-08-14T21:46:30.7451384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7451838Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7452334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7452942Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7453348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7453765Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7454202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7454663Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7455194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7455708Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7456239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7456860Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7457070Z 2025-08-14T21:46:30.7457201Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7457696Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7458046Z return mod(**inputs) 2025-08-14T21:46:30.7458533Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7459053Z outputs = self.model( 2025-08-14T21:46:30.7459467Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7459952Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7460366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7460888Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7461307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7461820Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7462245Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7462773Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7463206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7463650Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7464139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7464668Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7464921Z 2025-08-14T21:46:30.7465028Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7465263Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7465532Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7465806Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7466044Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7466263Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7466506Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7466775Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7467089Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7467355Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7467601Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7467824Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7468035Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7468255Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7468471Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7468744Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7469008Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7469398Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7469755Z return mod(**inputs) 2025-08-14T21:46:30.7470156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7470578Z outputs = self.model( 2025-08-14T21:46:30.7470978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7471445Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7471916Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7472446Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7472822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7473205Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7473622Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7474058Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7474489Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7474926Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7475411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7476020Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7476217Z 2025-08-14T21:46:30.7476395Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7476816Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7477169Z return mod(**inputs) 2025-08-14T21:46:30.7477570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7477979Z outputs = self.model( 2025-08-14T21:46:30.7478382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7478802Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7479218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7479631Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7480107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7480512Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7481023Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7481462Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7481901Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7482346Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7482822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7483320Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7483504Z 2025-08-14T21:46:30.7483593Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7483867Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7484118Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7484394Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7484722Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7484940Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7485195Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7485667Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7486033Z return mod(**inputs) 2025-08-14T21:46:30.7486493Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7486987Z outputs = self.model( 2025-08-14T21:46:30.7487427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7487943Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7488433Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7488999Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7489536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7489925Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7490451Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 393, in forward 2025-08-14T21:46:30.7490978Z hidden_states = residual + hidden_states 2025-08-14T21:46:30.7491130Z 2025-08-14T21:46:30.7491216Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7491488Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7491762Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7491985Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7492204Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7492508Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7492730Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7492951Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7493237Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7493472Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7493718Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7494110Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7494463Z return mod(**inputs) 2025-08-14T21:46:30.7494853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7495269Z outputs = self.model( 2025-08-14T21:46:30.7495668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7496081Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7496484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7496906Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7497284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7497678Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7498171Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7498661Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7499130Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7499569Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7500052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7500601Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7500815Z 2025-08-14T21:46:30.7500953Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7501331Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7501690Z return mod(**inputs) 2025-08-14T21:46:30.7502092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7502517Z outputs = self.model( 2025-08-14T21:46:30.7503187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7503805Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7504222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7504731Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7505123Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7505618Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7506045Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7506562Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7506998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7507546Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7508013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7508489Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7508671Z 2025-08-14T21:46:30.7508760Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7508997Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7509217Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7509444Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7509669Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7509884Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7510106Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7510325Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7510577Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7510841Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7511069Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7511294Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7511575Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7511840Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7512079Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7512288Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7512579Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7513031Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7513460Z return mod(**inputs) 2025-08-14T21:46:30.7513890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7514338Z outputs = self.model( 2025-08-14T21:46:30.7514726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7515214Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7515629Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7516193Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7516595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7517208Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7517666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7518180Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7518611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7519057Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7519557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7520076Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7520272Z 2025-08-14T21:46:30.7520391Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7520795Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7521158Z return mod(**inputs) 2025-08-14T21:46:30.7521564Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7521983Z outputs = self.model( 2025-08-14T21:46:30.7522383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7522911Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7523332Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7523842Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7524224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7524617Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7525040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7525499Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7525987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7526524Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7527082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7527669Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7527842Z 2025-08-14T21:46:30.7527958Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7528244Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7528480Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7528724Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7529131Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7529368Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7529633Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7530035Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7530385Z return mod(**inputs) 2025-08-14T21:46:30.7530796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7531232Z outputs = self.model( 2025-08-14T21:46:30.7531635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7532070Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7532561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7533045Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7533532Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7533942Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7534366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 393, in forward 2025-08-14T21:46:30.7534792Z hidden_states = residual + hidden_states 2025-08-14T21:46:30.7534939Z 2025-08-14T21:46:30.7535024Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7535256Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7535472Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7535699Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7535921Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7536148Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7536634Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7536863Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7537095Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7537324Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7537635Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7538025Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7538458Z return mod(**inputs) 2025-08-14T21:46:30.7538866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7539291Z outputs = self.model( 2025-08-14T21:46:30.7539751Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7540186Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7540640Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7541057Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7541434Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7541818Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7542241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7542752Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7543192Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7543728Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7544208Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7544811Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7545016Z 2025-08-14T21:46:30.7545140Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7545611Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7545982Z return mod(**inputs) 2025-08-14T21:46:30.7546384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7546807Z outputs = self.model( 2025-08-14T21:46:30.7547207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7547640Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7548058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7548473Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7548975Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7549423Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7549925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7550448Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7550904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7551410Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7551899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7552392Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7552571Z 2025-08-14T21:46:30.7552657Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7552892Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7553116Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7553341Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7553565Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7553781Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7554003Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7554227Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7554443Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7554665Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7554888Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7555101Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7555322Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7555542Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7555769Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7556057Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7556313Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7556789Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7557136Z return mod(**inputs) 2025-08-14T21:46:30.7557536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7557952Z outputs = self.model( 2025-08-14T21:46:30.7558346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7558761Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7559173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7559593Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7559966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7560364Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7560791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7561230Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7561659Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7562105Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7562588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7563104Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7563301Z 2025-08-14T21:46:30.7563414Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7563811Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7564303Z return mod(**inputs) 2025-08-14T21:46:30.7564703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7565223Z outputs = self.model( 2025-08-14T21:46:30.7565618Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7566050Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7566453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7566890Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7567268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7567652Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7568080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7568517Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7569143Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7569583Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7570062Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7570554Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7570726Z 2025-08-14T21:46:30.7570825Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7571048Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7571275Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7571500Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7571720Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7571943Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7572197Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7572577Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7572930Z return mod(**inputs) 2025-08-14T21:46:30.7573349Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7573786Z outputs = self.model( 2025-08-14T21:46:30.7574185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7574630Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7575043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7575457Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7575829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7576236Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7576665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 393, in forward 2025-08-14T21:46:30.7577096Z hidden_states = residual + hidden_states 2025-08-14T21:46:30.7577249Z 2025-08-14T21:46:30.7577336Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7577563Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7577787Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7578003Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7578226Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7578448Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7578662Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7578917Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7579161Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7579394Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7579654Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7580045Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7580420Z return mod(**inputs) 2025-08-14T21:46:30.7580820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7581258Z outputs = self.model( 2025-08-14T21:46:30.7581693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7582135Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7582569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7582992Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7583374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7583761Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7584184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7584623Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7585052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7585503Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7585987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7586505Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7586723Z 2025-08-14T21:46:30.7586839Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7587233Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7587581Z return mod(**inputs) 2025-08-14T21:46:30.7587984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7588404Z outputs = self.model( 2025-08-14T21:46:30.7588801Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7589224Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7589642Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7590064Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7590434Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7590834Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7591258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7591696Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7592125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7592566Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7593048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7593541Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7593713Z 2025-08-14T21:46:30.7593800Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7594054Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7594299Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7594542Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7594766Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7594990Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7595206Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7595430Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7595654Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7595877Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7596092Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7596311Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7596553Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7596773Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7596996Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7597220Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7597468Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7597871Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7598231Z return mod(**inputs) 2025-08-14T21:46:30.7598633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7599044Z outputs = self.model( 2025-08-14T21:46:30.7599445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7599874Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7600283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7600703Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7601086Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7601480Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7601897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7602336Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7602986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7603443Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7603921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7604446Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7604644Z 2025-08-14T21:46:30.7604766Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7605154Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7605519Z return mod(**inputs) 2025-08-14T21:46:30.7605927Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7606349Z outputs = self.model( 2025-08-14T21:46:30.7606737Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7607164Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7607582Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7607996Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7608383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7608847Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7609364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7610036Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7610479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7610923Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7611403Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7611892Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7612074Z 2025-08-14T21:46:30.7612192Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7612429Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7612649Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7612874Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7613100Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7613323Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7613569Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7613963Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7614333Z return mod(**inputs) 2025-08-14T21:46:30.7614730Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7615165Z outputs = self.model( 2025-08-14T21:46:30.7615573Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7616006Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7616409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7616796Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7617164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7617547Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7617956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 393, in forward 2025-08-14T21:46:30.7618371Z hidden_states = residual + hidden_states 2025-08-14T21:46:30.7618514Z 2025-08-14T21:46:30.7618605Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7618820Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7619039Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7619258Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7619471Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7619688Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7619904Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7620112Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7620333Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7620548Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7620805Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7621157Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7621505Z return mod(**inputs) 2025-08-14T21:46:30.7621894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7622294Z outputs = self.model( 2025-08-14T21:46:30.7622696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7623118Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7623520Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7623984Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7624369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7624756Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7625174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7625584Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7625990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7626403Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7626884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7627386Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7627579Z 2025-08-14T21:46:30.7627701Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7628095Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7628450Z return mod(**inputs) 2025-08-14T21:46:30.7628835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7629242Z outputs = self.model( 2025-08-14T21:46:30.7629616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7630035Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7630437Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7630841Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7631203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7631589Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7632001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7632418Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7632841Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7633270Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7633733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7634203Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7634379Z 2025-08-14T21:46:30.7634464Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7634692Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7634919Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7635135Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7635352Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7635568Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7635777Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7635991Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7636210Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7636418Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7636632Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7636848Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7637057Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7637274Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7637492Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7637707Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7637944Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7638382Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7638729Z return mod(**inputs) 2025-08-14T21:46:30.7639113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7639520Z outputs = self.model( 2025-08-14T21:46:30.7639922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7640344Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7640759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7641176Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7641549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7641938Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7642358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7642787Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7643217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7643641Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7644118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7644649Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7644849Z 2025-08-14T21:46:30.7644967Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7645355Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7645715Z return mod(**inputs) 2025-08-14T21:46:30.7646116Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7646530Z outputs = self.model( 2025-08-14T21:46:30.7646930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7647374Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7647789Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7648205Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7648592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7649107Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7649533Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7649988Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7650434Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7650885Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7651369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7651872Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7652057Z 2025-08-14T21:46:30.7652147Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7652382Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7652610Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7652839Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7653062Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7653333Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7653602Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7654000Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7654358Z return mod(**inputs) 2025-08-14T21:46:30.7654768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7655188Z outputs = self.model( 2025-08-14T21:46:30.7655587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7656117Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7656533Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7656954Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7657331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7657724Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7658146Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 393, in forward 2025-08-14T21:46:30.7658574Z hidden_states = residual + hidden_states 2025-08-14T21:46:30.7658722Z 2025-08-14T21:46:30.7658808Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7659035Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7659258Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7659472Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7659696Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7659922Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7660135Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7660343Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7660560Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7660778Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7661014Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7661395Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7661734Z return mod(**inputs) 2025-08-14T21:46:30.7662113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7662517Z outputs = self.model( 2025-08-14T21:46:30.7662899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7663305Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7663698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7664110Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7664483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7664865Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7665267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7665694Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7666115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7666538Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7667003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7667503Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7667698Z 2025-08-14T21:46:30.7667869Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7668283Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7668634Z return mod(**inputs) 2025-08-14T21:46:30.7669025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7669431Z outputs = self.model( 2025-08-14T21:46:30.7669808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7670214Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7670636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7671035Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7671407Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7671793Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7672207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7672625Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7673046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7673479Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7673937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7674421Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7674596Z 2025-08-14T21:46:30.7674682Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7674911Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7675129Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7675351Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7675571Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7675780Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7675998Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7676217Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7676432Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7676639Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7676854Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7677066Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7677276Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7677491Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7677705Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7677910Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7678156Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7678545Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7678878Z return mod(**inputs) 2025-08-14T21:46:30.7679267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7679672Z outputs = self.model( 2025-08-14T21:46:30.7680059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7680472Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7680896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7681305Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7681674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7682046Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7682518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7682960Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7683391Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7683836Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7684319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7684840Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7685057Z 2025-08-14T21:46:30.7685172Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7685562Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7685920Z return mod(**inputs) 2025-08-14T21:46:30.7686321Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7686730Z outputs = self.model( 2025-08-14T21:46:30.7687126Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7687549Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7687950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7688374Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7688833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7689250Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7689666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 378, in forward 2025-08-14T21:46:30.7690112Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:46:30.7690552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7691010Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7691490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7691983Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7692158Z 2025-08-14T21:46:30.7692253Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7692480Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7692711Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7692937Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7693154Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7693383Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7693637Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7694031Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7694381Z return mod(**inputs) 2025-08-14T21:46:30.7694777Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7695199Z outputs = self.model( 2025-08-14T21:46:30.7695586Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1261, in forward 2025-08-14T21:46:30.7696014Z encoder_outputs = self.encoder( 2025-08-14T21:46:30.7696423Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 878, in forward 2025-08-14T21:46:30.7696840Z layer_outputs = encoder_layer( 2025-08-14T21:46:30.7697210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7697668Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7698093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 393, in forward 2025-08-14T21:46:30.7698525Z hidden_states = residual + hidden_states 2025-08-14T21:46:30.7698683Z 2025-08-14T21:46:30.7698768Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7698994Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7699216Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7699430Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7699655Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7699892Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7700111Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7700333Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7700555Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7700775Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7701031Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7701440Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7701828Z return mod(**inputs) 2025-08-14T21:46:30.7702210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7702817Z outputs = self.model( 2025-08-14T21:46:30.7703225Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7703657Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7704085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7704507Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7704884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7705268Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7705694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7706144Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7706599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7707044Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7707521Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7708039Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7708237Z 2025-08-14T21:46:30.7708353Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7708745Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7709093Z return mod(**inputs) 2025-08-14T21:46:30.7709491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7709907Z outputs = self.model( 2025-08-14T21:46:30.7710320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7710736Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7711135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7711545Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7711909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7712394Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7712846Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7713290Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7713737Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7714179Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7714637Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7715141Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7715317Z 2025-08-14T21:46:30.7715414Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7715641Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7715875Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7716102Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7716324Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7716541Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7716762Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7716986Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7717209Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7717425Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7717638Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7717847Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7718095Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7718488Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7718845Z return mod(**inputs) 2025-08-14T21:46:30.7719239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7719657Z outputs = self.model( 2025-08-14T21:46:30.7720057Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7720475Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7720889Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7721320Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7721702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7722088Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7722513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7722973Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7723424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7723870Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7724351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7724867Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7725064Z 2025-08-14T21:46:30.7725176Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7725576Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7725934Z return mod(**inputs) 2025-08-14T21:46:30.7726335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7726742Z outputs = self.model( 2025-08-14T21:46:30.7727186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7727627Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7728034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7728453Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7728893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7729293Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7729736Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7730198Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7730650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7731096Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7731575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7732069Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7732242Z 2025-08-14T21:46:30.7732338Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7732563Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7732791Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7733015Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7733238Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7733456Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7733679Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7733902Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7734116Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7734343Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7734570Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7734785Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7735008Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7735235Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7735449Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7735678Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7735926Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7736313Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7736652Z return mod(**inputs) 2025-08-14T21:46:30.7737044Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7737452Z outputs = self.model( 2025-08-14T21:46:30.7737832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7738251Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7738652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7739079Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7739440Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7739825Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7740237Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7740665Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7741096Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7741555Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7742060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7742575Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7742775Z 2025-08-14T21:46:30.7742884Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7743285Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7743642Z return mod(**inputs) 2025-08-14T21:46:30.7744019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7744481Z outputs = self.model( 2025-08-14T21:46:30.7744880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7745298Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7745723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7746146Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7746525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7746917Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7747361Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7747823Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7748274Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7748718Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7749203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7749713Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7749887Z 2025-08-14T21:46:30.7749973Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7750205Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7750460Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7750855Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7751199Z return mod(**inputs) 2025-08-14T21:46:30.7751602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7752011Z outputs = self.model( 2025-08-14T21:46:30.7752395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7752826Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7753241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7753660Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7754029Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7754430Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7754851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 482, in forward 2025-08-14T21:46:30.7755282Z hidden_states = residual + hidden_states 2025-08-14T21:46:30.7755431Z 2025-08-14T21:46:30.7755518Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7755748Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7755972Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7756185Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7756440Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7756683Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7756916Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7757139Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7757361Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7757576Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7757826Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7758218Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7758569Z return mod(**inputs) 2025-08-14T21:46:30.7758978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7759407Z outputs = self.model( 2025-08-14T21:46:30.7759808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7760250Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7760670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7761093Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7761466Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7761872Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7762296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7762749Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7763204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7763639Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7764125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7764642Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7764841Z 2025-08-14T21:46:30.7764964Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7765353Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7765707Z return mod(**inputs) 2025-08-14T21:46:30.7766102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7766510Z outputs = self.model( 2025-08-14T21:46:30.7766906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7767326Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7767739Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7768157Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7768535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7769026Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7769453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7769920Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7770386Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7770837Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7771322Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7771882Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7772078Z 2025-08-14T21:46:30.7772176Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7772408Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7772626Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7772853Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7773076Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7773294Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7773519Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7773750Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7773969Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7774215Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7774440Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7774655Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7774880Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7775106Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7775333Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7775548Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7775804Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7776199Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7776550Z return mod(**inputs) 2025-08-14T21:46:30.7776950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7777378Z outputs = self.model( 2025-08-14T21:46:30.7777767Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7778195Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7778611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7779033Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7779405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7779799Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7780222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7780672Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7781113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7781542Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7782006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7782496Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7782697Z 2025-08-14T21:46:30.7782809Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7783191Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7783561Z return mod(**inputs) 2025-08-14T21:46:30.7783937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7784367Z outputs = self.model( 2025-08-14T21:46:30.7784761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7785195Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7785615Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7786026Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7786444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7786875Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7787310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7787767Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7788218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7788671Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7789159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7789647Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7789814Z 2025-08-14T21:46:30.7789904Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7790123Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7790353Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7790572Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7790781Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7790998Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7791215Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7791429Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7791650Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7791882Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7792092Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7792307Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7792556Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7792946Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7793291Z return mod(**inputs) 2025-08-14T21:46:30.7793695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7794116Z outputs = self.model( 2025-08-14T21:46:30.7794502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7794935Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7795357Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7795769Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7796140Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7796538Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7796959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7797416Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7797861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7798301Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7798780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7799287Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7799491Z 2025-08-14T21:46:30.7799603Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7799996Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7800347Z return mod(**inputs) 2025-08-14T21:46:30.7800732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7801194Z outputs = self.model( 2025-08-14T21:46:30.7801605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7802030Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7802439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7803102Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7803489Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7803878Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7804367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7804497Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7804780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7804888Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7805211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7805328Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7805333Z 2025-08-14T21:46:30.7805420Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7805514Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7805629Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7805861Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7805935Z return mod(**inputs) 2025-08-14T21:46:30.7806213Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7806307Z outputs = self.model( 2025-08-14T21:46:30.7806588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7806670Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7806953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7807034Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7807284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7807375Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7807654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 499, in forward 2025-08-14T21:46:30.7807751Z hidden_states = residual + hidden_states 2025-08-14T21:46:30.7807756Z 2025-08-14T21:46:30.7807844Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7807940Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7808021Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7808103Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7808193Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7808274Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7808354Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7808441Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7808522Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7808601Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7808689Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7808826Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7808924Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7809009Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7809123Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7809442Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7809517Z return mod(**inputs) 2025-08-14T21:46:30.7809791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7809876Z outputs = self.model( 2025-08-14T21:46:30.7810147Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7810229Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7810526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7810607Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7810852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7810941Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7811216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7811333Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7811601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7811713Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7812025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7812167Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7812171Z 2025-08-14T21:46:30.7812290Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7812503Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7812585Z return mod(**inputs) 2025-08-14T21:46:30.7812860Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7812934Z outputs = self.model( 2025-08-14T21:46:30.7813211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7813294Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7813582Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7813721Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7813976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7814066Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7814330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7814438Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7814710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7814813Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7815115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7815237Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7815241Z 2025-08-14T21:46:30.7815324Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7815416Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7815499Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7815580Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7815668Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7816682Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7816779Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7816869Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7816949Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7817037Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7817117Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7817195Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7817315Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7817526Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7817596Z return mod(**inputs) 2025-08-14T21:46:30.7817890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7817965Z outputs = self.model( 2025-08-14T21:46:30.7818236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7818319Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7818583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7818666Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7818899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7818983Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7819253Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7819373Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7819643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7819747Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7820053Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7820195Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7820199Z 2025-08-14T21:46:30.7820309Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7820525Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7820595Z return mod(**inputs) 2025-08-14T21:46:30.7820863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7820943Z outputs = self.model( 2025-08-14T21:46:30.7821207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7821286Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7821562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7821638Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7821873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7821958Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7822221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7822343Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7822606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7822712Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7823032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7823176Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7823180Z 2025-08-14T21:46:30.7823271Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7823352Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7823432Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7823519Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7823598Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7823677Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7823793Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7824020Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7824099Z return mod(**inputs) 2025-08-14T21:46:30.7824363Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7824438Z outputs = self.model( 2025-08-14T21:46:30.7824716Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7824795Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7825073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7825147Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7825382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7825476Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7825750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 508, in forward 2025-08-14T21:46:30.7825837Z hidden_states = residual + hidden_states 2025-08-14T21:46:30.7825843Z 2025-08-14T21:46:30.7825934Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7826018Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7826106Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7826188Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7826269Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7826354Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7826434Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7826516Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7826604Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7826684Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7826795Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7827020Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7827091Z return mod(**inputs) 2025-08-14T21:46:30.7827383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7827463Z outputs = self.model( 2025-08-14T21:46:30.7827747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7827833Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7828112Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7828188Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7828436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7828522Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7828796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7828900Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7829255Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7829363Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7829662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7829804Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7829808Z 2025-08-14T21:46:30.7829916Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7830124Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7830218Z return mod(**inputs) 2025-08-14T21:46:30.7830544Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7830618Z outputs = self.model( 2025-08-14T21:46:30.7830910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7830986Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7831261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7831335Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7831571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7831663Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7831935Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7832048Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7832319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7832425Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7832745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7832858Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7832862Z 2025-08-14T21:46:30.7832950Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7833032Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7833112Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7833199Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7833277Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7833356Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7833443Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7833521Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7833599Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7833688Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7833769Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7833851Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7833969Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7834189Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7834268Z return mod(**inputs) 2025-08-14T21:46:30.7834565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7834637Z outputs = self.model( 2025-08-14T21:46:30.7834919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7834998Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7835283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7835420Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7835657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7835752Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7836021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7836139Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7836417Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7836546Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7836865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7837005Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7837011Z 2025-08-14T21:46:30.7837123Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7837346Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7837415Z return mod(**inputs) 2025-08-14T21:46:30.7837698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7837770Z outputs = self.model( 2025-08-14T21:46:30.7838043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7838130Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7838401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7838478Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7838728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7838815Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7839094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7839209Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7839480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7839590Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7839902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7840024Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7840028Z 2025-08-14T21:46:30.7840116Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7840204Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7840295Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7840377Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7840462Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7840552Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7840631Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7840711Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7840799Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7840879Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7840966Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7841049Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7841130Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7841217Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7841296Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7841398Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7841561Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7841776Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7841848Z return mod(**inputs) 2025-08-14T21:46:30.7842129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7842202Z outputs = self.model( 2025-08-14T21:46:30.7842479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7842557Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7842842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7842929Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7843172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7843261Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7843541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7843648Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7843926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7844027Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7844341Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7844486Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7844490Z 2025-08-14T21:46:30.7844603Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7844830Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7844901Z return mod(**inputs) 2025-08-14T21:46:30.7845177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7845258Z outputs = self.model( 2025-08-14T21:46:30.7845531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7845610Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7845892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7845972Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7846222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7846308Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7846588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7846704Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7846976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7847447Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7847944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7848442Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7848633Z 2025-08-14T21:46:30.7848723Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7849048Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7849321Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7849840Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7850199Z return mod(**inputs) 2025-08-14T21:46:30.7850592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7851020Z outputs = self.model( 2025-08-14T21:46:30.7851421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7851837Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7852260Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7852711Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7853107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7853585Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7854040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 482, in forward 2025-08-14T21:46:30.7854460Z hidden_states = residual + hidden_states 2025-08-14T21:46:30.7854609Z 2025-08-14T21:46:30.7854704Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7854931Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7855158Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7855380Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7855596Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7855819Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7856043Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7856263Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7856486Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7856707Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7856956Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7857347Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7857708Z return mod(**inputs) 2025-08-14T21:46:30.7858110Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7858532Z outputs = self.model( 2025-08-14T21:46:30.7858937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7859369Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7859776Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7860183Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7860558Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7860955Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7861367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7861815Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7862271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7862727Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7863208Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7863736Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7863946Z 2025-08-14T21:46:30.7864064Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7864464Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7864846Z return mod(**inputs) 2025-08-14T21:46:30.7865254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7865681Z outputs = self.model( 2025-08-14T21:46:30.7866095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7866516Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7866916Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7867333Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7867712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7868095Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7868504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7868951Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7869382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7869811Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7870274Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7870745Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7870923Z 2025-08-14T21:46:30.7871014Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7871243Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7871465Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7871675Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7871893Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7872112Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7872320Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7872537Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7872748Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7872954Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7873168Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7873384Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7873599Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7873820Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7874041Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7874263Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7874508Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7874908Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7875255Z return mod(**inputs) 2025-08-14T21:46:30.7875656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7876074Z outputs = self.model( 2025-08-14T21:46:30.7876470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7876888Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7877293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7877713Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7878096Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7878486Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7878917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7879457Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7879905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7880355Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7880840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7881364Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7881564Z 2025-08-14T21:46:30.7881687Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7882093Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7882454Z return mod(**inputs) 2025-08-14T21:46:30.7882872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7883301Z outputs = self.model( 2025-08-14T21:46:30.7883705Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7884139Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7884563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7884982Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7885369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7885772Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7886205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7886664Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7887125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7887575Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7888051Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7888550Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7888732Z 2025-08-14T21:46:30.7888902Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7889159Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7889396Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7889636Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7889860Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7890075Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7890302Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7890533Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7890750Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7890973Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7891193Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7891413Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7891657Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7892052Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7892417Z return mod(**inputs) 2025-08-14T21:46:30.7892797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7893214Z outputs = self.model( 2025-08-14T21:46:30.7893613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7894031Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7894524Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7894953Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7895334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7895726Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7896155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7896613Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7897084Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7897512Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7897985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7898499Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7898693Z 2025-08-14T21:46:30.7898811Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7899185Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7899540Z return mod(**inputs) 2025-08-14T21:46:30.7899939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7900364Z outputs = self.model( 2025-08-14T21:46:30.7900771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7901189Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7901592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7902006Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7902380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7903007Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7903425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7903887Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7904347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7904811Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7905304Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7905790Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7905973Z 2025-08-14T21:46:30.7906057Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7906274Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7906507Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7906870Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7907211Z return mod(**inputs) 2025-08-14T21:46:30.7907591Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7908011Z outputs = self.model( 2025-08-14T21:46:30.7908412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7908825Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7909225Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7909779Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7910167Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7910551Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7910969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 499, in forward 2025-08-14T21:46:30.7911391Z hidden_states = residual + hidden_states 2025-08-14T21:46:30.7911537Z 2025-08-14T21:46:30.7911629Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7911850Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7912105Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7912329Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7912539Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7912763Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7912992Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7913212Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7913442Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7913663Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7913883Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7914116Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7914329Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7914553Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7914815Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7915192Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7915540Z return mod(**inputs) 2025-08-14T21:46:30.7915926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7916325Z outputs = self.model( 2025-08-14T21:46:30.7916702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7917112Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7917521Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7917937Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7918312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7918703Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7919126Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7919564Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7920018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7920450Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7920914Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7921406Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7921606Z 2025-08-14T21:46:30.7921717Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7922095Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7922437Z return mod(**inputs) 2025-08-14T21:46:30.7922814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7923217Z outputs = self.model( 2025-08-14T21:46:30.7923598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7924059Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7924493Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7924917Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7925304Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7925699Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7926133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7926595Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7927099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7927559Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7928048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7928544Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7928718Z 2025-08-14T21:46:30.7928870Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7929128Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7929362Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7929580Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7929808Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7930034Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7930254Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7930479Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7930705Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7930930Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7931146Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7931371Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7931625Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7932010Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7932366Z return mod(**inputs) 2025-08-14T21:46:30.7932772Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7933200Z outputs = self.model( 2025-08-14T21:46:30.7933601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7934021Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7934438Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7934860Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7935244Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7935643Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7936068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7936525Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7936985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7937482Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7937967Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7938488Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7938691Z 2025-08-14T21:46:30.7938798Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7939267Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7939605Z return mod(**inputs) 2025-08-14T21:46:30.7939987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7940395Z outputs = self.model( 2025-08-14T21:46:30.7940783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7941181Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7941579Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7941967Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7942315Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7942675Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7943080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7943529Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7943961Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7944411Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7944903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7945390Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7945561Z 2025-08-14T21:46:30.7945645Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7945876Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7946098Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7946315Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7946538Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7946763Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7946998Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7947360Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7947705Z return mod(**inputs) 2025-08-14T21:46:30.7948090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7948561Z outputs = self.model( 2025-08-14T21:46:30.7948933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7949327Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7949709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7950091Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7950442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7950823Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7951231Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 508, in forward 2025-08-14T21:46:30.7951646Z hidden_states = residual + hidden_states 2025-08-14T21:46:30.7951795Z 2025-08-14T21:46:30.7951881Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7952102Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7952313Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7952530Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7952748Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7952955Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7953223Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7953459Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7953668Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7953887Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7954131Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7954512Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7954850Z return mod(**inputs) 2025-08-14T21:46:30.7955241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7955648Z outputs = self.model( 2025-08-14T21:46:30.7956052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7956460Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7956858Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7957275Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7957646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7958039Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7958461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7958925Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7959359Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7959794Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7960261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7960755Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7960957Z 2025-08-14T21:46:30.7961069Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7961451Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7961794Z return mod(**inputs) 2025-08-14T21:46:30.7962174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7962581Z outputs = self.model( 2025-08-14T21:46:30.7962966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7963388Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7963799Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7964219Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7964600Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7964988Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7965410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7965870Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7966314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7966762Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7967244Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7967738Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7967945Z 2025-08-14T21:46:30.7968055Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7968306Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7968541Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7968838Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7969074Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7969302Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7969529Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7969746Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7969975Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7970202Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7970427Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7970671Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7970926Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7971312Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7971657Z return mod(**inputs) 2025-08-14T21:46:30.7972046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7972468Z outputs = self.model( 2025-08-14T21:46:30.7972858Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7973263Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7973665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7974082Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7974451Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7974837Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7975251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7975693Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7976144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7976555Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7976993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7977488Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7977692Z 2025-08-14T21:46:30.7977804Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7978192Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7978539Z return mod(**inputs) 2025-08-14T21:46:30.7978931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7979355Z outputs = self.model( 2025-08-14T21:46:30.7979799Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7980199Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7980585Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7980973Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7981340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7981727Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7982143Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.7982607Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.7983110Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7983532Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7984006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.7984491Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.7984663Z 2025-08-14T21:46:30.7984748Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7984976Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7985201Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7985446Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7985664Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7985885Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7986103Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7986438Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7986658Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7986876Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7987088Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7987303Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7987517Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7987735Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7987943Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7988158Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.7988408Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7988791Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7989148Z return mod(**inputs) 2025-08-14T21:46:30.7989546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7989947Z outputs = self.model( 2025-08-14T21:46:30.7990332Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7990737Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7991138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7991551Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7991919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7992302Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7992710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.7993146Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.7993583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.7994014Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.7994472Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.7994973Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.7995173Z 2025-08-14T21:46:30.7995282Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.7995666Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.7995999Z return mod(**inputs) 2025-08-14T21:46:30.7996384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.7996791Z outputs = self.model( 2025-08-14T21:46:30.7997217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.7997652Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.7998058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.7998470Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.7998863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.7999249Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.7999698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.8000148Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.8000582Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.8001037Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.8001522Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.8002011Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.8002191Z 2025-08-14T21:46:30.8002278Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8002510Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8003024Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.8003412Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.8003776Z return mod(**inputs) 2025-08-14T21:46:30.8004179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.8004596Z outputs = self.model( 2025-08-14T21:46:30.8004999Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.8005424Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.8005838Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.8006265Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.8006646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.8007041Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.8007464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 482, in forward 2025-08-14T21:46:30.8007881Z hidden_states = residual + hidden_states 2025-08-14T21:46:30.8008038Z 2025-08-14T21:46:30.8008124Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8008352Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8008572Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8009102Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8009340Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8009558Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8009784Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8010006Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8010229Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8010448Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8010705Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.8011104Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.8011451Z return mod(**inputs) 2025-08-14T21:46:30.8011850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.8012275Z outputs = self.model( 2025-08-14T21:46:30.8012807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.8013237Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.8013663Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.8014103Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.8014486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.8014899Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.8015372Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.8015843Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.8016309Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.8016756Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.8017240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.8017755Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.8017955Z 2025-08-14T21:46:30.8018070Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.8018472Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.8018825Z return mod(**inputs) 2025-08-14T21:46:30.8019235Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.8019663Z outputs = self.model( 2025-08-14T21:46:30.8020069Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.8020502Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.8020917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.8021339Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.8021721Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.8022120Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.8022543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.8023012Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.8023464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.8023910Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.8024398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.8024889Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.8025062Z 2025-08-14T21:46:30.8025155Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8025380Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8025606Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8025832Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8026048Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8026270Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8026490Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8026708Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8026931Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8027154Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8027375Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8027634Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8027874Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8028100Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8028317Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8028538Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8028652Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.8028884Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.8028959Z return mod(**inputs) 2025-08-14T21:46:30.8029299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.8029387Z outputs = self.model( 2025-08-14T21:46:30.8029661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.8029756Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.8030034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.8030114Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.8030370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.8030456Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.8030731Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.8030837Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.8031103Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.8031211Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.8031511Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.8031656Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.8031670Z 2025-08-14T21:46:30.8031784Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.8032000Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.8032081Z return mod(**inputs) 2025-08-14T21:46:30.8032353Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.8032426Z outputs = self.model( 2025-08-14T21:46:30.8032707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.8032787Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.8033063Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.8033147Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.8033385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.8033490Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.8033751Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.8033856Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.8034127Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.8034231Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.8034548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.8034707Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.8034711Z 2025-08-14T21:46:30.8034813Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8034908Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8034990Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8035073Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8035160Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8035240Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8035331Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8035411Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8035491Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8035608Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8035693Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8035773Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8035893Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.8036115Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.8036189Z return mod(**inputs) 2025-08-14T21:46:30.8036472Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.8036547Z outputs = self.model( 2025-08-14T21:46:30.8036833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.8036912Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.8037174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.8037260Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.8037489Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.8037582Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.8037848Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.8037961Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.8038230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.8038329Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.8038628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.8038769Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.8038774Z 2025-08-14T21:46:30.8038884Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.8039101Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.8039177Z return mod(**inputs) 2025-08-14T21:46:30.8039441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.8039524Z outputs = self.model( 2025-08-14T21:46:30.8039789Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.8039875Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.8040145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.8040223Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.8040469Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.8040556Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.8040825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.8041001Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.8041274Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.8041382Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.8041694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.8041816Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.8041820Z 2025-08-14T21:46:30.8041911Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8042009Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8042129Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.8042339Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.8042414Z return mod(**inputs) 2025-08-14T21:46:30.8042700Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.8042774Z outputs = self.model( 2025-08-14T21:46:30.8043047Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.8043136Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.8043408Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.8043493Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.8043734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.8043822Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.8044102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 499, in forward 2025-08-14T21:46:30.8044198Z hidden_states = residual + hidden_states 2025-08-14T21:46:30.8044202Z 2025-08-14T21:46:30.8044288Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8044378Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8044460Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8044551Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8044633Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8044713Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8044805Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8044887Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8044970Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8045060Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8045140Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8045221Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8045312Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8045395Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8045516Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.8045733Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.8045805Z return mod(**inputs) 2025-08-14T21:46:30.8046086Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.8046160Z outputs = self.model( 2025-08-14T21:46:30.8046433Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.8046531Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.8046805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.8046889Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.8047194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.8047283Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.8047563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.8047669Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.8047948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.8048050Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.8048376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.8048524Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.8048528Z 2025-08-14T21:46:30.8048642Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.8048948Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.8049037Z return mod(**inputs) 2025-08-14T21:46:30.8049310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.8049393Z outputs = self.model( 2025-08-14T21:46:30.8049665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.8049746Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.8050031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.8050108Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.8050354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.8050446Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.8050719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 473, in forward 2025-08-14T21:46:30.8050833Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:46:30.8051104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.8051207Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.8051531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.8051648Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.8051652Z 2025-08-14T21:46:30.8051748Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8051835Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8051921Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8052018Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8052102Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8052184Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8052274Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8052357Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8052446Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8052526Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8052606Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8052693Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8052804Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.8053021Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.8053102Z return mod(**inputs) 2025-08-14T21:46:30.8053375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.8053497Z outputs = self.model( 2025-08-14T21:46:30.8053810Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.8053890Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.8054177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.8054255Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.8054502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.8054595Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.8054897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.8055026Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.8055320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.8055423Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.8055748Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:46:30.8055890Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:46:30.8055895Z 2025-08-14T21:46:30.8056005Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.8056233Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.8056306Z return mod(**inputs) 2025-08-14T21:46:30.8056590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.8056664Z outputs = self.model( 2025-08-14T21:46:30.8056958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.8057045Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.8057327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.8057412Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.8057654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.8057740Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.8058027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 490, in forward 2025-08-14T21:46:30.8058143Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:46:30.8058421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 319, in forward 2025-08-14T21:46:30.8058537Z attn_output, attn_weights = attention_interface( 2025-08-14T21:46:30.8058855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:46:30.8058974Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:46:30.8058979Z 2025-08-14T21:46:30.8059065Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8059147Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8059237Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8059318Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8059400Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8059491Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8059603Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.8059836Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.8059948Z return mod(**inputs) 2025-08-14T21:46:30.8060251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1404, in forward 2025-08-14T21:46:30.8060331Z outputs = self.model( 2025-08-14T21:46:30.8060604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1279, in forward 2025-08-14T21:46:30.8060682Z decoder_outputs = self.decoder( 2025-08-14T21:46:30.8060966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1127, in forward 2025-08-14T21:46:30.8061042Z layer_outputs = decoder_layer( 2025-08-14T21:46:30.8061306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:46:30.8061390Z return super().__call__(*args, **kwargs) 2025-08-14T21:46:30.8061667Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 508, in forward 2025-08-14T21:46:30.8061766Z hidden_states = residual + hidden_states 2025-08-14T21:46:30.8061769Z 2025-08-14T21:46:30.8061849Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8061935Z cudagraph partition due to non gpu ops 2025-08-14T21:46:30.8062043Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:46:30.8062289Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:46:30.8062367Z return mod(**inputs) 2025-08-14T21:46:30.8062652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1429, in forward 2025-08-14T21:46:30.8062836Z masked_lm_loss = loss_fct(lm_logits.view(-1, self.config.vocab_size), labels.view(-1)) 2025-08-14T21:46:30.8062841Z 2025-08-14T21:46:44.5859306Z Compilation time (from dynamo_timed): 37.862084225 2025-08-14T21:46:44.5907624Z pass 2025-08-14T21:46:44.5908096Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:46:44.5908978Z TIMING: _recursive_pre_grad_passes:0.09998 _recursive_joint_graph_passes:0.93483 _recursive_post_grad_passes:0.17215 async_compile.wait:0.93731 code_gen:13.56001 inductor_compile:16.81221 backend_compile:31.28613 gc:0.0005 entire_frame_compile:37.86208 total_wall_time:37.86208 2025-08-14T21:46:44.5911075Z STATS: call_* op count: 1016 | FakeTensorMode.__torch_dispatch__:68441 | FakeTensor.__torch_dispatch__:8067 | ProxyTorchDispatchMode.__torch_dispatch__:19125 2025-08-14T21:46:44.5911752Z Dynamo produced 1 graphs covering 1016 ops with 0 graph breaks (0 unique) 2025-08-14T21:46:51.2702927Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:46:51.2703962Z from pkg_resources import resource_filename 2025-08-14T21:46:51.9135010Z 2025-08-14T21:46:54.8141472Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:46:54.8142372Z loading model: 0it [00:02, ?it/s] 2025-08-14T21:46:54.8142667Z cpu eval MBartForCausalLM 2025-08-14T21:46:56.6458532Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:46:57.1200437Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:46:57.6028450Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:47:09.0235416Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0241142Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0244633Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0244939Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0246748Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0247861Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0248167Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0248401Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0248631Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0249128Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0249362Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0249580Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0249886Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0250193Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0250482Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0250803Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0251104Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0251356Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0251610Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0251880Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0252136Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0252391Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0252687Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0253177Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0253596Z return mod(**inputs) 2025-08-14T21:47:09.0254141Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0254649Z outputs = self.model.decoder( 2025-08-14T21:47:09.0255138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0255604Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0256018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0256451Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0256928Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0257416Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0257907Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0258388Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0258937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:47:09.0259526Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:47:09.0259783Z 2025-08-14T21:47:09.0259903Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0260336Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0260707Z return mod(**inputs) 2025-08-14T21:47:09.0261128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0261573Z outputs = self.model.decoder( 2025-08-14T21:47:09.0261991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0262433Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0262875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0263356Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0263777Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0264243Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0264701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0265244Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0265731Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:47:09.0266235Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:47:09.0266416Z 2025-08-14T21:47:09.0266519Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0266756Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0266992Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0267225Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0267521Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0267740Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0267965Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0268200Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0268433Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0268749Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0268973Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0269281Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0269583Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0269877Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0270096Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0270417Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0270738Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0271207Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0271614Z return mod(**inputs) 2025-08-14T21:47:09.0272203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0272756Z outputs = self.model.decoder( 2025-08-14T21:47:09.0273188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0273778Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0274225Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0274768Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0275298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0275753Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0276369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0276982Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0277642Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:47:09.0278334Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:47:09.0278580Z 2025-08-14T21:47:09.0278740Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0279282Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0279641Z return mod(**inputs) 2025-08-14T21:47:09.0280033Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0280588Z outputs = self.model.decoder( 2025-08-14T21:47:09.0281203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0281635Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0282008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0282467Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0282915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0283360Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0283807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0284387Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0284870Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:47:09.0285388Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:47:09.0285572Z 2025-08-14T21:47:09.0285660Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0285892Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0286110Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0286336Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0286564Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0286786Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0287034Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0287427Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0287783Z return mod(**inputs) 2025-08-14T21:47:09.0288176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0288603Z outputs = self.model.decoder( 2025-08-14T21:47:09.0289124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0289554Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0289930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0290336Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0290766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 450, in forward 2025-08-14T21:47:09.0291193Z hidden_states = residual + hidden_states 2025-08-14T21:47:09.0291351Z 2025-08-14T21:47:09.0291438Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0291670Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0291898Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0292117Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0292344Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0292569Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0292784Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0293003Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0293226Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0293445Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0293704Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0294102Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0294454Z return mod(**inputs) 2025-08-14T21:47:09.0294844Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0295268Z outputs = self.model.decoder( 2025-08-14T21:47:09.0295682Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0296097Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0296481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0296879Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0297305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0297820Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0298259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0298690Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0299145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:47:09.0299650Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:47:09.0299849Z 2025-08-14T21:47:09.0299964Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0300394Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0300735Z return mod(**inputs) 2025-08-14T21:47:09.0301115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0301538Z outputs = self.model.decoder( 2025-08-14T21:47:09.0301937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0302337Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0302940Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0303340Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0303747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0304186Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0304620Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0305056Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0305514Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:47:09.0306000Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:47:09.0306178Z 2025-08-14T21:47:09.0306265Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0306492Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0306708Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0306930Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0307149Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0307361Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0307580Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0307796Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0308005Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0308222Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0308444Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0308659Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0308875Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0309090Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0309304Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0309513Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0309759Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0310137Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0310481Z return mod(**inputs) 2025-08-14T21:47:09.0310886Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0311358Z outputs = self.model.decoder( 2025-08-14T21:47:09.0311761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0312297Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0312696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0313093Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0313498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0313964Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0314406Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0314952Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0315452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:47:09.0315965Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:47:09.0316167Z 2025-08-14T21:47:09.0316289Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0316678Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0317068Z return mod(**inputs) 2025-08-14T21:47:09.0317474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0317910Z outputs = self.model.decoder( 2025-08-14T21:47:09.0318311Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0318833Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0319204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0319587Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0320000Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0320555Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0320996Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0321436Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0321904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:47:09.0322400Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:47:09.0322574Z 2025-08-14T21:47:09.0322670Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0322900Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0323134Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0323362Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0323582Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0323813Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0324073Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0324467Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0324817Z return mod(**inputs) 2025-08-14T21:47:09.0325214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0325638Z outputs = self.model.decoder( 2025-08-14T21:47:09.0326040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0326455Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0326832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0327231Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0327692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 450, in forward 2025-08-14T21:47:09.0328157Z hidden_states = residual + hidden_states 2025-08-14T21:47:09.0328306Z 2025-08-14T21:47:09.0328399Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0328618Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0328924Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0329170Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0329393Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0329609Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0329835Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0330061Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0330308Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0330543Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0330804Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0331193Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0331555Z return mod(**inputs) 2025-08-14T21:47:09.0331953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0332375Z outputs = self.model.decoder( 2025-08-14T21:47:09.0332783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0333204Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0333572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0333947Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0334356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0334785Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0335226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0335670Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0336220Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:47:09.0336742Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:47:09.0336940Z 2025-08-14T21:47:09.0337070Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0337442Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0337792Z return mod(**inputs) 2025-08-14T21:47:09.0338177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0338596Z outputs = self.model.decoder( 2025-08-14T21:47:09.0339013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0339430Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0339808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0340201Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0340625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0341068Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0341508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0341941Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0342424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:47:09.0342986Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:47:09.0343162Z 2025-08-14T21:47:09.0343249Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0343481Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0343731Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0343957Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0344173Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0344397Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0344622Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0344836Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0345085Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0345317Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0345532Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0345757Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0345985Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0346200Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0346427Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0346650Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0346906Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0347289Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0347643Z return mod(**inputs) 2025-08-14T21:47:09.0348056Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0348474Z outputs = self.model.decoder( 2025-08-14T21:47:09.0348891Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0349313Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0349687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0350080Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0350503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0350948Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0351385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0351828Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0352317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:47:09.0352837Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:47:09.0353036Z 2025-08-14T21:47:09.0353151Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0353547Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0353907Z return mod(**inputs) 2025-08-14T21:47:09.0354304Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0354720Z outputs = self.model.decoder( 2025-08-14T21:47:09.0355133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0355555Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0355926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0356334Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0356765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0357207Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0357708Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0358156Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0358634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:47:09.0359192Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:47:09.0359371Z 2025-08-14T21:47:09.0359460Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0359691Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0359920Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0360167Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0360400Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0360630Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0360881Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0361284Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0361641Z return mod(**inputs) 2025-08-14T21:47:09.0362044Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0362464Z outputs = self.model.decoder( 2025-08-14T21:47:09.0362879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0363299Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0363672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0364068Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0364494Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 450, in forward 2025-08-14T21:47:09.0364931Z hidden_states = residual + hidden_states 2025-08-14T21:47:09.0365079Z 2025-08-14T21:47:09.0365169Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0365401Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0365626Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0365842Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0366062Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0366283Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0366505Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0366721Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0366940Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0367162Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0367409Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0367798Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0368153Z return mod(**inputs) 2025-08-14T21:47:09.0368546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0369086Z outputs = self.model.decoder( 2025-08-14T21:47:09.0369533Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0369965Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0370347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0370759Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0371202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0371660Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0372116Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0372687Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0373174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:47:09.0373700Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:47:09.0373905Z 2025-08-14T21:47:09.0374016Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0374402Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0374755Z return mod(**inputs) 2025-08-14T21:47:09.0375153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0375573Z outputs = self.model.decoder( 2025-08-14T21:47:09.0375979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0376390Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0376771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0377169Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0377594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0378037Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0378484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0378922Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0379387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:47:09.0379864Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:47:09.0380043Z 2025-08-14T21:47:09.0380130Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0380357Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0380573Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0380794Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0381015Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0381235Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0381451Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0381671Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0381891Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0382102Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0382319Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0382536Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0382754Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0382979Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0383202Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0383425Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0383682Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0384077Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0384453Z return mod(**inputs) 2025-08-14T21:47:09.0384872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0385300Z outputs = self.model.decoder( 2025-08-14T21:47:09.0385703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0386121Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0386493Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0386887Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0387384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0387832Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0388284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0388753Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0389239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:47:09.0389758Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:47:09.0389976Z 2025-08-14T21:47:09.0390090Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0390486Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0390843Z return mod(**inputs) 2025-08-14T21:47:09.0391246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0391672Z outputs = self.model.decoder( 2025-08-14T21:47:09.0392086Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0392496Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0392877Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0393274Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0393691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0394139Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0394645Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0395097Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0395572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:47:09.0396070Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:47:09.0396253Z 2025-08-14T21:47:09.0396343Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0396577Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0396796Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0397019Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0397243Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0397458Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0397712Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0398106Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0398456Z return mod(**inputs) 2025-08-14T21:47:09.0398864Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0399287Z outputs = self.model.decoder( 2025-08-14T21:47:09.0399697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0400106Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0400482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0400873Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0401286Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 450, in forward 2025-08-14T21:47:09.0401712Z hidden_states = residual + hidden_states 2025-08-14T21:47:09.0401867Z 2025-08-14T21:47:09.0401951Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0402269Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0402507Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0402972Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0403205Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0403423Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0403650Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0403888Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0404104Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0404330Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0404591Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0405061Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0405433Z return mod(**inputs) 2025-08-14T21:47:09.0405834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0406268Z outputs = self.model.decoder( 2025-08-14T21:47:09.0406679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0407104Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0407490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0407893Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0408312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0408852Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0409498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0409964Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0410442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:47:09.0410983Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:47:09.0411183Z 2025-08-14T21:47:09.0411302Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0411684Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0412037Z return mod(**inputs) 2025-08-14T21:47:09.0412432Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0412855Z outputs = self.model.decoder( 2025-08-14T21:47:09.0413259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0413647Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0413992Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0414359Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0414736Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0415143Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0415547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0415942Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0416382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:47:09.0416833Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:47:09.0416993Z 2025-08-14T21:47:09.0417079Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0417348Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0417590Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0417829Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0418034Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0418254Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0418472Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0418680Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0418900Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0419119Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0419321Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0419520Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0419743Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0419953Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0420153Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0420359Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0420595Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0420954Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0421283Z return mod(**inputs) 2025-08-14T21:47:09.0421650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0422038Z outputs = self.model.decoder( 2025-08-14T21:47:09.0422411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0422815Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0423186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0423560Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0423971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0424410Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0424841Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0425268Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0425705Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:47:09.0426176Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:47:09.0426355Z 2025-08-14T21:47:09.0426464Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0426820Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0427142Z return mod(**inputs) 2025-08-14T21:47:09.0427507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0427892Z outputs = self.model.decoder( 2025-08-14T21:47:09.0428270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0428654Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0429019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0429391Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0429803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0430246Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0430673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0431105Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0431630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:47:09.0432117Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:47:09.0432287Z 2025-08-14T21:47:09.0432373Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0432601Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0432827Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0433052Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0433270Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0433494Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0433745Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0434157Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0434516Z return mod(**inputs) 2025-08-14T21:47:09.0434917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0435326Z outputs = self.model.decoder( 2025-08-14T21:47:09.0435737Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0436141Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0436507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0436878Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0437288Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 450, in forward 2025-08-14T21:47:09.0437705Z hidden_states = residual + hidden_states 2025-08-14T21:47:09.0437849Z 2025-08-14T21:47:09.0437940Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0438152Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0438370Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0438591Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0438802Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0439021Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0439236Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0439444Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0439660Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0439876Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0440110Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0440495Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0440842Z return mod(**inputs) 2025-08-14T21:47:09.0441236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0441637Z outputs = self.model.decoder( 2025-08-14T21:47:09.0442050Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0442471Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0442834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0443221Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0443630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0444064Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0444487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0444917Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0445383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:47:09.0445945Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:47:09.0446158Z 2025-08-14T21:47:09.0446269Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0446650Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0446993Z return mod(**inputs) 2025-08-14T21:47:09.0447369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0447778Z outputs = self.model.decoder( 2025-08-14T21:47:09.0448187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0448613Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0449074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0449465Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0449891Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0450327Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0450745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0451158Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0451599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:47:09.0452051Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:47:09.0452221Z 2025-08-14T21:47:09.0452305Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0452528Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0452745Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0452958Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0453176Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0453393Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0453602Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0453820Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0454037Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0454247Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0454463Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0454681Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0454890Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0455107Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0455329Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0455544Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0455784Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0456163Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0456508Z return mod(**inputs) 2025-08-14T21:47:09.0456891Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0457316Z outputs = self.model.decoder( 2025-08-14T21:47:09.0457732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0458139Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0458504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0458888Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0459300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0459736Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0460226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0460686Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0461425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:47:09.0462170Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:47:09.0462378Z 2025-08-14T21:47:09.0462489Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0462870Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0463228Z return mod(**inputs) 2025-08-14T21:47:09.0463641Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0464075Z outputs = self.model.decoder( 2025-08-14T21:47:09.0464487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0464905Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0465282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0465677Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0466096Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:47:09.0466546Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:47:09.0466986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:47:09.0467438Z attn_output, attn_weights = attention_interface( 2025-08-14T21:47:09.0467912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:47:09.0468396Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:47:09.0468577Z 2025-08-14T21:47:09.0468666Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0468900Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0469119Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0469343Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0469565Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0469783Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0470038Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0470428Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0470780Z return mod(**inputs) 2025-08-14T21:47:09.0471168Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1864, in forward 2025-08-14T21:47:09.0471589Z outputs = self.model.decoder( 2025-08-14T21:47:09.0472007Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:47:09.0472416Z layer_outputs = decoder_layer( 2025-08-14T21:47:09.0472792Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:47:09.0473187Z return super().__call__(*args, **kwargs) 2025-08-14T21:47:09.0473605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 450, in forward 2025-08-14T21:47:09.0474022Z hidden_states = residual + hidden_states 2025-08-14T21:47:09.0474177Z 2025-08-14T21:47:09.0474263Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0474493Z cudagraph partition due to non gpu ops 2025-08-14T21:47:09.0474743Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:47:09.0475133Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:47:09.0475545Z return mod(**inputs) 2025-08-14T21:47:09.0475951Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1886, in forward 2025-08-14T21:47:09.0476447Z loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) 2025-08-14T21:47:09.0476670Z 2025-08-14T21:47:19.5307846Z Compilation time (from dynamo_timed): 20.106427772 2025-08-14T21:47:19.5520230Z pass 2025-08-14T21:47:19.5520679Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:47:19.5522032Z TIMING: _recursive_pre_grad_passes:0.04181 _recursive_joint_graph_passes:0.43375 _recursive_post_grad_passes:0.08859 async_compile.wait:1.04461 code_gen:10.09527 inductor_compile:11.77548 backend_compile:17.55225 gc:0.00046 entire_frame_compile:20.10643 total_wall_time:20.10643 2025-08-14T21:47:19.5531452Z STATS: call_* op count: 375 | FakeTensorMode.__torch_dispatch__:27763 | FakeTensor.__torch_dispatch__:3395 | ProxyTorchDispatchMode.__torch_dispatch__:7520 2025-08-14T21:47:19.5532050Z Dynamo produced 1 graphs covering 375 ops with 0 graph breaks (0 unique) 2025-08-14T21:47:25.5513952Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:47:25.5514944Z from pkg_resources import resource_filename 2025-08-14T21:47:26.3886135Z 2025-08-14T21:47:31.6631487Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:47:31.6631804Z loading model: 0it [00:05, ?it/s] 2025-08-14T21:47:31.6632340Z cpu eval MBartForConditionalGeneration 2025-08-14T21:47:35.4928179Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:47:36.1881605Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:47:36.8793476Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:48:02.7573040Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7574085Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7574529Z return mod(**inputs) 2025-08-14T21:48:02.7574985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1436, in forward 2025-08-14T21:48:02.7575525Z decoder_input_ids = shift_tokens_right(labels, self.config.pad_token_id) 2025-08-14T21:48:02.7576117Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 76, in shift_tokens_right 2025-08-14T21:48:02.7576686Z index_of_eos = (prev_output_tokens.ne(pad_token_id).sum(dim=1) - 1).unsqueeze(-1) 2025-08-14T21:48:02.7576934Z 2025-08-14T21:48:02.7577048Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7577299Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7577542Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7577776Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7578018Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7578261Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7578492Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7578713Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7578945Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7579180Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7579412Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7579639Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7579873Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7580141Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7580375Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7581090Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7582165Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7582396Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7582663Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7582891Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7583118Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7583334Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7583596Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7584002Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7584380Z return mod(**inputs) 2025-08-14T21:48:02.7584868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7585308Z outputs = self.model( 2025-08-14T21:48:02.7585732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7586183Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7586614Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7587054Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7587451Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7587865Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7588304Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7588769Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7589218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7589693Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7590210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7590757Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7590962Z 2025-08-14T21:48:02.7591081Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7591505Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7591876Z return mod(**inputs) 2025-08-14T21:48:02.7592289Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7592713Z outputs = self.model( 2025-08-14T21:48:02.7593117Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7593549Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7593968Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7594396Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7594785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7595196Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7595631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7596089Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7596550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7597010Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7597501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7598078Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7598264Z 2025-08-14T21:48:02.7598357Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7598579Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7598802Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7599023Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7599243Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7599463Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7599690Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7599913Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7600155Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7600385Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7600612Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7600830Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7601081Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7601316Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7601537Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7601765Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7602070Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7602484Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7603063Z return mod(**inputs) 2025-08-14T21:48:02.7603476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7603915Z outputs = self.model( 2025-08-14T21:48:02.7604316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7604758Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7605176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7605616Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7605997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7606401Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7606826Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7607275Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7607714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7608163Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7608649Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7609530Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7609752Z 2025-08-14T21:48:02.7609872Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7610269Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7610629Z return mod(**inputs) 2025-08-14T21:48:02.7611027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7611455Z outputs = self.model( 2025-08-14T21:48:02.7611856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7612305Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7612731Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7613161Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7613646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7614070Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7614497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7614942Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7615378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7615835Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7616350Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7616861Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7617038Z 2025-08-14T21:48:02.7617129Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7617371Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7617602Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7617831Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7618054Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7618283Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7618541Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7618929Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7619286Z return mod(**inputs) 2025-08-14T21:48:02.7619695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7620114Z outputs = self.model( 2025-08-14T21:48:02.7620525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7620951Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7621383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7621800Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7622187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7622589Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7623012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 336, in forward 2025-08-14T21:48:02.7623445Z hidden_states = residual + hidden_states 2025-08-14T21:48:02.7623604Z 2025-08-14T21:48:02.7623691Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7623922Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7624239Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7624464Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7624699Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7624925Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7625318Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7625565Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7625792Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7626015Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7626280Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7626681Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7627042Z return mod(**inputs) 2025-08-14T21:48:02.7627450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7627879Z outputs = self.model( 2025-08-14T21:48:02.7628282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7628735Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7629223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7629666Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7630045Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7630459Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7630876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7631310Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7631766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7632228Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7632718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7633249Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7633462Z 2025-08-14T21:48:02.7633576Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7633976Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7634348Z return mod(**inputs) 2025-08-14T21:48:02.7634733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7635145Z outputs = self.model( 2025-08-14T21:48:02.7635532Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7635948Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7636348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7636768Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7637147Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7637528Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7637954Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7638397Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7638837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7639284Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7639769Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7640272Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7640453Z 2025-08-14T21:48:02.7640553Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7640783Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7641011Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7641234Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7641455Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7641680Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7641904Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7642125Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7642352Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7642577Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7642796Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7643025Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7643252Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7643475Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7643713Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7643955Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7644232Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7644656Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7645025Z return mod(**inputs) 2025-08-14T21:48:02.7645433Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7645867Z outputs = self.model( 2025-08-14T21:48:02.7646275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7646737Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7647156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7647576Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7647972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7648368Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7648890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7649431Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7649883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7650340Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7650823Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7651349Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7651558Z 2025-08-14T21:48:02.7651675Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7652078Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7652432Z return mod(**inputs) 2025-08-14T21:48:02.7652832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7653270Z outputs = self.model( 2025-08-14T21:48:02.7653670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7654094Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7654512Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7654933Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7655307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7655715Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7656144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7656593Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7657027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7657478Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7657969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7658470Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7658648Z 2025-08-14T21:48:02.7658736Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7658976Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7659210Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7659493Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7659739Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7659968Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7660220Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7660621Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7660985Z return mod(**inputs) 2025-08-14T21:48:02.7661380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7661787Z outputs = self.model( 2025-08-14T21:48:02.7662218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7662646Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7663057Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7663490Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7663877Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7664275Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7664705Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 336, in forward 2025-08-14T21:48:02.7665148Z hidden_states = residual + hidden_states 2025-08-14T21:48:02.7665296Z 2025-08-14T21:48:02.7665389Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7665613Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7665867Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7666096Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7666319Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7666536Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7666770Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7666993Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7667210Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7667434Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7667687Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7668076Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7668433Z return mod(**inputs) 2025-08-14T21:48:02.7668842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7669284Z outputs = self.model( 2025-08-14T21:48:02.7669674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7670124Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7670538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7670966Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7671349Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7671766Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7672192Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7672624Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7673069Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7673518Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7674002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7674513Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7674761Z 2025-08-14T21:48:02.7674894Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7675307Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7675660Z return mod(**inputs) 2025-08-14T21:48:02.7676061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7676477Z outputs = self.model( 2025-08-14T21:48:02.7676874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7677322Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7677752Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7678186Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7678562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7678971Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7679412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7679869Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7680299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7680751Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7681241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7681740Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7681915Z 2025-08-14T21:48:02.7682004Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7682246Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7682477Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7682696Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7682923Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7683149Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7683372Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7683601Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7683826Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7684052Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7684272Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7684497Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7684727Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7684944Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7685169Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7685397Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7685646Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7686045Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7686401Z return mod(**inputs) 2025-08-14T21:48:02.7686803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7687227Z outputs = self.model( 2025-08-14T21:48:02.7687625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7688061Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7688477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7689009Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7689407Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7689882Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7690326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7690772Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7691211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7691670Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7692145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7692680Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7692881Z 2025-08-14T21:48:02.7693006Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7693391Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7693753Z return mod(**inputs) 2025-08-14T21:48:02.7694150Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7694569Z outputs = self.model( 2025-08-14T21:48:02.7694958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7695389Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7695805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7696232Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7696640Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7697034Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7697451Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7697901Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7698342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7698788Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7699269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7699763Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7699939Z 2025-08-14T21:48:02.7700034Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7700261Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7700488Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7700715Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7700931Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7701159Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7701419Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7701815Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7702180Z return mod(**inputs) 2025-08-14T21:48:02.7702575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7703149Z outputs = self.model( 2025-08-14T21:48:02.7703548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7703983Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7704403Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7704836Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7705312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7705739Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7706156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 336, in forward 2025-08-14T21:48:02.7706600Z hidden_states = residual + hidden_states 2025-08-14T21:48:02.7706758Z 2025-08-14T21:48:02.7706845Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7707075Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7708175Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7708632Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7709204Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7709567Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7709871Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7710260Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7710590Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7710884Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7711264Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7711770Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7712147Z return mod(**inputs) 2025-08-14T21:48:02.7712590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7713041Z outputs = self.model( 2025-08-14T21:48:02.7713451Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7713896Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7714398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7715033Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7715507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7715901Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7716328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7716791Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7717240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7717696Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7718184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7718714Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7718995Z 2025-08-14T21:48:02.7719166Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7719742Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7720278Z return mod(**inputs) 2025-08-14T21:48:02.7720883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7721521Z outputs = self.model( 2025-08-14T21:48:02.7722193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7723483Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7724138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7724779Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7725345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7726078Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7726581Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7727063Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7727503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7727974Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7728450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7729079Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7729264Z 2025-08-14T21:48:02.7729367Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7729595Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7729841Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7730070Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7730297Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7730511Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7730734Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7730956Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7731169Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7731390Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7731612Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7731825Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7732049Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7732274Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7732490Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7732710Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7732963Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7733361Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7733716Z return mod(**inputs) 2025-08-14T21:48:02.7734114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7734542Z outputs = self.model( 2025-08-14T21:48:02.7734933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7735363Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7735776Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7736207Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7736585Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7736983Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7737412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7737857Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7738299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7738758Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7739239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7739760Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7739967Z 2025-08-14T21:48:02.7740083Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7740475Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7740829Z return mod(**inputs) 2025-08-14T21:48:02.7741264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7741702Z outputs = self.model( 2025-08-14T21:48:02.7742098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7742510Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7742924Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7743350Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7743730Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7744133Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7744571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7745002Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7745424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7745860Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7746328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7746809Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7746980Z 2025-08-14T21:48:02.7747065Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7747293Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7747515Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7747734Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7747943Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7748161Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7748410Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7748791Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7749136Z return mod(**inputs) 2025-08-14T21:48:02.7749529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7749933Z outputs = self.model( 2025-08-14T21:48:02.7750323Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7750736Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7751143Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7751549Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7751923Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7752315Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7752723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 336, in forward 2025-08-14T21:48:02.7753145Z hidden_states = residual + hidden_states 2025-08-14T21:48:02.7753297Z 2025-08-14T21:48:02.7753381Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7753606Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7753823Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7754048Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7754269Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7754481Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7754701Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7754920Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7755139Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7755351Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7755653Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7756051Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7756413Z return mod(**inputs) 2025-08-14T21:48:02.7756804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7757212Z outputs = self.model( 2025-08-14T21:48:02.7757593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7758008Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7758433Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7758853Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7759215Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7759603Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7760024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7760461Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7760884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7761342Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7761819Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7762328Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7762531Z 2025-08-14T21:48:02.7762641Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7763034Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7763393Z return mod(**inputs) 2025-08-14T21:48:02.7763803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7764242Z outputs = self.model( 2025-08-14T21:48:02.7764662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7765122Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7765537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7765969Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7766358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7766758Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7767194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7767653Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7768107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7768607Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7769585Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7770128Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7770309Z 2025-08-14T21:48:02.7770407Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7770639Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7770875Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7771108Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7771329Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7771613Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7771859Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7772080Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7772302Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7772527Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7772743Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7772968Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7773191Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7773410Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7773637Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7773860Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7774134Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7774531Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7774908Z return mod(**inputs) 2025-08-14T21:48:02.7775324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7775741Z outputs = self.model( 2025-08-14T21:48:02.7776144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7776613Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7777028Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7777457Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7777840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7778248Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7778676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7779142Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7779572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7780006Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7780467Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7780984Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7781185Z 2025-08-14T21:48:02.7781296Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7781689Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7782033Z return mod(**inputs) 2025-08-14T21:48:02.7782424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7782835Z outputs = self.model( 2025-08-14T21:48:02.7783217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7783630Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7784032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7784446Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7784810Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7785199Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7785613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7786043Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7786460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7786949Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7787424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7787914Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7788099Z 2025-08-14T21:48:02.7788186Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7788467Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7788695Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7788915Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7789440Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7789672Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7789923Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7790328Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7790692Z return mod(**inputs) 2025-08-14T21:48:02.7791098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7791514Z outputs = self.model( 2025-08-14T21:48:02.7791915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7792340Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7792746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7793170Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7793552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7793950Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7794365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 336, in forward 2025-08-14T21:48:02.7794800Z hidden_states = residual + hidden_states 2025-08-14T21:48:02.7794946Z 2025-08-14T21:48:02.7795039Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7795260Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7795486Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7795707Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7795929Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7796143Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7796363Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7796584Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7796802Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7797022Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7797273Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7797659Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7798021Z return mod(**inputs) 2025-08-14T21:48:02.7798421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7798852Z outputs = self.model( 2025-08-14T21:48:02.7799243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7799672Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7800087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7800519Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7800898Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7801302Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7801772Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7802233Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7802861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7803316Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7803789Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7804308Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7804514Z 2025-08-14T21:48:02.7804695Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7805099Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7805463Z return mod(**inputs) 2025-08-14T21:48:02.7805873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7806307Z outputs = self.model( 2025-08-14T21:48:02.7806711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7807142Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7807561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7808000Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7808380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7808843Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7809284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7809760Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7810207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7810664Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7811143Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7811678Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7811848Z 2025-08-14T21:48:02.7811932Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7812159Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7812378Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7812587Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7812805Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7813021Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7813231Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7813447Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7813665Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7813882Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7814090Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7814305Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7814520Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7814731Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7814944Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7815158Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7815395Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7815783Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7816128Z return mod(**inputs) 2025-08-14T21:48:02.7816511Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7817002Z outputs = self.model( 2025-08-14T21:48:02.7817431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7817865Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7818259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7818674Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7819044Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7819432Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7819854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7820287Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7820722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7821166Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7821648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7822168Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7822375Z 2025-08-14T21:48:02.7822495Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7822868Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7823215Z return mod(**inputs) 2025-08-14T21:48:02.7823604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7824014Z outputs = self.model( 2025-08-14T21:48:02.7824393Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7824805Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7825209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7825613Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7825982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7826362Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7826775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 321, in forward 2025-08-14T21:48:02.7827196Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:48:02.7827621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7828055Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7828512Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7828994Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7829168Z 2025-08-14T21:48:02.7829251Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7829474Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7829685Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7829903Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7830121Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7830330Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7830580Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7830960Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7831308Z return mod(**inputs) 2025-08-14T21:48:02.7831739Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7832177Z outputs = self.model( 2025-08-14T21:48:02.7832573Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1248, in forward 2025-08-14T21:48:02.7833000Z encoder_outputs = self.encoder( 2025-08-14T21:48:02.7833404Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 861, in forward 2025-08-14T21:48:02.7833812Z layer_outputs = encoder_layer( 2025-08-14T21:48:02.7834179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7834578Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7835004Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 336, in forward 2025-08-14T21:48:02.7835435Z hidden_states = residual + hidden_states 2025-08-14T21:48:02.7835585Z 2025-08-14T21:48:02.7835680Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7835901Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7836128Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7836372Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7836580Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7836793Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7837010Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7837218Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7837436Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7837657Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7837902Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7838307Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7838661Z return mod(**inputs) 2025-08-14T21:48:02.7839059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7839468Z outputs = self.model( 2025-08-14T21:48:02.7839868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7840297Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7840707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7841136Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7841523Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7841929Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7842344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.7842796Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.7843241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7843687Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7844158Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7844674Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7844872Z 2025-08-14T21:48:02.7844995Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7845390Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7845737Z return mod(**inputs) 2025-08-14T21:48:02.7846201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7846680Z outputs = self.model( 2025-08-14T21:48:02.7847099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7847538Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7847970Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7848408Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7848870Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7849280Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7849732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.7850189Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.7850650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7851109Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7851588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7852088Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7852270Z 2025-08-14T21:48:02.7852356Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7852589Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7852818Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7853042Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7853261Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7853478Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7853687Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7853905Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7854127Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7854337Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7854550Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7854767Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7855004Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7855388Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7855734Z return mod(**inputs) 2025-08-14T21:48:02.7856123Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7856524Z outputs = self.model( 2025-08-14T21:48:02.7856914Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7857323Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7857720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7858136Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7858504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7858889Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7859292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.7859727Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.7860143Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7860557Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7860989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7861502Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7861715Z 2025-08-14T21:48:02.7861829Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7862187Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7862524Z return mod(**inputs) 2025-08-14T21:48:02.7862897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7863314Z outputs = self.model( 2025-08-14T21:48:02.7863701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7864144Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7864547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7864960Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7865346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7865752Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7866171Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.7866618Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.7867065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7867507Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7867983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7868464Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7868646Z 2025-08-14T21:48:02.7868736Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7868974Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7869197Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7869424Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7869649Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7869875Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7870091Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7870314Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7870539Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7870756Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7870981Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7871207Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7871424Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7871647Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7871870Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7872089Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7872345Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7872738Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7873087Z return mod(**inputs) 2025-08-14T21:48:02.7873477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7873889Z outputs = self.model( 2025-08-14T21:48:02.7874281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7874692Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7875125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7875570Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7875949Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7876394Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7876807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.7877247Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.7877680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7878104Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7878605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7879112Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7879303Z 2025-08-14T21:48:02.7879413Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7879793Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7880139Z return mod(**inputs) 2025-08-14T21:48:02.7880527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7880924Z outputs = self.model( 2025-08-14T21:48:02.7881310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7881790Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7882188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7882606Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7882979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7883364Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7883788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.7884234Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.7884676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7885126Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7885608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7886088Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7886254Z 2025-08-14T21:48:02.7886347Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7886561Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7886810Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7887186Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7887534Z return mod(**inputs) 2025-08-14T21:48:02.7887908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7888310Z outputs = self.model( 2025-08-14T21:48:02.7888697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7889202Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7889631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7890057Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7890444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7890843Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7891318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 424, in forward 2025-08-14T21:48:02.7891759Z hidden_states = residual + hidden_states 2025-08-14T21:48:02.7891908Z 2025-08-14T21:48:02.7892006Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7892235Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7892469Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7892702Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7892926Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7893161Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7893393Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7893636Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7893863Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7894088Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7894334Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7894731Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7895091Z return mod(**inputs) 2025-08-14T21:48:02.7895487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7895898Z outputs = self.model( 2025-08-14T21:48:02.7896294Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7896719Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7897132Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7897560Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7897941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7898335Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7898757Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.7899229Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.7899675Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7900130Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7900610Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7901134Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7901335Z 2025-08-14T21:48:02.7901451Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7901842Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7902197Z return mod(**inputs) 2025-08-14T21:48:02.7902780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7903200Z outputs = self.model( 2025-08-14T21:48:02.7903600Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7904033Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7904429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7904846Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7905229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7905616Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7906026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.7906570Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.7907045Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7907477Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7907953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7908454Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7908625Z 2025-08-14T21:48:02.7908719Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7908937Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7909207Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7909434Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7909650Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7909866Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7910089Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7910308Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7910518Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7910736Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7910953Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7911161Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7911379Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7911600Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7911811Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7912028Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7912277Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7912659Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7913010Z return mod(**inputs) 2025-08-14T21:48:02.7913401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7913816Z outputs = self.model( 2025-08-14T21:48:02.7914200Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7914613Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7915022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7915433Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7915800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7916205Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7916617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.7917052Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.7917493Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7917925Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7918400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7918904Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7919106Z 2025-08-14T21:48:02.7919215Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7919598Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7919947Z return mod(**inputs) 2025-08-14T21:48:02.7920330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7920745Z outputs = self.model( 2025-08-14T21:48:02.7921225Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7921627Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7922029Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7922451Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7922821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7923208Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7923638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.7924078Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.7924510Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7924964Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7925454Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7925951Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7926126Z 2025-08-14T21:48:02.7926212Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7926446Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7926673Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7926892Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7927117Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7927347Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7927582Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7927794Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7928015Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7928238Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7928450Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7928668Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7929012Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7929407Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7929778Z return mod(**inputs) 2025-08-14T21:48:02.7930168Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7930575Z outputs = self.model( 2025-08-14T21:48:02.7930956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7931366Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7931750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7932130Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7932496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7932881Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7933292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.7933725Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.7934164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7934588Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7935033Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7935501Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7935745Z 2025-08-14T21:48:02.7935866Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7936233Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7936550Z return mod(**inputs) 2025-08-14T21:48:02.7936913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7937301Z outputs = self.model( 2025-08-14T21:48:02.7937685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7938094Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7938526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7938936Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7939296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7939686Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7940093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.7940512Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.7940921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7941350Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7941817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7942291Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7942460Z 2025-08-14T21:48:02.7942545Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7942779Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7943030Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7943400Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7943743Z return mod(**inputs) 2025-08-14T21:48:02.7944122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7944528Z outputs = self.model( 2025-08-14T21:48:02.7944906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7945316Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7945722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7946127Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7946497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7946886Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7947299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 441, in forward 2025-08-14T21:48:02.7947717Z hidden_states = residual + hidden_states 2025-08-14T21:48:02.7947871Z 2025-08-14T21:48:02.7947954Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7948183Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7948399Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7948619Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7948842Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7949071Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7949286Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7949518Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7949735Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7950005Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7950243Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7950466Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7950682Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7950900Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7951150Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7951528Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7951876Z return mod(**inputs) 2025-08-14T21:48:02.7952259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7952696Z outputs = self.model( 2025-08-14T21:48:02.7953083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7953502Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7953918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7954338Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7954715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7955111Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7955533Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.7955980Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.7956442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7956881Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7957355Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7957860Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7958062Z 2025-08-14T21:48:02.7958175Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7958562Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7958913Z return mod(**inputs) 2025-08-14T21:48:02.7959299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7959710Z outputs = self.model( 2025-08-14T21:48:02.7960141Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7960551Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7960962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7961384Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7961771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7962164Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7962598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.7963056Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.7963502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7963958Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7964449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7964961Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7965214Z 2025-08-14T21:48:02.7965303Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7965556Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7965790Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7966011Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7966237Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7966463Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7966695Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7966917Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7967142Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7967366Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7967602Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7967826Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7968081Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7968468Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7968940Z return mod(**inputs) 2025-08-14T21:48:02.7969376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7969810Z outputs = self.model( 2025-08-14T21:48:02.7970203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7970635Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7971059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7971489Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7971879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7972277Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7972705Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.7973164Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.7973619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7974066Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7974548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7975063Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7975266Z 2025-08-14T21:48:02.7975381Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7975774Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7976118Z return mod(**inputs) 2025-08-14T21:48:02.7976512Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7976934Z outputs = self.model( 2025-08-14T21:48:02.7977329Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7977748Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7978163Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7978586Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7978959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7979357Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7979781Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.7980236Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.7980786Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7981236Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7981705Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.7982186Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.7982361Z 2025-08-14T21:48:02.7982447Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7982681Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7982913Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7983153Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7983383Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7983606Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7983854Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7984259Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7984602Z return mod(**inputs) 2025-08-14T21:48:02.7984986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7985384Z outputs = self.model( 2025-08-14T21:48:02.7985765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7986176Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7986570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7986978Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7987347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7987730Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7988138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 450, in forward 2025-08-14T21:48:02.7988558Z hidden_states = residual + hidden_states 2025-08-14T21:48:02.7988701Z 2025-08-14T21:48:02.7988791Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7989010Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7989220Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7989434Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7989650Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7989861Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7990077Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7990294Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7990503Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7990722Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.7990965Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7991341Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7991681Z return mod(**inputs) 2025-08-14T21:48:02.7992059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7992461Z outputs = self.model( 2025-08-14T21:48:02.7992837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.7993253Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.7993667Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.7994085Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.7994456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.7994900Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.7995342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.7995774Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.7996208Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.7996641Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.7997109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.7997621Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.7997822Z 2025-08-14T21:48:02.7997932Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.7998321Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.7998681Z return mod(**inputs) 2025-08-14T21:48:02.7999073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.7999496Z outputs = self.model( 2025-08-14T21:48:02.7999891Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8000308Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8000720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8001144Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8001526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8001910Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8002336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.8002988Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.8003436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8003890Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8004377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.8004881Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.8005060Z 2025-08-14T21:48:02.8005149Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8005390Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8005621Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8005845Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8006074Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8006304Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8006538Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8006757Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8006983Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8007211Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8007434Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8007666Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8007927Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8008315Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8008704Z return mod(**inputs) 2025-08-14T21:48:02.8009171Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8009610Z outputs = self.model( 2025-08-14T21:48:02.8010002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8010576Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8011001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8011466Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8011850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8012254Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8012687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.8013179Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.8013643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8014095Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8014584Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.8015102Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.8015313Z 2025-08-14T21:48:02.8015434Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8015828Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8016175Z return mod(**inputs) 2025-08-14T21:48:02.8016576Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8016994Z outputs = self.model( 2025-08-14T21:48:02.8017393Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8017807Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8018229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8018654Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8019027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8019421Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8019847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.8020309Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.8020741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8021172Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8021655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.8022164Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.8022344Z 2025-08-14T21:48:02.8022432Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8022666Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8022900Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8023120Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8023345Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8023569Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8023794Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8024011Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8024233Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8024450Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8024659Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8024874Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8025134Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8025359Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8025577Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8025795Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8026033Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8026417Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8026782Z return mod(**inputs) 2025-08-14T21:48:02.8027179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8027610Z outputs = self.model( 2025-08-14T21:48:02.8028026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8028438Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8028839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8029264Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8029652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8030032Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8030433Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.8030865Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.8031295Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8031739Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8032227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.8032742Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.8032946Z 2025-08-14T21:48:02.8033069Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8033454Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8033807Z return mod(**inputs) 2025-08-14T21:48:02.8034202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8034617Z outputs = self.model( 2025-08-14T21:48:02.8035003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8035424Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8035848Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8036259Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8036644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8037037Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8037456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.8037894Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.8038335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8038780Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8039262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.8039751Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.8039930Z 2025-08-14T21:48:02.8040052Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8040311Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8040623Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8041016Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8041373Z return mod(**inputs) 2025-08-14T21:48:02.8041767Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8042176Z outputs = self.model( 2025-08-14T21:48:02.8042572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8043014Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8043421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8043843Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8044226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8044621Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8045034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 424, in forward 2025-08-14T21:48:02.8045463Z hidden_states = residual + hidden_states 2025-08-14T21:48:02.8045613Z 2025-08-14T21:48:02.8045704Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8045924Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8046149Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8046372Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8046594Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8046809Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8047029Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8047252Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8047467Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8047707Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8047963Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8048345Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8048700Z return mod(**inputs) 2025-08-14T21:48:02.8049197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8049633Z outputs = self.model( 2025-08-14T21:48:02.8050024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8050452Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8050873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8051299Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8051688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8052081Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8052509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.8052957Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.8053407Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8053861Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8054347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.8054840Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.8055038Z 2025-08-14T21:48:02.8055204Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8055627Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8055993Z return mod(**inputs) 2025-08-14T21:48:02.8056400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8056825Z outputs = self.model( 2025-08-14T21:48:02.8057229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8057644Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8058078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8058499Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8058962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8059378Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8059794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.8060242Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.8060676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8061116Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8061585Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.8062070Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.8062241Z 2025-08-14T21:48:02.8062324Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8062557Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8062789Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8063005Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8063233Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8063455Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8063672Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8063899Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8064121Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8064343Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8064562Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8064789Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8065012Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8065229Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8065453Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8065677Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8065927Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8066333Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8066684Z return mod(**inputs) 2025-08-14T21:48:02.8067072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8067473Z outputs = self.model( 2025-08-14T21:48:02.8067854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8068265Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8068658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8069069Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8069443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8069826Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8070297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.8070735Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.8071167Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8071598Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8072051Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.8072550Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.8072739Z 2025-08-14T21:48:02.8072880Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8073254Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8073600Z return mod(**inputs) 2025-08-14T21:48:02.8073994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8074410Z outputs = self.model( 2025-08-14T21:48:02.8074805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8075223Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8075638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8076055Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8076429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8076810Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8077238Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.8077685Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.8078133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8078582Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8079066Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.8079558Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.8079737Z 2025-08-14T21:48:02.8079823Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8080054Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8080276Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8080496Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8080720Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8080935Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8081159Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8081382Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8081604Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8081823Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8082045Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8082270Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8082518Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8082919Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8083307Z return mod(**inputs) 2025-08-14T21:48:02.8083709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8084136Z outputs = self.model( 2025-08-14T21:48:02.8084541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8085004Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8085442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8085864Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8086243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8086638Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8087050Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.8087510Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.8087983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8088429Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8089010Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.8089535Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.8089730Z 2025-08-14T21:48:02.8089855Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8090239Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8090595Z return mod(**inputs) 2025-08-14T21:48:02.8090991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8091421Z outputs = self.model( 2025-08-14T21:48:02.8091814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8092240Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8092665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8093074Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8093445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8093821Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8094231Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.8094663Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.8095102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8095531Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8095988Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.8096466Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.8096641Z 2025-08-14T21:48:02.8096723Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8096947Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8097186Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8097567Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8097908Z return mod(**inputs) 2025-08-14T21:48:02.8098282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8098684Z outputs = self.model( 2025-08-14T21:48:02.8099066Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8099474Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8099864Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8100338Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8100713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8101097Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8101500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 441, in forward 2025-08-14T21:48:02.8101917Z hidden_states = residual + hidden_states 2025-08-14T21:48:02.8102063Z 2025-08-14T21:48:02.8102158Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8102376Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8102874Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8103116Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8103331Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8103556Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8103789Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8104017Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8104242Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8104467Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8104690Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8104903Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8105128Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8105350Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8105598Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8105981Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8106331Z return mod(**inputs) 2025-08-14T21:48:02.8106715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8107118Z outputs = self.model( 2025-08-14T21:48:02.8107512Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8107927Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8108343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8108773Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8109153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8109551Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8109969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.8110407Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.8110852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8111309Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8111781Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.8112299Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.8112494Z 2025-08-14T21:48:02.8112613Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8112999Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8113364Z return mod(**inputs) 2025-08-14T21:48:02.8113760Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8114185Z outputs = self.model( 2025-08-14T21:48:02.8114571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8115067Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8115515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8115931Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8116316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8116711Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8117134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.8117575Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.8118075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8118525Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8119013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.8119500Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.8119681Z 2025-08-14T21:48:02.8119766Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8119996Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8120219Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8120443Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8120667Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8120890Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8121108Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8121335Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8121562Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8121784Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8122006Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8122231Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8122481Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8122872Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8123230Z return mod(**inputs) 2025-08-14T21:48:02.8123619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8124038Z outputs = self.model( 2025-08-14T21:48:02.8124436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8124860Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8125268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8125689Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8126078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8126474Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8126894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.8127346Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.8127797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8128250Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8128735Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.8129354Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.8129554Z 2025-08-14T21:48:02.8129677Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8130130Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8130496Z return mod(**inputs) 2025-08-14T21:48:02.8130882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8131289Z outputs = self.model( 2025-08-14T21:48:02.8131665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8132088Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8132502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8132944Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8133330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8133731Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8134153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.8134597Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.8135044Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8135537Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8136003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.8136493Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.8136672Z 2025-08-14T21:48:02.8136757Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8136989Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8137071Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8137155Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8137247Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8137331Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8137444Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8137672Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8137743Z return mod(**inputs) 2025-08-14T21:48:02.8138032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8138108Z outputs = self.model( 2025-08-14T21:48:02.8138387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8138477Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8138754Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8138845Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8139098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8139184Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8139461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 450, in forward 2025-08-14T21:48:02.8139548Z hidden_states = residual + hidden_states 2025-08-14T21:48:02.8139552Z 2025-08-14T21:48:02.8139631Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8139719Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8139797Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8139885Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8139964Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8140041Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8140130Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8140269Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8140368Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8140461Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8140574Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8140789Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8140871Z return mod(**inputs) 2025-08-14T21:48:02.8141143Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8141226Z outputs = self.model( 2025-08-14T21:48:02.8141518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8141601Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8141883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8141965Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8142206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8142300Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8142576Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.8142692Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.8142966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8143073Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8143395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.8143539Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.8143546Z 2025-08-14T21:48:02.8143666Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8143880Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8143950Z return mod(**inputs) 2025-08-14T21:48:02.8144232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8144305Z outputs = self.model( 2025-08-14T21:48:02.8144578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8144667Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8144941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8145026Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8145270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8145361Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8145642Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.8145747Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.8146026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8146131Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8146447Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.8146567Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.8146571Z 2025-08-14T21:48:02.8146654Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8146760Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8146873Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8146974Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8147063Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8147144Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8147226Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8147316Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8147396Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8147476Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8147565Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8147645Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8147776Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8148003Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8148074Z return mod(**inputs) 2025-08-14T21:48:02.8148356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8148435Z outputs = self.model( 2025-08-14T21:48:02.8148713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8148801Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8149073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8149151Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8149398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8149486Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8149771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.8149889Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.8150169Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8150281Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8150596Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.8150741Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.8150745Z 2025-08-14T21:48:02.8150857Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8151074Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8151158Z return mod(**inputs) 2025-08-14T21:48:02.8151432Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8151508Z outputs = self.model( 2025-08-14T21:48:02.8151796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8151875Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8152154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8152232Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8152469Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8152562Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8152836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.8152959Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.8153231Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8154128Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8154451Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.8154565Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.8154569Z 2025-08-14T21:48:02.8154654Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8154745Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8154828Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8154918Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8154999Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8155097Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8155187Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8155268Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8155348Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8155440Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8155522Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8155602Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8155692Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8155772Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8155861Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8155941Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8156052Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8156273Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8156346Z return mod(**inputs) 2025-08-14T21:48:02.8156619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8156701Z outputs = self.model( 2025-08-14T21:48:02.8156969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8157061Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8157340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8157416Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8157654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8157736Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8158002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.8158118Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.8158388Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8158500Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8158817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.8158950Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.8158953Z 2025-08-14T21:48:02.8159068Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8159275Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8159351Z return mod(**inputs) 2025-08-14T21:48:02.8159631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8159703Z outputs = self.model( 2025-08-14T21:48:02.8159985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8160063Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8160400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8160499Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8160729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8160819Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8161080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.8161181Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.8161470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8161571Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8161892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.8162013Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.8162017Z 2025-08-14T21:48:02.8162101Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8162190Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8162303Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8162528Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8162608Z return mod(**inputs) 2025-08-14T21:48:02.8162894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8162977Z outputs = self.model( 2025-08-14T21:48:02.8163260Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8163341Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8163635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8163715Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8163959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8164054Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8164337Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 424, in forward 2025-08-14T21:48:02.8164433Z hidden_states = residual + hidden_states 2025-08-14T21:48:02.8164436Z 2025-08-14T21:48:02.8164519Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8164601Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8164692Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8164774Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8164856Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8164949Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8165032Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8165123Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8165205Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8165287Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8165404Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8165629Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8165702Z return mod(**inputs) 2025-08-14T21:48:02.8165987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8166062Z outputs = self.model( 2025-08-14T21:48:02.8166348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8166435Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8166821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8166908Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8167153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8167240Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8167522Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.8167640Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.8167944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8168049Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8168378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.8168530Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.8168534Z 2025-08-14T21:48:02.8168645Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8168969Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8169048Z return mod(**inputs) 2025-08-14T21:48:02.8169327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8169411Z outputs = self.model( 2025-08-14T21:48:02.8169686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8169768Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8170049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8170132Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8170390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8170474Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8170740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.8170863Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.8171127Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8171227Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8171538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.8171649Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.8171657Z 2025-08-14T21:48:02.8171753Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8171839Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8171923Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8172017Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8172099Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8172185Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8172275Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8172359Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8172449Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8172532Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8172615Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8172704Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8172787Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8172868Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8173002Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8173104Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8173218Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8173441Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8173511Z return mod(**inputs) 2025-08-14T21:48:02.8173796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8173868Z outputs = self.model( 2025-08-14T21:48:02.8174134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8174237Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8174506Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8174583Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8174833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8174918Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8175198Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.8175304Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.8175574Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8175687Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8176001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.8176144Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.8176147Z 2025-08-14T21:48:02.8176261Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8176478Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8176558Z return mod(**inputs) 2025-08-14T21:48:02.8176835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8176908Z outputs = self.model( 2025-08-14T21:48:02.8177189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8177268Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8177552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8177630Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8177869Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8177964Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8178242Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.8178355Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.8178628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8178729Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8179051Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.8179168Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.8179172Z 2025-08-14T21:48:02.8179256Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8179348Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8179452Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8179558Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8179656Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8179739Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8179827Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8179909Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8179990Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8180079Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8180162Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8180244Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8180364Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8180599Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8180681Z return mod(**inputs) 2025-08-14T21:48:02.8180955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8181034Z outputs = self.model( 2025-08-14T21:48:02.8181316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8181395Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8181666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8181753Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8181993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8182085Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8182358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.8182474Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.8182755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8182858Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8183172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.8183312Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.8183315Z 2025-08-14T21:48:02.8183425Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8183648Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8183719Z return mod(**inputs) 2025-08-14T21:48:02.8183991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8184075Z outputs = self.model( 2025-08-14T21:48:02.8184346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8184439Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8184711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8184790Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8185034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8185122Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8185399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.8185516Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.8185786Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8185930Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8186254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.8186371Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.8186383Z 2025-08-14T21:48:02.8186468Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8186553Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8186672Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8186885Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8186956Z return mod(**inputs) 2025-08-14T21:48:02.8187263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8187339Z outputs = self.model( 2025-08-14T21:48:02.8187613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8187706Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8187976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8188060Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8188296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8188381Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8188660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 441, in forward 2025-08-14T21:48:02.8188748Z hidden_states = residual + hidden_states 2025-08-14T21:48:02.8188753Z 2025-08-14T21:48:02.8188845Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8188927Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8189010Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8189104Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8189187Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8189268Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8189356Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8189437Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8189516Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8189603Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8189683Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8189771Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8189852Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8189934Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8190050Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8190263Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8190338Z return mod(**inputs) 2025-08-14T21:48:02.8190630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8190704Z outputs = self.model( 2025-08-14T21:48:02.8190966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8191052Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8191323Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8191409Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8191650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8191742Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8192021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.8192162Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.8192447Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8192550Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8192861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.8193006Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.8193009Z 2025-08-14T21:48:02.8193119Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8193368Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8193440Z return mod(**inputs) 2025-08-14T21:48:02.8193707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8193789Z outputs = self.model( 2025-08-14T21:48:02.8194056Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8194134Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8194409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8194487Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8194725Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8194809Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8195074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 415, in forward 2025-08-14T21:48:02.8195185Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:48:02.8195453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8195554Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8195875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.8195992Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.8195996Z 2025-08-14T21:48:02.8196087Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8196172Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8196253Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8196342Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8196434Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8196514Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8196601Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8196681Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8196768Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8196849Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8196928Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8197015Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8197122Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8197332Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8197411Z return mod(**inputs) 2025-08-14T21:48:02.8197679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8197750Z outputs = self.model( 2025-08-14T21:48:02.8198024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8198103Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8198384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8198521Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8198758Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8198852Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8199127Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.8199258Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.8199553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8199659Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8199978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:48:02.8200119Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:48:02.8200125Z 2025-08-14T21:48:02.8200255Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8200463Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8200534Z return mod(**inputs) 2025-08-14T21:48:02.8200809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8200881Z outputs = self.model( 2025-08-14T21:48:02.8201146Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8201234Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8201495Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8201580Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8201813Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8201899Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8202167Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 432, in forward 2025-08-14T21:48:02.8202280Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:48:02.8202541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 263, in forward 2025-08-14T21:48:02.8202868Z attn_output, attn_weights = attention_interface( 2025-08-14T21:48:02.8203189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:48:02.8203310Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:02.8203316Z 2025-08-14T21:48:02.8203405Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8203492Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8203584Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8203666Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8203748Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8203843Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8203956Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8204179Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8204251Z return mod(**inputs) 2025-08-14T21:48:02.8204531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1438, in forward 2025-08-14T21:48:02.8204615Z outputs = self.model( 2025-08-14T21:48:02.8204893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1266, in forward 2025-08-14T21:48:02.8205053Z decoder_outputs = self.decoder( 2025-08-14T21:48:02.8205365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1109, in forward 2025-08-14T21:48:02.8205445Z layer_outputs = decoder_layer( 2025-08-14T21:48:02.8205691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:02.8205777Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:02.8206049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 450, in forward 2025-08-14T21:48:02.8206147Z hidden_states = residual + hidden_states 2025-08-14T21:48:02.8206151Z 2025-08-14T21:48:02.8206267Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8206358Z cudagraph partition due to non gpu ops 2025-08-14T21:48:02.8206470Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:02.8206689Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:02.8206773Z return mod(**inputs) 2025-08-14T21:48:02.8207046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mbart/modeling_mbart.py", line 1461, in forward 2025-08-14T21:48:02.8207231Z masked_lm_loss = loss_fct(lm_logits.view(-1, self.config.vocab_size), labels.view(-1)) 2025-08-14T21:48:02.8207235Z 2025-08-14T21:48:16.9704584Z Compilation time (from dynamo_timed): 37.747164116 2025-08-14T21:48:16.9919436Z pass 2025-08-14T21:48:16.9920099Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:48:16.9921116Z TIMING: _recursive_pre_grad_passes:0.10058 _recursive_joint_graph_passes:0.93206 _recursive_post_grad_passes:0.19229 async_compile.wait:1.05284 code_gen:13.15901 inductor_compile:16.47133 backend_compile:31.22014 gc:0.00039 entire_frame_compile:37.74716 total_wall_time:37.74716 2025-08-14T21:48:16.9922975Z STATS: call_* op count: 988 | FakeTensorMode.__torch_dispatch__:70572 | FakeTensor.__torch_dispatch__:8382 | ProxyTorchDispatchMode.__torch_dispatch__:19302 2025-08-14T21:48:16.9923611Z Dynamo produced 1 graphs covering 988 ops with 0 graph breaks (0 unique) 2025-08-14T21:48:23.6186897Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:48:23.6187809Z from pkg_resources import resource_filename 2025-08-14T21:48:24.2461488Z 2025-08-14T21:48:27.0194176Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:48:27.0194493Z loading model: 0it [00:02, ?it/s] 2025-08-14T21:48:27.0194800Z cpu eval MT5ForConditionalGeneration 2025-08-14T21:48:27.6898260Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:48:28.0460031Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:48:28.4518474Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:48:47.9187367Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9187915Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9188289Z return mod(**inputs) 2025-08-14T21:48:47.9188708Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9189135Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9189564Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9189984Z layer_outputs = layer_module( 2025-08-14T21:48:47.9190354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9191888Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9192334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9192761Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9193203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9193628Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9194292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 421, in forward 2025-08-14T21:48:47.9194840Z position_bias = position_bias + causal_mask 2025-08-14T21:48:47.9195010Z 2025-08-14T21:48:47.9195102Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9200545Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9200819Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9201052Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9201304Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9201727Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9202101Z return mod(**inputs) 2025-08-14T21:48:47.9202486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9203193Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9203618Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9204035Z layer_outputs = layer_module( 2025-08-14T21:48:47.9204412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9204813Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9205232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9205654Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9206060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9206478Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9206906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9207370Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9207579Z 2025-08-14T21:48:47.9207667Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9207898Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9208157Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9208558Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9209222Z return mod(**inputs) 2025-08-14T21:48:47.9209622Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9210024Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9210436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9210850Z layer_outputs = layer_module( 2025-08-14T21:48:47.9211229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9211618Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9212034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9212449Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9212923Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9213408Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9213825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9214277Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9214454Z 2025-08-14T21:48:47.9214568Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9214972Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9215338Z return mod(**inputs) 2025-08-14T21:48:47.9215756Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9216175Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9216576Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9217002Z layer_outputs = layer_module( 2025-08-14T21:48:47.9217412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9217807Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9218352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9218752Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9219140Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9219543Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9219938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9220378Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9220552Z 2025-08-14T21:48:47.9220637Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9220863Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9221089Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9221301Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9221524Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9221746Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9221961Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9222185Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9222438Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9222835Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9223184Z return mod(**inputs) 2025-08-14T21:48:47.9223567Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9223992Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9224388Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9224784Z layer_outputs = layer_module( 2025-08-14T21:48:47.9225147Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9225526Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9225914Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9226317Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9226713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9227106Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9227500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9228004Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9228198Z 2025-08-14T21:48:47.9228314Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9228699Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9229059Z return mod(**inputs) 2025-08-14T21:48:47.9229433Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9229840Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9230254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9230663Z layer_outputs = layer_module( 2025-08-14T21:48:47.9231045Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9231423Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9231832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9232238Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9232650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9233059Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9233467Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 433, in forward 2025-08-14T21:48:47.9233957Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:48:47.9234197Z 2025-08-14T21:48:47.9234315Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9234689Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9235042Z return mod(**inputs) 2025-08-14T21:48:47.9235424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9235824Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9236226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9236637Z layer_outputs = layer_module( 2025-08-14T21:48:47.9237015Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9237403Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9237812Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9238223Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9238620Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9239034Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9239446Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9239890Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9240064Z 2025-08-14T21:48:47.9240176Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9240569Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9240939Z return mod(**inputs) 2025-08-14T21:48:47.9241312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9241719Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9242125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9242567Z layer_outputs = layer_module( 2025-08-14T21:48:47.9242971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9243376Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9243783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9244196Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9244592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9245004Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9245435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9245875Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9246058Z 2025-08-14T21:48:47.9246147Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9246385Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9246620Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9246838Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9247059Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9247282Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9247495Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9247717Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9247969Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9248353Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9248704Z return mod(**inputs) 2025-08-14T21:48:47.9249395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9249807Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9250197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9250609Z layer_outputs = layer_module( 2025-08-14T21:48:47.9250995Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9251378Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9251786Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 609, in forward 2025-08-14T21:48:47.9252213Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:48:47.9252636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 216, in forward 2025-08-14T21:48:47.9253083Z forwarded_states = self.DenseReluDense(forwarded_states) 2025-08-14T21:48:47.9253530Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 185, in forward 2025-08-14T21:48:47.9253952Z hidden_states = hidden_gelu * hidden_linear 2025-08-14T21:48:47.9254110Z 2025-08-14T21:48:47.9254206Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9254429Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9254652Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9254873Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9255088Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9255312Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9255563Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9255948Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9256289Z return mod(**inputs) 2025-08-14T21:48:47.9256657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9257051Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9257461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9257927Z layer_outputs = layer_module( 2025-08-14T21:48:47.9258369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9258754Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9259178Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9259593Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9260002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9260435Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9260843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9261298Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9261490Z 2025-08-14T21:48:47.9261613Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9262000Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9262355Z return mod(**inputs) 2025-08-14T21:48:47.9262740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9263142Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9263543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9263952Z layer_outputs = layer_module( 2025-08-14T21:48:47.9264328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9264720Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9265134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9265549Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9265954Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9266361Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9266770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 433, in forward 2025-08-14T21:48:47.9267274Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:48:47.9267500Z 2025-08-14T21:48:47.9267588Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9267821Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9268077Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9268467Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9268817Z return mod(**inputs) 2025-08-14T21:48:47.9269213Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9269663Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9270058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9270471Z layer_outputs = layer_module( 2025-08-14T21:48:47.9270850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9271248Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9271667Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9272082Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9272495Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9272975Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9273390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9273838Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9274011Z 2025-08-14T21:48:47.9274131Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9274533Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9274899Z return mod(**inputs) 2025-08-14T21:48:47.9275305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9275723Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9276129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9276548Z layer_outputs = layer_module( 2025-08-14T21:48:47.9276925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9277311Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9277729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9278149Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9278564Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9278983Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9279392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9279848Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9280028Z 2025-08-14T21:48:47.9280122Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9280347Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9280572Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9280794Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9281007Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9281224Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9281471Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9281854Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9282207Z return mod(**inputs) 2025-08-14T21:48:47.9282591Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9283006Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9283397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9283805Z layer_outputs = layer_module( 2025-08-14T21:48:47.9284176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9284558Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9284967Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 609, in forward 2025-08-14T21:48:47.9285393Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:48:47.9285816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 216, in forward 2025-08-14T21:48:47.9286269Z forwarded_states = self.DenseReluDense(forwarded_states) 2025-08-14T21:48:47.9286712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 185, in forward 2025-08-14T21:48:47.9287131Z hidden_states = hidden_gelu * hidden_linear 2025-08-14T21:48:47.9287330Z 2025-08-14T21:48:47.9287422Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9287662Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9287891Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9288114Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9288331Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9288551Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9288902Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9289297Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9289652Z return mod(**inputs) 2025-08-14T21:48:47.9290065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9290483Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9290883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9291303Z layer_outputs = layer_module( 2025-08-14T21:48:47.9291687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9292079Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9292496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9292914Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9293328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9293740Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9294167Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9294625Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9294819Z 2025-08-14T21:48:47.9294930Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9295316Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9295659Z return mod(**inputs) 2025-08-14T21:48:47.9296031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9296425Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9296820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9297222Z layer_outputs = layer_module( 2025-08-14T21:48:47.9297588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9297967Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9298373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9298796Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9299203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9299621Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9300038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 433, in forward 2025-08-14T21:48:47.9300519Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:48:47.9300744Z 2025-08-14T21:48:47.9300834Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9301063Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9301320Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9301707Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9302064Z return mod(**inputs) 2025-08-14T21:48:47.9302515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9303142Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9303541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9303953Z layer_outputs = layer_module( 2025-08-14T21:48:47.9304322Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9304700Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9305171Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9305576Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9305973Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9306376Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9306778Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9307214Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9307383Z 2025-08-14T21:48:47.9307502Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9307876Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9308219Z return mod(**inputs) 2025-08-14T21:48:47.9308591Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9308986Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9309381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9309778Z layer_outputs = layer_module( 2025-08-14T21:48:47.9310146Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9310523Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9310921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9311324Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9311712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9312116Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9312515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9312946Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9313119Z 2025-08-14T21:48:47.9313203Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9313432Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9313655Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9313866Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9314086Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9314299Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9314541Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9314911Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9315257Z return mod(**inputs) 2025-08-14T21:48:47.9315626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9316019Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9316409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9316802Z layer_outputs = layer_module( 2025-08-14T21:48:47.9317264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9317639Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9318032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 609, in forward 2025-08-14T21:48:47.9318443Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:48:47.9318842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 216, in forward 2025-08-14T21:48:47.9319282Z forwarded_states = self.DenseReluDense(forwarded_states) 2025-08-14T21:48:47.9319737Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 185, in forward 2025-08-14T21:48:47.9320153Z hidden_states = hidden_gelu * hidden_linear 2025-08-14T21:48:47.9320304Z 2025-08-14T21:48:47.9320388Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9320616Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9320839Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9321049Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9321267Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9321483Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9321729Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9322101Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9322448Z return mod(**inputs) 2025-08-14T21:48:47.9322823Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9323231Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9323623Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9324027Z layer_outputs = layer_module( 2025-08-14T21:48:47.9324398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9324775Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9325172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9325595Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9325997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9326414Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9326826Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9327296Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9327486Z 2025-08-14T21:48:47.9327595Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9327990Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9328344Z return mod(**inputs) 2025-08-14T21:48:47.9328726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9329207Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9329611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9330026Z layer_outputs = layer_module( 2025-08-14T21:48:47.9330398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9330800Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9331209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9331653Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9332092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9332509Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9332919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 433, in forward 2025-08-14T21:48:47.9333402Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:48:47.9333634Z 2025-08-14T21:48:47.9333721Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9333950Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9334207Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9334615Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9334976Z return mod(**inputs) 2025-08-14T21:48:47.9335366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9335780Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9336188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9336599Z layer_outputs = layer_module( 2025-08-14T21:48:47.9336980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9337374Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9337789Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9338211Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9338624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9339004Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9339392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9339806Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9339971Z 2025-08-14T21:48:47.9340078Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9340445Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9340776Z return mod(**inputs) 2025-08-14T21:48:47.9341129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9341505Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9341896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9342296Z layer_outputs = layer_module( 2025-08-14T21:48:47.9342658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9343053Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9343455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9343862Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9344258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9344665Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9345047Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9345461Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9345624Z 2025-08-14T21:48:47.9345707Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9345928Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9346244Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9346640Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9346987Z return mod(**inputs) 2025-08-14T21:48:47.9347355Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9347758Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9348138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9348511Z layer_outputs = layer_module( 2025-08-14T21:48:47.9348903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9349262Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9349643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9350027Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9350407Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 485, in forward 2025-08-14T21:48:47.9350855Z hidden_states = hidden_states + self.dropout(attention_output[0]) 2025-08-14T21:48:47.9351059Z 2025-08-14T21:48:47.9351146Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9351373Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9351590Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9351813Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9352066Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9352449Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9352791Z return mod(**inputs) 2025-08-14T21:48:47.9353166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9353574Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9353958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9354336Z layer_outputs = layer_module( 2025-08-14T21:48:47.9354688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9355054Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9355429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 609, in forward 2025-08-14T21:48:47.9355854Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:48:47.9356267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 216, in forward 2025-08-14T21:48:47.9356711Z forwarded_states = self.DenseReluDense(forwarded_states) 2025-08-14T21:48:47.9357164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 185, in forward 2025-08-14T21:48:47.9357578Z hidden_states = hidden_gelu * hidden_linear 2025-08-14T21:48:47.9357728Z 2025-08-14T21:48:47.9357824Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9358045Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9358270Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9358495Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9358711Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9358937Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9359189Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9359575Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9359917Z return mod(**inputs) 2025-08-14T21:48:47.9360293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9360751Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9361161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9361566Z layer_outputs = layer_module( 2025-08-14T21:48:47.9361929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9362320Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9362718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9363130Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9363556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9363978Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9364393Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9364868Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9365063Z 2025-08-14T21:48:47.9365185Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9365571Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9365926Z return mod(**inputs) 2025-08-14T21:48:47.9366309Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9366727Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9367123Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9367534Z layer_outputs = layer_module( 2025-08-14T21:48:47.9367908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9368296Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9368703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9369216Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9369629Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9370046Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9370457Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 433, in forward 2025-08-14T21:48:47.9370955Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:48:47.9371190Z 2025-08-14T21:48:47.9371274Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9371500Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9371756Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9372140Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9372479Z return mod(**inputs) 2025-08-14T21:48:47.9372848Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9373259Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9373657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9374065Z layer_outputs = layer_module( 2025-08-14T21:48:47.9374450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9374843Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9375242Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9375696Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9376119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9376536Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9376937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9377378Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9377553Z 2025-08-14T21:48:47.9377674Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9378081Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9378440Z return mod(**inputs) 2025-08-14T21:48:47.9378818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9379229Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9379629Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9380035Z layer_outputs = layer_module( 2025-08-14T21:48:47.9380411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9380798Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9381194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9381570Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9381946Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9382335Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9382735Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9383177Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9383347Z 2025-08-14T21:48:47.9383439Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9383654Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9383875Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9384092Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9384304Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9384519Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9384764Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9385140Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9385492Z return mod(**inputs) 2025-08-14T21:48:47.9385841Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9386221Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9386588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9386963Z layer_outputs = layer_module( 2025-08-14T21:48:47.9387308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9387663Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9388037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 609, in forward 2025-08-14T21:48:47.9388433Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:48:47.9388842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 216, in forward 2025-08-14T21:48:47.9389286Z forwarded_states = self.DenseReluDense(forwarded_states) 2025-08-14T21:48:47.9389720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 185, in forward 2025-08-14T21:48:47.9390192Z hidden_states = hidden_gelu * hidden_linear 2025-08-14T21:48:47.9390351Z 2025-08-14T21:48:47.9390439Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9390638Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9390847Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9391051Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9391245Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9391453Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9391688Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9392076Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9392428Z return mod(**inputs) 2025-08-14T21:48:47.9392802Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9393204Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9393588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9393983Z layer_outputs = layer_module( 2025-08-14T21:48:47.9394361Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9394744Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9395145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9395544Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9395920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9396300Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9396699Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9397161Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9397349Z 2025-08-14T21:48:47.9397468Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9397847Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9398195Z return mod(**inputs) 2025-08-14T21:48:47.9398569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9398965Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9399367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9399761Z layer_outputs = layer_module( 2025-08-14T21:48:47.9400124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9400506Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9400942Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9401356Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9401754Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9402157Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9402565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 433, in forward 2025-08-14T21:48:47.9403398Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:48:47.9403630Z 2025-08-14T21:48:47.9415924Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9416351Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9416619Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9417227Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9417639Z return mod(**inputs) 2025-08-14T21:48:47.9418053Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9418475Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9418882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9419287Z layer_outputs = layer_module( 2025-08-14T21:48:47.9419669Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9420125Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9420538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9420952Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9421381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9421792Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9422194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9422622Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9422803Z 2025-08-14T21:48:47.9422918Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9423308Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9423657Z return mod(**inputs) 2025-08-14T21:48:47.9424029Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9424432Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9424825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9425219Z layer_outputs = layer_module( 2025-08-14T21:48:47.9425589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9425979Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9426378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9426774Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9427173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9427583Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9427974Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9428409Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9428592Z 2025-08-14T21:48:47.9428682Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9428910Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9429153Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9429536Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9429880Z return mod(**inputs) 2025-08-14T21:48:47.9430246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9430645Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9431037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9431429Z layer_outputs = layer_module( 2025-08-14T21:48:47.9431787Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9432236Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9432652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9433054Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9433441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 485, in forward 2025-08-14T21:48:47.9433895Z hidden_states = hidden_states + self.dropout(attention_output[0]) 2025-08-14T21:48:47.9434090Z 2025-08-14T21:48:47.9434184Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9434402Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9434645Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9434866Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9435114Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9435491Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9435839Z return mod(**inputs) 2025-08-14T21:48:47.9436211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9436600Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9436990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9437383Z layer_outputs = layer_module( 2025-08-14T21:48:47.9437745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9438120Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9438516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 609, in forward 2025-08-14T21:48:47.9438931Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:48:47.9439340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 216, in forward 2025-08-14T21:48:47.9439783Z forwarded_states = self.DenseReluDense(forwarded_states) 2025-08-14T21:48:47.9440219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 185, in forward 2025-08-14T21:48:47.9440628Z hidden_states = hidden_gelu * hidden_linear 2025-08-14T21:48:47.9440781Z 2025-08-14T21:48:47.9440866Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9441089Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9441307Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9441517Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9441737Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9441953Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9442204Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9442586Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9442945Z return mod(**inputs) 2025-08-14T21:48:47.9443328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9443727Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9444129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9444541Z layer_outputs = layer_module( 2025-08-14T21:48:47.9444918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9445304Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9445714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9446132Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9446583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9447021Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9447429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9447904Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9448101Z 2025-08-14T21:48:47.9448215Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9448611Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9449088Z return mod(**inputs) 2025-08-14T21:48:47.9449496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9449912Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9450314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9450730Z layer_outputs = layer_module( 2025-08-14T21:48:47.9451100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9451498Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9451910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9452333Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9452734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9453152Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9453563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 433, in forward 2025-08-14T21:48:47.9454064Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:48:47.9454304Z 2025-08-14T21:48:47.9454394Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9454631Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9454889Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9455277Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9455633Z return mod(**inputs) 2025-08-14T21:48:47.9456016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9456419Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9456822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9457236Z layer_outputs = layer_module( 2025-08-14T21:48:47.9457613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9458002Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9458421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9458921Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9459326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9459741Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9460156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9460602Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9460775Z 2025-08-14T21:48:47.9460886Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9461273Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9461687Z return mod(**inputs) 2025-08-14T21:48:47.9462093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9462503Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9462899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9463294Z layer_outputs = layer_module( 2025-08-14T21:48:47.9463648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9464036Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9464448Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9464854Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9465241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9465650Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9466049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9466487Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9466665Z 2025-08-14T21:48:47.9466750Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9466978Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9467199Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9467409Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9467626Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9467842Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9468081Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9468458Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9468804Z return mod(**inputs) 2025-08-14T21:48:47.9469173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9469562Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9469951Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9470344Z layer_outputs = layer_module( 2025-08-14T21:48:47.9470701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9471091Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9471486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 609, in forward 2025-08-14T21:48:47.9471908Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:48:47.9472319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 216, in forward 2025-08-14T21:48:47.9472785Z forwarded_states = self.DenseReluDense(forwarded_states) 2025-08-14T21:48:47.9473220Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 185, in forward 2025-08-14T21:48:47.9473625Z hidden_states = hidden_gelu * hidden_linear 2025-08-14T21:48:47.9473783Z 2025-08-14T21:48:47.9473867Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9474094Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9474314Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9474525Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9474741Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9474959Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9475198Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9475583Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9475967Z return mod(**inputs) 2025-08-14T21:48:47.9476362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9476755Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9477145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9477554Z layer_outputs = layer_module( 2025-08-14T21:48:47.9477930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9478311Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9478723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9479135Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9479542Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9479972Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9480380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9480845Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9481052Z 2025-08-14T21:48:47.9481166Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9481561Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9481919Z return mod(**inputs) 2025-08-14T21:48:47.9482298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9482715Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9483117Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9483537Z layer_outputs = layer_module( 2025-08-14T21:48:47.9483907Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9484311Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9484723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9485135Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9485547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9485968Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9486385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 433, in forward 2025-08-14T21:48:47.9486875Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:48:47.9487111Z 2025-08-14T21:48:47.9487202Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9487436Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9487684Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9488082Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9488453Z return mod(**inputs) 2025-08-14T21:48:47.9488950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9489370Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9489767Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9490180Z layer_outputs = layer_module( 2025-08-14T21:48:47.9490543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9490935Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9491405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9491820Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9492223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9492639Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9493050Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9493505Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9493678Z 2025-08-14T21:48:47.9493808Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9494206Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9494547Z return mod(**inputs) 2025-08-14T21:48:47.9494914Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9495312Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9495700Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9496103Z layer_outputs = layer_module( 2025-08-14T21:48:47.9496461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9496848Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9497251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9497650Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9498046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9498450Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9498851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9499276Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9499452Z 2025-08-14T21:48:47.9499537Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9499765Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9500005Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9500384Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9500724Z return mod(**inputs) 2025-08-14T21:48:47.9501097Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9501482Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9501869Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9502266Z layer_outputs = layer_module( 2025-08-14T21:48:47.9502862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9503267Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9503676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9504091Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9504497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 485, in forward 2025-08-14T21:48:47.9504956Z hidden_states = hidden_states + self.dropout(attention_output[0]) 2025-08-14T21:48:47.9505158Z 2025-08-14T21:48:47.9505242Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9505465Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9505678Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9506207Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9506479Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9506861Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9507210Z return mod(**inputs) 2025-08-14T21:48:47.9507590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1750, in forward 2025-08-14T21:48:47.9507991Z encoder_outputs = self.encoder( 2025-08-14T21:48:47.9508375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9508788Z layer_outputs = layer_module( 2025-08-14T21:48:47.9509131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9509498Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9509903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 609, in forward 2025-08-14T21:48:47.9510295Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:48:47.9510681Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 216, in forward 2025-08-14T21:48:47.9511087Z forwarded_states = self.DenseReluDense(forwarded_states) 2025-08-14T21:48:47.9511497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 185, in forward 2025-08-14T21:48:47.9511899Z hidden_states = hidden_gelu * hidden_linear 2025-08-14T21:48:47.9512047Z 2025-08-14T21:48:47.9512136Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9512351Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9512569Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9512785Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9513022Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9513404Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9513749Z return mod(**inputs) 2025-08-14T21:48:47.9514092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9514487Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9514873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9515265Z layer_outputs = layer_module( 2025-08-14T21:48:47.9515621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9515998Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9516393Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9516789Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9517177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9517574Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9517970Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9518410Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9518608Z 2025-08-14T21:48:47.9518691Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9518912Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9519158Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9519536Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9519875Z return mod(**inputs) 2025-08-14T21:48:47.9520238Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9520671Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9521041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9521412Z layer_outputs = layer_module( 2025-08-14T21:48:47.9521757Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9522124Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9522519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9522945Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9523346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9523775Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9524181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9524611Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9524781Z 2025-08-14T21:48:47.9524888Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9525274Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9525622Z return mod(**inputs) 2025-08-14T21:48:47.9525989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9526390Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9526778Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9527189Z layer_outputs = layer_module( 2025-08-14T21:48:47.9527556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9527957Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9528373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9528852Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9529265Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9529680Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9530102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9530536Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9530717Z 2025-08-14T21:48:47.9530803Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9531027Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9531252Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9531466Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9531685Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9531902Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9532141Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9532520Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9532861Z return mod(**inputs) 2025-08-14T21:48:47.9533221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9533627Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9534013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9534409Z layer_outputs = layer_module( 2025-08-14T21:48:47.9534762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9535207Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9535609Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 609, in forward 2025-08-14T21:48:47.9536037Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:48:47.9536447Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 216, in forward 2025-08-14T21:48:47.9536892Z forwarded_states = self.DenseReluDense(forwarded_states) 2025-08-14T21:48:47.9537348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 185, in forward 2025-08-14T21:48:47.9537751Z hidden_states = hidden_gelu * hidden_linear 2025-08-14T21:48:47.9537912Z 2025-08-14T21:48:47.9537996Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9538216Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9538438Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9538650Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9538866Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9539079Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9539317Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9539696Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9540039Z return mod(**inputs) 2025-08-14T21:48:47.9540404Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9540801Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9541188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9541582Z layer_outputs = layer_module( 2025-08-14T21:48:47.9541942Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9542327Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9542721Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9543125Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9543512Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9543892Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9544291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9544735Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9544931Z 2025-08-14T21:48:47.9545014Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9545235Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9545595Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9545974Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9546303Z return mod(**inputs) 2025-08-14T21:48:47.9546671Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9547066Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9547460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9547856Z layer_outputs = layer_module( 2025-08-14T21:48:47.9548226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9548605Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9548981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9549459Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9549844Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9550227Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9550599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9551005Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9551164Z 2025-08-14T21:48:47.9551266Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9551638Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9551961Z return mod(**inputs) 2025-08-14T21:48:47.9552315Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9552693Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9553081Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9553471Z layer_outputs = layer_module( 2025-08-14T21:48:47.9553840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9554228Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9554611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9555015Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9555409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9555822Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9556209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9556644Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9556814Z 2025-08-14T21:48:47.9556907Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9557137Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9557350Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9557571Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9557789Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9557997Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9558244Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9558619Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9558951Z return mod(**inputs) 2025-08-14T21:48:47.9559319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9559715Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9560106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9560501Z layer_outputs = layer_module( 2025-08-14T21:48:47.9560859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9561244Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9561628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9562037Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9562443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9562856Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9563255Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9563778Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9563967Z 2025-08-14T21:48:47.9564057Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9564279Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9564518Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9564912Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9565266Z return mod(**inputs) 2025-08-14T21:48:47.9565649Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9566097Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9566503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9566905Z layer_outputs = layer_module( 2025-08-14T21:48:47.9567274Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9567669Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9568079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9568494Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9568982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9569417Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9569890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9570338Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9570519Z 2025-08-14T21:48:47.9570629Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9571016Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9571351Z return mod(**inputs) 2025-08-14T21:48:47.9571723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9572126Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9572516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9572912Z layer_outputs = layer_module( 2025-08-14T21:48:47.9573275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9573656Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9574036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9574436Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9574836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9575241Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9575631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9576071Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9576248Z 2025-08-14T21:48:47.9576335Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9576556Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9576765Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9576982Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9577197Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9577405Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9577647Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9578120Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9578474Z return mod(**inputs) 2025-08-14T21:48:47.9578850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9579256Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9579642Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9580037Z layer_outputs = layer_module( 2025-08-14T21:48:47.9580405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9580786Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9581153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 609, in forward 2025-08-14T21:48:47.9581539Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:48:47.9581927Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 216, in forward 2025-08-14T21:48:47.9582339Z forwarded_states = self.DenseReluDense(forwarded_states) 2025-08-14T21:48:47.9582755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 185, in forward 2025-08-14T21:48:47.9583162Z hidden_states = hidden_gelu * hidden_linear 2025-08-14T21:48:47.9583311Z 2025-08-14T21:48:47.9583404Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9583626Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9583840Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9584057Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9584272Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9584482Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9584727Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9584940Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9585022Z return mod(**inputs) 2025-08-14T21:48:47.9585277Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9585362Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9585613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9585688Z layer_outputs = layer_module( 2025-08-14T21:48:47.9585929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9586015Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9586264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9586361Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9586614Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9586710Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9586958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9587090Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9587094Z 2025-08-14T21:48:47.9587182Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9587263Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9587371Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9587589Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9587658Z return mod(**inputs) 2025-08-14T21:48:47.9587917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9588029Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9588303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9588392Z layer_outputs = layer_module( 2025-08-14T21:48:47.9588630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9588716Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9588991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9589077Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9589351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9589439Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9589685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9589814Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9589818Z 2025-08-14T21:48:47.9589927Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9590143Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9590212Z return mod(**inputs) 2025-08-14T21:48:47.9590463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9590551Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9590802Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9590887Z layer_outputs = layer_module( 2025-08-14T21:48:47.9591111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9591194Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9591439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9591524Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9591770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9591861Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9592105Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9592226Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9592230Z 2025-08-14T21:48:47.9592313Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9592392Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9592508Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9592717Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9592786Z return mod(**inputs) 2025-08-14T21:48:47.9593048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9593124Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9593379Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9593452Z layer_outputs = layer_module( 2025-08-14T21:48:47.9593682Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9593771Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9594019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9594145Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9594420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 485, in forward 2025-08-14T21:48:47.9594559Z hidden_states = hidden_states + self.dropout(attention_output[0]) 2025-08-14T21:48:47.9594562Z 2025-08-14T21:48:47.9594650Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9594730Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9594809Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9594895Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9595001Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9595233Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9595311Z return mod(**inputs) 2025-08-14T21:48:47.9595565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9595652Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9595908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9595984Z layer_outputs = layer_module( 2025-08-14T21:48:47.9596219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9596302Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9596549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9596643Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9596891Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9596988Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9597238Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9597375Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9597379Z 2025-08-14T21:48:47.9597468Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9597549Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9597663Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9597870Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9597937Z return mod(**inputs) 2025-08-14T21:48:47.9598195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9598275Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9598525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9598609Z layer_outputs = layer_module( 2025-08-14T21:48:47.9598843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9598932Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9599181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9599265Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9599520Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9599610Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9599861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9599982Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9599986Z 2025-08-14T21:48:47.9600093Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9600375Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9600445Z return mod(**inputs) 2025-08-14T21:48:47.9600694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9600780Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9601031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9601112Z layer_outputs = layer_module( 2025-08-14T21:48:47.9601337Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9601437Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9601694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9601781Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9602034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9602131Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9602387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9602506Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9602510Z 2025-08-14T21:48:47.9602594Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9602876Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9602971Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9603057Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9603139Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9603229Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9603341Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9603569Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9603641Z return mod(**inputs) 2025-08-14T21:48:47.9603911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9603999Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9604269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9604347Z layer_outputs = layer_module( 2025-08-14T21:48:47.9604593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9604679Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9604945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 609, in forward 2025-08-14T21:48:47.9605046Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:48:47.9605306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 216, in forward 2025-08-14T21:48:47.9605444Z forwarded_states = self.DenseReluDense(forwarded_states) 2025-08-14T21:48:47.9605707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 185, in forward 2025-08-14T21:48:47.9605810Z hidden_states = hidden_gelu * hidden_linear 2025-08-14T21:48:47.9605814Z 2025-08-14T21:48:47.9605898Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9605980Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9606069Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9606151Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9606230Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9606317Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9606428Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9606754Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9606832Z return mod(**inputs) 2025-08-14T21:48:47.9607099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9607187Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9607452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9607532Z layer_outputs = layer_module( 2025-08-14T21:48:47.9607777Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9607887Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9608160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9608247Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9608510Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9608606Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9608937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9609080Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9609086Z 2025-08-14T21:48:47.9609176Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9609256Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9609373Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9609593Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9609664Z return mod(**inputs) 2025-08-14T21:48:47.9609937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9610021Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9610281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9610365Z layer_outputs = layer_module( 2025-08-14T21:48:47.9610602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9610704Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9610959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9611045Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9611308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9611400Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9611666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9611788Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9611791Z 2025-08-14T21:48:47.9611902Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9612125Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9612196Z return mod(**inputs) 2025-08-14T21:48:47.9612455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9612544Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9612803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9612890Z layer_outputs = layer_module( 2025-08-14T21:48:47.9613126Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9613279Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9613544Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9613631Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9613885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9613983Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9614236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9614382Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9614386Z 2025-08-14T21:48:47.9614472Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9614554Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9614646Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9614729Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9614811Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9614900Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9615009Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9615233Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9615310Z return mod(**inputs) 2025-08-14T21:48:47.9615575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9615666Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9615928Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9616005Z layer_outputs = layer_module( 2025-08-14T21:48:47.9616252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9616342Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9616606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9616693Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9616951Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9617051Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9617314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9617460Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9617463Z 2025-08-14T21:48:47.9617548Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9617632Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9617751Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9617974Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9618045Z return mod(**inputs) 2025-08-14T21:48:47.9618316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9618398Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9618666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9618744Z layer_outputs = layer_module( 2025-08-14T21:48:47.9618986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9619080Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9619347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9619481Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9619758Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9619847Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9620103Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9620216Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9620220Z 2025-08-14T21:48:47.9620325Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9620541Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9620628Z return mod(**inputs) 2025-08-14T21:48:47.9620887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9620965Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9621218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9621297Z layer_outputs = layer_module( 2025-08-14T21:48:47.9621525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9621605Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9621859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9621942Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9622197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9622284Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9622530Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9622655Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9622659Z 2025-08-14T21:48:47.9622743Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9622831Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9622936Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9623153Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9623227Z return mod(**inputs) 2025-08-14T21:48:47.9623488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9623565Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9623824Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9623897Z layer_outputs = layer_module( 2025-08-14T21:48:47.9624134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9624219Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9624463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9624553Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9624806Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 524, in forward 2025-08-14T21:48:47.9624940Z layer_output = hidden_states + self.dropout(attention_output[0]) 2025-08-14T21:48:47.9624943Z 2025-08-14T21:48:47.9625034Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9625115Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9625204Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9625284Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9625389Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9625669Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9625740Z return mod(**inputs) 2025-08-14T21:48:47.9626002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9626089Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9626348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9626430Z layer_outputs = layer_module( 2025-08-14T21:48:47.9626660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9626759Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9627021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 609, in forward 2025-08-14T21:48:47.9627118Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:48:47.9627370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 216, in forward 2025-08-14T21:48:47.9627502Z forwarded_states = self.DenseReluDense(forwarded_states) 2025-08-14T21:48:47.9627757Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 185, in forward 2025-08-14T21:48:47.9627855Z hidden_states = hidden_gelu * hidden_linear 2025-08-14T21:48:47.9627859Z 2025-08-14T21:48:47.9627940Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9628022Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9628108Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9628188Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9628268Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9628354Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9628460Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9628680Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9628747Z return mod(**inputs) 2025-08-14T21:48:47.9629007Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9629092Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9629379Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9629452Z layer_outputs = layer_module( 2025-08-14T21:48:47.9629686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9629770Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9630024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9630112Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9630360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9630454Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9630709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9630843Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9630847Z 2025-08-14T21:48:47.9630926Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9631005Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9631120Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9631340Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9631410Z return mod(**inputs) 2025-08-14T21:48:47.9631718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9631843Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9632089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9632159Z layer_outputs = layer_module( 2025-08-14T21:48:47.9632378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9632478Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9632734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9632835Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9633102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9633189Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9633449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9633563Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9633567Z 2025-08-14T21:48:47.9633675Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9633889Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9633958Z return mod(**inputs) 2025-08-14T21:48:47.9634216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9634293Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9634544Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9634622Z layer_outputs = layer_module( 2025-08-14T21:48:47.9634835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9634917Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9635158Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9635238Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9635477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9635558Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9635788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9635902Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9635906Z 2025-08-14T21:48:47.9635983Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9636058Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9636139Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9636216Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9636303Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9636380Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9636486Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9636698Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9636766Z return mod(**inputs) 2025-08-14T21:48:47.9637013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9637096Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9637348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9637430Z layer_outputs = layer_module( 2025-08-14T21:48:47.9637657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9637791Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9638049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9638134Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9638381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9638476Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9638725Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9638880Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9638884Z 2025-08-14T21:48:47.9638968Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9639046Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9639165Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9639388Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9639460Z return mod(**inputs) 2025-08-14T21:48:47.9639702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9639774Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9640020Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9640091Z layer_outputs = layer_module( 2025-08-14T21:48:47.9640311Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9640397Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9640633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9640724Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9640961Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9641046Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9641290Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9641398Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9641402Z 2025-08-14T21:48:47.9641509Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9641711Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9641777Z return mod(**inputs) 2025-08-14T21:48:47.9642023Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9642101Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9642344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9642425Z layer_outputs = layer_module( 2025-08-14T21:48:47.9642643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9642729Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9642971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9643056Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9643313Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9643399Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9643646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9643824Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9643829Z 2025-08-14T21:48:47.9643913Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9644002Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9644082Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9644161Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9644248Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9644327Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9644435Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9644671Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9644742Z return mod(**inputs) 2025-08-14T21:48:47.9645001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9645081Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9645335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9645418Z layer_outputs = layer_module( 2025-08-14T21:48:47.9645648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9645731Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9645989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 609, in forward 2025-08-14T21:48:47.9646086Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:48:47.9646349Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 216, in forward 2025-08-14T21:48:47.9646477Z forwarded_states = self.DenseReluDense(forwarded_states) 2025-08-14T21:48:47.9646734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 185, in forward 2025-08-14T21:48:47.9646840Z hidden_states = hidden_gelu * hidden_linear 2025-08-14T21:48:47.9646844Z 2025-08-14T21:48:47.9646928Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9647019Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9647129Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9647343Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9647419Z return mod(**inputs) 2025-08-14T21:48:47.9647675Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9647757Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9648022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9648101Z layer_outputs = layer_module( 2025-08-14T21:48:47.9648347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9648436Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9648688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 609, in forward 2025-08-14T21:48:47.9648866Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:48:47.9649139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 217, in forward 2025-08-14T21:48:47.9649276Z hidden_states = hidden_states + self.dropout(forwarded_states) 2025-08-14T21:48:47.9649290Z 2025-08-14T21:48:47.9649375Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9649462Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9649553Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9649635Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9649745Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9650049Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9650122Z return mod(**inputs) 2025-08-14T21:48:47.9650385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9650471Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9650732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9650815Z layer_outputs = layer_module( 2025-08-14T21:48:47.9651072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9651160Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9651424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9651520Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9651774Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9651873Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9652127Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9652272Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9652276Z 2025-08-14T21:48:47.9652359Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9652442Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9652564Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9652780Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9652861Z return mod(**inputs) 2025-08-14T21:48:47.9653120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9653203Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9653472Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9653548Z layer_outputs = layer_module( 2025-08-14T21:48:47.9653783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9653875Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9654129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9654225Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9654482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9654571Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9654835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9654954Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9654958Z 2025-08-14T21:48:47.9655078Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9655291Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9655361Z return mod(**inputs) 2025-08-14T21:48:47.9655628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9655707Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9655965Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9656052Z layer_outputs = layer_module( 2025-08-14T21:48:47.9656288Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9656436Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9656693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9656778Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9657043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9657132Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9657386Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9657527Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9657532Z 2025-08-14T21:48:47.9657618Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9657707Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9657791Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9657873Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9657962Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9658044Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9658154Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9658379Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9658450Z return mod(**inputs) 2025-08-14T21:48:47.9658721Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9658800Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9659060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9659144Z layer_outputs = layer_module( 2025-08-14T21:48:47.9659382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9659474Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9659742Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9659832Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9660098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9660199Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9660448Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9660591Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9660595Z 2025-08-14T21:48:47.9660677Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9660767Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9660881Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9661091Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9661167Z return mod(**inputs) 2025-08-14T21:48:47.9661419Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9661496Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9661758Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9661833Z layer_outputs = layer_module( 2025-08-14T21:48:47.9662072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9662156Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9662404Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9662534Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9662798Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9662887Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9663143Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9663256Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9663260Z 2025-08-14T21:48:47.9663373Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9663597Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9663667Z return mod(**inputs) 2025-08-14T21:48:47.9663928Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9664008Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9664261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9664342Z layer_outputs = layer_module( 2025-08-14T21:48:47.9664569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9664660Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9664913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9664999Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9665261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9665351Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9665613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9665731Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9665735Z 2025-08-14T21:48:47.9665820Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9665913Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9665994Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9666074Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9666163Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9666242Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9666359Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9666584Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9666652Z return mod(**inputs) 2025-08-14T21:48:47.9666922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9667003Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9667264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9667348Z layer_outputs = layer_module( 2025-08-14T21:48:47.9667581Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9667674Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9667935Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 609, in forward 2025-08-14T21:48:47.9668033Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:48:47.9668346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 216, in forward 2025-08-14T21:48:47.9668474Z forwarded_states = self.DenseReluDense(forwarded_states) 2025-08-14T21:48:47.9668738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 185, in forward 2025-08-14T21:48:47.9669308Z hidden_states = hidden_gelu * hidden_linear 2025-08-14T21:48:47.9669312Z 2025-08-14T21:48:47.9669398Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9669493Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9669576Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9669658Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9669753Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9669836Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9669948Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9670210Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9670285Z return mod(**inputs) 2025-08-14T21:48:47.9670563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9670645Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9670909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9670993Z layer_outputs = layer_module( 2025-08-14T21:48:47.9671227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9671312Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9671574Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9671660Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9671929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9672020Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9672266Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9672403Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9672406Z 2025-08-14T21:48:47.9672482Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9672565Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9672667Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9672894Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9672968Z return mod(**inputs) 2025-08-14T21:48:47.9673228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9673305Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9673570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9673642Z layer_outputs = layer_module( 2025-08-14T21:48:47.9673880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9673962Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9674217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9674309Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9674560Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9674647Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9674908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9675020Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9675024Z 2025-08-14T21:48:47.9675140Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9675371Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9675479Z return mod(**inputs) 2025-08-14T21:48:47.9675739Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9675816Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9676073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9676148Z layer_outputs = layer_module( 2025-08-14T21:48:47.9676374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9676482Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9676733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9676818Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9677079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9677166Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9677423Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9677535Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9677540Z 2025-08-14T21:48:47.9677621Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9677709Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9677815Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9678025Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9678096Z return mod(**inputs) 2025-08-14T21:48:47.9678331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9678412Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9678650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9678721Z layer_outputs = layer_module( 2025-08-14T21:48:47.9678944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9679023Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9679257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9679344Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9679579Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 485, in forward 2025-08-14T21:48:47.9679715Z hidden_states = hidden_states + self.dropout(attention_output[0]) 2025-08-14T21:48:47.9679720Z 2025-08-14T21:48:47.9679803Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9679884Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9679971Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9680049Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9680163Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9680378Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9680440Z return mod(**inputs) 2025-08-14T21:48:47.9680684Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9680757Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9680995Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9681072Z layer_outputs = layer_module( 2025-08-14T21:48:47.9681287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9681443Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9681675Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9681755Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9681994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9682076Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9682308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9682454Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9682459Z 2025-08-14T21:48:47.9682536Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9682620Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9682728Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9682936Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9683013Z return mod(**inputs) 2025-08-14T21:48:47.9683265Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9683342Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9683603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9683677Z layer_outputs = layer_module( 2025-08-14T21:48:47.9683917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9684002Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9684252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9684349Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9684599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9684696Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9684945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9685057Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9685060Z 2025-08-14T21:48:47.9685173Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9685382Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9685451Z return mod(**inputs) 2025-08-14T21:48:47.9685712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9685793Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9686055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9686130Z layer_outputs = layer_module( 2025-08-14T21:48:47.9686359Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9686449Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9686700Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9686791Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9687040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9687127Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9687383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9687546Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9687551Z 2025-08-14T21:48:47.9687634Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9687723Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9687802Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9687888Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9687966Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9688045Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9688160Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9688390Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9688459Z return mod(**inputs) 2025-08-14T21:48:47.9688723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9688877Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9689153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9689231Z layer_outputs = layer_module( 2025-08-14T21:48:47.9689468Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9689562Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9689818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 609, in forward 2025-08-14T21:48:47.9689917Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:48:47.9690186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 216, in forward 2025-08-14T21:48:47.9690313Z forwarded_states = self.DenseReluDense(forwarded_states) 2025-08-14T21:48:47.9690585Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 185, in forward 2025-08-14T21:48:47.9690683Z hidden_states = hidden_gelu * hidden_linear 2025-08-14T21:48:47.9690687Z 2025-08-14T21:48:47.9690770Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9690858Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9690937Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9691016Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9691104Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9691182Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9691298Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9691506Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9691573Z return mod(**inputs) 2025-08-14T21:48:47.9691831Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9691909Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9692161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9692241Z layer_outputs = layer_module( 2025-08-14T21:48:47.9692469Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9692556Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9692805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9692889Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9693147Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9693236Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9693484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9693687Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9693692Z 2025-08-14T21:48:47.9693774Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9693863Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9693970Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9694179Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9694253Z return mod(**inputs) 2025-08-14T21:48:47.9694506Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9694600Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9694864Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9694939Z layer_outputs = layer_module( 2025-08-14T21:48:47.9695177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9695257Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9695504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9695595Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9695842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9695933Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9696182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9696295Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9696298Z 2025-08-14T21:48:47.9696413Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9696622Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9696691Z return mod(**inputs) 2025-08-14T21:48:47.9696948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9697023Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9697281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9697355Z layer_outputs = layer_module( 2025-08-14T21:48:47.9697583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9697673Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9697921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 559, in forward 2025-08-14T21:48:47.9698014Z self_attention_outputs = self.layer[0]( 2025-08-14T21:48:47.9698266Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 475, in forward 2025-08-14T21:48:47.9698350Z attention_output = self.SelfAttention( 2025-08-14T21:48:47.9698603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9698715Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9698719Z 2025-08-14T21:48:47.9698799Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9698890Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9698968Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9699055Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9699134Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9699211Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9699324Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9699548Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9699659Z return mod(**inputs) 2025-08-14T21:48:47.9699918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9699995Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9700250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9700324Z layer_outputs = layer_module( 2025-08-14T21:48:47.9700550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9700658Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9700907Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9700994Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9701257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9701346Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9701602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 401, in forward 2025-08-14T21:48:47.9701735Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:48:47.9701739Z 2025-08-14T21:48:47.9701820Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9701909Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9702016Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9702226Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9702302Z return mod(**inputs) 2025-08-14T21:48:47.9702554Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9702883Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9703152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9703229Z layer_outputs = layer_module( 2025-08-14T21:48:47.9703466Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9703547Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9703801Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9703882Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9704119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9704210Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9704442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 440, in forward 2025-08-14T21:48:47.9704555Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:48:47.9704559Z 2025-08-14T21:48:47.9704671Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9704866Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9704940Z return mod(**inputs) 2025-08-14T21:48:47.9705177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9705249Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9705498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9705570Z layer_outputs = layer_module( 2025-08-14T21:48:47.9705785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9705943Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9706203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9706294Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9706526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 512, in forward 2025-08-14T21:48:47.9706608Z attention_output = self.EncDecAttention( 2025-08-14T21:48:47.9706850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 442, in forward 2025-08-14T21:48:47.9706982Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:48:47.9706986Z 2025-08-14T21:48:47.9707075Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9707152Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9707254Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9707477Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9707547Z return mod(**inputs) 2025-08-14T21:48:47.9707801Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9707887Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9708142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9708220Z layer_outputs = layer_module( 2025-08-14T21:48:47.9708440Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9708517Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9708760Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 583, in forward 2025-08-14T21:48:47.9708842Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:48:47.9709133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 524, in forward 2025-08-14T21:48:47.9709269Z layer_output = hidden_states + self.dropout(attention_output[0]) 2025-08-14T21:48:47.9709272Z 2025-08-14T21:48:47.9709348Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9709431Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9709506Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9709581Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9709689Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9709884Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9709947Z return mod(**inputs) 2025-08-14T21:48:47.9710197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1787, in forward 2025-08-14T21:48:47.9710271Z decoder_outputs = self.decoder( 2025-08-14T21:48:47.9710517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1079, in forward 2025-08-14T21:48:47.9710587Z layer_outputs = layer_module( 2025-08-14T21:48:47.9710802Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:48:47.9710887Z return super().__call__(*args, **kwargs) 2025-08-14T21:48:47.9711124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 609, in forward 2025-08-14T21:48:47.9711222Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:48:47.9711459Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 216, in forward 2025-08-14T21:48:47.9711576Z forwarded_states = self.DenseReluDense(forwarded_states) 2025-08-14T21:48:47.9711835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 185, in forward 2025-08-14T21:48:47.9712002Z hidden_states = hidden_gelu * hidden_linear 2025-08-14T21:48:47.9712007Z 2025-08-14T21:48:47.9712091Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9712181Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9712259Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9712345Z cudagraph partition due to non gpu ops 2025-08-14T21:48:47.9712453Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:48:47.9712667Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:48:47.9712743Z return mod(**inputs) 2025-08-14T21:48:47.9713018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mt5/modeling_mt5.py", line 1823, in forward 2025-08-14T21:48:47.9713162Z loss = loss_fct(lm_logits.view(-1, lm_logits.size(-1)), labels.view(-1)) 2025-08-14T21:48:47.9713168Z 2025-08-14T21:48:59.7811781Z Compilation time (from dynamo_timed): 29.730204053 2025-08-14T21:48:59.7953852Z pass 2025-08-14T21:48:59.7955386Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:48:59.7956237Z TIMING: _recursive_pre_grad_passes:0.09439 _recursive_joint_graph_passes:0.86812 _recursive_post_grad_passes:0.32672 async_compile.wait:0.85326 code_gen:11.24962 inductor_compile:14.47786 backend_compile:24.98193 gc:0.00046 entire_frame_compile:29.7302 total_wall_time:29.7302 2025-08-14T21:48:59.7957341Z STATS: call_* op count: 1207 | FakeTensorMode.__torch_dispatch__:56581 | FakeTensor.__torch_dispatch__:8068 | ProxyTorchDispatchMode.__torch_dispatch__:15790 2025-08-14T21:48:59.7957898Z Dynamo produced 1 graphs covering 1207 ops with 0 graph breaks (0 unique) 2025-08-14T21:49:05.8709872Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:49:05.8710959Z from pkg_resources import resource_filename 2025-08-14T21:49:06.6911468Z 2025-08-14T21:49:06.7037639Z loading model: 0it [00:00, ?it/s]If you want to use `MegatronBertForCausalLM` as a standalone, add `is_decoder=True.` 2025-08-14T21:49:06.7043940Z WARNING:transformers.models.megatron_bert.modeling_megatron_bert:If you want to use `MegatronBertForCausalLM` as a standalone, add `is_decoder=True.` 2025-08-14T21:49:10.4310706Z 2025-08-14T21:49:10.4311301Z loading model: 0it [00:03, ?it/s] 2025-08-14T21:49:10.4311630Z cpu eval MegatronBertForCausalLM 2025-08-14T21:49:12.2448606Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:49:12.7014079Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:49:13.1637048Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:49:34.6579790Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6580262Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6580623Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6580980Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6581329Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6581669Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6581998Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6582357Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6582886Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6583280Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6583603Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6583947Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6584324Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6585181Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6585608Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6585962Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6586278Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6586604Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6586945Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6587273Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6587633Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6588010Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6588386Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6588722Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6589136Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6589491Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6589819Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6590175Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6590513Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6590857Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6591197Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6591524Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6591885Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6592205Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6592531Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6592865Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6593206Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6593572Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6593964Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:49:34.6594641Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:49:34.6595199Z return mod(**inputs) 2025-08-14T21:49:34.6595966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1064, in forward 2025-08-14T21:49:34.6596729Z outputs = self.bert( 2025-08-14T21:49:34.6597430Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:49:34.6598184Z encoder_outputs = self.encoder( 2025-08-14T21:49:34.6598924Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:49:34.6599652Z layer_outputs = layer_module( 2025-08-14T21:49:34.6600268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:49:34.6600916Z return super().__call__(*args, **kwargs) 2025-08-14T21:49:34.6601691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:49:34.6602457Z self_attention_outputs = self.attention( 2025-08-14T21:49:34.6603508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:49:34.6604393Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:49:34.6605259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:49:34.6606003Z return residual + hidden_states 2025-08-14T21:49:34.6606228Z 2025-08-14T21:49:34.6606362Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6606738Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6607085Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6607423Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6607771Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6608119Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6608464Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6609081Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6609487Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6609839Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6610191Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6610535Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6610876Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6611214Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6611577Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6611914Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6612252Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6612593Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6613001Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6613346Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6613684Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6614025Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6614368Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6614714Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6615063Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6615413Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6615760Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6616092Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6616522Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:49:34.6617195Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:49:34.6617782Z return mod(**inputs) 2025-08-14T21:49:34.6618509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1064, in forward 2025-08-14T21:49:34.6619254Z outputs = self.bert( 2025-08-14T21:49:34.6619953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:49:34.6620708Z encoder_outputs = self.encoder( 2025-08-14T21:49:34.6621470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:49:34.6622275Z layer_outputs = layer_module( 2025-08-14T21:49:34.6622955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:49:34.6623661Z return super().__call__(*args, **kwargs) 2025-08-14T21:49:34.6624497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:49:34.6625288Z self_attention_outputs = self.attention( 2025-08-14T21:49:34.6626068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:49:34.6626946Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:49:34.6627806Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:49:34.6628579Z return residual + hidden_states 2025-08-14T21:49:34.6628807Z 2025-08-14T21:49:34.6628938Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6629305Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6629644Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6629991Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6630411Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6630752Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6631105Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6631449Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6631809Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6632128Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6632553Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6632935Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6633269Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6633622Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6633958Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6634295Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6634623Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6634956Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6635295Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6635644Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6635986Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6636362Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6636695Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6637040Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6637402Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6637747Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6638098Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6638444Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6638846Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:49:34.6639461Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:49:34.6640021Z return mod(**inputs) 2025-08-14T21:49:34.6640755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1064, in forward 2025-08-14T21:49:34.6641485Z outputs = self.bert( 2025-08-14T21:49:34.6642194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:49:34.6642960Z encoder_outputs = self.encoder( 2025-08-14T21:49:34.6643713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:49:34.6644481Z layer_outputs = layer_module( 2025-08-14T21:49:34.6645107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:49:34.6645750Z return super().__call__(*args, **kwargs) 2025-08-14T21:49:34.6646504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:49:34.6647264Z self_attention_outputs = self.attention( 2025-08-14T21:49:34.6648025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:49:34.6649001Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:49:34.6649887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:49:34.6650649Z return residual + hidden_states 2025-08-14T21:49:34.6650868Z 2025-08-14T21:49:34.6651007Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6651352Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6651687Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6652034Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6652380Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6652721Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6653070Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6653420Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6653732Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6654091Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6654444Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6654781Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6655117Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6655474Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6655926Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6656333Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6656707Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6657091Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6657456Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6657835Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6658207Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6658570Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6658953Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6659325Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6659694Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6660094Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6660466Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6660812Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6661215Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:49:34.6661855Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:49:34.6662445Z return mod(**inputs) 2025-08-14T21:49:34.6663145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1064, in forward 2025-08-14T21:49:34.6663900Z outputs = self.bert( 2025-08-14T21:49:34.6664632Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:49:34.6665392Z encoder_outputs = self.encoder( 2025-08-14T21:49:34.6666132Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:49:34.6666902Z layer_outputs = layer_module( 2025-08-14T21:49:34.6667563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:49:34.6668258Z return super().__call__(*args, **kwargs) 2025-08-14T21:49:34.6669074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:49:34.6669924Z self_attention_outputs = self.attention( 2025-08-14T21:49:34.6670769Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:49:34.6671663Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:49:34.6672507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:49:34.6673289Z return residual + hidden_states 2025-08-14T21:49:34.6673498Z 2025-08-14T21:49:34.6673636Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6673984Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6674343Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6674714Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6675045Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6675389Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6675800Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6676151Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6676483Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6676835Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6677191Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6677532Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6677880Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6678215Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6678559Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6678904Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6679259Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6679677Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6680060Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6680406Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6680748Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6681091Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6681520Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6681866Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6682216Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6682567Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6682913Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6683261Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6683742Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:49:34.6684399Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:49:34.6684990Z return mod(**inputs) 2025-08-14T21:49:34.6685720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1064, in forward 2025-08-14T21:49:34.6686481Z outputs = self.bert( 2025-08-14T21:49:34.6687188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:49:34.6687945Z encoder_outputs = self.encoder( 2025-08-14T21:49:34.6688697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:49:34.6689528Z layer_outputs = layer_module( 2025-08-14T21:49:34.6690161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:49:34.6690813Z return super().__call__(*args, **kwargs) 2025-08-14T21:49:34.6691647Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:49:34.6692454Z self_attention_outputs = self.attention( 2025-08-14T21:49:34.6693240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:49:34.6694086Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:49:34.6694920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:49:34.6695683Z return residual + hidden_states 2025-08-14T21:49:34.6695906Z 2025-08-14T21:49:34.6696032Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6696391Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6696737Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6697085Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6697440Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6697774Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6698117Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6698461Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6698824Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6699161Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6699520Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6699869Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6700215Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6700566Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6700911Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6701243Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6701588Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6701945Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6702276Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6702774Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6703133Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6703709Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6704094Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6704445Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6704790Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6705129Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6705477Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6705803Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6706165Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:49:34.6706784Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:49:34.6707324Z return mod(**inputs) 2025-08-14T21:49:34.6708078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1064, in forward 2025-08-14T21:49:34.6708809Z outputs = self.bert( 2025-08-14T21:49:34.6709474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:49:34.6710185Z encoder_outputs = self.encoder( 2025-08-14T21:49:34.6710863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:49:34.6711533Z layer_outputs = layer_module( 2025-08-14T21:49:34.6712127Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:49:34.6712804Z return super().__call__(*args, **kwargs) 2025-08-14T21:49:34.6713613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:49:34.6714447Z self_attention_outputs = self.attention( 2025-08-14T21:49:34.6715271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:49:34.6716173Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:49:34.6717069Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:49:34.6717866Z return residual + hidden_states 2025-08-14T21:49:34.6718088Z 2025-08-14T21:49:34.6718217Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6718575Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6718943Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6719281Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6719572Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6719863Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6720162Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6720456Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6720742Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6721042Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6721339Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6721650Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6721974Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6722298Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6722616Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6722937Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6723259Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6723574Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6723893Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6724215Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6724520Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6724834Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6725159Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6725492Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6725926Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6726279Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6726615Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6726941Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6727321Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:49:34.6727930Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:49:34.6728463Z return mod(**inputs) 2025-08-14T21:49:34.6729312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1064, in forward 2025-08-14T21:49:34.6730048Z outputs = self.bert( 2025-08-14T21:49:34.6730818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:49:34.6731607Z encoder_outputs = self.encoder( 2025-08-14T21:49:34.6732374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:49:34.6733139Z layer_outputs = layer_module( 2025-08-14T21:49:34.6733752Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:49:34.6734406Z return super().__call__(*args, **kwargs) 2025-08-14T21:49:34.6735191Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:49:34.6735986Z self_attention_outputs = self.attention( 2025-08-14T21:49:34.6736755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:49:34.6737569Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:49:34.6738378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:49:34.6739098Z return residual + hidden_states 2025-08-14T21:49:34.6739306Z 2025-08-14T21:49:34.6739430Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6739772Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6740110Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6740443Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6740793Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6741136Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6741472Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6741802Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6742146Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6742493Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6742832Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6743168Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6743510Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6743842Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6744182Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6744534Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6744879Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6745229Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6745584Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6745934Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6746277Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6746636Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6746992Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6747339Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6747674Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6748009Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6748338Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6748683Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6749213Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:49:34.6749870Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:49:34.6750445Z return mod(**inputs) 2025-08-14T21:49:34.6751140Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1064, in forward 2025-08-14T21:49:34.6751886Z outputs = self.bert( 2025-08-14T21:49:34.6752584Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:49:34.6753344Z encoder_outputs = self.encoder( 2025-08-14T21:49:34.6754113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:49:34.6754905Z layer_outputs = layer_module( 2025-08-14T21:49:34.6755547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:49:34.6756225Z return super().__call__(*args, **kwargs) 2025-08-14T21:49:34.6757038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:49:34.6757754Z self_attention_outputs = self.attention( 2025-08-14T21:49:34.6758509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:49:34.6759352Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:49:34.6760160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:49:34.6760854Z return residual + hidden_states 2025-08-14T21:49:34.6761069Z 2025-08-14T21:49:34.6761191Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6761538Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6761884Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6762225Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6762587Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6762949Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6763305Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6763664Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6764027Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6764378Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6764738Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6765080Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6765421Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6765753Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6766119Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6766448Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6766769Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6767101Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6767422Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6767744Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6768085Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6768418Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6768856Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6769218Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6769573Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6769889Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6770209Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6770558Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6770958Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:49:34.6771578Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:49:34.6772358Z return mod(**inputs) 2025-08-14T21:49:34.6773199Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1064, in forward 2025-08-14T21:49:34.6774004Z outputs = self.bert( 2025-08-14T21:49:34.6774783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:49:34.6775573Z encoder_outputs = self.encoder( 2025-08-14T21:49:34.6776384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:49:34.6777179Z layer_outputs = layer_module( 2025-08-14T21:49:34.6777902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:49:34.6778612Z return super().__call__(*args, **kwargs) 2025-08-14T21:49:34.6779442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:49:34.6780288Z self_attention_outputs = self.attention( 2025-08-14T21:49:34.6781120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:49:34.6782059Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:49:34.6782958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:49:34.6783769Z return residual + hidden_states 2025-08-14T21:49:34.6784006Z 2025-08-14T21:49:34.6784144Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6784525Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6784893Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6785268Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6785629Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6785989Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6786364Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6786738Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6787109Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6787477Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6787860Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6788228Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6788604Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6788984Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6789364Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6789727Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6790170Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6790544Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6790904Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6791246Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6791601Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6791964Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6792322Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6792693Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6793073Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6793435Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6793812Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6794183Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6794620Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:49:34.6795292Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:49:34.6795905Z return mod(**inputs) 2025-08-14T21:49:34.6796690Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1064, in forward 2025-08-14T21:49:34.6797622Z outputs = self.bert( 2025-08-14T21:49:34.6798437Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:49:34.6799271Z encoder_outputs = self.encoder( 2025-08-14T21:49:34.6800087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:49:34.6800905Z layer_outputs = layer_module( 2025-08-14T21:49:34.6801588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:49:34.6802293Z return super().__call__(*args, **kwargs) 2025-08-14T21:49:34.6803473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:49:34.6804317Z self_attention_outputs = self.attention( 2025-08-14T21:49:34.6805157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:49:34.6806096Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:49:34.6807026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:49:34.6807839Z return residual + hidden_states 2025-08-14T21:49:34.6808086Z 2025-08-14T21:49:34.6808230Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6808629Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6809082Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6809464Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6809844Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6810201Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6810565Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6810938Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6811313Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6811690Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6812072Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6812430Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6812818Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6813193Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6813571Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6813941Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6814326Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6814714Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6815082Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6815459Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6815842Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6816206Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6816566Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6816942Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6817310Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6817690Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6818069Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6818439Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6818876Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:49:34.6819586Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:49:34.6820231Z return mod(**inputs) 2025-08-14T21:49:34.6821016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1064, in forward 2025-08-14T21:49:34.6821833Z outputs = self.bert( 2025-08-14T21:49:34.6822603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:49:34.6823598Z encoder_outputs = self.encoder( 2025-08-14T21:49:34.6824493Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:49:34.6825301Z layer_outputs = layer_module( 2025-08-14T21:49:34.6825984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:49:34.6826678Z return super().__call__(*args, **kwargs) 2025-08-14T21:49:34.6827530Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:49:34.6828424Z self_attention_outputs = self.attention( 2025-08-14T21:49:34.6829282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:49:34.6830094Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:49:34.6831002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:49:34.6831805Z return residual + hidden_states 2025-08-14T21:49:34.6832028Z 2025-08-14T21:49:34.6832170Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6832537Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6832909Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6833260Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6833615Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6833986Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6834364Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6834737Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6835119Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6835497Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6835877Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6836246Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6836607Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6836978Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6837345Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6837728Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6838107Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6838470Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6838840Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6839209Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6839571Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6839940Z cudagraph partition due to non gpu ops 2025-08-14T21:49:34.6840386Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:49:34.6841085Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:49:34.6841710Z return mod(**inputs) 2025-08-14T21:49:34.6842498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1086, in forward 2025-08-14T21:49:34.6843345Z lm_loss = self.loss_function( 2025-08-14T21:49:34.6844063Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/loss/loss_utils.py", line 67, in ForCausalLMLoss 2025-08-14T21:49:34.6844996Z loss = fixed_cross_entropy(logits, shift_labels, num_items_in_batch, ignore_index, **kwargs) 2025-08-14T21:49:34.6845924Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/loss/loss_utils.py", line 36, in fixed_cross_entropy 2025-08-14T21:49:34.6846910Z loss = nn.functional.cross_entropy(source, target, ignore_index=ignore_index, reduction=reduction) 2025-08-14T21:49:34.6847398Z 2025-08-14T21:49:46.8951444Z Compilation time (from dynamo_timed): 32.186332031 2025-08-14T21:49:46.8981111Z pass 2025-08-14T21:49:46.8981516Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:49:46.8982826Z TIMING: _recursive_pre_grad_passes:0.07493 _recursive_joint_graph_passes:0.94714 _recursive_post_grad_passes:0.14508 async_compile.wait:0.98783 code_gen:10.46113 inductor_compile:13.00398 backend_compile:25.26307 gc:0.00025 entire_frame_compile:32.18633 total_wall_time:32.18633 2025-08-14T21:49:46.8986867Z STATS: call_* op count: 725 | FakeTensorMode.__torch_dispatch__:56996 | FakeTensor.__torch_dispatch__:6075 | ProxyTorchDispatchMode.__torch_dispatch__:16350 2025-08-14T21:49:46.8987544Z Dynamo produced 1 graphs covering 725 ops with 0 graph breaks (0 unique) 2025-08-14T21:49:52.7248152Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:49:52.7249612Z from pkg_resources import resource_filename 2025-08-14T21:49:53.4555504Z 2025-08-14T21:49:56.7239987Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:49:56.7240431Z loading model: 0it [00:03, ?it/s] 2025-08-14T21:49:56.7240690Z cpu eval MegatronBertForQuestionAnswering 2025-08-14T21:49:58.4641277Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:49:58.9084822Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:49:59.2400880Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:50:20.3067545Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:50:20.3068106Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:50:20.3068484Z return mod(**inputs) 2025-08-14T21:50:20.3068966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1611, in forward 2025-08-14T21:50:20.3069507Z logits = self.qa_outputs(sequence_output) 2025-08-14T21:50:20.3069670Z 2025-08-14T21:50:20.3069772Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3070010Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3070238Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3070471Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3070697Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3070923Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3071157Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3071391Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3071837Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3072217Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3072567Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3073464Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3073774Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3074032Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3074272Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3074500Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3074733Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3074966Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3075196Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3079553Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3079987Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3080235Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3080821Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3081147Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3081422Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3081658Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3081886Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3082120Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3082748Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3083049Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3083287Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3083517Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3083742Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3083986Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3084224Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3084459Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3084692Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3084928Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3085199Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:50:20.3085685Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:50:20.3086076Z return mod(**inputs) 2025-08-14T21:50:20.3086581Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1597, in forward 2025-08-14T21:50:20.3087065Z outputs = self.bert( 2025-08-14T21:50:20.3087512Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:50:20.3087995Z encoder_outputs = self.encoder( 2025-08-14T21:50:20.3088466Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:50:20.3089196Z layer_outputs = layer_module( 2025-08-14T21:50:20.3089599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:50:20.3090016Z return super().__call__(*args, **kwargs) 2025-08-14T21:50:20.3090537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:50:20.3091000Z self_attention_outputs = self.attention( 2025-08-14T21:50:20.3091479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:50:20.3092013Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:50:20.3092539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:50:20.3093004Z return residual + hidden_states 2025-08-14T21:50:20.3093156Z 2025-08-14T21:50:20.3093253Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3093622Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3093930Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3094188Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3094481Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3094699Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3094924Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3095255Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3095478Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3095692Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3095916Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3096138Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3096350Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3096570Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3096792Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3097005Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3097227Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3097447Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3097662Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3097887Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3098108Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3098334Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3098624Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3098882Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3099186Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3099414Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3099635Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3099930Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3100335Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:50:20.3100742Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:50:20.3101110Z return mod(**inputs) 2025-08-14T21:50:20.3101676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1597, in forward 2025-08-14T21:50:20.3102205Z outputs = self.bert( 2025-08-14T21:50:20.3103059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:50:20.3103751Z encoder_outputs = self.encoder( 2025-08-14T21:50:20.3104211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:50:20.3104730Z layer_outputs = layer_module( 2025-08-14T21:50:20.3105154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:50:20.3105615Z return super().__call__(*args, **kwargs) 2025-08-14T21:50:20.3106105Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:50:20.3106581Z self_attention_outputs = self.attention( 2025-08-14T21:50:20.3107048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:50:20.3107556Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:50:20.3108084Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:50:20.3108546Z return residual + hidden_states 2025-08-14T21:50:20.3108685Z 2025-08-14T21:50:20.3108780Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3109001Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3109231Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3109467Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3109748Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3109988Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3110214Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3110528Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3110747Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3110965Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3111193Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3111410Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3111630Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3111926Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3112141Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3112364Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3112633Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3112899Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3113123Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3113346Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3113568Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3113783Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3114013Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3114235Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3114453Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3114797Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3115063Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3115282Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3115544Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:50:20.3115951Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:50:20.3116315Z return mod(**inputs) 2025-08-14T21:50:20.3116850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1597, in forward 2025-08-14T21:50:20.3117346Z outputs = self.bert( 2025-08-14T21:50:20.3117880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:50:20.3118402Z encoder_outputs = self.encoder( 2025-08-14T21:50:20.3118864Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:50:20.3119439Z layer_outputs = layer_module( 2025-08-14T21:50:20.3119851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:50:20.3120292Z return super().__call__(*args, **kwargs) 2025-08-14T21:50:20.3120862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:50:20.3121334Z self_attention_outputs = self.attention( 2025-08-14T21:50:20.3121797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:50:20.3122396Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:50:20.3123015Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:50:20.3123495Z return residual + hidden_states 2025-08-14T21:50:20.3123686Z 2025-08-14T21:50:20.3123777Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3124007Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3124241Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3124545Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3124759Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3124982Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3125353Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3125573Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3125795Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3126048Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3126322Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3126547Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3126782Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3127078Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3127307Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3127566Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3127828Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3128044Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3128286Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3128590Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3128967Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3129199Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3129458Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3129771Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3130038Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3130271Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3130518Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3130800Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3131059Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:50:20.3131565Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:50:20.3131917Z return mod(**inputs) 2025-08-14T21:50:20.3132364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1597, in forward 2025-08-14T21:50:20.3132912Z outputs = self.bert( 2025-08-14T21:50:20.3133342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:50:20.3133910Z encoder_outputs = self.encoder( 2025-08-14T21:50:20.3134449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:50:20.3134927Z layer_outputs = layer_module( 2025-08-14T21:50:20.3135297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:50:20.3135692Z return super().__call__(*args, **kwargs) 2025-08-14T21:50:20.3136152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:50:20.3136620Z self_attention_outputs = self.attention( 2025-08-14T21:50:20.3137090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:50:20.3137680Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:50:20.3138201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:50:20.3138798Z return residual + hidden_states 2025-08-14T21:50:20.3138969Z 2025-08-14T21:50:20.3139056Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3139286Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3139559Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3139824Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3140047Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3140269Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3140544Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3140776Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3140993Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3141242Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3141518Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3141737Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3141957Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3142174Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3142398Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3142618Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3142831Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3143056Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3143280Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3143493Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3143713Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3143935Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3144150Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3144373Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3144592Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3144807Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3145027Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3145247Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3145502Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:50:20.3145888Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:50:20.3146248Z return mod(**inputs) 2025-08-14T21:50:20.3147656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1597, in forward 2025-08-14T21:50:20.3148128Z outputs = self.bert( 2025-08-14T21:50:20.3148563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:50:20.3149031Z encoder_outputs = self.encoder( 2025-08-14T21:50:20.3149480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:50:20.3149938Z layer_outputs = layer_module( 2025-08-14T21:50:20.3150347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:50:20.3150741Z return super().__call__(*args, **kwargs) 2025-08-14T21:50:20.3151204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:50:20.3151670Z self_attention_outputs = self.attention( 2025-08-14T21:50:20.3152147Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:50:20.3152663Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:50:20.3153172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:50:20.3153637Z return residual + hidden_states 2025-08-14T21:50:20.3153782Z 2025-08-14T21:50:20.3153867Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3154095Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3154314Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3154537Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3154758Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3154972Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3155195Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3155419Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3155634Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3155859Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3156075Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3156294Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3156505Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3156727Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3156946Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3157158Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3157377Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3157603Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3157815Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3158037Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3158260Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3158479Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3158704Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3158926Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3159149Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3159364Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3159589Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3159813Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3160061Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:50:20.3160461Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:50:20.3160817Z return mod(**inputs) 2025-08-14T21:50:20.3161259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1597, in forward 2025-08-14T21:50:20.3161722Z outputs = self.bert( 2025-08-14T21:50:20.3162203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:50:20.3162693Z encoder_outputs = self.encoder( 2025-08-14T21:50:20.3163143Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:50:20.3163609Z layer_outputs = layer_module( 2025-08-14T21:50:20.3163986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:50:20.3164377Z return super().__call__(*args, **kwargs) 2025-08-14T21:50:20.3164851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:50:20.3165334Z self_attention_outputs = self.attention( 2025-08-14T21:50:20.3165806Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:50:20.3166325Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:50:20.3166853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:50:20.3167320Z return residual + hidden_states 2025-08-14T21:50:20.3167456Z 2025-08-14T21:50:20.3167548Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3167768Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3167994Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3168219Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3168433Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3168661Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3169065Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3169292Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3169518Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3169753Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3169983Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3170198Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3170425Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3170658Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3170869Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3171084Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3171302Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3171518Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3171747Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3171971Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3172187Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3172410Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3172634Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3172866Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3173089Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3173313Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3173537Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3173763Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3174012Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:50:20.3174407Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:50:20.3174750Z return mod(**inputs) 2025-08-14T21:50:20.3175188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1597, in forward 2025-08-14T21:50:20.3175654Z outputs = self.bert( 2025-08-14T21:50:20.3176087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:50:20.3176537Z encoder_outputs = self.encoder( 2025-08-14T21:50:20.3177036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:50:20.3177520Z layer_outputs = layer_module( 2025-08-14T21:50:20.3177883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:50:20.3178262Z return super().__call__(*args, **kwargs) 2025-08-14T21:50:20.3178733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:50:20.3179214Z self_attention_outputs = self.attention( 2025-08-14T21:50:20.3179707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:50:20.3180232Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:50:20.3180747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:50:20.3181214Z return residual + hidden_states 2025-08-14T21:50:20.3181350Z 2025-08-14T21:50:20.3181434Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3181662Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3181887Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3182101Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3182324Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3182550Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3182773Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3182989Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3183211Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3183435Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3183653Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3183876Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3184106Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3184323Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3184545Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3184768Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3184984Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3185209Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3185434Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3185658Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3185875Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3186097Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3186320Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3186542Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3186768Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3186994Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3187210Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3187437Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3187700Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:50:20.3188090Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:50:20.3188455Z return mod(**inputs) 2025-08-14T21:50:20.3188903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1597, in forward 2025-08-14T21:50:20.3189374Z outputs = self.bert( 2025-08-14T21:50:20.3189800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:50:20.3190260Z encoder_outputs = self.encoder( 2025-08-14T21:50:20.3190717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:50:20.3191173Z layer_outputs = layer_module( 2025-08-14T21:50:20.3191576Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:50:20.3192038Z return super().__call__(*args, **kwargs) 2025-08-14T21:50:20.3192501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:50:20.3192987Z self_attention_outputs = self.attention( 2025-08-14T21:50:20.3193459Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:50:20.3193980Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:50:20.3194514Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:50:20.3194972Z return residual + hidden_states 2025-08-14T21:50:20.3195114Z 2025-08-14T21:50:20.3195196Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3195424Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3195638Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3195857Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3196077Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3196295Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3196502Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3196719Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3196938Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3197152Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3197374Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3197597Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3197813Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3198037Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3198258Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3198478Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3198707Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3198929Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3199149Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3199370Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3199589Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3199807Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3200023Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3200246Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3200468Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3200689Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3200909Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3201129Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3201373Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:50:20.3201770Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:50:20.3202133Z return mod(**inputs) 2025-08-14T21:50:20.3202577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1597, in forward 2025-08-14T21:50:20.3203536Z outputs = self.bert( 2025-08-14T21:50:20.3204017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:50:20.3204484Z encoder_outputs = self.encoder( 2025-08-14T21:50:20.3204944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:50:20.3205408Z layer_outputs = layer_module( 2025-08-14T21:50:20.3205793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:50:20.3206195Z return super().__call__(*args, **kwargs) 2025-08-14T21:50:20.3206654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:50:20.3207299Z self_attention_outputs = self.attention( 2025-08-14T21:50:20.3207785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:50:20.3208317Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:50:20.3208915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:50:20.3209415Z return residual + hidden_states 2025-08-14T21:50:20.3209552Z 2025-08-14T21:50:20.3209691Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3209923Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3210161Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3210396Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3210639Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3210863Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3211089Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3211316Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3211535Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3211764Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3211994Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3212219Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3212457Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3212689Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3212912Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3213146Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3213377Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3213609Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3213834Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3214068Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3214302Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3214531Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3214764Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3214992Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3215278Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3215502Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3215736Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3215965Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3216221Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:50:20.3216626Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:50:20.3216994Z return mod(**inputs) 2025-08-14T21:50:20.3217443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1597, in forward 2025-08-14T21:50:20.3217913Z outputs = self.bert( 2025-08-14T21:50:20.3218362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:50:20.3218836Z encoder_outputs = self.encoder( 2025-08-14T21:50:20.3219311Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:50:20.3219777Z layer_outputs = layer_module( 2025-08-14T21:50:20.3220170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:50:20.3220583Z return super().__call__(*args, **kwargs) 2025-08-14T21:50:20.3221057Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:50:20.3221543Z self_attention_outputs = self.attention( 2025-08-14T21:50:20.3222047Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:50:20.3222611Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:50:20.3223122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:50:20.3223591Z return residual + hidden_states 2025-08-14T21:50:20.3223734Z 2025-08-14T21:50:20.3223831Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3224078Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3224297Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3224521Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3224764Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3224977Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3225189Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3225404Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3225617Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3225835Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3226056Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3226272Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3226504Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3226721Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3226934Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3227141Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3227362Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3227581Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3227798Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3228018Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3228233Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3228438Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3228651Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3228867Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3229075Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3229290Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3229504Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3229711Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3229955Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:50:20.3230338Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:50:20.3230682Z return mod(**inputs) 2025-08-14T21:50:20.3231100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1597, in forward 2025-08-14T21:50:20.3231544Z outputs = self.bert( 2025-08-14T21:50:20.3231958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 856, in forward 2025-08-14T21:50:20.3232409Z encoder_outputs = self.encoder( 2025-08-14T21:50:20.3232842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 537, in forward 2025-08-14T21:50:20.3233299Z layer_outputs = layer_module( 2025-08-14T21:50:20.3233677Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:50:20.3234063Z return super().__call__(*args, **kwargs) 2025-08-14T21:50:20.3234535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 444, in forward 2025-08-14T21:50:20.3235003Z self_attention_outputs = self.attention( 2025-08-14T21:50:20.3235474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 381, in forward 2025-08-14T21:50:20.3235990Z attention_output = self.output(self_outputs[0], hidden_states) 2025-08-14T21:50:20.3236546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 331, in forward 2025-08-14T21:50:20.3237018Z return residual + hidden_states 2025-08-14T21:50:20.3237150Z 2025-08-14T21:50:20.3237240Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3237459Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3237682Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3237906Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3238125Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3238353Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3238575Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3238819Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3239049Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3239276Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3239494Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3239718Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3239946Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3240173Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3240387Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3240608Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3240831Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3241044Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3241264Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3241483Z cudagraph partition due to non gpu ops 2025-08-14T21:50:20.3241729Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:50:20.3242124Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:50:20.3242478Z return mod(**inputs) 2025-08-14T21:50:20.3242914Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1629, in forward 2025-08-14T21:50:20.3243399Z start_loss = loss_fct(start_logits, start_positions) 2025-08-14T21:50:20.3243577Z 2025-08-14T21:50:20.3243692Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:50:20.3244087Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:50:20.3244434Z return mod(**inputs) 2025-08-14T21:50:20.3244864Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1630, in forward 2025-08-14T21:50:20.3245343Z end_loss = loss_fct(end_logits, end_positions) 2025-08-14T21:50:20.3245503Z 2025-08-14T21:50:31.3214902Z Compilation time (from dynamo_timed): 30.595481632 2025-08-14T21:50:31.3215214Z pass 2025-08-14T21:50:31.3221786Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:50:31.3222718Z TIMING: _recursive_pre_grad_passes:0.07542 _recursive_joint_graph_passes:0.92323 _recursive_post_grad_passes:0.1503 async_compile.wait:0.78116 code_gen:9.99164 inductor_compile:12.98232 backend_compile:24.54049 gc:0.00068 entire_frame_compile:30.59548 total_wall_time:30.59548 2025-08-14T21:50:31.3223786Z STATS: call_* op count: 726 | FakeTensorMode.__torch_dispatch__:56816 | FakeTensor.__torch_dispatch__:6081 | ProxyTorchDispatchMode.__torch_dispatch__:16350 2025-08-14T21:50:31.3224337Z Dynamo produced 1 graphs covering 726 ops with 0 graph breaks (0 unique) 2025-08-14T21:50:37.7245074Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:50:37.7246231Z from pkg_resources import resource_filename 2025-08-14T21:50:38.4728050Z 2025-08-14T21:50:39.2865800Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:50:39.2866128Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:50:39.2866840Z cpu eval MobileBertForMaskedLM 2025-08-14T21:50:39.5925808Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:50:39.8563081Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:50:40.1037898Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:51:20.4421125Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4424777Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4425217Z return mod(**inputs) 2025-08-14T21:51:20.4426059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4426539Z outputs = self.mobilebert( 2025-08-14T21:51:20.4427024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 791, in forward 2025-08-14T21:51:20.4427523Z embedding_output = self.embeddings( 2025-08-14T21:51:20.4427993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 199, in forward 2025-08-14T21:51:20.4428449Z inputs_embeds = torch.cat( 2025-08-14T21:51:20.4428582Z 2025-08-14T21:51:20.4428703Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4429090Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4429433Z return mod(**inputs) 2025-08-14T21:51:20.4429854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4430296Z outputs = self.mobilebert( 2025-08-14T21:51:20.4430723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 791, in forward 2025-08-14T21:51:20.4431158Z embedding_output = self.embeddings( 2025-08-14T21:51:20.4431605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 208, in forward 2025-08-14T21:51:20.4432098Z inputs_embeds = self.embedding_transformation(inputs_embeds) 2025-08-14T21:51:20.4432286Z 2025-08-14T21:51:20.4432381Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4432603Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4432827Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4433045Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4433289Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4433703Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4434060Z return mod(**inputs) 2025-08-14T21:51:20.4434500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4434927Z outputs = self.mobilebert( 2025-08-14T21:51:20.4435367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4435818Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4436258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4436706Z layer_outputs = layer_module( 2025-08-14T21:51:20.4437146Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4437660Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4438111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4438728Z self_outputs = self.self( 2025-08-14T21:51:20.4439227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4439715Z self.query(query_tensor) 2025-08-14T21:51:20.4439842Z 2025-08-14T21:51:20.4439932Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4440227Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4440453Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4440669Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4440890Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4441114Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4441354Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4441580Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4441801Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4442021Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4442238Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4442459Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4442681Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4442894Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4443118Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4443342Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4443589Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4443990Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4444360Z return mod(**inputs) 2025-08-14T21:51:20.4444799Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4445238Z outputs = self.mobilebert( 2025-08-14T21:51:20.4445674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4446137Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4446581Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4447035Z layer_outputs = layer_module( 2025-08-14T21:51:20.4447486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4447960Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4448430Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4449152Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4449651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4450118Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4450284Z 2025-08-14T21:51:20.4450375Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4450618Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4450850Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4451067Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4451286Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4451506Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4451719Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4451939Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4452195Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4452696Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4453073Z return mod(**inputs) 2025-08-14T21:51:20.4453511Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4454018Z outputs = self.mobilebert( 2025-08-14T21:51:20.4454484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4454931Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4455383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4455824Z layer_outputs = layer_module( 2025-08-14T21:51:20.4456255Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4456742Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4457264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4457733Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4457889Z 2025-08-14T21:51:20.4457975Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4458211Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4458427Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4458637Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4458886Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4459274Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4459615Z return mod(**inputs) 2025-08-14T21:51:20.4460035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4460477Z outputs = self.mobilebert( 2025-08-14T21:51:20.4460906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4461351Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4461804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4462269Z layer_outputs = layer_module( 2025-08-14T21:51:20.4462715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4463253Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4463790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4464289Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4464780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4465222Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4465379Z 2025-08-14T21:51:20.4465468Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4465693Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4465945Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4466334Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4466681Z return mod(**inputs) 2025-08-14T21:51:20.4467112Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4467556Z outputs = self.mobilebert( 2025-08-14T21:51:20.4467973Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4468424Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4468853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4470134Z layer_outputs = layer_module( 2025-08-14T21:51:20.4470599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:51:20.4471156Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:51:20.4471687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:51:20.4472174Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:51:20.4472661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:51:20.4473143Z layer_input = self.dense(hidden_states) 2025-08-14T21:51:20.4473300Z 2025-08-14T21:51:20.4473389Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4473620Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4473885Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4474292Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4474777Z return mod(**inputs) 2025-08-14T21:51:20.4475214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4475664Z outputs = self.mobilebert( 2025-08-14T21:51:20.4476101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4476559Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4477017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4477470Z layer_outputs = layer_module( 2025-08-14T21:51:20.4477927Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4478416Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4478906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4479360Z self_outputs = self.self( 2025-08-14T21:51:20.4479808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4480265Z self.query(query_tensor) 2025-08-14T21:51:20.4480395Z 2025-08-14T21:51:20.4480497Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4480733Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4480967Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4481197Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4481412Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4481638Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4481869Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4482089Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4482323Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4482546Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4482767Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4482995Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4483222Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4483450Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4483672Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4483928Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4484188Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4484584Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4484940Z return mod(**inputs) 2025-08-14T21:51:20.4485374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4485943Z outputs = self.mobilebert( 2025-08-14T21:51:20.4486382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4486833Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4487275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4487712Z layer_outputs = layer_module( 2025-08-14T21:51:20.4488151Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4488667Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4489223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4489716Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4490206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4490667Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4490825Z 2025-08-14T21:51:20.4490920Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4491142Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4491366Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4491591Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4491804Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4492024Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4492247Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4492484Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4492740Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4493134Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4493483Z return mod(**inputs) 2025-08-14T21:51:20.4493925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4494376Z outputs = self.mobilebert( 2025-08-14T21:51:20.4494816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4495309Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4495757Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4496214Z layer_outputs = layer_module( 2025-08-14T21:51:20.4496661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4497150Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4497651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4498105Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4498268Z 2025-08-14T21:51:20.4498353Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4498579Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4498804Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4499023Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4499268Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4499659Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4500008Z return mod(**inputs) 2025-08-14T21:51:20.4500427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4500902Z outputs = self.mobilebert( 2025-08-14T21:51:20.4501377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4501825Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4502258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4502899Z layer_outputs = layer_module( 2025-08-14T21:51:20.4503344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4503877Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4504509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4505018Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4505537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4505995Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4506155Z 2025-08-14T21:51:20.4506244Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4506482Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4506714Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4506941Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4507207Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4507617Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4507978Z return mod(**inputs) 2025-08-14T21:51:20.4508410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4508862Z outputs = self.mobilebert( 2025-08-14T21:51:20.4509307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4509754Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4510195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4510649Z layer_outputs = layer_module( 2025-08-14T21:51:20.4511083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4511550Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4512016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4512470Z self_outputs = self.self( 2025-08-14T21:51:20.4512909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4513364Z self.query(query_tensor) 2025-08-14T21:51:20.4513493Z 2025-08-14T21:51:20.4513589Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4513817Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4514050Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4514280Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4514510Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4514734Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4514965Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4515197Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4515418Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4515648Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4515881Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4516102Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4516332Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4516624Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4516870Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4517098Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4517352Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4517743Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4518092Z return mod(**inputs) 2025-08-14T21:51:20.4518531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4518984Z outputs = self.mobilebert( 2025-08-14T21:51:20.4519439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4519892Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4520341Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4520790Z layer_outputs = layer_module( 2025-08-14T21:51:20.4521223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4521711Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4522193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4522696Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4523182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4523646Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4523798Z 2025-08-14T21:51:20.4523890Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4524112Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4524340Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4524564Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4524785Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4524997Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4525217Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4525439Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4525684Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4526079Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4526430Z return mod(**inputs) 2025-08-14T21:51:20.4526852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4527309Z outputs = self.mobilebert( 2025-08-14T21:51:20.4527749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4528208Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4528640Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4529172Z layer_outputs = layer_module( 2025-08-14T21:51:20.4529618Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4530124Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4530664Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4531132Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4531284Z 2025-08-14T21:51:20.4531379Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4531792Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4532061Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4532298Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4532557Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4532946Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4533300Z return mod(**inputs) 2025-08-14T21:51:20.4533725Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4534163Z outputs = self.mobilebert( 2025-08-14T21:51:20.4534610Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4535057Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4535496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4535930Z layer_outputs = layer_module( 2025-08-14T21:51:20.4536364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4536895Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4537431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4537925Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4538430Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4538884Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4539028Z 2025-08-14T21:51:20.4539110Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4539332Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4539582Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4539956Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4540288Z return mod(**inputs) 2025-08-14T21:51:20.4540705Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4541156Z outputs = self.mobilebert( 2025-08-14T21:51:20.4541574Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4542029Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4542451Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4542887Z layer_outputs = layer_module( 2025-08-14T21:51:20.4543299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:51:20.4543826Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:51:20.4544360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:51:20.4544845Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:51:20.4545314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:51:20.4545773Z layer_input = self.dense(hidden_states) 2025-08-14T21:51:20.4545917Z 2025-08-14T21:51:20.4546008Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4546228Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4546474Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4546846Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4547237Z return mod(**inputs) 2025-08-14T21:51:20.4547684Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4548136Z outputs = self.mobilebert( 2025-08-14T21:51:20.4548578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4549039Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4549461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4549897Z layer_outputs = layer_module( 2025-08-14T21:51:20.4550342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4550778Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4551240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4551692Z self_outputs = self.self( 2025-08-14T21:51:20.4552125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4552570Z self.query(query_tensor) 2025-08-14T21:51:20.4552715Z 2025-08-14T21:51:20.4552807Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4553041Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4553265Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4553504Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4553729Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4553952Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4554166Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4554384Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4554604Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4554821Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4555044Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4555265Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4555479Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4555703Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4555923Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4556136Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4556389Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4556785Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4557145Z return mod(**inputs) 2025-08-14T21:51:20.4557560Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4558015Z outputs = self.mobilebert( 2025-08-14T21:51:20.4558454Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4558901Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4574772Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4575502Z layer_outputs = layer_module( 2025-08-14T21:51:20.4575978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4576460Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4576970Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4577471Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4577964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4578664Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4578844Z 2025-08-14T21:51:20.4578940Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4579182Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4579405Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4579634Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4579861Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4580078Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4580305Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4580530Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4580823Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4581231Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4581612Z return mod(**inputs) 2025-08-14T21:51:20.4582053Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4582511Z outputs = self.mobilebert( 2025-08-14T21:51:20.4582996Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4583453Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4583909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4584358Z layer_outputs = layer_module( 2025-08-14T21:51:20.4584794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4585283Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4585781Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4586231Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4586408Z 2025-08-14T21:51:20.4586494Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4586724Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4586948Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4587178Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4587444Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4587839Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4588197Z return mod(**inputs) 2025-08-14T21:51:20.4588619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4589059Z outputs = self.mobilebert( 2025-08-14T21:51:20.4589485Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4589936Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4590385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4590845Z layer_outputs = layer_module( 2025-08-14T21:51:20.4591269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4591796Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4592330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4592812Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4593305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4593803Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4593973Z 2025-08-14T21:51:20.4594069Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4594291Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4594514Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4594745Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4594988Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4595375Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4595721Z return mod(**inputs) 2025-08-14T21:51:20.4596150Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4596584Z outputs = self.mobilebert( 2025-08-14T21:51:20.4597019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4597478Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4597912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4598359Z layer_outputs = layer_module( 2025-08-14T21:51:20.4598797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4599245Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4599678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4600107Z self_outputs = self.self( 2025-08-14T21:51:20.4600534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4600963Z self.query(query_tensor) 2025-08-14T21:51:20.4601092Z 2025-08-14T21:51:20.4601179Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4601409Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4601637Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4601852Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4602073Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4602294Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4602506Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4602928Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4603158Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4603384Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4603600Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4603831Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4604065Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4604284Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4604512Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4604745Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4604994Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4605386Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4605739Z return mod(**inputs) 2025-08-14T21:51:20.4606161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4606597Z outputs = self.mobilebert( 2025-08-14T21:51:20.4607031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4607477Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4607907Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4608352Z layer_outputs = layer_module( 2025-08-14T21:51:20.4609215Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4609700Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4610167Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4610657Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4611146Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4611609Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4611797Z 2025-08-14T21:51:20.4611885Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4612117Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4612345Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4612565Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4612792Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4613017Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4613234Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4613456Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4613707Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4614097Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4614439Z return mod(**inputs) 2025-08-14T21:51:20.4614862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4615303Z outputs = self.mobilebert( 2025-08-14T21:51:20.4615726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4616166Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4616608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4617045Z layer_outputs = layer_module( 2025-08-14T21:51:20.4617471Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4617968Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4618461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4618919Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4619072Z 2025-08-14T21:51:20.4619161Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4619388Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4619614Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4619827Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4620084Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4620474Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4620817Z return mod(**inputs) 2025-08-14T21:51:20.4621261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4621695Z outputs = self.mobilebert( 2025-08-14T21:51:20.4622121Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4622566Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4623009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4623452Z layer_outputs = layer_module( 2025-08-14T21:51:20.4623889Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4624489Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4625026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4625525Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4626024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4626479Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4626636Z 2025-08-14T21:51:20.4626737Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4626969Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4627219Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4627609Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4627964Z return mod(**inputs) 2025-08-14T21:51:20.4628388Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4628836Z outputs = self.mobilebert( 2025-08-14T21:51:20.4629271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4629724Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4630166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4630627Z layer_outputs = layer_module( 2025-08-14T21:51:20.4631070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:51:20.4631621Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:51:20.4632169Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:51:20.4632660Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:51:20.4633152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:51:20.4633613Z layer_input = self.dense(hidden_states) 2025-08-14T21:51:20.4633762Z 2025-08-14T21:51:20.4633848Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4634074Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4634329Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4634711Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4635064Z return mod(**inputs) 2025-08-14T21:51:20.4635484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4635935Z outputs = self.mobilebert( 2025-08-14T21:51:20.4636359Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4636805Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4637244Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4637683Z layer_outputs = layer_module( 2025-08-14T21:51:20.4638122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4638582Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4639037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4639511Z self_outputs = self.self( 2025-08-14T21:51:20.4639957Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4640414Z self.query(query_tensor) 2025-08-14T21:51:20.4640539Z 2025-08-14T21:51:20.4640632Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4640858Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4641090Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4641326Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4641542Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4641765Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4642010Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4642228Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4642447Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4642672Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4642886Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4643109Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4643335Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4643549Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4643771Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4643990Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4644239Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4644618Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4644965Z return mod(**inputs) 2025-08-14T21:51:20.4645384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4645819Z outputs = self.mobilebert( 2025-08-14T21:51:20.4646246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4646691Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4647127Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4647564Z layer_outputs = layer_module( 2025-08-14T21:51:20.4648006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4648473Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4649028Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4649532Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4650026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4650486Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4650641Z 2025-08-14T21:51:20.4650726Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4650958Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4651185Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4651412Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4651630Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4651854Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4652081Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4652297Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4652551Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4652942Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4653286Z return mod(**inputs) 2025-08-14T21:51:20.4653719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4654212Z outputs = self.mobilebert( 2025-08-14T21:51:20.4654668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4655112Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4655565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4656013Z layer_outputs = layer_module( 2025-08-14T21:51:20.4656445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4656962Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4657473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4657918Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4658071Z 2025-08-14T21:51:20.4658154Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4658383Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4658610Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4658826Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4659079Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4659468Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4659819Z return mod(**inputs) 2025-08-14T21:51:20.4660225Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4660671Z outputs = self.mobilebert( 2025-08-14T21:51:20.4661108Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4661549Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4661979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4662410Z layer_outputs = layer_module( 2025-08-14T21:51:20.4662833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4663346Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4663864Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4664346Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4664832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4665266Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4665424Z 2025-08-14T21:51:20.4665507Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4665730Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4665942Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4666160Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4666405Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4666783Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4667117Z return mod(**inputs) 2025-08-14T21:51:20.4667525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4667954Z outputs = self.mobilebert( 2025-08-14T21:51:20.4668379Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4668821Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4669291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4669737Z layer_outputs = layer_module( 2025-08-14T21:51:20.4670158Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4670602Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4671051Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4671487Z self_outputs = self.self( 2025-08-14T21:51:20.4671910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4672353Z self.query(query_tensor) 2025-08-14T21:51:20.4672475Z 2025-08-14T21:51:20.4672569Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4672789Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4673018Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4673245Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4673460Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4673682Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4673905Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4674124Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4674337Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4674558Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4674781Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4674998Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4675217Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4675439Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4675651Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4675873Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4676123Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4676510Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4676860Z return mod(**inputs) 2025-08-14T21:51:20.4677283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4677728Z outputs = self.mobilebert( 2025-08-14T21:51:20.4678151Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4678598Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4679039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4679477Z layer_outputs = layer_module( 2025-08-14T21:51:20.4679902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4680373Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4680842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4681325Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4681806Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4682266Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4682417Z 2025-08-14T21:51:20.4682508Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4682726Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4682952Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4683175Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4683390Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4683614Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4683892Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4684133Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4684379Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4684771Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4685121Z return mod(**inputs) 2025-08-14T21:51:20.4685534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4685978Z outputs = self.mobilebert( 2025-08-14T21:51:20.4686433Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4686886Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4687319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4687765Z layer_outputs = layer_module( 2025-08-14T21:51:20.4688207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4688698Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4689284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4689751Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4689904Z 2025-08-14T21:51:20.4689997Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4690220Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4690453Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4690679Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4690926Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4691318Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4691677Z return mod(**inputs) 2025-08-14T21:51:20.4692098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4692546Z outputs = self.mobilebert( 2025-08-14T21:51:20.4692981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4693436Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4693874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4694328Z layer_outputs = layer_module( 2025-08-14T21:51:20.4694775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4695303Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4695833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4696315Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4696806Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4697246Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4697391Z 2025-08-14T21:51:20.4697471Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4697699Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4697948Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4698312Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4698653Z return mod(**inputs) 2025-08-14T21:51:20.4699114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4699565Z outputs = self.mobilebert( 2025-08-14T21:51:20.4699975Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4700409Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4700833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4701258Z layer_outputs = layer_module( 2025-08-14T21:51:20.4701706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:51:20.4702253Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:51:20.4703005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:51:20.4703505Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:51:20.4703976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:51:20.4704424Z layer_input = self.dense(hidden_states) 2025-08-14T21:51:20.4704582Z 2025-08-14T21:51:20.4704665Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4704891Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4705154Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4705533Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4705865Z return mod(**inputs) 2025-08-14T21:51:20.4706269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4706699Z outputs = self.mobilebert( 2025-08-14T21:51:20.4707111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4707545Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4707968Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4708401Z layer_outputs = layer_module( 2025-08-14T21:51:20.4708833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4709289Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4709733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4710170Z self_outputs = self.self( 2025-08-14T21:51:20.4710577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4711019Z self.query(query_tensor) 2025-08-14T21:51:20.4711140Z 2025-08-14T21:51:20.4711233Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4711454Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4711680Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4711905Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4712121Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4712343Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4712563Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4712784Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4712998Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4713220Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4713442Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4713655Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4713941Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4714191Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4714443Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4714669Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4714923Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4715306Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4715662Z return mod(**inputs) 2025-08-14T21:51:20.4716093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4716548Z outputs = self.mobilebert( 2025-08-14T21:51:20.4717010Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4717461Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4717905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4718358Z layer_outputs = layer_module( 2025-08-14T21:51:20.4718789Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4719257Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4719726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4720207Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4720692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4721153Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4721305Z 2025-08-14T21:51:20.4721396Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4721618Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4721846Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4722072Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4722290Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4722512Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4722736Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4722952Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4723205Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4723596Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4723957Z return mod(**inputs) 2025-08-14T21:51:20.4724362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4724795Z outputs = self.mobilebert( 2025-08-14T21:51:20.4725223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4725674Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4726119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4726565Z layer_outputs = layer_module( 2025-08-14T21:51:20.4727001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4727488Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4727980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4728439Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4728591Z 2025-08-14T21:51:20.4728683Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4728967Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4729237Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4729479Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4729735Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4730129Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4730495Z return mod(**inputs) 2025-08-14T21:51:20.4730915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4731338Z outputs = self.mobilebert( 2025-08-14T21:51:20.4731832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4732280Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4732710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4733144Z layer_outputs = layer_module( 2025-08-14T21:51:20.4733566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4734089Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4734603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4735086Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4735570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4736027Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4736174Z 2025-08-14T21:51:20.4736258Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4736484Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4736709Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4736920Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4737167Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4737542Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4737878Z return mod(**inputs) 2025-08-14T21:51:20.4738297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4738732Z outputs = self.mobilebert( 2025-08-14T21:51:20.4739169Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4739608Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4740038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4740471Z layer_outputs = layer_module( 2025-08-14T21:51:20.4740894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4741331Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4741771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4742198Z self_outputs = self.self( 2025-08-14T21:51:20.4742608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4743040Z self.query(query_tensor) 2025-08-14T21:51:20.4743166Z 2025-08-14T21:51:20.4743249Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4743468Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4743682Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4743897Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4744150Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4744374Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4744591Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4744805Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4745013Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4745235Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4745456Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4745686Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4745892Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4746103Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4746315Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4746538Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4746790Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4747168Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4747503Z return mod(**inputs) 2025-08-14T21:51:20.4747915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4748349Z outputs = self.mobilebert( 2025-08-14T21:51:20.4748767Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4749191Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4749616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4750048Z layer_outputs = layer_module( 2025-08-14T21:51:20.4750481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4750956Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4751419Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4751912Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4752397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4752855Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4753015Z 2025-08-14T21:51:20.4753099Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4753328Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4753543Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4753763Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4753986Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4754198Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4754420Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4754641Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4754889Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4755280Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4755630Z return mod(**inputs) 2025-08-14T21:51:20.4756058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4756504Z outputs = self.mobilebert( 2025-08-14T21:51:20.4756945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4757397Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4757833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4758284Z layer_outputs = layer_module( 2025-08-14T21:51:20.4758728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4759280Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4759771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4760229Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4760381Z 2025-08-14T21:51:20.4760474Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4760700Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4760918Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4761136Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4761403Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4761790Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4762144Z return mod(**inputs) 2025-08-14T21:51:20.4762564Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4763006Z outputs = self.mobilebert( 2025-08-14T21:51:20.4763438Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4763880Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4764326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4764748Z layer_outputs = layer_module( 2025-08-14T21:51:20.4765187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4765725Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4766263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4766758Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4767256Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4767715Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4767866Z 2025-08-14T21:51:20.4767957Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4768184Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4768443Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4768920Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4769278Z return mod(**inputs) 2025-08-14T21:51:20.4769698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4770148Z outputs = self.mobilebert( 2025-08-14T21:51:20.4770601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4771026Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4771463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4771906Z layer_outputs = layer_module( 2025-08-14T21:51:20.4772335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:51:20.4772869Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:51:20.4773402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:51:20.4773882Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:51:20.4774429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:51:20.4774871Z layer_input = self.dense(hidden_states) 2025-08-14T21:51:20.4775019Z 2025-08-14T21:51:20.4775110Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4775339Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4775586Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4775969Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4776317Z return mod(**inputs) 2025-08-14T21:51:20.4776755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4777215Z outputs = self.mobilebert( 2025-08-14T21:51:20.4777651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4778092Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4778528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4778967Z layer_outputs = layer_module( 2025-08-14T21:51:20.4779403Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4779856Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4780317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4780753Z self_outputs = self.self( 2025-08-14T21:51:20.4781175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4781604Z self.query(query_tensor) 2025-08-14T21:51:20.4781737Z 2025-08-14T21:51:20.4781822Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4782050Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4782266Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4782494Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4782722Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4782959Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4783170Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4783391Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4783609Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4783823Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4784046Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4784276Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4784477Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4784685Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4784904Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4785121Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4785376Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4785765Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4786114Z return mod(**inputs) 2025-08-14T21:51:20.4786519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4786956Z outputs = self.mobilebert( 2025-08-14T21:51:20.4787382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4787816Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4788252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4788691Z layer_outputs = layer_module( 2025-08-14T21:51:20.4789187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4789644Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4790106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4790584Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4791057Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4791498Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4791670Z 2025-08-14T21:51:20.4791753Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4791976Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4792186Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4792407Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4792627Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4792844Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4793055Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4793268Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4793513Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4793883Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4794223Z return mod(**inputs) 2025-08-14T21:51:20.4794639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4795076Z outputs = self.mobilebert( 2025-08-14T21:51:20.4795508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4795948Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4796386Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4796831Z layer_outputs = layer_module( 2025-08-14T21:51:20.4797270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4797776Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4798275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4798722Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4798875Z 2025-08-14T21:51:20.4798958Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4799179Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4799389Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4799604Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4799852Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4800222Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4800563Z return mod(**inputs) 2025-08-14T21:51:20.4800980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4801419Z outputs = self.mobilebert( 2025-08-14T21:51:20.4801830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4802273Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4802889Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4803344Z layer_outputs = layer_module( 2025-08-14T21:51:20.4803770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4804421Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4804956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4805445Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4805942Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4806399Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4806548Z 2025-08-14T21:51:20.4806671Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4806898Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4807125Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4807347Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4807595Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4807987Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4808340Z return mod(**inputs) 2025-08-14T21:51:20.4808809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4809259Z outputs = self.mobilebert( 2025-08-14T21:51:20.4809688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4810141Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4810568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4810994Z layer_outputs = layer_module( 2025-08-14T21:51:20.4811413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4811897Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4812348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4812788Z self_outputs = self.self( 2025-08-14T21:51:20.4813205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4813656Z self.query(query_tensor) 2025-08-14T21:51:20.4813779Z 2025-08-14T21:51:20.4813872Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4814092Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4814310Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4814518Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4814731Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4814943Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4815149Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4815370Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4815586Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4815804Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4816018Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4816240Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4816459Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4816539Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4816621Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4816710Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4816822Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4817039Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4817120Z return mod(**inputs) 2025-08-14T21:51:20.4817427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4817576Z outputs = self.mobilebert( 2025-08-14T21:51:20.4817875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4817955Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4818260Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4818338Z layer_outputs = layer_module( 2025-08-14T21:51:20.4818652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4818772Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4819072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4819206Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4819506Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4819598Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4819602Z 2025-08-14T21:51:20.4819692Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4819775Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4819862Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4819942Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4820022Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4820110Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4820190Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4820271Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4820392Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4820610Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4820686Z return mod(**inputs) 2025-08-14T21:51:20.4820999Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4821075Z outputs = self.mobilebert( 2025-08-14T21:51:20.4821396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4821475Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4821788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4821875Z layer_outputs = layer_module( 2025-08-14T21:51:20.4822181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4822319Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4822626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4822717Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4822721Z 2025-08-14T21:51:20.4822810Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4822891Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4822971Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4823058Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4823169Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4823391Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4823463Z return mod(**inputs) 2025-08-14T21:51:20.4823765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4823883Z outputs = self.mobilebert( 2025-08-14T21:51:20.4824210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4824290Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4824598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4824675Z layer_outputs = layer_module( 2025-08-14T21:51:20.4824980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4825174Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4825473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4825618Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4825919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4826017Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4826021Z 2025-08-14T21:51:20.4826103Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4826185Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4826303Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4826513Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4826585Z return mod(**inputs) 2025-08-14T21:51:20.4826890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4826966Z outputs = self.mobilebert( 2025-08-14T21:51:20.4827269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4827352Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4827649Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4827734Z layer_outputs = layer_module( 2025-08-14T21:51:20.4828032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:51:20.4828208Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:51:20.4828510Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:51:20.4828627Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:51:20.4828932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:51:20.4829024Z layer_input = self.dense(hidden_states) 2025-08-14T21:51:20.4829029Z 2025-08-14T21:51:20.4829111Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4829199Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4829308Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4829527Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4829597Z return mod(**inputs) 2025-08-14T21:51:20.4829897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4829981Z outputs = self.mobilebert( 2025-08-14T21:51:20.4830279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4830359Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4830707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4830802Z layer_outputs = layer_module( 2025-08-14T21:51:20.4831107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4831200Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4831496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4831578Z self_outputs = self.self( 2025-08-14T21:51:20.4831893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4831978Z self.query(query_tensor) 2025-08-14T21:51:20.4831982Z 2025-08-14T21:51:20.4832065Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4832146Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4832238Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4832320Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4832401Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4832489Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4832569Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4832651Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4832739Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4832820Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4832906Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4832996Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4833075Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4833165Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4833244Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4833322Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4833438Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4833648Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4833720Z return mod(**inputs) 2025-08-14T21:51:20.4834030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4834108Z outputs = self.mobilebert( 2025-08-14T21:51:20.4834421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4834504Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4834811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4834897Z layer_outputs = layer_module( 2025-08-14T21:51:20.4835210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4835325Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4835614Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4835734Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4836033Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4836124Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4836127Z 2025-08-14T21:51:20.4836219Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4836300Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4836388Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4836475Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4836553Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4836654Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4836756Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4836854Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4836966Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4837185Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4837258Z return mod(**inputs) 2025-08-14T21:51:20.4837573Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4837649Z outputs = self.mobilebert( 2025-08-14T21:51:20.4837976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4838067Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4838375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4838455Z layer_outputs = layer_module( 2025-08-14T21:51:20.4838761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4838901Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4839202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4839291Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4839295Z 2025-08-14T21:51:20.4839374Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4839464Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4839546Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4839624Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4839743Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4839950Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4840033Z return mod(**inputs) 2025-08-14T21:51:20.4840320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4840394Z outputs = self.mobilebert( 2025-08-14T21:51:20.4840691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4840768Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4841078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4841153Z layer_outputs = layer_module( 2025-08-14T21:51:20.4841454Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4841623Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4841916Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4842048Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4842356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4842445Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4842449Z 2025-08-14T21:51:20.4842538Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4842618Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4842699Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4842787Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4842898Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4843107Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4843221Z return mod(**inputs) 2025-08-14T21:51:20.4843540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4843625Z outputs = self.mobilebert( 2025-08-14T21:51:20.4843925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4844004Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4844311Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4844405Z layer_outputs = layer_module( 2025-08-14T21:51:20.4844714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4844806Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4845114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4845196Z self_outputs = self.self( 2025-08-14T21:51:20.4845496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4845573Z self.query(query_tensor) 2025-08-14T21:51:20.4845577Z 2025-08-14T21:51:20.4845664Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4845744Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4845831Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4845912Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4845996Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4846084Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4846164Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4846245Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4846336Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4846418Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4846500Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4846589Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4846670Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4846758Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4846839Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4847192Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4847311Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4847523Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4847599Z return mod(**inputs) 2025-08-14T21:51:20.4847910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4847987Z outputs = self.mobilebert( 2025-08-14T21:51:20.4848302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4848383Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4848687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4848849Z layer_outputs = layer_module( 2025-08-14T21:51:20.4849160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4849267Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4849584Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4849706Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4850019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4850181Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4850186Z 2025-08-14T21:51:20.4850271Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4850364Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4850446Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4850537Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4850619Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4850704Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4850792Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4850874Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4851016Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4851238Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4851309Z return mod(**inputs) 2025-08-14T21:51:20.4851608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4851691Z outputs = self.mobilebert( 2025-08-14T21:51:20.4851987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4852073Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4852373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4852448Z layer_outputs = layer_module( 2025-08-14T21:51:20.4852755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4852882Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4853185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4853280Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4853284Z 2025-08-14T21:51:20.4853366Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4853455Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4853536Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4853616Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4853732Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4853943Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4854022Z return mod(**inputs) 2025-08-14T21:51:20.4854319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4854397Z outputs = self.mobilebert( 2025-08-14T21:51:20.4854706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4854787Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4855085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4855168Z layer_outputs = layer_module( 2025-08-14T21:51:20.4855474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4855649Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4855956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4856090Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4856398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4856550Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4856554Z 2025-08-14T21:51:20.4856645Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4856727Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4856837Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4857053Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4857124Z return mod(**inputs) 2025-08-14T21:51:20.4857430Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4857531Z outputs = self.mobilebert( 2025-08-14T21:51:20.4857838Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4857922Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4858222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4858296Z layer_outputs = layer_module( 2025-08-14T21:51:20.4858599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:51:20.4858769Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:51:20.4859074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:51:20.4859204Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:51:20.4859506Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:51:20.4859598Z layer_input = self.dense(hidden_states) 2025-08-14T21:51:20.4859603Z 2025-08-14T21:51:20.4859685Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4859765Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4859889Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4860083Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4860157Z return mod(**inputs) 2025-08-14T21:51:20.4860435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4860504Z outputs = self.mobilebert( 2025-08-14T21:51:20.4860786Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4860858Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4861134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4861218Z layer_outputs = layer_module( 2025-08-14T21:51:20.4861521Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4861617Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4861905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4861979Z self_outputs = self.self( 2025-08-14T21:51:20.4862276Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4862351Z self.query(query_tensor) 2025-08-14T21:51:20.4862355Z 2025-08-14T21:51:20.4862445Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4862525Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4862603Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4862690Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4862815Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4862909Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4863000Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4863079Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4863158Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4863244Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4863323Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4863409Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4863490Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4863568Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4863654Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4863747Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4863855Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4864067Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4864139Z return mod(**inputs) 2025-08-14T21:51:20.4864440Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4864526Z outputs = self.mobilebert( 2025-08-14T21:51:20.4864817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4864901Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4865190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4865265Z layer_outputs = layer_module( 2025-08-14T21:51:20.4865560Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4865662Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4865959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4866078Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4866372Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4866467Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4866470Z 2025-08-14T21:51:20.4866550Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4866629Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4866715Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4866795Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4866882Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4866961Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4867040Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4867125Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4867235Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4867441Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4867519Z return mod(**inputs) 2025-08-14T21:51:20.4867810Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4867892Z outputs = self.mobilebert( 2025-08-14T21:51:20.4868180Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4868256Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4868552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4868627Z layer_outputs = layer_module( 2025-08-14T21:51:20.4868915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4869100Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4869390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4869482Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4869486Z 2025-08-14T21:51:20.4869566Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4869645Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4869731Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4869809Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4869933Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4870147Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4870217Z return mod(**inputs) 2025-08-14T21:51:20.4870518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4870593Z outputs = self.mobilebert( 2025-08-14T21:51:20.4870881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4870964Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4871255Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4871328Z layer_outputs = layer_module( 2025-08-14T21:51:20.4871628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4871793Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4872087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4872219Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4872507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4872603Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4872606Z 2025-08-14T21:51:20.4872686Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4872773Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4872851Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4872928Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4873043Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4873247Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4873316Z return mod(**inputs) 2025-08-14T21:51:20.4873613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4873690Z outputs = self.mobilebert( 2025-08-14T21:51:20.4873984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4874060Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4874362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4874443Z layer_outputs = layer_module( 2025-08-14T21:51:20.4874745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4874843Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4875135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4875244Z self_outputs = self.self( 2025-08-14T21:51:20.4875549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4875625Z self.query(query_tensor) 2025-08-14T21:51:20.4875629Z 2025-08-14T21:51:20.4875709Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4875798Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4875876Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4875962Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4876041Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4876120Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4876234Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4876314Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4876392Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4876478Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4876559Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4876639Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4876725Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4876804Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4876885Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4876975Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4877085Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4877305Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4877378Z return mod(**inputs) 2025-08-14T21:51:20.4877689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4877770Z outputs = self.mobilebert( 2025-08-14T21:51:20.4878071Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4878151Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4878448Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4878521Z layer_outputs = layer_module( 2025-08-14T21:51:20.4878814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4878914Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4879214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4879339Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4879640Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4879737Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4879742Z 2025-08-14T21:51:20.4879824Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4879904Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4879990Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4880069Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4880147Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4880235Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4880312Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4880398Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4880505Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4880708Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4880784Z return mod(**inputs) 2025-08-14T21:51:20.4881086Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4881197Z outputs = self.mobilebert( 2025-08-14T21:51:20.4881541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4881618Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4881915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4881991Z layer_outputs = layer_module( 2025-08-14T21:51:20.4882292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4882442Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4882750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4882840Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4882854Z 2025-08-14T21:51:20.4882937Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4883020Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4883108Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4883188Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4883297Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4883515Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4883586Z return mod(**inputs) 2025-08-14T21:51:20.4883888Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4883972Z outputs = self.mobilebert( 2025-08-14T21:51:20.4884270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4884352Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4884657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4884733Z layer_outputs = layer_module( 2025-08-14T21:51:20.4885038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4885205Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4885508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4885641Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4885939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4886036Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4886043Z 2025-08-14T21:51:20.4886125Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4886208Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4886325Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4886534Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4886610Z return mod(**inputs) 2025-08-14T21:51:20.4886905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4886980Z outputs = self.mobilebert( 2025-08-14T21:51:20.4887287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4887366Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4887672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4887783Z layer_outputs = layer_module( 2025-08-14T21:51:20.4888104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:51:20.4888285Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:51:20.4888587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:51:20.4888705Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:51:20.4889083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:51:20.4889202Z layer_input = self.dense(hidden_states) 2025-08-14T21:51:20.4889207Z 2025-08-14T21:51:20.4889300Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4889383Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4889498Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4889719Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4889792Z return mod(**inputs) 2025-08-14T21:51:20.4890093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4890179Z outputs = self.mobilebert( 2025-08-14T21:51:20.4890479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4890566Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4890880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4890957Z layer_outputs = layer_module( 2025-08-14T21:51:20.4891256Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4891350Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4891650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4891725Z self_outputs = self.self( 2025-08-14T21:51:20.4892018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4892104Z self.query(query_tensor) 2025-08-14T21:51:20.4892108Z 2025-08-14T21:51:20.4892192Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4892275Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4892366Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4892447Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4892535Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4892615Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4892698Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4892786Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4892866Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4892946Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4893041Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4893123Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4893203Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4893293Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4893373Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4893461Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4893572Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4893786Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4893866Z return mod(**inputs) 2025-08-14T21:51:20.4894164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4894290Z outputs = self.mobilebert( 2025-08-14T21:51:20.4894596Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4894675Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4894990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4895067Z layer_outputs = layer_module( 2025-08-14T21:51:20.4895393Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4895505Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4895809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4895932Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4896240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4896330Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4896333Z 2025-08-14T21:51:20.4896423Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4896505Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4896585Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4896675Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4896756Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4896836Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4896927Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4897008Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4897125Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4897338Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4897411Z return mod(**inputs) 2025-08-14T21:51:20.4897718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4897793Z outputs = self.mobilebert( 2025-08-14T21:51:20.4898099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4898184Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4898496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4898580Z layer_outputs = layer_module( 2025-08-14T21:51:20.4898885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4899015Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4899323Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4899413Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4899417Z 2025-08-14T21:51:20.4899506Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4899587Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4899666Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4899753Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4899862Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4900074Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4900151Z return mod(**inputs) 2025-08-14T21:51:20.4900457Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4900574Z outputs = self.mobilebert( 2025-08-14T21:51:20.4900904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4900984Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4901290Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4901366Z layer_outputs = layer_module( 2025-08-14T21:51:20.4901673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4901869Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4902181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4902317Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4902782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4902881Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4902886Z 2025-08-14T21:51:20.4902977Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4903057Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4903145Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4903224Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4903334Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4903555Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4903627Z return mod(**inputs) 2025-08-14T21:51:20.4903934Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4904023Z outputs = self.mobilebert( 2025-08-14T21:51:20.4904320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4904406Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4904703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4904777Z layer_outputs = layer_module( 2025-08-14T21:51:20.4905084Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4905175Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4905473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4905560Z self_outputs = self.self( 2025-08-14T21:51:20.4905855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4905944Z self.query(query_tensor) 2025-08-14T21:51:20.4905948Z 2025-08-14T21:51:20.4906028Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4906108Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4906195Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4906273Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4906353Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4906439Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4906517Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4906605Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4906684Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4906763Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4906846Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4906924Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4907090Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4907210Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4907291Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4907369Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4907487Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4907693Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4907774Z return mod(**inputs) 2025-08-14T21:51:20.4908065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4908139Z outputs = self.mobilebert( 2025-08-14T21:51:20.4908477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4908556Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4908851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4908936Z layer_outputs = layer_module( 2025-08-14T21:51:20.4909224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4909331Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4909621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4909739Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4910038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4910126Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4910129Z 2025-08-14T21:51:20.4910217Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4910300Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4910380Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4910467Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4910546Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4910624Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4910712Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4910789Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4910896Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4911107Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4911178Z return mod(**inputs) 2025-08-14T21:51:20.4911477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4911550Z outputs = self.mobilebert( 2025-08-14T21:51:20.4911843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4911932Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4912224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4912297Z layer_outputs = layer_module( 2025-08-14T21:51:20.4912602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4912732Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4913039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4913129Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4913133Z 2025-08-14T21:51:20.4913216Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4913329Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4913432Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4913555Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4913667Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4913877Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4913952Z return mod(**inputs) 2025-08-14T21:51:20.4914253Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4914329Z outputs = self.mobilebert( 2025-08-14T21:51:20.4914657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4914734Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4915032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4915108Z layer_outputs = layer_module( 2025-08-14T21:51:20.4915400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4915569Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4915858Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4915988Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4916305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4916393Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4916397Z 2025-08-14T21:51:20.4916483Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4916563Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4916674Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4916890Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4916959Z return mod(**inputs) 2025-08-14T21:51:20.4917256Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4917329Z outputs = self.mobilebert( 2025-08-14T21:51:20.4917634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4917718Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4918030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4918106Z layer_outputs = layer_module( 2025-08-14T21:51:20.4918421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:51:20.4918596Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:51:20.4918904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:51:20.4919022Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:51:20.4919333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:51:20.4919429Z layer_input = self.dense(hidden_states) 2025-08-14T21:51:20.4919435Z 2025-08-14T21:51:20.4919521Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4919609Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4919720Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4919933Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4920054Z return mod(**inputs) 2025-08-14T21:51:20.4920370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4920451Z outputs = self.mobilebert( 2025-08-14T21:51:20.4920757Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4920835Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4921152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4921245Z layer_outputs = layer_module( 2025-08-14T21:51:20.4921550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4921648Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4921951Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4922027Z self_outputs = self.self( 2025-08-14T21:51:20.4922330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4922407Z self.query(query_tensor) 2025-08-14T21:51:20.4922410Z 2025-08-14T21:51:20.4922498Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4922577Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4922657Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4922744Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4922825Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4922905Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4922994Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4923073Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4923162Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4923244Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4923324Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4923411Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4923490Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4923571Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4923659Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4923739Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4923848Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4924066Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4924138Z return mod(**inputs) 2025-08-14T21:51:20.4924452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4924527Z outputs = self.mobilebert( 2025-08-14T21:51:20.4924825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4924912Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4925208Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4925284Z layer_outputs = layer_module( 2025-08-14T21:51:20.4925599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4925702Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4926008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4926126Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4926422Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4927241Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4927245Z 2025-08-14T21:51:20.4927330Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4927419Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4927499Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4927583Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4927668Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4927751Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4927832Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4927919Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4928049Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4928264Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4928341Z return mod(**inputs) 2025-08-14T21:51:20.4928646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4928730Z outputs = self.mobilebert( 2025-08-14T21:51:20.4929113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4929196Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4929510Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4929588Z layer_outputs = layer_module( 2025-08-14T21:51:20.4929897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4930030Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4930328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4930429Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4930434Z 2025-08-14T21:51:20.4930517Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4930603Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4930694Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4930776Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4930895Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4931110Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4931182Z return mod(**inputs) 2025-08-14T21:51:20.4931496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4931574Z outputs = self.mobilebert( 2025-08-14T21:51:20.4931872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4931964Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4932262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4932346Z layer_outputs = layer_module( 2025-08-14T21:51:20.4932645Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4932816Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4933128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4933264Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4933574Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4933721Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4933725Z 2025-08-14T21:51:20.4933810Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4933900Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4933984Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4934064Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4934184Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4934396Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4934476Z return mod(**inputs) 2025-08-14T21:51:20.4934795Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4934874Z outputs = self.mobilebert( 2025-08-14T21:51:20.4935175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4935260Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4935559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4935642Z layer_outputs = layer_module( 2025-08-14T21:51:20.4935940Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4936038Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4936334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4936413Z self_outputs = self.self( 2025-08-14T21:51:20.4936719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4936807Z self.query(query_tensor) 2025-08-14T21:51:20.4936812Z 2025-08-14T21:51:20.4936900Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4936979Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4937057Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4937142Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4937221Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4937300Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4937387Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4937475Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4937549Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4937632Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4937706Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4937785Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4937858Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4937931Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4938018Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4938097Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4938206Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4938418Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4938487Z return mod(**inputs) 2025-08-14T21:51:20.4938773Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4938854Z outputs = self.mobilebert( 2025-08-14T21:51:20.4939155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4939240Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4939529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4939623Z layer_outputs = layer_module( 2025-08-14T21:51:20.4939956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4940059Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4940354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4940473Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4940770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4940875Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4940879Z 2025-08-14T21:51:20.4940959Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4941035Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4941119Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4941196Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4941279Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4941353Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4941427Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4941506Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4941608Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4941803Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4941876Z return mod(**inputs) 2025-08-14T21:51:20.4942179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4942255Z outputs = self.mobilebert( 2025-08-14T21:51:20.4942555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4942634Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4942936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4943011Z layer_outputs = layer_module( 2025-08-14T21:51:20.4943299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4943431Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4943732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4943827Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4943831Z 2025-08-14T21:51:20.4943911Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4943992Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4944079Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4944156Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4944266Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4944478Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4944547Z return mod(**inputs) 2025-08-14T21:51:20.4944839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4944913Z outputs = self.mobilebert( 2025-08-14T21:51:20.4945214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4945298Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4945600Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4945675Z layer_outputs = layer_module( 2025-08-14T21:51:20.4945971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4946190Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4946492Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4946622Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4946915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4947010Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4947013Z 2025-08-14T21:51:20.4947162Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4947253Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4947360Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4947568Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4947660Z return mod(**inputs) 2025-08-14T21:51:20.4947937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4948008Z outputs = self.mobilebert( 2025-08-14T21:51:20.4948292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4948367Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4948667Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4948745Z layer_outputs = layer_module( 2025-08-14T21:51:20.4949046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:51:20.4949213Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:51:20.4949498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:51:20.4949619Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:51:20.4949913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:51:20.4950002Z layer_input = self.dense(hidden_states) 2025-08-14T21:51:20.4950006Z 2025-08-14T21:51:20.4950096Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4950177Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4950289Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4950506Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4950579Z return mod(**inputs) 2025-08-14T21:51:20.4950884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4950960Z outputs = self.mobilebert( 2025-08-14T21:51:20.4951252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4951343Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4951635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4951716Z layer_outputs = layer_module( 2025-08-14T21:51:20.4952012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4952101Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4952397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4952519Z self_outputs = self.self( 2025-08-14T21:51:20.4952836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4952919Z self.query(query_tensor) 2025-08-14T21:51:20.4952923Z 2025-08-14T21:51:20.4953004Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4953089Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4953167Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4953246Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4953334Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4953411Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4953512Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4953599Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4953677Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4953756Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4953845Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4953926Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4954012Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4954091Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4954168Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4954254Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4954363Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4954570Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4954647Z return mod(**inputs) 2025-08-14T21:51:20.4954941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4955015Z outputs = self.mobilebert( 2025-08-14T21:51:20.4955316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4955396Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4955694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4955767Z layer_outputs = layer_module( 2025-08-14T21:51:20.4956055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4956166Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4956456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4956580Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4956871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4956961Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4956966Z 2025-08-14T21:51:20.4957054Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4957134Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4957212Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4957297Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4957375Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4957458Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4957535Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4957612Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4957726Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4957933Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4958004Z return mod(**inputs) 2025-08-14T21:51:20.4958301Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4958406Z outputs = self.mobilebert( 2025-08-14T21:51:20.4958719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4958798Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4959086Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4959167Z layer_outputs = layer_module( 2025-08-14T21:51:20.4959456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4959596Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4959896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4959981Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4959988Z 2025-08-14T21:51:20.4960075Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4960155Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4960233Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4960320Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4960427Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4960635Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4960713Z return mod(**inputs) 2025-08-14T21:51:20.4961006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4961089Z outputs = self.mobilebert( 2025-08-14T21:51:20.4961380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4961456Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4961758Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4961832Z layer_outputs = layer_module( 2025-08-14T21:51:20.4962132Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4962302Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4962601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4962744Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4963046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4963134Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4963148Z 2025-08-14T21:51:20.4963232Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4963316Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4963404Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4963483Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4963594Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4963813Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4963884Z return mod(**inputs) 2025-08-14T21:51:20.4964183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4964270Z outputs = self.mobilebert( 2025-08-14T21:51:20.4964570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4964654Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4965004Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4965083Z layer_outputs = layer_module( 2025-08-14T21:51:20.4965394Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4965486Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4965792Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4965867Z self_outputs = self.self( 2025-08-14T21:51:20.4966186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4966270Z self.query(query_tensor) 2025-08-14T21:51:20.4966274Z 2025-08-14T21:51:20.4966355Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4966439Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4966533Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4966613Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4966703Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4966784Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4966864Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4966952Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4967031Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4967112Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4967198Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4967279Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4967361Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4967448Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4967529Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4967609Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4967736Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4967955Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4968034Z return mod(**inputs) 2025-08-14T21:51:20.4968330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4968407Z outputs = self.mobilebert( 2025-08-14T21:51:20.4968711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4968871Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4969193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4969270Z layer_outputs = layer_module( 2025-08-14T21:51:20.4969569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4969686Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4969987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4970117Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4970416Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4970502Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4970506Z 2025-08-14T21:51:20.4970594Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4970676Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4970756Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4970843Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4970921Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4971047Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4971152Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4971235Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4971353Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4971565Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4971636Z return mod(**inputs) 2025-08-14T21:51:20.4971938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4972014Z outputs = self.mobilebert( 2025-08-14T21:51:20.4972338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4972429Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4972726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4972810Z layer_outputs = layer_module( 2025-08-14T21:51:20.4973111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4973243Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4973549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4973637Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4973641Z 2025-08-14T21:51:20.4973730Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4973809Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4973891Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4973978Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4974087Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4974298Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4974380Z return mod(**inputs) 2025-08-14T21:51:20.4974677Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4974751Z outputs = self.mobilebert( 2025-08-14T21:51:20.4975055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4975132Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4975436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4975514Z layer_outputs = layer_module( 2025-08-14T21:51:20.4975811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4975991Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4976292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4976431Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4976731Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4976821Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4976825Z 2025-08-14T21:51:20.4976915Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4976998Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4977108Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4977326Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4977397Z return mod(**inputs) 2025-08-14T21:51:20.4977757Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4977834Z outputs = self.mobilebert( 2025-08-14T21:51:20.4978130Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4978217Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4978523Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4978607Z layer_outputs = layer_module( 2025-08-14T21:51:20.4978932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:51:20.4979108Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:51:20.4979424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:51:20.4979547Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:51:20.4979850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:51:20.4979946Z layer_input = self.dense(hidden_states) 2025-08-14T21:51:20.4979950Z 2025-08-14T21:51:20.4980031Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4980120Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4980230Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4980441Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4980521Z return mod(**inputs) 2025-08-14T21:51:20.4980830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4980917Z outputs = self.mobilebert( 2025-08-14T21:51:20.4981216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4981294Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4981593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4981662Z layer_outputs = layer_module( 2025-08-14T21:51:20.4981940Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4982035Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4982338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4982420Z self_outputs = self.self( 2025-08-14T21:51:20.4982723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4982802Z self.query(query_tensor) 2025-08-14T21:51:20.4982806Z 2025-08-14T21:51:20.4982897Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4982976Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4983055Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4983142Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4983221Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4983308Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4983386Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4983466Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4983555Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4983634Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4983713Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4983800Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4983952Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4984048Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4984139Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4984219Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4984336Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4984545Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4984614Z return mod(**inputs) 2025-08-14T21:51:20.4984920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4984994Z outputs = self.mobilebert( 2025-08-14T21:51:20.4985317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4985405Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4985713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4985794Z layer_outputs = layer_module( 2025-08-14T21:51:20.4986095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4986199Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.4986512Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.4986631Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.4986948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4987038Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4987042Z 2025-08-14T21:51:20.4987124Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4987217Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4987299Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4987379Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4987469Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4987549Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4987629Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4987717Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4987827Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4988051Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4988119Z return mod(**inputs) 2025-08-14T21:51:20.4988420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4988503Z outputs = self.mobilebert( 2025-08-14T21:51:20.4988806Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4988887Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4989191Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4989267Z layer_outputs = layer_module( 2025-08-14T21:51:20.4989574Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.4989703Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.4990007Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.4990106Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.4990110Z 2025-08-14T21:51:20.4990193Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4990309Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4990407Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4990506Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4990626Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4990834Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4990906Z return mod(**inputs) 2025-08-14T21:51:20.4991211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4991288Z outputs = self.mobilebert( 2025-08-14T21:51:20.4991606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4991687Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4991985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4992072Z layer_outputs = layer_module( 2025-08-14T21:51:20.4992369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.4992540Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.4992843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.4992976Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.4993280Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.4993371Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.4993375Z 2025-08-14T21:51:20.4993457Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4993547Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4993630Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4993720Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4993831Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4994040Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4994117Z return mod(**inputs) 2025-08-14T21:51:20.4994411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4994487Z outputs = self.mobilebert( 2025-08-14T21:51:20.4994793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4994880Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4995174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4995251Z layer_outputs = layer_module( 2025-08-14T21:51:20.4995539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.4995636Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.4995923Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.4995997Z self_outputs = self.self( 2025-08-14T21:51:20.4996298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.4996374Z self.query(query_tensor) 2025-08-14T21:51:20.4996380Z 2025-08-14T21:51:20.4996469Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4996550Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4996630Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4996744Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4996850Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4996952Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4997041Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4997118Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4997204Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4997284Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4997362Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4997448Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4997525Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4997603Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4997705Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4997785Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.4997894Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.4998111Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.4998186Z return mod(**inputs) 2025-08-14T21:51:20.4998496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.4998574Z outputs = self.mobilebert( 2025-08-14T21:51:20.4998874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.4998959Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.4999258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.4999335Z layer_outputs = layer_module( 2025-08-14T21:51:20.4999644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.4999750Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.5000060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.5000187Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.5000488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.5000586Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.5000589Z 2025-08-14T21:51:20.5000681Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5000767Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5000844Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5000923Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5001011Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5001089Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5001166Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5001253Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5001364Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.5001568Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.5001645Z return mod(**inputs) 2025-08-14T21:51:20.5001956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.5002040Z outputs = self.mobilebert( 2025-08-14T21:51:20.5002349Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.5002428Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.5002958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.5003042Z layer_outputs = layer_module( 2025-08-14T21:51:20.5003457Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.5003625Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.5003926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.5004025Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.5004029Z 2025-08-14T21:51:20.5004112Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5004196Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5004286Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5004367Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5004518Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.5004735Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.5004806Z return mod(**inputs) 2025-08-14T21:51:20.5005117Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.5005194Z outputs = self.mobilebert( 2025-08-14T21:51:20.5005495Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.5005583Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.5005890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.5005972Z layer_outputs = layer_module( 2025-08-14T21:51:20.5006281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.5006451Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.5006762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.5006902Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.5007205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.5007296Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.5007299Z 2025-08-14T21:51:20.5007382Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5007472Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5007581Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.5007793Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.5007874Z return mod(**inputs) 2025-08-14T21:51:20.5008189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.5008275Z outputs = self.mobilebert( 2025-08-14T21:51:20.5008572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.5008648Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.5009003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.5009084Z layer_outputs = layer_module( 2025-08-14T21:51:20.5009391Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:51:20.5009574Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:51:20.5009874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:51:20.5010004Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:51:20.5010363Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:51:20.5010454Z layer_input = self.dense(hidden_states) 2025-08-14T21:51:20.5010459Z 2025-08-14T21:51:20.5010551Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5010633Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5010753Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.5010966Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.5011039Z return mod(**inputs) 2025-08-14T21:51:20.5011356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.5011433Z outputs = self.mobilebert( 2025-08-14T21:51:20.5011722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.5011811Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.5012101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.5012182Z layer_outputs = layer_module( 2025-08-14T21:51:20.5012471Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:51:20.5012559Z self_attention_outputs = self.attention( 2025-08-14T21:51:20.5012857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:51:20.5012931Z self_outputs = self.self( 2025-08-14T21:51:20.5013228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:51:20.5013304Z self.query(query_tensor) 2025-08-14T21:51:20.5013309Z 2025-08-14T21:51:20.5013390Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5013478Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5013556Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5013635Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5013722Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5013800Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5013877Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5013963Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5014041Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5014125Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5014203Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5014280Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5014365Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5014443Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5014522Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5014608Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5014713Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.5014918Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.5014995Z return mod(**inputs) 2025-08-14T21:51:20.5015283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.5015362Z outputs = self.mobilebert( 2025-08-14T21:51:20.5015652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.5015728Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.5016027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.5016138Z layer_outputs = layer_module( 2025-08-14T21:51:20.5016447Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:51:20.5016554Z attention_output = ffn_module(attention_output) 2025-08-14T21:51:20.5016853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:51:20.5016980Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:51:20.5017287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.5017392Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.5017396Z 2025-08-14T21:51:20.5017488Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5017568Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5017655Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5017738Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5017820Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5017908Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5017986Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5018066Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5018180Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.5018391Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.5018463Z return mod(**inputs) 2025-08-14T21:51:20.5018771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.5018848Z outputs = self.mobilebert( 2025-08-14T21:51:20.5019153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.5019233Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.5019535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.5019620Z layer_outputs = layer_module( 2025-08-14T21:51:20.5019920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:51:20.5020056Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:51:20.5020354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:51:20.5020444Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.5020449Z 2025-08-14T21:51:20.5020542Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5020623Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5020706Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5020795Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5020910Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.5021132Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.5021203Z return mod(**inputs) 2025-08-14T21:51:20.5021501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 976, in forward 2025-08-14T21:51:20.5021585Z outputs = self.mobilebert( 2025-08-14T21:51:20.5021894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:51:20.5021971Z encoder_outputs = self.encoder( 2025-08-14T21:51:20.5022292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:51:20.5022370Z layer_outputs = layer_module( 2025-08-14T21:51:20.5022706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:51:20.5022923Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:51:20.5023229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:51:20.5023370Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:51:20.5023676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:51:20.5023773Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:51:20.5023776Z 2025-08-14T21:51:20.5023873Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5023955Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5024073Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.5024285Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.5024358Z return mod(**inputs) 2025-08-14T21:51:20.5024663Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 989, in forward 2025-08-14T21:51:20.5024771Z prediction_scores = self.cls(sequence_output) 2025-08-14T21:51:20.5025085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 643, in forward 2025-08-14T21:51:20.5025207Z prediction_scores = self.predictions(sequence_output) 2025-08-14T21:51:20.5025517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 631, in forward 2025-08-14T21:51:20.5025626Z hidden_states = self.transform(hidden_states) 2025-08-14T21:51:20.5025935Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 609, in forward 2025-08-14T21:51:20.5026034Z hidden_states = self.dense(hidden_states) 2025-08-14T21:51:20.5026040Z 2025-08-14T21:51:20.5026124Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5026207Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5026296Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5026377Z cudagraph partition due to non gpu ops 2025-08-14T21:51:20.5026487Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.5026709Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.5026780Z return mod(**inputs) 2025-08-14T21:51:20.5027095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 994, in forward 2025-08-14T21:51:20.5027300Z masked_lm_loss = loss_fct(prediction_scores.view(-1, self.config.vocab_size), labels.view(-1)) 2025-08-14T21:51:20.5027304Z 2025-08-14T21:51:20.5027414Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:51:20.5027637Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:51:20.5027707Z return mod(**inputs) 2025-08-14T21:51:20.5028003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 989, in forward 2025-08-14T21:51:20.5028113Z prediction_scores = self.cls(sequence_output) 2025-08-14T21:51:20.5028420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 643, in forward 2025-08-14T21:51:20.5028548Z prediction_scores = self.predictions(sequence_output) 2025-08-14T21:51:20.5028854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 633, in forward 2025-08-14T21:51:20.5028938Z hidden_states += self.decoder.bias 2025-08-14T21:51:20.5028942Z 2025-08-14T21:51:35.1589996Z Compilation time (from dynamo_timed): 53.681298035 2025-08-14T21:51:35.1590828Z pass 2025-08-14T21:51:35.1591305Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:51:35.1592138Z TIMING: _recursive_pre_grad_passes:0.16896 _recursive_joint_graph_passes:1.59875 _recursive_post_grad_passes:0.2707 async_compile.wait:0.88617 code_gen:11.69203 inductor_compile:15.97489 backend_compile:40.36212 gc:0.00098 entire_frame_compile:53.6813 total_wall_time:53.6813 2025-08-14T21:51:35.1593122Z STATS: call_* op count: 1451 | FakeTensorMode.__torch_dispatch__:114786 | FakeTensor.__torch_dispatch__:10800 | ProxyTorchDispatchMode.__torch_dispatch__:31006 2025-08-14T21:51:35.1593733Z Dynamo produced 1 graphs covering 1451 ops with 0 graph breaks (0 unique) 2025-08-14T21:51:42.0757155Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:51:42.0758162Z from pkg_resources import resource_filename 2025-08-14T21:51:42.6933269Z 2025-08-14T21:51:43.3253987Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:51:43.3255077Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:51:43.3255450Z cpu eval MobileBertForQuestionAnswering 2025-08-14T21:51:43.5564465Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:51:43.7423607Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:51:43.9281348Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:52:23.9903399Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:23.9903941Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:23.9904366Z return mod(**inputs) 2025-08-14T21:52:23.9904859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1256, in forward 2025-08-14T21:52:23.9905411Z logits = self.qa_outputs(sequence_output) 2025-08-14T21:52:23.9905571Z 2025-08-14T21:52:23.9905691Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:23.9906104Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:23.9906475Z return mod(**inputs) 2025-08-14T21:52:23.9906911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:23.9907386Z outputs = self.mobilebert( 2025-08-14T21:52:23.9907840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 791, in forward 2025-08-14T21:52:23.9908347Z embedding_output = self.embeddings( 2025-08-14T21:52:23.9908808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 199, in forward 2025-08-14T21:52:23.9909262Z inputs_embeds = torch.cat( 2025-08-14T21:52:23.9909388Z 2025-08-14T21:52:23.9909512Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:23.9909911Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:23.9910274Z return mod(**inputs) 2025-08-14T21:52:23.9910746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:23.9911255Z outputs = self.mobilebert( 2025-08-14T21:52:23.9911693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 791, in forward 2025-08-14T21:52:23.9912154Z embedding_output = self.embeddings( 2025-08-14T21:52:23.9913117Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 208, in forward 2025-08-14T21:52:23.9913658Z inputs_embeds = self.embedding_transformation(inputs_embeds) 2025-08-14T21:52:23.9913853Z 2025-08-14T21:52:23.9913976Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9914225Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9914453Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9914670Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9914927Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:23.9915331Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:23.9915750Z return mod(**inputs) 2025-08-14T21:52:23.9916174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:23.9916625Z outputs = self.mobilebert( 2025-08-14T21:52:23.9917078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:23.9917540Z encoder_outputs = self.encoder( 2025-08-14T21:52:23.9917985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:23.9918431Z layer_outputs = layer_module( 2025-08-14T21:52:23.9918872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:23.9919385Z self_attention_outputs = self.attention( 2025-08-14T21:52:23.9919845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:23.9920286Z self_outputs = self.self( 2025-08-14T21:52:23.9920704Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:23.9921140Z self.query(query_tensor) 2025-08-14T21:52:23.9921267Z 2025-08-14T21:52:23.9921350Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9921572Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9921783Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9922004Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9922225Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9922442Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9922665Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9922887Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9923104Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9923327Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9923547Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9923765Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9923983Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9924209Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9924492Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9924707Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9924961Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:23.9925358Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:23.9925715Z return mod(**inputs) 2025-08-14T21:52:23.9926141Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:23.9926592Z outputs = self.mobilebert( 2025-08-14T21:52:23.9927024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:23.9927470Z encoder_outputs = self.encoder( 2025-08-14T21:52:23.9927917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:23.9928454Z layer_outputs = layer_module( 2025-08-14T21:52:23.9929111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:23.9929593Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:23.9930078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:23.9930591Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:23.9931119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:23.9931659Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:23.9931807Z 2025-08-14T21:52:23.9931890Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9932113Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9932331Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9932542Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9932757Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9932978Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9933192Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9933414Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9933670Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:23.9934058Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:23.9934411Z return mod(**inputs) 2025-08-14T21:52:23.9934843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:23.9936244Z outputs = self.mobilebert( 2025-08-14T21:52:23.9936768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:23.9937423Z encoder_outputs = self.encoder( 2025-08-14T21:52:23.9937949Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:23.9938545Z layer_outputs = layer_module( 2025-08-14T21:52:23.9938993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:23.9939587Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:23.9940118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:23.9940700Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:23.9940877Z 2025-08-14T21:52:23.9940971Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9941253Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9941486Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9941702Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9941961Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:23.9942369Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:23.9942727Z return mod(**inputs) 2025-08-14T21:52:23.9943173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:23.9943635Z outputs = self.mobilebert( 2025-08-14T21:52:23.9944085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:23.9944540Z encoder_outputs = self.encoder( 2025-08-14T21:52:23.9944995Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:23.9945669Z layer_outputs = layer_module( 2025-08-14T21:52:23.9946182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:23.9946730Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:23.9947400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:23.9947913Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:23.9948412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:23.9948906Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:23.9949070Z 2025-08-14T21:52:23.9949160Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9949391Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9949644Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:23.9950042Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:23.9950412Z return mod(**inputs) 2025-08-14T21:52:23.9950834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:23.9951282Z outputs = self.mobilebert( 2025-08-14T21:52:23.9951774Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:23.9952283Z encoder_outputs = self.encoder( 2025-08-14T21:52:23.9952729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:23.9953187Z layer_outputs = layer_module( 2025-08-14T21:52:23.9953639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:52:23.9954197Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:52:23.9954742Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:52:23.9955230Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:52:23.9955726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:52:23.9956308Z layer_input = self.dense(hidden_states) 2025-08-14T21:52:23.9956526Z 2025-08-14T21:52:23.9956612Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9956841Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9957163Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:23.9957731Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:23.9958104Z return mod(**inputs) 2025-08-14T21:52:23.9958551Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:23.9958994Z outputs = self.mobilebert( 2025-08-14T21:52:23.9959421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:23.9959876Z encoder_outputs = self.encoder( 2025-08-14T21:52:23.9960328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:23.9960907Z layer_outputs = layer_module( 2025-08-14T21:52:23.9961419Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:23.9961902Z self_attention_outputs = self.attention( 2025-08-14T21:52:23.9962417Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:23.9962995Z self_outputs = self.self( 2025-08-14T21:52:23.9963613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:23.9964266Z self.query(query_tensor) 2025-08-14T21:52:23.9964455Z 2025-08-14T21:52:23.9964549Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9964823Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9965160Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9965445Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9965784Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9966108Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9966434Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9966738Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9967068Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9967398Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9967723Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9968013Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9968350Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9968633Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9969381Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9969686Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9970016Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:23.9970499Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:23.9970973Z return mod(**inputs) 2025-08-14T21:52:23.9971597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:23.9972300Z outputs = self.mobilebert( 2025-08-14T21:52:23.9972991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:23.9973714Z encoder_outputs = self.encoder( 2025-08-14T21:52:23.9974389Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:23.9975072Z layer_outputs = layer_module( 2025-08-14T21:52:23.9975722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:23.9976286Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:23.9976872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:23.9977355Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:23.9977935Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:23.9978457Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:23.9978615Z 2025-08-14T21:52:23.9978707Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9978928Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9979160Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9979451Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9979667Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9979889Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9980110Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9980321Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9980578Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:23.9980972Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:23.9981452Z return mod(**inputs) 2025-08-14T21:52:23.9981979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:23.9982606Z outputs = self.mobilebert( 2025-08-14T21:52:23.9983169Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:23.9983688Z encoder_outputs = self.encoder( 2025-08-14T21:52:23.9984202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:23.9984749Z layer_outputs = layer_module( 2025-08-14T21:52:23.9985220Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:23.9985814Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:23.9986422Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:23.9986933Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:23.9987084Z 2025-08-14T21:52:23.9987170Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9987444Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9987713Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9987937Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9988184Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:23.9988651Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:23.9989003Z return mod(**inputs) 2025-08-14T21:52:23.9989497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:23.9990104Z outputs = self.mobilebert( 2025-08-14T21:52:23.9990627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:23.9991074Z encoder_outputs = self.encoder( 2025-08-14T21:52:23.9991584Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:23.9992129Z layer_outputs = layer_module( 2025-08-14T21:52:23.9992740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:23.9993271Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:23.9993809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:23.9994305Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:23.9994858Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:23.9995530Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:23.9995780Z 2025-08-14T21:52:23.9995899Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9996213Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9996442Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9996658Z cudagraph partition due to non gpu ops 2025-08-14T21:52:23.9996983Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:23.9997380Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:23.9997732Z return mod(**inputs) 2025-08-14T21:52:23.9998223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:23.9998779Z outputs = self.mobilebert( 2025-08-14T21:52:23.9999416Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:23.9999901Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0000454Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0000904Z layer_outputs = layer_module( 2025-08-14T21:52:24.0001335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0001795Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0002295Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0003252Z self_outputs = self.self( 2025-08-14T21:52:24.0003884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0004487Z self.query(query_tensor) 2025-08-14T21:52:24.0004680Z 2025-08-14T21:52:24.0004775Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0005011Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0005348Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0005645Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0005962Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0006246Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0006563Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0006899Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0007147Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0007454Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0007678Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0007991Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0008329Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0008627Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0009187Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0009536Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0009908Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0010416Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0010891Z return mod(**inputs) 2025-08-14T21:52:24.0011344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0011886Z outputs = self.mobilebert( 2025-08-14T21:52:24.0012397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0013031Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0013604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0014158Z layer_outputs = layer_module( 2025-08-14T21:52:24.0014591Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0015066Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0015532Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0016019Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0016494Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0017036Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0017191Z 2025-08-14T21:52:24.0017284Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0017503Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0017790Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0018116Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0018387Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0018651Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0018873Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0019097Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0019343Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0019741Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0020093Z return mod(**inputs) 2025-08-14T21:52:24.0020512Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0021081Z outputs = self.mobilebert( 2025-08-14T21:52:24.0021595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0022052Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0022486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0022931Z layer_outputs = layer_module( 2025-08-14T21:52:24.0023370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0023872Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0024486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0025138Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0025361Z 2025-08-14T21:52:24.0025460Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0025756Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0026063Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0026371Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0026728Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0027247Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0027734Z return mod(**inputs) 2025-08-14T21:52:24.0028329Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0028924Z outputs = self.mobilebert( 2025-08-14T21:52:24.0029530Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0030026Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0030593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0031134Z layer_outputs = layer_module( 2025-08-14T21:52:24.0031660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0032429Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0033191Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0033887Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0034516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0035205Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0035448Z 2025-08-14T21:52:24.0035578Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0035833Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0036346Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0036822Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0037262Z return mod(**inputs) 2025-08-14T21:52:24.0037714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0038267Z outputs = self.mobilebert( 2025-08-14T21:52:24.0038687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0039229Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0039757Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0040310Z layer_outputs = layer_module( 2025-08-14T21:52:24.0040926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:52:24.0041797Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:52:24.0042643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:52:24.0043239Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:52:24.0043850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:52:24.0044511Z layer_input = self.dense(hidden_states) 2025-08-14T21:52:24.0044672Z 2025-08-14T21:52:24.0044804Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0045131Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0045387Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0045918Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0046428Z return mod(**inputs) 2025-08-14T21:52:24.0047065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0047731Z outputs = self.mobilebert( 2025-08-14T21:52:24.0048327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0048970Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0049424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0049875Z layer_outputs = layer_module( 2025-08-14T21:52:24.0050421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0050886Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0051432Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0051928Z self_outputs = self.self( 2025-08-14T21:52:24.0052398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0052833Z self.query(query_tensor) 2025-08-14T21:52:24.0052968Z 2025-08-14T21:52:24.0053056Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0053292Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0053516Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0053742Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0053970Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0054199Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0054419Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0054702Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0054937Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0055153Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0055417Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0055678Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0056000Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0056226Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0056449Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0056746Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0057115Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0057694Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0058050Z return mod(**inputs) 2025-08-14T21:52:24.0058594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0059046Z outputs = self.mobilebert( 2025-08-14T21:52:24.0059570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0060012Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0060453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0060996Z layer_outputs = layer_module( 2025-08-14T21:52:24.0061428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0061890Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0062354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0062885Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0063410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0063858Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0064021Z 2025-08-14T21:52:24.0064109Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0064337Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0064554Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0064777Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0065000Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0065215Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0065439Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0065661Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0065913Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0066297Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0066743Z return mod(**inputs) 2025-08-14T21:52:24.0067169Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0067616Z outputs = self.mobilebert( 2025-08-14T21:52:24.0068059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0068508Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0068956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0069403Z layer_outputs = layer_module( 2025-08-14T21:52:24.0069850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0070358Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0070942Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0071406Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0071649Z 2025-08-14T21:52:24.0071731Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0071990Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0072209Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0072434Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0072691Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0073074Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0073425Z return mod(**inputs) 2025-08-14T21:52:24.0073851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0074427Z outputs = self.mobilebert( 2025-08-14T21:52:24.0075049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0075545Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0076223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0076815Z layer_outputs = layer_module( 2025-08-14T21:52:24.0077317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0077939Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0078741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0079319Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0079910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0080470Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0080622Z 2025-08-14T21:52:24.0080718Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0080987Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0081304Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0081546Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0081923Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0082373Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0082730Z return mod(**inputs) 2025-08-14T21:52:24.0083358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0083895Z outputs = self.mobilebert( 2025-08-14T21:52:24.0084450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0085122Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0085723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0086307Z layer_outputs = layer_module( 2025-08-14T21:52:24.0086810Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0087349Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0087827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0088284Z self_outputs = self.self( 2025-08-14T21:52:24.0089045Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0089535Z self.query(query_tensor) 2025-08-14T21:52:24.0089723Z 2025-08-14T21:52:24.0089835Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0090245Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0090569Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0090811Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0091037Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0091272Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0091563Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0091789Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0092011Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0092264Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0092571Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0092893Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0093118Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0093446Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0093672Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0093965Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0094246Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0094642Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0094991Z return mod(**inputs) 2025-08-14T21:52:24.0095418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0095860Z outputs = self.mobilebert( 2025-08-14T21:52:24.0096292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0096810Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0097390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0097913Z layer_outputs = layer_module( 2025-08-14T21:52:24.0098346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0098820Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0099275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0099765Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0100250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0100712Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0100869Z 2025-08-14T21:52:24.0100953Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0101188Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0101417Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0101632Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0101860Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0102087Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0102304Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0102530Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0103026Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0103426Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0103775Z return mod(**inputs) 2025-08-14T21:52:24.0104209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0104660Z outputs = self.mobilebert( 2025-08-14T21:52:24.0105093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0105546Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0105989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0106606Z layer_outputs = layer_module( 2025-08-14T21:52:24.0107078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0107580Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0108083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0108555Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0108705Z 2025-08-14T21:52:24.0108789Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0109015Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0109274Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0109509Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0109775Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0110180Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0110545Z return mod(**inputs) 2025-08-14T21:52:24.0110982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0111443Z outputs = self.mobilebert( 2025-08-14T21:52:24.0111892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0112341Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0112797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0113264Z layer_outputs = layer_module( 2025-08-14T21:52:24.0113717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0114265Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0114815Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0115322Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0115830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0116289Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0116450Z 2025-08-14T21:52:24.0116538Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0116772Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0117029Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0117430Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0117794Z return mod(**inputs) 2025-08-14T21:52:24.0118228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0118808Z outputs = self.mobilebert( 2025-08-14T21:52:24.0119254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0119702Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0120136Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0120585Z layer_outputs = layer_module( 2025-08-14T21:52:24.0121027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:52:24.0121569Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:52:24.0122102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:52:24.0122662Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:52:24.0123148Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:52:24.0123609Z layer_input = self.dense(hidden_states) 2025-08-14T21:52:24.0123758Z 2025-08-14T21:52:24.0123844Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0124073Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0124329Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0124729Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0125081Z return mod(**inputs) 2025-08-14T21:52:24.0125504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0125952Z outputs = self.mobilebert( 2025-08-14T21:52:24.0126381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0126827Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0127274Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0127720Z layer_outputs = layer_module( 2025-08-14T21:52:24.0128164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0128632Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0129178Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0129630Z self_outputs = self.self( 2025-08-14T21:52:24.0130070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0130522Z self.query(query_tensor) 2025-08-14T21:52:24.0130646Z 2025-08-14T21:52:24.0130739Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0130962Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0131189Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0131414Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0131630Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0131857Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0132079Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0132291Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0132520Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0132739Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0132951Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0133175Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0133407Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0133630Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0133847Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0134069Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0134323Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0134704Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0135055Z return mod(**inputs) 2025-08-14T21:52:24.0135480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0135934Z outputs = self.mobilebert( 2025-08-14T21:52:24.0136370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0136819Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0137263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0137783Z layer_outputs = layer_module( 2025-08-14T21:52:24.0138224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0138694Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0139163Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0139643Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0140144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0140613Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0140768Z 2025-08-14T21:52:24.0140858Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0141098Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0141330Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0141557Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0141779Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0142012Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0142246Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0142470Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0142731Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0143127Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0143479Z return mod(**inputs) 2025-08-14T21:52:24.0143922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0144376Z outputs = self.mobilebert( 2025-08-14T21:52:24.0144812Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0145268Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0145717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0146175Z layer_outputs = layer_module( 2025-08-14T21:52:24.0146626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0147131Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0147636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0148109Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0148265Z 2025-08-14T21:52:24.0148352Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0148589Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0148823Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0149058Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0149308Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0149707Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0150066Z return mod(**inputs) 2025-08-14T21:52:24.0150492Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0150957Z outputs = self.mobilebert( 2025-08-14T21:52:24.0151407Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0151876Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0152323Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0152877Z layer_outputs = layer_module( 2025-08-14T21:52:24.0153334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0153862Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0154414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0154931Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0155476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0155935Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0156094Z 2025-08-14T21:52:24.0156179Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0156405Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0156632Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0156855Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0157129Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0157524Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0157873Z return mod(**inputs) 2025-08-14T21:52:24.0158307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0158755Z outputs = self.mobilebert( 2025-08-14T21:52:24.0159192Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0159630Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0160078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0160527Z layer_outputs = layer_module( 2025-08-14T21:52:24.0160961Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0161429Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0161888Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0162337Z self_outputs = self.self( 2025-08-14T21:52:24.0162761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0163207Z self.query(query_tensor) 2025-08-14T21:52:24.0163334Z 2025-08-14T21:52:24.0163426Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0163645Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0163872Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0164095Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0164321Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0164542Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0164767Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0164988Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0165202Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0165426Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0165646Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0165860Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0166080Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0166303Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0166522Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0166744Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0167000Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0167390Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0167775Z return mod(**inputs) 2025-08-14T21:52:24.0168226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0168674Z outputs = self.mobilebert( 2025-08-14T21:52:24.0169277Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0169742Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0170189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0170647Z layer_outputs = layer_module( 2025-08-14T21:52:24.0171124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0171613Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0172105Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0172598Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0173083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0173545Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0173697Z 2025-08-14T21:52:24.0173790Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0174010Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0174235Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0174466Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0174689Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0174906Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0175125Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0175346Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0175594Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0175985Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0176340Z return mod(**inputs) 2025-08-14T21:52:24.0176758Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0177205Z outputs = self.mobilebert( 2025-08-14T21:52:24.0177637Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0178080Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0178513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0178956Z layer_outputs = layer_module( 2025-08-14T21:52:24.0179393Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0179889Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0180373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0180831Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0180980Z 2025-08-14T21:52:24.0181070Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0181288Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0181511Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0181735Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0181980Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0182373Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0182725Z return mod(**inputs) 2025-08-14T21:52:24.0183204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0183644Z outputs = self.mobilebert( 2025-08-14T21:52:24.0184072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0184512Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0184951Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0185391Z layer_outputs = layer_module( 2025-08-14T21:52:24.0185841Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0186382Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0186926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0187415Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0187907Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0188362Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0188509Z 2025-08-14T21:52:24.0188591Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0188818Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0189075Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0189470Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0189838Z return mod(**inputs) 2025-08-14T21:52:24.0190262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0190728Z outputs = self.mobilebert( 2025-08-14T21:52:24.0191151Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0191588Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0192030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0192485Z layer_outputs = layer_module( 2025-08-14T21:52:24.0192923Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:52:24.0193455Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:52:24.0193985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:52:24.0194482Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:52:24.0194976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:52:24.0195436Z layer_input = self.dense(hidden_states) 2025-08-14T21:52:24.0195584Z 2025-08-14T21:52:24.0195676Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0195905Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0196151Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0196531Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0196876Z return mod(**inputs) 2025-08-14T21:52:24.0197288Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0197736Z outputs = self.mobilebert( 2025-08-14T21:52:24.0198166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0198667Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0199109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0199550Z layer_outputs = layer_module( 2025-08-14T21:52:24.0199984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0200433Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0200885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0201345Z self_outputs = self.self( 2025-08-14T21:52:24.0201769Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0202210Z self.query(query_tensor) 2025-08-14T21:52:24.0202341Z 2025-08-14T21:52:24.0202427Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0202869Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0203103Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0203333Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0203560Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0203780Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0204008Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0204236Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0204454Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0204679Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0204908Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0205133Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0205348Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0205573Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0205799Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0206016Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0206272Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0206667Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0207018Z return mod(**inputs) 2025-08-14T21:52:24.0207452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0207904Z outputs = self.mobilebert( 2025-08-14T21:52:24.0208339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0208860Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0209321Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0209774Z layer_outputs = layer_module( 2025-08-14T21:52:24.0210224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0210689Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0211159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0211665Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0212151Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0212626Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0212785Z 2025-08-14T21:52:24.0212871Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0213103Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0213318Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0213655Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0213921Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0214141Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0214363Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0214593Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0214839Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0215231Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0215582Z return mod(**inputs) 2025-08-14T21:52:24.0216002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0216478Z outputs = self.mobilebert( 2025-08-14T21:52:24.0216919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0217365Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0217808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0218250Z layer_outputs = layer_module( 2025-08-14T21:52:24.0222042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0222542Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0223053Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0223510Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0223664Z 2025-08-14T21:52:24.0223757Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0223976Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0224201Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0224424Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0224667Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0225054Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0225400Z return mod(**inputs) 2025-08-14T21:52:24.0225854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0226295Z outputs = self.mobilebert( 2025-08-14T21:52:24.0226714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0227146Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0227569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0228004Z layer_outputs = layer_module( 2025-08-14T21:52:24.0228433Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0228963Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0229500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0229992Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0230476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0230926Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0231073Z 2025-08-14T21:52:24.0231156Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0231379Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0231598Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0231808Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0232088Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0232497Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0232846Z return mod(**inputs) 2025-08-14T21:52:24.0233274Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0233723Z outputs = self.mobilebert( 2025-08-14T21:52:24.0234136Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0234541Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0234965Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0235373Z layer_outputs = layer_module( 2025-08-14T21:52:24.0235773Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0236218Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0236657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0237169Z self_outputs = self.self( 2025-08-14T21:52:24.0237582Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0238019Z self.query(query_tensor) 2025-08-14T21:52:24.0238139Z 2025-08-14T21:52:24.0238216Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0238430Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0238629Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0238833Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0239033Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0239231Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0239444Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0239666Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0239876Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0240093Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0240311Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0240517Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0240735Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0240949Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0241164Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0241373Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0241621Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0241998Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0242332Z return mod(**inputs) 2025-08-14T21:52:24.0242750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0243186Z outputs = self.mobilebert( 2025-08-14T21:52:24.0243608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0244048Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0244487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0244934Z layer_outputs = layer_module( 2025-08-14T21:52:24.0245363Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0245832Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0246300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0246817Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0247318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0247768Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0247913Z 2025-08-14T21:52:24.0248001Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0248225Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0248441Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0248662Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0249102Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0249375Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0249605Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0249830Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0250078Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0250487Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0250839Z return mod(**inputs) 2025-08-14T21:52:24.0251253Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0251746Z outputs = self.mobilebert( 2025-08-14T21:52:24.0252166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0252611Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0253044Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0253474Z layer_outputs = layer_module( 2025-08-14T21:52:24.0253908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0254400Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0254882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0255332Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0255483Z 2025-08-14T21:52:24.0255582Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0255802Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0256026Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0256254Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0256500Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0256876Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0257220Z return mod(**inputs) 2025-08-14T21:52:24.0257640Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0258092Z outputs = self.mobilebert( 2025-08-14T21:52:24.0258523Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0258959Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0259392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0259826Z layer_outputs = layer_module( 2025-08-14T21:52:24.0260269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0260809Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0261348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0261840Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0262395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0262851Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0263004Z 2025-08-14T21:52:24.0263089Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0263317Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0263570Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0263961Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0264304Z return mod(**inputs) 2025-08-14T21:52:24.0264748Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0265197Z outputs = self.mobilebert( 2025-08-14T21:52:24.0265623Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0266080Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0266513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0266989Z layer_outputs = layer_module( 2025-08-14T21:52:24.0267418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:52:24.0267957Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:52:24.0268499Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:52:24.0268983Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:52:24.0269462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:52:24.0269923Z layer_input = self.dense(hidden_states) 2025-08-14T21:52:24.0270074Z 2025-08-14T21:52:24.0270166Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0270386Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0270642Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0271032Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0271405Z return mod(**inputs) 2025-08-14T21:52:24.0271823Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0272269Z outputs = self.mobilebert( 2025-08-14T21:52:24.0272700Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0273154Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0273598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0274044Z layer_outputs = layer_module( 2025-08-14T21:52:24.0274483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0274940Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0275409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0275865Z self_outputs = self.self( 2025-08-14T21:52:24.0276298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0276738Z self.query(query_tensor) 2025-08-14T21:52:24.0276880Z 2025-08-14T21:52:24.0276965Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0277243Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0277480Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0277707Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0277930Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0278151Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0278374Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0278597Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0278819Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0279033Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0279258Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0279481Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0279722Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0279942Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0280160Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0280366Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0280618Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0281007Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0281365Z return mod(**inputs) 2025-08-14T21:52:24.0281788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0282264Z outputs = self.mobilebert( 2025-08-14T21:52:24.0282701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0283146Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0283597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0284046Z layer_outputs = layer_module( 2025-08-14T21:52:24.0284490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0284964Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0285444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0285939Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0286430Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0286885Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0287045Z 2025-08-14T21:52:24.0287129Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0287358Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0287578Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0287800Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0288024Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0288244Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0288466Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0288686Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0289057Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0289449Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0289801Z return mod(**inputs) 2025-08-14T21:52:24.0290229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0290680Z outputs = self.mobilebert( 2025-08-14T21:52:24.0291107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0291542Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0291971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0292462Z layer_outputs = layer_module( 2025-08-14T21:52:24.0292885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0293367Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0293849Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0294290Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0294443Z 2025-08-14T21:52:24.0294526Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0294762Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0294975Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0295195Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0295442Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0295811Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0296159Z return mod(**inputs) 2025-08-14T21:52:24.0296570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0297048Z outputs = self.mobilebert( 2025-08-14T21:52:24.0297478Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0297923Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0298360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0298882Z layer_outputs = layer_module( 2025-08-14T21:52:24.0299311Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0299840Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0300366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0300845Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0301340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0301784Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0301932Z 2025-08-14T21:52:24.0302024Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0302245Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0302472Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0302938Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0303187Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0303571Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0303920Z return mod(**inputs) 2025-08-14T21:52:24.0304339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0304772Z outputs = self.mobilebert( 2025-08-14T21:52:24.0305195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0305631Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0306052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0306489Z layer_outputs = layer_module( 2025-08-14T21:52:24.0306915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0307364Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0307910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0308341Z self_outputs = self.self( 2025-08-14T21:52:24.0308761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0309232Z self.query(query_tensor) 2025-08-14T21:52:24.0309394Z 2025-08-14T21:52:24.0309501Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0309813Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0310038Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0310294Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0310521Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0311054Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0311343Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0326867Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0326990Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0327082Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0327173Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0327259Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0327504Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0327596Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0327679Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0327761Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0327899Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0328151Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0328233Z return mod(**inputs) 2025-08-14T21:52:24.0328595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0328686Z outputs = self.mobilebert( 2025-08-14T21:52:24.0329186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0329284Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0329594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0329686Z layer_outputs = layer_module( 2025-08-14T21:52:24.0329992Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0330118Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0330417Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0330544Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0330851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0330946Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0330953Z 2025-08-14T21:52:24.0331046Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0331132Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0331213Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0331301Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0331381Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0331462Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0331550Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0331633Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0331748Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0331974Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0332048Z return mod(**inputs) 2025-08-14T21:52:24.0332424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0332506Z outputs = self.mobilebert( 2025-08-14T21:52:24.0332804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0332895Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0333201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0333280Z layer_outputs = layer_module( 2025-08-14T21:52:24.0333610Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0333741Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0334037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0334130Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0334135Z 2025-08-14T21:52:24.0334214Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0334323Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0334403Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0334482Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0334604Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0334819Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0334899Z return mod(**inputs) 2025-08-14T21:52:24.0335198Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0335276Z outputs = self.mobilebert( 2025-08-14T21:52:24.0335577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0335658Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0335955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0336032Z layer_outputs = layer_module( 2025-08-14T21:52:24.0336333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0336516Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0336811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0336949Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0337249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0337343Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0337348Z 2025-08-14T21:52:24.0337436Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0337516Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0337625Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0337845Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0337916Z return mod(**inputs) 2025-08-14T21:52:24.0338219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0338299Z outputs = self.mobilebert( 2025-08-14T21:52:24.0338616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0338701Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0339034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0339112Z layer_outputs = layer_module( 2025-08-14T21:52:24.0339413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:52:24.0339584Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:52:24.0339887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:52:24.0340025Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:52:24.0340317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:52:24.0340413Z layer_input = self.dense(hidden_states) 2025-08-14T21:52:24.0340418Z 2025-08-14T21:52:24.0340499Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0340587Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0340697Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0340928Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0341006Z return mod(**inputs) 2025-08-14T21:52:24.0341305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0341383Z outputs = self.mobilebert( 2025-08-14T21:52:24.0341700Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0341780Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0342096Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0342175Z layer_outputs = layer_module( 2025-08-14T21:52:24.0342484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0342599Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0342887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0342964Z self_outputs = self.self( 2025-08-14T21:52:24.0343261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0343337Z self.query(query_tensor) 2025-08-14T21:52:24.0343341Z 2025-08-14T21:52:24.0343428Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0343508Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0343586Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0343678Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0343757Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0343836Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0343924Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0344004Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0344092Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0344170Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0344247Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0344334Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0344412Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0344490Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0344580Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0344660Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0344768Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0344983Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0345145Z return mod(**inputs) 2025-08-14T21:52:24.0345454Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0345532Z outputs = self.mobilebert( 2025-08-14T21:52:24.0345825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0345913Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0346205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0346296Z layer_outputs = layer_module( 2025-08-14T21:52:24.0346597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0346701Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0347002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0347122Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0347437Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0347535Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0347539Z 2025-08-14T21:52:24.0347618Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0347705Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0347782Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0347861Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0347948Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0348027Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0348106Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0348196Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0348308Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0348514Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0348590Z return mod(**inputs) 2025-08-14T21:52:24.0348889Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0348971Z outputs = self.mobilebert( 2025-08-14T21:52:24.0349262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0349339Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0349679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0349758Z layer_outputs = layer_module( 2025-08-14T21:52:24.0350058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0350186Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0350476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0350574Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0350578Z 2025-08-14T21:52:24.0350657Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0350736Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0350822Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0350902Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0351017Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0351220Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0351290Z return mod(**inputs) 2025-08-14T21:52:24.0351626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0351702Z outputs = self.mobilebert( 2025-08-14T21:52:24.0351994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0352081Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0352373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0352457Z layer_outputs = layer_module( 2025-08-14T21:52:24.0352765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0352935Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0353237Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0353372Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0353670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0353781Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0353785Z 2025-08-14T21:52:24.0353865Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0353952Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0354031Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0354112Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0354231Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0354435Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0354513Z return mod(**inputs) 2025-08-14T21:52:24.0354810Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0354886Z outputs = self.mobilebert( 2025-08-14T21:52:24.0355183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0355260Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0355549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0355632Z layer_outputs = layer_module( 2025-08-14T21:52:24.0355950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0356052Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0356368Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0356449Z self_outputs = self.self( 2025-08-14T21:52:24.0356744Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0356821Z self.query(query_tensor) 2025-08-14T21:52:24.0356825Z 2025-08-14T21:52:24.0356912Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0356992Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0357071Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0357157Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0357235Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0357314Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0357401Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0357480Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0357557Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0357643Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0357754Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0357848Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0357938Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0358019Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0358108Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0358188Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0358298Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0358512Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0358583Z return mod(**inputs) 2025-08-14T21:52:24.0358895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0358980Z outputs = self.mobilebert( 2025-08-14T21:52:24.0359300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0359389Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0359678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0359775Z layer_outputs = layer_module( 2025-08-14T21:52:24.0360083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0360191Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0360514Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0360647Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0360940Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0361045Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0361050Z 2025-08-14T21:52:24.0361135Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0361218Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0361311Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0361395Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0361484Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0361568Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0361649Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0361738Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0361848Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0362057Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0362137Z return mod(**inputs) 2025-08-14T21:52:24.0362441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0362522Z outputs = self.mobilebert( 2025-08-14T21:52:24.0362822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0362905Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0363203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0363281Z layer_outputs = layer_module( 2025-08-14T21:52:24.0363582Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0363727Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0364042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0364158Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0364162Z 2025-08-14T21:52:24.0364260Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0364345Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0364433Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0364516Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0364627Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0364844Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0364914Z return mod(**inputs) 2025-08-14T21:52:24.0365239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0365317Z outputs = self.mobilebert( 2025-08-14T21:52:24.0365616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0365709Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0366009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0366086Z layer_outputs = layer_module( 2025-08-14T21:52:24.0366425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0366597Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0366907Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0367044Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0367347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0367449Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0367455Z 2025-08-14T21:52:24.0367539Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0367631Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0367743Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0367956Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0368035Z return mod(**inputs) 2025-08-14T21:52:24.0368341Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0368418Z outputs = self.mobilebert( 2025-08-14T21:52:24.0368815Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0368908Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0369218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0369299Z layer_outputs = layer_module( 2025-08-14T21:52:24.0369598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:52:24.0369785Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:52:24.0370088Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:52:24.0370217Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:52:24.0370512Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:52:24.0370599Z layer_input = self.dense(hidden_states) 2025-08-14T21:52:24.0370603Z 2025-08-14T21:52:24.0370706Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0370782Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0371498Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0371704Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0371771Z return mod(**inputs) 2025-08-14T21:52:24.0372061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0372133Z outputs = self.mobilebert( 2025-08-14T21:52:24.0372409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0372499Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0372818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0372896Z layer_outputs = layer_module( 2025-08-14T21:52:24.0373210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0373302Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0373601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0373702Z self_outputs = self.self( 2025-08-14T21:52:24.0374008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0374092Z self.query(query_tensor) 2025-08-14T21:52:24.0374095Z 2025-08-14T21:52:24.0374176Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0374265Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0374345Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0374435Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0374519Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0374597Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0374674Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0374756Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0374831Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0374907Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0374990Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0375066Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0375148Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0375223Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0375300Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0375380Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0375485Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0375686Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0375767Z return mod(**inputs) 2025-08-14T21:52:24.0376068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0376146Z outputs = self.mobilebert( 2025-08-14T21:52:24.0376448Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0376528Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0376839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0376914Z layer_outputs = layer_module( 2025-08-14T21:52:24.0377222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0377333Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0377638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0377801Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0378092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0378180Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0378184Z 2025-08-14T21:52:24.0378273Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0378350Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0378429Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0378513Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0378592Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0378695Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0378775Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0378853Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0378965Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0379174Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0379243Z return mod(**inputs) 2025-08-14T21:52:24.0379545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0379640Z outputs = self.mobilebert( 2025-08-14T21:52:24.0379953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0380031Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0380336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0380418Z layer_outputs = layer_module( 2025-08-14T21:52:24.0380724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0380852Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0381164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0381254Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0381258Z 2025-08-14T21:52:24.0381346Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0381425Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0381504Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0381594Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0381703Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0381912Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0381992Z return mod(**inputs) 2025-08-14T21:52:24.0382298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0382383Z outputs = self.mobilebert( 2025-08-14T21:52:24.0382674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0382754Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0383055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0383128Z layer_outputs = layer_module( 2025-08-14T21:52:24.0383428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0383601Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0383892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0384051Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0384368Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0384460Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0384463Z 2025-08-14T21:52:24.0384551Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0384861Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0384946Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0385024Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0385130Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0385359Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0385432Z return mod(**inputs) 2025-08-14T21:52:24.0385727Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0385813Z outputs = self.mobilebert( 2025-08-14T21:52:24.0386105Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0386211Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0386503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0386577Z layer_outputs = layer_module( 2025-08-14T21:52:24.0386874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0386966Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0387262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0387336Z self_outputs = self.self( 2025-08-14T21:52:24.0387628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0387709Z self.query(query_tensor) 2025-08-14T21:52:24.0387713Z 2025-08-14T21:52:24.0387792Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0387872Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0387961Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0388037Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0388123Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0388201Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0388279Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0388365Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0388443Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0388521Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0388606Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0388685Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0388764Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0388854Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0388932Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0389009Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0389123Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0389328Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0389403Z return mod(**inputs) 2025-08-14T21:52:24.0389696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0389772Z outputs = self.mobilebert( 2025-08-14T21:52:24.0390068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0390142Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0390470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0390545Z layer_outputs = layer_module( 2025-08-14T21:52:24.0390837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0390947Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0391239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0391357Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0391681Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0391768Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0391772Z 2025-08-14T21:52:24.0391860Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0391939Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0392017Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0392104Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0392181Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0392280Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0392369Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0392446Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0392560Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0392765Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0392835Z return mod(**inputs) 2025-08-14T21:52:24.0393140Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0393214Z outputs = self.mobilebert( 2025-08-14T21:52:24.0393508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0393594Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0393884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0393967Z layer_outputs = layer_module( 2025-08-14T21:52:24.0394259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0394384Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0394686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0394772Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0394776Z 2025-08-14T21:52:24.0394863Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0394944Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0395026Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0395111Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0395217Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0395426Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0395506Z return mod(**inputs) 2025-08-14T21:52:24.0395802Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0395878Z outputs = self.mobilebert( 2025-08-14T21:52:24.0396189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0396266Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0396570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0396685Z layer_outputs = layer_module( 2025-08-14T21:52:24.0396976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0397151Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0397449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0397587Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0397903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0397993Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0397997Z 2025-08-14T21:52:24.0398086Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0398167Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0398277Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0398490Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0398608Z return mod(**inputs) 2025-08-14T21:52:24.0398924Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0398997Z outputs = self.mobilebert( 2025-08-14T21:52:24.0399285Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0399372Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0399673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0399755Z layer_outputs = layer_module( 2025-08-14T21:52:24.0400060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:52:24.0400227Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:52:24.0400527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:52:24.0400641Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:52:24.0400943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:52:24.0401039Z layer_input = self.dense(hidden_states) 2025-08-14T21:52:24.0401045Z 2025-08-14T21:52:24.0401123Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0401208Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0401315Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0401523Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0401602Z return mod(**inputs) 2025-08-14T21:52:24.0401897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0401982Z outputs = self.mobilebert( 2025-08-14T21:52:24.0402287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0402364Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0402836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0402918Z layer_outputs = layer_module( 2025-08-14T21:52:24.0403207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0403363Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0403692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0403776Z self_outputs = self.self( 2025-08-14T21:52:24.0404072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0404148Z self.query(query_tensor) 2025-08-14T21:52:24.0404152Z 2025-08-14T21:52:24.0404244Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0404328Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0404413Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0404530Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0404616Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0404710Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0404796Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0404883Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0404976Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0405062Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0405148Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0405274Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0405356Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0405437Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0405525Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0405605Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0405724Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0405939Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0406008Z return mod(**inputs) 2025-08-14T21:52:24.0406314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0406391Z outputs = self.mobilebert( 2025-08-14T21:52:24.0406691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0406777Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0407079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0407160Z layer_outputs = layer_module( 2025-08-14T21:52:24.0407456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0407562Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0407869Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0407990Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0408293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0408392Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0408396Z 2025-08-14T21:52:24.0408480Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0408569Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0408651Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0408792Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0408889Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0408969Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0409050Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0409142Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0409254Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0409474Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0409567Z return mod(**inputs) 2025-08-14T21:52:24.0409896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0409986Z outputs = self.mobilebert( 2025-08-14T21:52:24.0410288Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0410368Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0410673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0410743Z layer_outputs = layer_module( 2025-08-14T21:52:24.0411043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0411162Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0411436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0411531Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0411535Z 2025-08-14T21:52:24.0411610Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0411714Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0411789Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0411866Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0411977Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0412173Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0412240Z return mod(**inputs) 2025-08-14T21:52:24.0412527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0412597Z outputs = self.mobilebert( 2025-08-14T21:52:24.0412893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0412970Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0413258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0413340Z layer_outputs = layer_module( 2025-08-14T21:52:24.0413631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0413796Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0414097Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0414231Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0414536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0414628Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0414632Z 2025-08-14T21:52:24.0414714Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0414805Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0414885Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0414973Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0415085Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0415296Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0415373Z return mod(**inputs) 2025-08-14T21:52:24.0415689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0415764Z outputs = self.mobilebert( 2025-08-14T21:52:24.0416088Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0416194Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0416499Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0416577Z layer_outputs = layer_module( 2025-08-14T21:52:24.0416882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0416980Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0417302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0417380Z self_outputs = self.self( 2025-08-14T21:52:24.0417694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0417772Z self.query(query_tensor) 2025-08-14T21:52:24.0417777Z 2025-08-14T21:52:24.0417868Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0417949Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0418029Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0418139Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0418218Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0418298Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0418385Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0418465Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0418553Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0418635Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0418716Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0418804Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0418884Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0418963Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0419053Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0419134Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0419244Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0419463Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0419534Z return mod(**inputs) 2025-08-14T21:52:24.0419849Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0419924Z outputs = self.mobilebert( 2025-08-14T21:52:24.0420232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0420319Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0420625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0420702Z layer_outputs = layer_module( 2025-08-14T21:52:24.0421017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0421121Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0421425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0421545Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0421850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0421947Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0421951Z 2025-08-14T21:52:24.0422032Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0422119Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0422224Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0422321Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0422409Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0422490Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0422573Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0422662Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0422775Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0422988Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0423067Z return mod(**inputs) 2025-08-14T21:52:24.0423398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0423484Z outputs = self.mobilebert( 2025-08-14T21:52:24.0423783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0423864Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0424175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0424253Z layer_outputs = layer_module( 2025-08-14T21:52:24.0424573Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0424708Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0425018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0425116Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0425120Z 2025-08-14T21:52:24.0425202Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0425284Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0425374Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0425457Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0425576Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0425786Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0425860Z return mod(**inputs) 2025-08-14T21:52:24.0426170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0426248Z outputs = self.mobilebert( 2025-08-14T21:52:24.0426546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0426632Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0426932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0427015Z layer_outputs = layer_module( 2025-08-14T21:52:24.0427317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0427485Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0427793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0427927Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0428231Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0428324Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0428328Z 2025-08-14T21:52:24.0428410Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0428500Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0428610Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0428861Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0428942Z return mod(**inputs) 2025-08-14T21:52:24.0429247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0429332Z outputs = self.mobilebert( 2025-08-14T21:52:24.0429631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0429710Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0430029Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0430108Z layer_outputs = layer_module( 2025-08-14T21:52:24.0430402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:52:24.0430586Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:52:24.0430882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:52:24.0431034Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:52:24.0431333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:52:24.0431422Z layer_input = self.dense(hidden_states) 2025-08-14T21:52:24.0431426Z 2025-08-14T21:52:24.0431515Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0431598Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0431715Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0431926Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0431999Z return mod(**inputs) 2025-08-14T21:52:24.0432307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0432384Z outputs = self.mobilebert( 2025-08-14T21:52:24.0432684Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0432771Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0433070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0433160Z layer_outputs = layer_module( 2025-08-14T21:52:24.0433461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0433553Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0433862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0433940Z self_outputs = self.self( 2025-08-14T21:52:24.0434244Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0434321Z self.query(query_tensor) 2025-08-14T21:52:24.0434325Z 2025-08-14T21:52:24.0434408Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0434495Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0434575Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0434671Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0434752Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0434832Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0434918Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0434997Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0435077Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0435202Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0435309Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0435392Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0435474Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0435565Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0435645Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0435726Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0435846Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0436056Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0436134Z return mod(**inputs) 2025-08-14T21:52:24.0436463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0436542Z outputs = self.mobilebert( 2025-08-14T21:52:24.0436852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0436935Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0437243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0437347Z layer_outputs = layer_module( 2025-08-14T21:52:24.0437651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0437760Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0438065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0438182Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0438490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0438581Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0438585Z 2025-08-14T21:52:24.0438671Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0438750Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0438829Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0438916Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0438994Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0439072Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0439158Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0439237Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0439342Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0439555Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0439622Z return mod(**inputs) 2025-08-14T21:52:24.0439926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0440003Z outputs = self.mobilebert( 2025-08-14T21:52:24.0440288Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0440374Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0440674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0440755Z layer_outputs = layer_module( 2025-08-14T21:52:24.0441059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0441183Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0441485Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0441603Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0441623Z 2025-08-14T21:52:24.0441706Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0441792Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0441869Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0441956Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0442063Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0442266Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0442343Z return mod(**inputs) 2025-08-14T21:52:24.0442660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0442737Z outputs = self.mobilebert( 2025-08-14T21:52:24.0443043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0443122Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0443422Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0443499Z layer_outputs = layer_module( 2025-08-14T21:52:24.0443856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0444029Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0444325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0444467Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0444767Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0444859Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0444863Z 2025-08-14T21:52:24.0444955Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0445038Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0445119Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0445212Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0445323Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0445544Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0445614Z return mod(**inputs) 2025-08-14T21:52:24.0445925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0446012Z outputs = self.mobilebert( 2025-08-14T21:52:24.0446312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0446395Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0446709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0446786Z layer_outputs = layer_module( 2025-08-14T21:52:24.0447094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0447186Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0447486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0447573Z self_outputs = self.self( 2025-08-14T21:52:24.0447875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0447961Z self.query(query_tensor) 2025-08-14T21:52:24.0447964Z 2025-08-14T21:52:24.0448070Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0448167Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0448257Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0448338Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0448420Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0448508Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0448590Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0448671Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0448831Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0448923Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0449015Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0449122Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0449207Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0449299Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0449380Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0449466Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0449590Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0449807Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0449900Z return mod(**inputs) 2025-08-14T21:52:24.0450217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0450294Z outputs = self.mobilebert( 2025-08-14T21:52:24.0450609Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0450686Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0450979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0451064Z layer_outputs = layer_module( 2025-08-14T21:52:24.0451354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0451464Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0451760Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0451883Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0452190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0452280Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0452284Z 2025-08-14T21:52:24.0452368Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0452459Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0452540Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0452628Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0452709Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0452793Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0452881Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0452961Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0453073Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0453292Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0453364Z return mod(**inputs) 2025-08-14T21:52:24.0453663Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0453746Z outputs = self.mobilebert( 2025-08-14T21:52:24.0454047Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0454130Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0454458Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0454563Z layer_outputs = layer_module( 2025-08-14T21:52:24.0454868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0454999Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0455305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0455394Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0455398Z 2025-08-14T21:52:24.0455497Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0455587Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0455667Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0455748Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0455864Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0456080Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0456157Z return mod(**inputs) 2025-08-14T21:52:24.0456462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0456557Z outputs = self.mobilebert( 2025-08-14T21:52:24.0456866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0456944Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0457242Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0457327Z layer_outputs = layer_module( 2025-08-14T21:52:24.0457623Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0457805Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0458101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0458237Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0458552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0458642Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0458646Z 2025-08-14T21:52:24.0458735Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0458820Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0458932Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0459150Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0459224Z return mod(**inputs) 2025-08-14T21:52:24.0459529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0459612Z outputs = self.mobilebert( 2025-08-14T21:52:24.0459911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0459996Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0460293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0460369Z layer_outputs = layer_module( 2025-08-14T21:52:24.0460679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:52:24.0460853Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:52:24.0461267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:52:24.0461390Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:52:24.0461702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:52:24.0461800Z layer_input = self.dense(hidden_states) 2025-08-14T21:52:24.0461804Z 2025-08-14T21:52:24.0461886Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0461968Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0462087Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0462317Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0462397Z return mod(**inputs) 2025-08-14T21:52:24.0462703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0462785Z outputs = self.mobilebert( 2025-08-14T21:52:24.0463097Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0463209Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0463526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0463601Z layer_outputs = layer_module( 2025-08-14T21:52:24.0463903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0464009Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0464320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0464398Z self_outputs = self.self( 2025-08-14T21:52:24.0464710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0464790Z self.query(query_tensor) 2025-08-14T21:52:24.0464794Z 2025-08-14T21:52:24.0464886Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0464967Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0465047Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0465132Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0465213Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0465293Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0465380Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0465464Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0465546Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0465632Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0465713Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0465803Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0465884Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0465964Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0466049Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0466131Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0466241Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0466464Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0466536Z return mod(**inputs) 2025-08-14T21:52:24.0466848Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0466927Z outputs = self.mobilebert( 2025-08-14T21:52:24.0467226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0467313Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0467661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0467740Z layer_outputs = layer_module( 2025-08-14T21:52:24.0468049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0468152Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0468467Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0468587Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0468897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0469001Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0469006Z 2025-08-14T21:52:24.0469089Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0469179Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0469261Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0469341Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0469447Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0469528Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0469609Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0469695Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0469805Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0470015Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0470096Z return mod(**inputs) 2025-08-14T21:52:24.0470399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0470483Z outputs = self.mobilebert( 2025-08-14T21:52:24.0470795Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0470874Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0471184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0471261Z layer_outputs = layer_module( 2025-08-14T21:52:24.0471555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0471690Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0471994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0472089Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0472093Z 2025-08-14T21:52:24.0472176Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0472257Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0472346Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0472425Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0472538Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0472758Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0472829Z return mod(**inputs) 2025-08-14T21:52:24.0473148Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0473225Z outputs = self.mobilebert( 2025-08-14T21:52:24.0473526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0473616Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0473936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0474037Z layer_outputs = layer_module( 2025-08-14T21:52:24.0474338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0474510Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0474820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0474954Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0475273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0475382Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0475386Z 2025-08-14T21:52:24.0475467Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0475556Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0475637Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0475715Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0475829Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0476063Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0476133Z return mod(**inputs) 2025-08-14T21:52:24.0476435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0476512Z outputs = self.mobilebert( 2025-08-14T21:52:24.0476817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0476896Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0477204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0477288Z layer_outputs = layer_module( 2025-08-14T21:52:24.0477576Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0477674Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0477966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0478039Z self_outputs = self.self( 2025-08-14T21:52:24.0478336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0478412Z self.query(query_tensor) 2025-08-14T21:52:24.0478415Z 2025-08-14T21:52:24.0478498Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0478585Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0478668Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0478757Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0478837Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0478917Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0479019Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0479096Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0479174Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0479260Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0479338Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0479416Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0479502Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0479583Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0479669Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0479747Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0479855Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0480109Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0480180Z return mod(**inputs) 2025-08-14T21:52:24.0480473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0480560Z outputs = self.mobilebert( 2025-08-14T21:52:24.0480849Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0480935Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0481264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0481342Z layer_outputs = layer_module( 2025-08-14T21:52:24.0481650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0481757Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0482058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0482190Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0482518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0482615Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0482618Z 2025-08-14T21:52:24.0482702Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0482784Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0482876Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0482958Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0483039Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0483128Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0483209Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0483298Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0483407Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0483618Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0483696Z return mod(**inputs) 2025-08-14T21:52:24.0483999Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0484073Z outputs = self.mobilebert( 2025-08-14T21:52:24.0484382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0484461Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0484765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0484844Z layer_outputs = layer_module( 2025-08-14T21:52:24.0485144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0485279Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0485576Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0485672Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0485676Z 2025-08-14T21:52:24.0485757Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0485838Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0485927Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0486007Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0486116Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0486333Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0486431Z return mod(**inputs) 2025-08-14T21:52:24.0486751Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0486834Z outputs = self.mobilebert( 2025-08-14T21:52:24.0487131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0487217Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0487510Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0487586Z layer_outputs = layer_module( 2025-08-14T21:52:24.0487915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0488087Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0488397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0488533Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0489188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0489295Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0489300Z 2025-08-14T21:52:24.0489384Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0489477Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0489593Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0489806Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0489887Z return mod(**inputs) 2025-08-14T21:52:24.0490194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0490276Z outputs = self.mobilebert( 2025-08-14T21:52:24.0490587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0490669Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0490976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0491055Z layer_outputs = layer_module( 2025-08-14T21:52:24.0491356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:52:24.0491537Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:52:24.0491840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:52:24.0491964Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:52:24.0492273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:52:24.0492363Z layer_input = self.dense(hidden_states) 2025-08-14T21:52:24.0492367Z 2025-08-14T21:52:24.0492460Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0492540Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0492651Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0492871Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0492946Z return mod(**inputs) 2025-08-14T21:52:24.0493259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0493345Z outputs = self.mobilebert( 2025-08-14T21:52:24.0493689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0493778Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0494073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0494152Z layer_outputs = layer_module( 2025-08-14T21:52:24.0494461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0494552Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0494892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0494970Z self_outputs = self.self( 2025-08-14T21:52:24.0495268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0495356Z self.query(query_tensor) 2025-08-14T21:52:24.0495361Z 2025-08-14T21:52:24.0495446Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0495535Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0495639Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0495720Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0495810Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0495890Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0495971Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0496060Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0496141Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0496223Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0496312Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0496394Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0496475Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0496567Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0496650Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0496737Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0496847Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0497058Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0497136Z return mod(**inputs) 2025-08-14T21:52:24.0497441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0497518Z outputs = self.mobilebert( 2025-08-14T21:52:24.0497828Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0497907Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0498221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0498300Z layer_outputs = layer_module( 2025-08-14T21:52:24.0498599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0498718Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0499019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0499143Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0499436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0499525Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0499529Z 2025-08-14T21:52:24.0499621Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0499703Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0499811Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0499921Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0500005Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0500084Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0500173Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0500253Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0500369Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0500579Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0500648Z return mod(**inputs) 2025-08-14T21:52:24.0500979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0501057Z outputs = self.mobilebert( 2025-08-14T21:52:24.0501367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0501458Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0501759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0501869Z layer_outputs = layer_module( 2025-08-14T21:52:24.0502173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0502299Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0502863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0502970Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0502974Z 2025-08-14T21:52:24.0503065Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0503148Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0503234Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0503328Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0503445Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0503658Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0503741Z return mod(**inputs) 2025-08-14T21:52:24.0504050Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0504137Z outputs = self.mobilebert( 2025-08-14T21:52:24.0504435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0504517Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0504829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0504905Z layer_outputs = layer_module( 2025-08-14T21:52:24.0505197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0505380Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0505691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0505836Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0506134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0506226Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0506230Z 2025-08-14T21:52:24.0506321Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0506405Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0506495Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0506649Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0506787Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0507004Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0507074Z return mod(**inputs) 2025-08-14T21:52:24.0507367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0507450Z outputs = self.mobilebert( 2025-08-14T21:52:24.0507739Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0507849Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0508143Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0508219Z layer_outputs = layer_module( 2025-08-14T21:52:24.0508523Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0508614Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0508936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0509017Z self_outputs = self.self( 2025-08-14T21:52:24.0509306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0509386Z self.query(query_tensor) 2025-08-14T21:52:24.0509389Z 2025-08-14T21:52:24.0509471Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0509551Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0509635Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0509714Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0509794Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0509880Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0509957Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0510044Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0510123Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0510200Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0510284Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0510362Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0510440Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0510526Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0510604Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0510682Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0510796Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0511002Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0511081Z return mod(**inputs) 2025-08-14T21:52:24.0511376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0511449Z outputs = self.mobilebert( 2025-08-14T21:52:24.0511749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0511828Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0512126Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0512210Z layer_outputs = layer_module( 2025-08-14T21:52:24.0512510Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0512619Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0512939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0513100Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0513397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0513486Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0513490Z 2025-08-14T21:52:24.0513576Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0513655Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0513732Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0513815Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0513907Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0513988Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0514075Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0514153Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0514262Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0514478Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0514547Z return mod(**inputs) 2025-08-14T21:52:24.0514866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0514940Z outputs = self.mobilebert( 2025-08-14T21:52:24.0515233Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0515320Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0515615Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0515698Z layer_outputs = layer_module( 2025-08-14T21:52:24.0515993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0516126Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0516428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0516519Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0516522Z 2025-08-14T21:52:24.0516603Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0516692Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0516772Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0516858Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0516971Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0517181Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0517261Z return mod(**inputs) 2025-08-14T21:52:24.0517564Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0517641Z outputs = self.mobilebert( 2025-08-14T21:52:24.0517946Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0518029Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0518340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0518414Z layer_outputs = layer_module( 2025-08-14T21:52:24.0518700Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0518878Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0519177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0519354Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0519656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0519750Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0519754Z 2025-08-14T21:52:24.0519845Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0519927Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0520038Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0520258Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0520349Z return mod(**inputs) 2025-08-14T21:52:24.0520659Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0520735Z outputs = self.mobilebert( 2025-08-14T21:52:24.0521035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0521122Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0521447Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0521524Z layer_outputs = layer_module( 2025-08-14T21:52:24.0521829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 496, in forward 2025-08-14T21:52:24.0522000Z query_tensor, key_tensor, value_tensor, layer_input = self.bottleneck(hidden_states) 2025-08-14T21:52:24.0522309Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 440, in forward 2025-08-14T21:52:24.0522428Z bottlenecked_hidden_states = self.input(hidden_states) 2025-08-14T21:52:24.0522732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 409, in forward 2025-08-14T21:52:24.0522830Z layer_input = self.dense(hidden_states) 2025-08-14T21:52:24.0522834Z 2025-08-14T21:52:24.0522918Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0523005Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0523116Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0523326Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0523405Z return mod(**inputs) 2025-08-14T21:52:24.0523708Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0523784Z outputs = self.mobilebert( 2025-08-14T21:52:24.0524092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0524171Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0524480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0524558Z layer_outputs = layer_module( 2025-08-14T21:52:24.0524859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 500, in forward 2025-08-14T21:52:24.0524960Z self_attention_outputs = self.attention( 2025-08-14T21:52:24.0525260Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 334, in forward 2025-08-14T21:52:24.0525344Z self_outputs = self.self( 2025-08-14T21:52:24.0525644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 245, in forward 2025-08-14T21:52:24.0525720Z self.query(query_tensor) 2025-08-14T21:52:24.0525748Z 2025-08-14T21:52:24.0525840Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0525940Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0526024Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0526112Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0526196Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0526277Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0526367Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0526449Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0526537Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0526616Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0526695Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0526800Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0526883Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0526962Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0527050Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0527132Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0527244Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0527461Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0527554Z return mod(**inputs) 2025-08-14T21:52:24.0527868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0527945Z outputs = self.mobilebert( 2025-08-14T21:52:24.0528244Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0528334Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0528635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0528795Z layer_outputs = layer_module( 2025-08-14T21:52:24.0529114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 515, in forward 2025-08-14T21:52:24.0529219Z attention_output = ffn_module(attention_output) 2025-08-14T21:52:24.0529529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 469, in forward 2025-08-14T21:52:24.0529654Z intermediate_output = self.intermediate(hidden_states) 2025-08-14T21:52:24.0529955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0530053Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0530057Z 2025-08-14T21:52:24.0530142Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0530235Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0530317Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0530398Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0530490Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0530573Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0530654Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0530782Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0530919Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0531220Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0531297Z return mod(**inputs) 2025-08-14T21:52:24.0531645Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0531760Z outputs = self.mobilebert( 2025-08-14T21:52:24.0532247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0532332Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0532692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0532771Z layer_outputs = layer_module( 2025-08-14T21:52:24.0533076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 518, in forward 2025-08-14T21:52:24.0533207Z intermediate_output = self.intermediate(attention_output) 2025-08-14T21:52:24.0533508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 359, in forward 2025-08-14T21:52:24.0533608Z hidden_states = self.dense(hidden_states) 2025-08-14T21:52:24.0533612Z 2025-08-14T21:52:24.0533717Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0533802Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0533890Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0533973Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0534095Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0534308Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0534378Z return mod(**inputs) 2025-08-14T21:52:24.0534688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1242, in forward 2025-08-14T21:52:24.0534789Z outputs = self.mobilebert( 2025-08-14T21:52:24.0535094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 794, in forward 2025-08-14T21:52:24.0535183Z encoder_outputs = self.encoder( 2025-08-14T21:52:24.0535488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 557, in forward 2025-08-14T21:52:24.0535574Z layer_outputs = layer_module( 2025-08-14T21:52:24.0535876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 519, in forward 2025-08-14T21:52:24.0536051Z layer_output = self.output(intermediate_output, attention_output, hidden_states) 2025-08-14T21:52:24.0536370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 398, in forward 2025-08-14T21:52:24.0536504Z layer_output = self.bottleneck(layer_output, residual_tensor_2) 2025-08-14T21:52:24.0536804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 372, in forward 2025-08-14T21:52:24.0536891Z layer_outputs = self.dense(hidden_states) 2025-08-14T21:52:24.0536895Z 2025-08-14T21:52:24.0536976Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0537066Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0537174Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0537380Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0537459Z return mod(**inputs) 2025-08-14T21:52:24.0537757Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1256, in forward 2025-08-14T21:52:24.0537860Z logits = self.qa_outputs(sequence_output) 2025-08-14T21:52:24.0537864Z 2025-08-14T21:52:24.0537944Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0538024Z cudagraph partition due to non gpu ops 2025-08-14T21:52:24.0538138Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0538343Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0538412Z return mod(**inputs) 2025-08-14T21:52:24.0538720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1274, in forward 2025-08-14T21:52:24.0538834Z start_loss = loss_fct(start_logits, start_positions) 2025-08-14T21:52:24.0538875Z 2025-08-14T21:52:24.0538989Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:52:24.0539213Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:52:24.0539284Z return mod(**inputs) 2025-08-14T21:52:24.0539590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1275, in forward 2025-08-14T21:52:24.0539687Z end_loss = loss_fct(end_logits, end_positions) 2025-08-14T21:52:24.0539690Z 2025-08-14T21:52:37.7104499Z Compilation time (from dynamo_timed): 52.510539978 2025-08-14T21:52:37.7105047Z pass 2025-08-14T21:52:37.7106066Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:52:37.7106919Z TIMING: _recursive_pre_grad_passes:0.17348 _recursive_joint_graph_passes:1.63203 _recursive_post_grad_passes:0.27059 async_compile.wait:0.40377 code_gen:10.78106 inductor_compile:14.92131 backend_compile:39.31524 gc:0.00088 entire_frame_compile:52.51054 total_wall_time:52.51054 2025-08-14T21:52:37.7108022Z STATS: call_* op count: 1455 | FakeTensorMode.__torch_dispatch__:114670 | FakeTensor.__torch_dispatch__:10830 | ProxyTorchDispatchMode.__torch_dispatch__:31013 2025-08-14T21:52:37.7108650Z Dynamo produced 1 graphs covering 1455 ops with 0 graph breaks (0 unique) 2025-08-14T21:52:44.7239827Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:52:44.7240847Z from pkg_resources import resource_filename 2025-08-14T21:52:45.3609183Z 2025-08-14T21:52:47.1867165Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:52:47.1868204Z loading model: 0it [00:01, ?it/s] 2025-08-14T21:52:47.1869009Z cpu eval OPTForCausalLM 2025-08-14T21:52:49.0939226Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:52:49.5472865Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:52:50.0244211Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:53:01.1345098Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1345432Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1345664Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1345899Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1346131Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1346382Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1346618Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1346845Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1347079Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1348068Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1348634Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1348871Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1349090Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1349320Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1349541Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1349769Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1349982Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1350202Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1350419Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1350628Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1350884Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1352413Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1352766Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1353222Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1354037Z return mod(**inputs) 2025-08-14T21:53:01.1354425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1354815Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1355216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1355692Z outputs = self.model.decoder( 2025-08-14T21:53:01.1356120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1356506Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1356988Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1357398Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1357775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1358163Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1358571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1359065Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1359536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1359974Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1360458Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:01.1360979Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:01.1361180Z 2025-08-14T21:53:01.1361309Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1361715Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1362076Z return mod(**inputs) 2025-08-14T21:53:01.1362429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1362854Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1363277Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1363697Z outputs = self.model.decoder( 2025-08-14T21:53:01.1364085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1364463Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1364881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1365298Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1365677Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1366076Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1366484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1366918Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1367356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1367798Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1368282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:01.1369071Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:01.1369259Z 2025-08-14T21:53:01.1369352Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1369630Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1369919Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1370314Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1370691Z return mod(**inputs) 2025-08-14T21:53:01.1371059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1371447Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1371845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1372255Z outputs = self.model.decoder( 2025-08-14T21:53:01.1372654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1373041Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1373453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1373882Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1374274Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1374687Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1375098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1375537Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1375705Z 2025-08-14T21:53:01.1375815Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1376211Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1376582Z return mod(**inputs) 2025-08-14T21:53:01.1376958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1377335Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1377741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1378153Z outputs = self.model.decoder( 2025-08-14T21:53:01.1378521Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1378911Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1379316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1379721Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1380095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1380486Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1380897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1381353Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1381521Z 2025-08-14T21:53:01.1381634Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1382033Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1382388Z return mod(**inputs) 2025-08-14T21:53:01.1382740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1383127Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1383536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1383946Z outputs = self.model.decoder( 2025-08-14T21:53:01.1384314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1384720Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1385143Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1385541Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1385916Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1386306Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1386712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1387116Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1387269Z 2025-08-14T21:53:01.1387401Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1387794Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1388146Z return mod(**inputs) 2025-08-14T21:53:01.1388497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1388877Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1389303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1389733Z outputs = self.model.decoder( 2025-08-14T21:53:01.1390102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1390483Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1390889Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1391290Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1391667Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1392064Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1392474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1392885Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1393040Z 2025-08-14T21:53:01.1393129Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1393363Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1393584Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1393807Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1394030Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1394244Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1394469Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1394695Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1394915Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1395131Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1395538Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1395936Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1396272Z return mod(**inputs) 2025-08-14T21:53:01.1396619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1396992Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1397382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1397785Z outputs = self.model.decoder( 2025-08-14T21:53:01.1398163Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1398549Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1398947Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1399365Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1399809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1400215Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1400608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1401045Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1401475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1401909Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1402418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:01.1403228Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:01.1403434Z 2025-08-14T21:53:01.1403560Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1403953Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1404330Z return mod(**inputs) 2025-08-14T21:53:01.1404750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1405138Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1405536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1405949Z outputs = self.model.decoder( 2025-08-14T21:53:01.1406324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1406701Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1407115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1407528Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1407908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1408298Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1408793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1409315Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1409748Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1410191Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1410680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:01.1411185Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:01.1411360Z 2025-08-14T21:53:01.1411447Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1411678Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1411928Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1412314Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1412654Z return mod(**inputs) 2025-08-14T21:53:01.1413013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1413398Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1413798Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1414209Z outputs = self.model.decoder( 2025-08-14T21:53:01.1414583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1414973Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1415449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1415847Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1416228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1416626Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1417042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1417486Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1417657Z 2025-08-14T21:53:01.1417812Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1418200Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1418559Z return mod(**inputs) 2025-08-14T21:53:01.1418916Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1419293Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1419701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1420151Z outputs = self.model.decoder( 2025-08-14T21:53:01.1420530Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1420909Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1421318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1421741Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1422119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1422526Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1422937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1423370Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1423535Z 2025-08-14T21:53:01.1423648Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1424039Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1424391Z return mod(**inputs) 2025-08-14T21:53:01.1424746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1425121Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1425526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1425956Z outputs = self.model.decoder( 2025-08-14T21:53:01.1426324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1426711Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1427115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1427526Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1427894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1428287Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1428696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1429107Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1429262Z 2025-08-14T21:53:01.1429373Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1429770Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1430140Z return mod(**inputs) 2025-08-14T21:53:01.1430495Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1430868Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1431261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1431652Z outputs = self.model.decoder( 2025-08-14T21:53:01.1432004Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1432371Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1432779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1433171Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1433545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1433940Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1434348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1434809Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1434961Z 2025-08-14T21:53:01.1435050Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1435279Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1435497Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1435721Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1435942Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1436159Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1436380Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1436601Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1436825Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1437040Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1437294Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1437686Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1438035Z return mod(**inputs) 2025-08-14T21:53:01.1438396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1438782Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1439187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1439595Z outputs = self.model.decoder( 2025-08-14T21:53:01.1439969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1440350Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1440742Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1441152Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1441531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1441911Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1442308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1442740Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1443168Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1443608Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1444092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:01.1444622Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:01.1444846Z 2025-08-14T21:53:01.1444985Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1445370Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1445727Z return mod(**inputs) 2025-08-14T21:53:01.1446093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1446472Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1446872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1447278Z outputs = self.model.decoder( 2025-08-14T21:53:01.1447669Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1448050Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1448455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1449043Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1449428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1449838Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1450251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1450689Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1451125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1451554Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1452038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:01.1452539Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:01.1452718Z 2025-08-14T21:53:01.1452809Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1453043Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1453304Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1453698Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1454051Z return mod(**inputs) 2025-08-14T21:53:01.1454415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1454803Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1455204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1455618Z outputs = self.model.decoder( 2025-08-14T21:53:01.1455996Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1456386Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1456782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1457193Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1457591Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1457977Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1458388Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1458820Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1458987Z 2025-08-14T21:53:01.1459104Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1459489Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1459867Z return mod(**inputs) 2025-08-14T21:53:01.1460282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1460669Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1461065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1461477Z outputs = self.model.decoder( 2025-08-14T21:53:01.1461857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1462218Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1462629Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1463025Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1463393Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1463767Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1464160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1464622Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1464786Z 2025-08-14T21:53:01.1464896Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1465284Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1465634Z return mod(**inputs) 2025-08-14T21:53:01.1465986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1466367Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1466754Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1467150Z outputs = self.model.decoder( 2025-08-14T21:53:01.1467503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1467871Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1468264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1468654Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1469017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1469412Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1469829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1470253Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1470400Z 2025-08-14T21:53:01.1470510Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1470908Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1471253Z return mod(**inputs) 2025-08-14T21:53:01.1471592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1471965Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1472375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1472790Z outputs = self.model.decoder( 2025-08-14T21:53:01.1473154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1473536Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1473941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1474347Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1474763Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1475161Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1475570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1475975Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1476129Z 2025-08-14T21:53:01.1476239Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1476624Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1476972Z return mod(**inputs) 2025-08-14T21:53:01.1477334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1477718Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1478124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1478537Z outputs = self.model.decoder( 2025-08-14T21:53:01.1478912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1479316Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1479721Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1480120Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1480500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1480899Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1481302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 291, in forward 2025-08-14T21:53:01.1481784Z hidden_states = (residual + hidden_states).view(hidden_states_shape) 2025-08-14T21:53:01.1481996Z 2025-08-14T21:53:01.1482083Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1482317Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1482542Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1482766Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1482991Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1483208Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1483430Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1483655Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1483868Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1484114Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1484367Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1484765Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1485135Z return mod(**inputs) 2025-08-14T21:53:01.1485495Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1485885Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1486288Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1486704Z outputs = self.model.decoder( 2025-08-14T21:53:01.1487083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1487469Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1487871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1488282Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1488660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1489163Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1489633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1490078Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1490517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1490950Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1491439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:01.1491966Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:01.1492164Z 2025-08-14T21:53:01.1492302Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1492692Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1493044Z return mod(**inputs) 2025-08-14T21:53:01.1493399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1493774Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1494181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1494649Z outputs = self.model.decoder( 2025-08-14T21:53:01.1495075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1495457Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1495865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1496275Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1496648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1497044Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1497461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1497899Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1498327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1498751Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1499236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:01.1499734Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:01.1499914Z 2025-08-14T21:53:01.1500003Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1500236Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1500491Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1500883Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1501234Z return mod(**inputs) 2025-08-14T21:53:01.1501589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1501976Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1502371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1503003Z outputs = self.model.decoder( 2025-08-14T21:53:01.1503389Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1503773Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1504181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1504604Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1505076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1505467Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1505882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1506310Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1506478Z 2025-08-14T21:53:01.1506590Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1506996Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1507365Z return mod(**inputs) 2025-08-14T21:53:01.1507756Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1508141Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1508560Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1508970Z outputs = self.model.decoder( 2025-08-14T21:53:01.1509334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1509734Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1510125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1510522Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1510884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1511267Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1511682Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1512114Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1512282Z 2025-08-14T21:53:01.1512394Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1512785Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1513140Z return mod(**inputs) 2025-08-14T21:53:01.1513489Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1513873Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1514282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1514745Z outputs = self.model.decoder( 2025-08-14T21:53:01.1515114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1515498Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1515909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1516324Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1516714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1517120Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1517526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1517934Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1518091Z 2025-08-14T21:53:01.1518203Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1518594Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1518936Z return mod(**inputs) 2025-08-14T21:53:01.1519294Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1519680Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1520121Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1520520Z outputs = self.model.decoder( 2025-08-14T21:53:01.1520896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1521278Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1521680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1522089Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1522496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1522900Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1523306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1523728Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1523895Z 2025-08-14T21:53:01.1523983Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1524213Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1524535Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1524765Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1524987Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1525200Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1525422Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1525646Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1525857Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1526084Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1526335Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1526721Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1527074Z return mod(**inputs) 2025-08-14T21:53:01.1527433Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1527834Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1528236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1528645Z outputs = self.model.decoder( 2025-08-14T21:53:01.1529151Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1529534Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1529930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1530336Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1530715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1531103Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1531519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1531954Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1532388Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1532815Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1533302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:01.1533813Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:01.1534005Z 2025-08-14T21:53:01.1534123Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1534495Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1534872Z return mod(**inputs) 2025-08-14T21:53:01.1535243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1535631Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1536024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1536414Z outputs = self.model.decoder( 2025-08-14T21:53:01.1536774Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1537134Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1537544Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1537937Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1538298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1538681Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1539075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1539520Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1539927Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1540345Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1540811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:01.1541289Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:01.1541457Z 2025-08-14T21:53:01.1541543Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1541768Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1542021Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1542392Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1542733Z return mod(**inputs) 2025-08-14T21:53:01.1543082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1543447Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1543838Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1544231Z outputs = self.model.decoder( 2025-08-14T21:53:01.1544593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1544955Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1545348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1545787Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1546161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1546548Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1546947Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1547371Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1547528Z 2025-08-14T21:53:01.1547644Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1548004Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1548325Z return mod(**inputs) 2025-08-14T21:53:01.1548650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1549003Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1549455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1549861Z outputs = self.model.decoder( 2025-08-14T21:53:01.1550221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1550595Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1550990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1551386Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1551767Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1552163Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1552587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1553012Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1553178Z 2025-08-14T21:53:01.1553290Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1553677Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1554041Z return mod(**inputs) 2025-08-14T21:53:01.1554376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1554747Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1555140Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1555548Z outputs = self.model.decoder( 2025-08-14T21:53:01.1555910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1556292Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1556695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1557095Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1557474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1557856Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1558253Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1558659Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1558815Z 2025-08-14T21:53:01.1558928Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1559322Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1559669Z return mod(**inputs) 2025-08-14T21:53:01.1560025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1560410Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1560814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1561222Z outputs = self.model.decoder( 2025-08-14T21:53:01.1561607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1561980Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1562362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1562762Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1563137Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1563538Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1563970Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1564403Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1564550Z 2025-08-14T21:53:01.1564667Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1565052Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1565392Z return mod(**inputs) 2025-08-14T21:53:01.1565744Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1566128Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1566538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1566961Z outputs = self.model.decoder( 2025-08-14T21:53:01.1567335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1567722Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1568116Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1568527Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1569027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1569433Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1569852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 291, in forward 2025-08-14T21:53:01.1570333Z hidden_states = (residual + hidden_states).view(hidden_states_shape) 2025-08-14T21:53:01.1570539Z 2025-08-14T21:53:01.1570637Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1570861Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1571089Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1571318Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1571537Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1571761Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1571982Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1572207Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1572418Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1572637Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1572889Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1573280Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1573628Z return mod(**inputs) 2025-08-14T21:53:01.1573989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1574369Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1574836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1575264Z outputs = self.model.decoder( 2025-08-14T21:53:01.1575634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1576013Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1576427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1576844Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1577217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1577612Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1578032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1578475Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1579009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1579460Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1579945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:01.1580470Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:01.1580670Z 2025-08-14T21:53:01.1580783Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1581185Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1581594Z return mod(**inputs) 2025-08-14T21:53:01.1581935Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1582308Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1582698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1583095Z outputs = self.model.decoder( 2025-08-14T21:53:01.1583449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1583870Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1584273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1584673Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1585050Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1585447Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1585854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1586267Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1586686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1587103Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1587574Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:01.1588085Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:01.1588262Z 2025-08-14T21:53:01.1588349Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1588577Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1588817Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1589197Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1589539Z return mod(**inputs) 2025-08-14T21:53:01.1589884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1590252Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1590644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1591043Z outputs = self.model.decoder( 2025-08-14T21:53:01.1591397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1591770Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1592171Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1592584Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1592957Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1593351Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1593784Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1594337Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1594522Z 2025-08-14T21:53:01.1594632Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1595013Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1595356Z return mod(**inputs) 2025-08-14T21:53:01.1595695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1596070Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1596490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1596890Z outputs = self.model.decoder( 2025-08-14T21:53:01.1597251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1597640Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1598043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1598472Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1598839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1599250Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1599648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1600062Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1600228Z 2025-08-14T21:53:01.1600337Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1600716Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1601053Z return mod(**inputs) 2025-08-14T21:53:01.1601400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1601778Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1602186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1602802Z outputs = self.model.decoder( 2025-08-14T21:53:01.1603195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1603584Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1603982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1604389Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1604766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1605171Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1605572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1605990Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1606139Z 2025-08-14T21:53:01.1606261Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1606649Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1606991Z return mod(**inputs) 2025-08-14T21:53:01.1607342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1607724Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1608118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1608533Z outputs = self.model.decoder( 2025-08-14T21:53:01.1609075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1609464Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1609863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1610275Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1610654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1611047Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1611498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1611919Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1612067Z 2025-08-14T21:53:01.1612160Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1612384Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1612614Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1612843Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1613060Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1613283Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1613544Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1613756Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1613982Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1614204Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1614458Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1614877Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1615240Z return mod(**inputs) 2025-08-14T21:53:01.1615597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1615980Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1616398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1616813Z outputs = self.model.decoder( 2025-08-14T21:53:01.1617193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1617575Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1617984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1618398Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1618774Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1619156Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1619551Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1619977Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1620390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1620814Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1621282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:01.1621789Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:01.1621980Z 2025-08-14T21:53:01.1622090Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1622472Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1622816Z return mod(**inputs) 2025-08-14T21:53:01.1623151Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1623548Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1623957Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1624355Z outputs = self.model.decoder( 2025-08-14T21:53:01.1624714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1625083Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1625473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1625864Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1626246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1626635Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1627035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1627461Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1627897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1628362Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1628845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:01.1629342Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:01.1629520Z 2025-08-14T21:53:01.1629606Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1629836Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1630080Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1630462Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1630804Z return mod(**inputs) 2025-08-14T21:53:01.1631157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1631522Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1631917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1632316Z outputs = self.model.decoder( 2025-08-14T21:53:01.1632672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1633045Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1633439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1633839Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1634199Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1634591Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1635015Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1635431Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1635601Z 2025-08-14T21:53:01.1635712Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1636102Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1636455Z return mod(**inputs) 2025-08-14T21:53:01.1636803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1637190Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1637598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1638007Z outputs = self.model.decoder( 2025-08-14T21:53:01.1639271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1639656Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1640065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1640468Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1640849Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1641246Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1641678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1642100Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1642268Z 2025-08-14T21:53:01.1642376Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1642765Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1643110Z return mod(**inputs) 2025-08-14T21:53:01.1643464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1643871Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1644270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1644669Z outputs = self.model.decoder( 2025-08-14T21:53:01.1645095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1645543Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1645938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1646342Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1646725Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1647118Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1647515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1647930Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1648077Z 2025-08-14T21:53:01.1648197Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1648601Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1649035Z return mod(**inputs) 2025-08-14T21:53:01.1649403Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1649787Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1650188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1650611Z outputs = self.model.decoder( 2025-08-14T21:53:01.1650990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1651376Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1651776Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1652188Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1652570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1652962Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1653410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1653811Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1653953Z 2025-08-14T21:53:01.1654107Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1654495Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1654844Z return mod(**inputs) 2025-08-14T21:53:01.1655187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1655555Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1655952Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1656350Z outputs = self.model.decoder( 2025-08-14T21:53:01.1656746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1657121Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1657532Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1657947Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1658324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1658711Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1659134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 291, in forward 2025-08-14T21:53:01.1659591Z hidden_states = (residual + hidden_states).view(hidden_states_shape) 2025-08-14T21:53:01.1659788Z 2025-08-14T21:53:01.1659873Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1660100Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1660317Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1660533Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1660739Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1660953Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1661165Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1661374Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1661592Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1661804Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1662041Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1662422Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1662763Z return mod(**inputs) 2025-08-14T21:53:01.1663106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1663481Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1663876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1664275Z outputs = self.model.decoder( 2025-08-14T21:53:01.1664631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1665007Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1665404Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1665803Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1666170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1666556Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1666952Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1667368Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1667792Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1668215Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1668715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:01.1669254Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:01.1669452Z 2025-08-14T21:53:01.1669580Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1669957Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1670295Z return mod(**inputs) 2025-08-14T21:53:01.1670632Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1671011Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1671427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1671819Z outputs = self.model.decoder( 2025-08-14T21:53:01.1672182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1672555Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1672950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1673359Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1673732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1674140Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1674552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1674983Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1675406Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1675832Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1676297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:01.1676786Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:01.1676958Z 2025-08-14T21:53:01.1677057Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1677288Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1677535Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1677918Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1678269Z return mod(**inputs) 2025-08-14T21:53:01.1678616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1679000Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1679422Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1679848Z outputs = self.model.decoder( 2025-08-14T21:53:01.1680222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1680611Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1681025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1681438Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1681826Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1682232Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1682656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1683097Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1683271Z 2025-08-14T21:53:01.1683411Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1683821Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1684166Z return mod(**inputs) 2025-08-14T21:53:01.1684522Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1684906Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1685308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1685707Z outputs = self.model.decoder( 2025-08-14T21:53:01.1686100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1686483Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1686875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1687281Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1687654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1688040Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1688474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1689018Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1689187Z 2025-08-14T21:53:01.1689308Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1689699Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1690070Z return mod(**inputs) 2025-08-14T21:53:01.1690426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1690810Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1691212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1691623Z outputs = self.model.decoder( 2025-08-14T21:53:01.1691997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1692384Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1692780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1693184Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1693561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1693954Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1694359Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1694773Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1694919Z 2025-08-14T21:53:01.1695040Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1695423Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1695777Z return mod(**inputs) 2025-08-14T21:53:01.1696129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1696503Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1696916Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1697315Z outputs = self.model.decoder( 2025-08-14T21:53:01.1697676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1698039Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1698495Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1698913Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1699289Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1699673Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1700081Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1700489Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1700647Z 2025-08-14T21:53:01.1700732Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1700975Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1701199Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1701408Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1701629Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1701990Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1702209Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1702420Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1702843Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1703083Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1703399Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1703796Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1704154Z return mod(**inputs) 2025-08-14T21:53:01.1704518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1704883Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1705278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1705679Z outputs = self.model.decoder( 2025-08-14T21:53:01.1706042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1706416Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1706811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1707210Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1707569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1707952Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1708349Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1708765Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1709186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1709612Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1710077Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:01.1710574Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:01.1710777Z 2025-08-14T21:53:01.1710888Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1711268Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1711610Z return mod(**inputs) 2025-08-14T21:53:01.1711948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1712320Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1712711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1713119Z outputs = self.model.decoder( 2025-08-14T21:53:01.1713578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1713962Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1714362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1714772Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1715158Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1715595Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1716023Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1716467Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1716895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1717337Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1717811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:01.1718334Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:01.1718510Z 2025-08-14T21:53:01.1718607Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1718845Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1719094Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1719496Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1719861Z return mod(**inputs) 2025-08-14T21:53:01.1720213Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1720596Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1721016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1721435Z outputs = self.model.decoder( 2025-08-14T21:53:01.1721800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1722184Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1722589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1722999Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1723376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1723768Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1724183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1724615Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1724789Z 2025-08-14T21:53:01.1724903Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1725293Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1725642Z return mod(**inputs) 2025-08-14T21:53:01.1725997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1726388Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1726791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1727201Z outputs = self.model.decoder( 2025-08-14T21:53:01.1727575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1727960Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1728355Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1728865Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1729247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1729644Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1730042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1730480Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1730643Z 2025-08-14T21:53:01.1730762Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1731172Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1731521Z return mod(**inputs) 2025-08-14T21:53:01.1731877Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1732261Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1732660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1733153Z outputs = self.model.decoder( 2025-08-14T21:53:01.1733524Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1733605Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1733872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1733953Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1734192Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1734286Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1734557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1734651Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1734656Z 2025-08-14T21:53:01.1734766Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1734978Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1735056Z return mod(**inputs) 2025-08-14T21:53:01.1735286Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1735361Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1735608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1735683Z outputs = self.model.decoder( 2025-08-14T21:53:01.1735908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1735989Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1736238Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1736323Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1736556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1736640Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1736899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1736984Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1736988Z 2025-08-14T21:53:01.1737106Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1737314Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1737386Z return mod(**inputs) 2025-08-14T21:53:01.1737662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1737745Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1738002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1738081Z outputs = self.model.decoder( 2025-08-14T21:53:01.1738306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1738392Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1738680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1738753Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1738978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1739058Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1739302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 291, in forward 2025-08-14T21:53:01.1739433Z hidden_states = (residual + hidden_states).view(hidden_states_shape) 2025-08-14T21:53:01.1739454Z 2025-08-14T21:53:01.1739547Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1739632Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1739710Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1739789Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1739877Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1739955Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1740043Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1740121Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1740200Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1740287Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1740397Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1740615Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1740695Z return mod(**inputs) 2025-08-14T21:53:01.1740929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1741010Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1741275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1741354Z outputs = self.model.decoder( 2025-08-14T21:53:01.1741597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1741687Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1741940Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1742027Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1742261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1742352Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1742604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1742710Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1742967Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1743070Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1743374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:01.1743517Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:01.1743543Z 2025-08-14T21:53:01.1743651Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1743882Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1743956Z return mod(**inputs) 2025-08-14T21:53:01.1744190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1744278Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1744530Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1744613Z outputs = self.model.decoder( 2025-08-14T21:53:01.1744856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1744935Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1745192Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1745269Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1745498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1745612Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1745863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1745974Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1746223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1746324Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1746631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:01.1746747Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:01.1746753Z 2025-08-14T21:53:01.1746843Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1746924Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1747032Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1747246Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1747314Z return mod(**inputs) 2025-08-14T21:53:01.1747540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1747624Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1747874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1747959Z outputs = self.model.decoder( 2025-08-14T21:53:01.1748184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1748263Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1748524Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1748600Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1748832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1748924Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1749174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1749283Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1749287Z 2025-08-14T21:53:01.1749399Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1749612Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1749691Z return mod(**inputs) 2025-08-14T21:53:01.1749977Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1750056Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1750314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1750392Z outputs = self.model.decoder( 2025-08-14T21:53:01.1750622Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1750701Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1750966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1751052Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1751284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1751379Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1751641Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1751746Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1751768Z 2025-08-14T21:53:01.1751887Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1752102Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1752172Z return mod(**inputs) 2025-08-14T21:53:01.1752414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1752497Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1752763Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1752842Z outputs = self.model.decoder( 2025-08-14T21:53:01.1753080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1753179Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1753431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1753510Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1753746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1753831Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1754090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1754176Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1754180Z 2025-08-14T21:53:01.1754286Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1754502Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1754573Z return mod(**inputs) 2025-08-14T21:53:01.1754807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1754888Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1755142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1755227Z outputs = self.model.decoder( 2025-08-14T21:53:01.1755453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1755530Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1755789Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1755865Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1756101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1756232Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1756492Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1756590Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1756594Z 2025-08-14T21:53:01.1756678Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1756762Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1756855Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1756936Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1757025Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1757123Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1757206Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1757295Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1757376Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1757458Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1757578Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1757794Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1757885Z return mod(**inputs) 2025-08-14T21:53:01.1758131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1758212Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1758487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1758565Z outputs = self.model.decoder( 2025-08-14T21:53:01.1758804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1758890Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1759159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1759249Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1759488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1759575Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1759854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1759963Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1760234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1760351Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1760675Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:01.1760825Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:01.1760830Z 2025-08-14T21:53:01.1760940Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1761164Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1761244Z return mod(**inputs) 2025-08-14T21:53:01.1761483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1761562Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1761839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1761919Z outputs = self.model.decoder( 2025-08-14T21:53:01.1762162Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1762241Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1762558Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1762647Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1762890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1762984Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1763246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1763355Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1763642Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1763750Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1764067Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:01.1764198Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:01.1764202Z 2025-08-14T21:53:01.1764287Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1764381Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1764516Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1764743Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1764824Z return mod(**inputs) 2025-08-14T21:53:01.1765060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1765140Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1765418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1765498Z outputs = self.model.decoder( 2025-08-14T21:53:01.1765744Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1765828Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1766089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1766179Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1766418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1766511Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1766773Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1766882Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1766886Z 2025-08-14T21:53:01.1767005Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1767220Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1767293Z return mod(**inputs) 2025-08-14T21:53:01.1767536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1767617Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1767886Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1767964Z outputs = self.model.decoder( 2025-08-14T21:53:01.1768197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1768285Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1768544Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1768622Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1768973Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1769115Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1769385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1769490Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1769494Z 2025-08-14T21:53:01.1769606Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1769834Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1769907Z return mod(**inputs) 2025-08-14T21:53:01.1770167Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1770249Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1770507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1770597Z outputs = self.model.decoder( 2025-08-14T21:53:01.1770829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1770911Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1771206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1771284Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1771526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1771614Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1771872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1771969Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1771973Z 2025-08-14T21:53:01.1772083Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1772308Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1772379Z return mod(**inputs) 2025-08-14T21:53:01.1772612Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1772702Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1772959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1773039Z outputs = self.model.decoder( 2025-08-14T21:53:01.1773281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1773361Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1773627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1773705Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1773943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1774034Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1774292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1774379Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1774383Z 2025-08-14T21:53:01.1774503Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1774714Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1774790Z return mod(**inputs) 2025-08-14T21:53:01.1775022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1775099Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1775364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1775484Z outputs = self.model.decoder( 2025-08-14T21:53:01.1775722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1775807Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1776069Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1776153Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1776395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1776496Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1776762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 291, in forward 2025-08-14T21:53:01.1776903Z hidden_states = (residual + hidden_states).view(hidden_states_shape) 2025-08-14T21:53:01.1776908Z 2025-08-14T21:53:01.1777003Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1777088Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1777170Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1777279Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1777361Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1777443Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1777530Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1777612Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1777693Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1777781Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1777893Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1778111Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1778181Z return mod(**inputs) 2025-08-14T21:53:01.1778419Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1778507Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1778764Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1778846Z outputs = self.model.decoder( 2025-08-14T21:53:01.1779086Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1779167Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1779435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1779513Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1779749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1779842Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1780104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1780211Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1780478Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1780583Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1780902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:01.1781045Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:01.1781051Z 2025-08-14T21:53:01.1781161Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1781391Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1781480Z return mod(**inputs) 2025-08-14T21:53:01.1781734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1781813Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1782061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1782146Z outputs = self.model.decoder( 2025-08-14T21:53:01.1782371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1782448Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1782723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1782802Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1783044Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1783131Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1783390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 259, in forward 2025-08-14T21:53:01.1783507Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:01.1783787Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 184, in forward 2025-08-14T21:53:01.1783898Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:01.1784207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:01.1784326Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:01.1784330Z 2025-08-14T21:53:01.1784426Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1784509Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1784620Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1784852Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1784920Z return mod(**inputs) 2025-08-14T21:53:01.1785153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1785233Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1785488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1785577Z outputs = self.model.decoder( 2025-08-14T21:53:01.1785809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1785889Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1786154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1786232Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1786478Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1786565Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1786825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1786937Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1786940Z 2025-08-14T21:53:01.1787049Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1787280Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1787351Z return mod(**inputs) 2025-08-14T21:53:01.1787578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1787663Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1787941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1788035Z outputs = self.model.decoder( 2025-08-14T21:53:01.1788273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1788353Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1788612Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1788687Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1788920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1789027Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1789278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 286, in forward 2025-08-14T21:53:01.1789378Z hidden_states = self.activation_fn(hidden_states) 2025-08-14T21:53:01.1789390Z 2025-08-14T21:53:01.1789497Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1789703Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1789797Z return mod(**inputs) 2025-08-14T21:53:01.1790028Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1790106Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1790366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1790442Z outputs = self.model.decoder( 2025-08-14T21:53:01.1790679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1790757Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1791010Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1791094Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1791326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1791410Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1791672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1791755Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1791759Z 2025-08-14T21:53:01.1791872Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1792084Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1792151Z return mod(**inputs) 2025-08-14T21:53:01.1792389Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1792470Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1792725Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 826, in forward 2025-08-14T21:53:01.1792812Z outputs = self.model.decoder( 2025-08-14T21:53:01.1793050Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1793136Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1793397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 653, in forward 2025-08-14T21:53:01.1793474Z layer_outputs = decoder_layer( 2025-08-14T21:53:01.1793724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:01.1793810Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:01.1794081Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 288, in forward 2025-08-14T21:53:01.1794205Z hidden_states = self.fc2(hidden_states) 2025-08-14T21:53:01.1794210Z 2025-08-14T21:53:01.1794296Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1794387Z cudagraph partition due to non gpu ops 2025-08-14T21:53:01.1794501Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:01.1794714Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:01.1794794Z return mod(**inputs) 2025-08-14T21:53:01.1795027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/generic.py", line 961, in wrapper 2025-08-14T21:53:01.1795117Z output = func(self, *args, **kwargs) 2025-08-14T21:53:01.1795391Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/opt/modeling_opt.py", line 847, in forward 2025-08-14T21:53:01.1795484Z loss = self.loss_function( 2025-08-14T21:53:01.1795745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/loss/loss_utils.py", line 67, in ForCausalLMLoss 2025-08-14T21:53:01.1795931Z loss = fixed_cross_entropy(logits, shift_labels, num_items_in_batch, ignore_index, **kwargs) 2025-08-14T21:53:01.1796194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/loss/loss_utils.py", line 36, in fixed_cross_entropy 2025-08-14T21:53:01.1796432Z loss = nn.functional.cross_entropy(source, target, ignore_index=ignore_index, reduction=reduction) 2025-08-14T21:53:01.1796436Z 2025-08-14T21:53:12.2220683Z Compilation time (from dynamo_timed): 20.05245119 2025-08-14T21:53:12.2785347Z pass 2025-08-14T21:53:12.2786454Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:53:12.2787381Z TIMING: _recursive_pre_grad_passes:0.04538 _recursive_joint_graph_passes:0.40439 _recursive_post_grad_passes:0.11054 async_compile.wait:1.00502 code_gen:10.3634 inductor_compile:12.0751 backend_compile:17.47458 gc:0.00039 entire_frame_compile:20.05245 total_wall_time:20.05245 2025-08-14T21:53:12.2788514Z STATS: call_* op count: 417 | FakeTensorMode.__torch_dispatch__:26230 | FakeTensor.__torch_dispatch__:3489 | ProxyTorchDispatchMode.__torch_dispatch__:7311 2025-08-14T21:53:12.2789086Z Dynamo produced 1 graphs covering 417 ops with 0 graph breaks (0 unique) 2025-08-14T21:53:18.1424419Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:53:18.1425403Z from pkg_resources import resource_filename 2025-08-14T21:53:18.7605205Z 2025-08-14T21:53:20.2170929Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:53:20.2171768Z loading model: 0it [00:01, ?it/s] 2025-08-14T21:53:20.2172097Z cpu eval PLBartForCausalLM 2025-08-14T21:53:20.9579589Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:53:21.2451415Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:53:21.4523641Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:53:28.1689811Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1690128Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1690366Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1690603Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1690829Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1691094Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1691322Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1691561Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1691796Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1692030Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1692589Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1692894Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1693155Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1693393Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1693628Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1693853Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1694126Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:28.1694557Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:28.1694943Z return mod(**inputs) 2025-08-14T21:53:28.1695461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1678, in forward 2025-08-14T21:53:28.1695922Z outputs = self.model.decoder( 2025-08-14T21:53:28.1696395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:53:28.1696839Z layer_outputs = decoder_layer( 2025-08-14T21:53:28.1697239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:28.1697705Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:28.1698157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:53:28.1698683Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:28.1699154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:53:28.1699615Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:28.1700100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:28.1700627Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:28.1700832Z 2025-08-14T21:53:28.1700953Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:28.1701355Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:28.1701725Z return mod(**inputs) 2025-08-14T21:53:28.1702130Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1678, in forward 2025-08-14T21:53:28.1702559Z outputs = self.model.decoder( 2025-08-14T21:53:28.1703218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:53:28.1703689Z layer_outputs = decoder_layer( 2025-08-14T21:53:28.1704065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:28.1704462Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:28.1704931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:53:28.1705392Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:28.1705844Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:53:28.1706303Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:28.1706784Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:28.1707285Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:28.1707461Z 2025-08-14T21:53:28.1707554Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1707791Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1708019Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1708237Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1708514Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1708782Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1709009Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1709228Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1709463Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1709680Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1709942Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1710159Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1710368Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1710585Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1710801Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1711053Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1711307Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:28.1711707Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:28.1712055Z return mod(**inputs) 2025-08-14T21:53:28.1712466Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1678, in forward 2025-08-14T21:53:28.1712919Z outputs = self.model.decoder( 2025-08-14T21:53:28.1713383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:53:28.1713830Z layer_outputs = decoder_layer( 2025-08-14T21:53:28.1714217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:28.1714616Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:28.1715044Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:53:28.1715518Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:28.1715968Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:53:28.1716445Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:28.1716921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:28.1717443Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:28.1717648Z 2025-08-14T21:53:28.1717763Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:28.1718158Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:28.1718509Z return mod(**inputs) 2025-08-14T21:53:28.1718915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1678, in forward 2025-08-14T21:53:28.1719356Z outputs = self.model.decoder( 2025-08-14T21:53:28.1719768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:53:28.1720195Z layer_outputs = decoder_layer( 2025-08-14T21:53:28.1720572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:28.1720974Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:28.1721395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:53:28.1721843Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:28.1722292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:53:28.1722796Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:28.1723268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:28.1723793Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:28.1723992Z 2025-08-14T21:53:28.1724092Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1724317Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1724552Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1724778Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1724995Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1725219Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1725444Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1725666Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1725881Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1726128Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1726357Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1726575Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1726799Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1727025Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1727245Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1727468Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1727722Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:28.1728145Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:28.1728495Z return mod(**inputs) 2025-08-14T21:53:28.1729094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1678, in forward 2025-08-14T21:53:28.1729548Z outputs = self.model.decoder( 2025-08-14T21:53:28.1729966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:53:28.1730407Z layer_outputs = decoder_layer( 2025-08-14T21:53:28.1730786Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:28.1731183Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:28.1731608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:53:28.1732064Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:28.1732517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:53:28.1732959Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:28.1733439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:28.1733962Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:28.1734159Z 2025-08-14T21:53:28.1734289Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:28.1734661Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:28.1735007Z return mod(**inputs) 2025-08-14T21:53:28.1735396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1678, in forward 2025-08-14T21:53:28.1735814Z outputs = self.model.decoder( 2025-08-14T21:53:28.1736209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:53:28.1736622Z layer_outputs = decoder_layer( 2025-08-14T21:53:28.1736986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:28.1737362Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:28.1737783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:53:28.1738219Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:28.1738712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:53:28.1739139Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:28.1739606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:28.1740085Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:28.1740253Z 2025-08-14T21:53:28.1740344Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1740561Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1740786Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1741057Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1741272Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1741492Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1741712Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1741920Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1742141Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1742360Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1742570Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1742787Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1743030Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1743249Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1743458Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1743675Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1743926Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:28.1744305Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:28.1744650Z return mod(**inputs) 2025-08-14T21:53:28.1745043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1678, in forward 2025-08-14T21:53:28.1745487Z outputs = self.model.decoder( 2025-08-14T21:53:28.1745897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:53:28.1746312Z layer_outputs = decoder_layer( 2025-08-14T21:53:28.1746686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:28.1747074Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:28.1747502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:53:28.1747954Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:28.1748408Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:53:28.1748837Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:28.1749302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:28.1749807Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:28.1749996Z 2025-08-14T21:53:28.1750115Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:28.1750491Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:28.1750837Z return mod(**inputs) 2025-08-14T21:53:28.1751226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1678, in forward 2025-08-14T21:53:28.1751638Z outputs = self.model.decoder( 2025-08-14T21:53:28.1752045Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:53:28.1752455Z layer_outputs = decoder_layer( 2025-08-14T21:53:28.1752817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:28.1753243Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:28.1753662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:53:28.1754098Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:28.1754523Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:53:28.1754957Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:28.1755420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:28.1755920Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:28.1756090Z 2025-08-14T21:53:28.1756172Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1756398Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1756622Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1756834Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1757050Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1757268Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1757512Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1757725Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1757944Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1758162Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1758372Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1758590Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1758806Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1759019Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1759236Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1759455Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1759694Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:28.1760082Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:28.1760427Z return mod(**inputs) 2025-08-14T21:53:28.1760816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1678, in forward 2025-08-14T21:53:28.1761234Z outputs = self.model.decoder( 2025-08-14T21:53:28.1761640Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:53:28.1762053Z layer_outputs = decoder_layer( 2025-08-14T21:53:28.1762410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:28.1762794Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:28.1763210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:53:28.1763654Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:28.1764095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:53:28.1764545Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:28.1765029Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:28.1765551Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:28.1765750Z 2025-08-14T21:53:28.1765862Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:28.1766257Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:28.1766614Z return mod(**inputs) 2025-08-14T21:53:28.1767006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1678, in forward 2025-08-14T21:53:28.1767469Z outputs = self.model.decoder( 2025-08-14T21:53:28.1767913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:53:28.1768337Z layer_outputs = decoder_layer( 2025-08-14T21:53:28.1768797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:28.1769212Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:28.1769670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:53:28.1770135Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:28.1770627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:53:28.1771084Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:28.1771569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:28.1772202Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:28.1772387Z 2025-08-14T21:53:28.1772476Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1772770Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1773001Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1773221Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1773451Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1773709Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1773930Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1774149Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1774371Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1774592Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1774808Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1775027Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1775250Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1775468Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1775692Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1775917Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1776166Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:28.1776565Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:28.1776934Z return mod(**inputs) 2025-08-14T21:53:28.1777346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1678, in forward 2025-08-14T21:53:28.1777783Z outputs = self.model.decoder( 2025-08-14T21:53:28.1778209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:53:28.1778650Z layer_outputs = decoder_layer( 2025-08-14T21:53:28.1779028Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:28.1779432Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:28.1779873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:53:28.1780326Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:28.1780797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:53:28.1781250Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:28.1781726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:53:28.1782230Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:53:28.1782420Z 2025-08-14T21:53:28.1782529Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:28.1782978Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:28.1783346Z return mod(**inputs) 2025-08-14T21:53:28.1783730Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1678, in forward 2025-08-14T21:53:28.1784161Z outputs = self.model.decoder( 2025-08-14T21:53:28.1784564Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:53:28.1784989Z layer_outputs = decoder_layer( 2025-08-14T21:53:28.1785369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:53:28.1785753Z return super().__call__(*args, **kwargs) 2025-08-14T21:53:28.1786171Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:53:28.1786611Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:53:28.1787039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:53:28.1787513Z attn_output, attn_weights = attention_interface( 2025-08-14T21:53:28.1787998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:53:28.1788476Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:53:28.1788654Z 2025-08-14T21:53:28.1788739Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1788964Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1789187Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1789398Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1789616Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1789831Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1790044Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1790263Z cudagraph partition due to non gpu ops 2025-08-14T21:53:28.1790515Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:53:28.1790903Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:53:28.1791269Z return mod(**inputs) 2025-08-14T21:53:28.1791685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1700, in forward 2025-08-14T21:53:28.1792191Z loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) 2025-08-14T21:53:28.1792403Z 2025-08-14T21:53:37.2471290Z Compilation time (from dynamo_timed): 14.073241814 2025-08-14T21:53:37.2702164Z pass 2025-08-14T21:53:37.2702770Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:53:37.2703714Z TIMING: _recursive_pre_grad_passes:0.02359 _recursive_joint_graph_passes:0.28671 _recursive_post_grad_passes:0.0582 async_compile.wait:0.89248 code_gen:8.37254 inductor_compile:9.55261 backend_compile:12.63114 gc:0.00125 entire_frame_compile:14.07324 total_wall_time:14.07324 2025-08-14T21:53:37.2704772Z STATS: call_* op count: 200 | FakeTensorMode.__torch_dispatch__:14578 | FakeTensor.__torch_dispatch__:1828 | ProxyTorchDispatchMode.__torch_dispatch__:3913 2025-08-14T21:53:37.2705334Z Dynamo produced 1 graphs covering 200 ops with 0 graph breaks (0 unique) 2025-08-14T21:53:42.9182139Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:53:42.9183081Z from pkg_resources import resource_filename 2025-08-14T21:53:43.5435555Z 2025-08-14T21:53:46.0276858Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:53:46.0277695Z loading model: 0it [00:02, ?it/s] 2025-08-14T21:53:46.0278362Z cpu eval PLBartForConditionalGeneration 2025-08-14T21:53:47.4500216Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:53:47.8379553Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:53:48.1756953Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:54:02.0888864Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.0889521Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.0890273Z return mod(**inputs) 2025-08-14T21:54:02.0890740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1357, in forward 2025-08-14T21:54:02.0891284Z decoder_input_ids = shift_tokens_right(labels, self.config.pad_token_id) 2025-08-14T21:54:02.0891851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1084, in shift_tokens_right 2025-08-14T21:54:02.0892423Z index_of_eos = (prev_output_tokens.ne(pad_token_id).sum(dim=1) - 1).unsqueeze(-1) 2025-08-14T21:54:02.0892742Z 2025-08-14T21:54:02.0892836Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0893075Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0893302Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0893520Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0893738Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0893968Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0894190Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0894403Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0894621Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0894853Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0895088Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0895313Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0895544Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0895835Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0896096Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0896353Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0896625Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.0897043Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.0897441Z return mod(**inputs) 2025-08-14T21:54:02.0897864Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.0898292Z outputs = self.model( 2025-08-14T21:54:02.0898715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1189, in forward 2025-08-14T21:54:02.0899155Z encoder_outputs = self.encoder( 2025-08-14T21:54:02.0899590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 669, in forward 2025-08-14T21:54:02.0900030Z layer_outputs = encoder_layer( 2025-08-14T21:54:02.0900409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.0900813Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.0901249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 496, in forward 2025-08-14T21:54:02.0901703Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:54:02.0902172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.0902917Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.0903576Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.0904108Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.0904315Z 2025-08-14T21:54:02.0904445Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.0904847Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.0905202Z return mod(**inputs) 2025-08-14T21:54:02.0905628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.0906061Z outputs = self.model( 2025-08-14T21:54:02.0906502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1189, in forward 2025-08-14T21:54:02.0906935Z encoder_outputs = self.encoder( 2025-08-14T21:54:02.0907405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 669, in forward 2025-08-14T21:54:02.0907881Z layer_outputs = encoder_layer( 2025-08-14T21:54:02.0908253Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.0908682Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.0909110Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 496, in forward 2025-08-14T21:54:02.0909548Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:54:02.0909993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.0910444Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.0910926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.0911417Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.0911601Z 2025-08-14T21:54:02.0911691Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0911926Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0912197Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0912480Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0912935Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0913265Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0913679Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0914007Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0914315Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0914605Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0914928Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0925482Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0925758Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0926015Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0926243Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0926468Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0926729Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.0927146Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.0927518Z return mod(**inputs) 2025-08-14T21:54:02.0927963Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.0928401Z outputs = self.model( 2025-08-14T21:54:02.0929006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1189, in forward 2025-08-14T21:54:02.0929462Z encoder_outputs = self.encoder( 2025-08-14T21:54:02.0929895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 669, in forward 2025-08-14T21:54:02.0930424Z layer_outputs = encoder_layer( 2025-08-14T21:54:02.0930861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.0931267Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.0931708Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 496, in forward 2025-08-14T21:54:02.0932154Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:54:02.0932601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.0933085Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.0933583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.0934103Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.0934316Z 2025-08-14T21:54:02.0934537Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.0934940Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.0935334Z return mod(**inputs) 2025-08-14T21:54:02.0935735Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.0936162Z outputs = self.model( 2025-08-14T21:54:02.0936569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1189, in forward 2025-08-14T21:54:02.0936995Z encoder_outputs = self.encoder( 2025-08-14T21:54:02.0937419Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 669, in forward 2025-08-14T21:54:02.0937847Z layer_outputs = encoder_layer( 2025-08-14T21:54:02.0938231Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.0938624Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.0939057Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 496, in forward 2025-08-14T21:54:02.0939509Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:54:02.0939947Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.0940395Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.0940877Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.0941377Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.0941553Z 2025-08-14T21:54:02.0941642Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0941876Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0942107Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0942333Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0942549Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0942771Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0942995Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0943208Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0943432Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0943658Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0943872Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0944097Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0944319Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0944533Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0944755Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0944980Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0945236Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.0945677Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.0946035Z return mod(**inputs) 2025-08-14T21:54:02.0946442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.0946862Z outputs = self.model( 2025-08-14T21:54:02.0947264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1189, in forward 2025-08-14T21:54:02.0947694Z encoder_outputs = self.encoder( 2025-08-14T21:54:02.0948129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 669, in forward 2025-08-14T21:54:02.0948551Z layer_outputs = encoder_layer( 2025-08-14T21:54:02.0948933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.0949334Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.0949761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 496, in forward 2025-08-14T21:54:02.0950210Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:54:02.0950691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.0951142Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.0951620Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.0952137Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.0952336Z 2025-08-14T21:54:02.0952465Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.0952844Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.0953179Z return mod(**inputs) 2025-08-14T21:54:02.0953575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.0954003Z outputs = self.model( 2025-08-14T21:54:02.0954398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1189, in forward 2025-08-14T21:54:02.0954841Z encoder_outputs = self.encoder( 2025-08-14T21:54:02.0955270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 669, in forward 2025-08-14T21:54:02.0955699Z layer_outputs = encoder_layer( 2025-08-14T21:54:02.0956073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.0956472Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.0956899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 496, in forward 2025-08-14T21:54:02.0957350Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:54:02.0957782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.0958217Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.0958686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.0959174Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.0959355Z 2025-08-14T21:54:02.0959441Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0959674Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0959903Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0960121Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0960346Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0960593Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0960830Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0961058Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0961283Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0961504Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0961731Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0961957Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0962172Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0962396Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0962621Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0962844Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0963109Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.0963512Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.0963870Z return mod(**inputs) 2025-08-14T21:54:02.0964294Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.0964724Z outputs = self.model( 2025-08-14T21:54:02.0965129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1189, in forward 2025-08-14T21:54:02.0965575Z encoder_outputs = self.encoder( 2025-08-14T21:54:02.0965981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 669, in forward 2025-08-14T21:54:02.0966404Z layer_outputs = encoder_layer( 2025-08-14T21:54:02.0966780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.0967175Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.0967601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 496, in forward 2025-08-14T21:54:02.0968044Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:54:02.0968484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.0969034Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.0969527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.0970050Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.0970248Z 2025-08-14T21:54:02.0970369Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.0970806Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.0971168Z return mod(**inputs) 2025-08-14T21:54:02.0971578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.0971999Z outputs = self.model( 2025-08-14T21:54:02.0972409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1189, in forward 2025-08-14T21:54:02.0972841Z encoder_outputs = self.encoder( 2025-08-14T21:54:02.0973263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 669, in forward 2025-08-14T21:54:02.0973682Z layer_outputs = encoder_layer( 2025-08-14T21:54:02.0974064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.0974458Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.0974883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 496, in forward 2025-08-14T21:54:02.0975334Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:54:02.0975775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.0976308Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.0976882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.0977384Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.0977558Z 2025-08-14T21:54:02.0977656Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0977886Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0978107Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0978333Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0978583Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0978804Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0979030Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0979264Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0979473Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0979692Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0979910Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0980118Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0980359Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0980577Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0980797Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0981008Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0981255Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.0981637Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.0981983Z return mod(**inputs) 2025-08-14T21:54:02.0982386Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.0982810Z outputs = self.model( 2025-08-14T21:54:02.0983204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1189, in forward 2025-08-14T21:54:02.0983634Z encoder_outputs = self.encoder( 2025-08-14T21:54:02.0984052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 669, in forward 2025-08-14T21:54:02.0984475Z layer_outputs = encoder_layer( 2025-08-14T21:54:02.0984846Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.0985241Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.0985660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 496, in forward 2025-08-14T21:54:02.0986116Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:54:02.0986537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.0986973Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.0987442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.0987933Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.0988132Z 2025-08-14T21:54:02.0988244Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.0988628Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.0988980Z return mod(**inputs) 2025-08-14T21:54:02.0989373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.0989797Z outputs = self.model( 2025-08-14T21:54:02.0990205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1189, in forward 2025-08-14T21:54:02.0990660Z encoder_outputs = self.encoder( 2025-08-14T21:54:02.0991090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 669, in forward 2025-08-14T21:54:02.0991507Z layer_outputs = encoder_layer( 2025-08-14T21:54:02.0991881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.0992269Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.0992713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 496, in forward 2025-08-14T21:54:02.0993241Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:54:02.0993754Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.0994201Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.0994690Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.0995184Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.0995359Z 2025-08-14T21:54:02.0995445Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0995697Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0995924Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0996149Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0996365Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0996587Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0996807Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0997022Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0997241Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0997463Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0997674Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0997895Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0998122Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0998338Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0998558Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0998779Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.0999034Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.0999423Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.0999775Z return mod(**inputs) 2025-08-14T21:54:02.1000185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1000606Z outputs = self.model( 2025-08-14T21:54:02.1001019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1189, in forward 2025-08-14T21:54:02.1001444Z encoder_outputs = self.encoder( 2025-08-14T21:54:02.1001866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 669, in forward 2025-08-14T21:54:02.1002291Z layer_outputs = encoder_layer( 2025-08-14T21:54:02.1002867Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1003279Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1003702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 496, in forward 2025-08-14T21:54:02.1004158Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:54:02.1004604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1005055Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1005533Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.1006195Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.1006404Z 2025-08-14T21:54:02.1006518Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1006917Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1007268Z return mod(**inputs) 2025-08-14T21:54:02.1007674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1008103Z outputs = self.model( 2025-08-14T21:54:02.1008534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1189, in forward 2025-08-14T21:54:02.1009043Z encoder_outputs = self.encoder( 2025-08-14T21:54:02.1009487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 669, in forward 2025-08-14T21:54:02.1009932Z layer_outputs = encoder_layer( 2025-08-14T21:54:02.1010309Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1010708Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1011184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 496, in forward 2025-08-14T21:54:02.1011633Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:54:02.1012094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1012547Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1013032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.1013529Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.1013710Z 2025-08-14T21:54:02.1013795Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1014021Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1014240Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1014454Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1014677Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1014895Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1015102Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1015317Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1015532Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1015740Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1015957Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1016175Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1016384Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1016597Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1016813Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1017032Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1017272Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1017658Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1018010Z return mod(**inputs) 2025-08-14T21:54:02.1018399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1018813Z outputs = self.model( 2025-08-14T21:54:02.1019207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1019626Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1020031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1020447Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1020885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1021263Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1021685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:54:02.1022131Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:02.1022567Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1022996Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1023483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.1023990Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.1024184Z 2025-08-14T21:54:02.1024303Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1024684Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1025029Z return mod(**inputs) 2025-08-14T21:54:02.1025424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1025867Z outputs = self.model( 2025-08-14T21:54:02.1026255Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1026672Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1027081Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1027492Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1027861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1028251Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1028670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:54:02.1029128Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:02.1029583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1030039Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1030500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.1030982Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.1031156Z 2025-08-14T21:54:02.1031239Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1031463Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1031677Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1031901Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1032122Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1032332Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1032547Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1032766Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1032974Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1033192Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1033410Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1033619Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1033868Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1034255Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1034601Z return mod(**inputs) 2025-08-14T21:54:02.1034988Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1035432Z outputs = self.model( 2025-08-14T21:54:02.1035843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1036252Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1036662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1037094Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1037471Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1037865Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1038330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 777, in forward 2025-08-14T21:54:02.1038798Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:54:02.1039273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1039724Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1040214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.1040772Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.1040971Z 2025-08-14T21:54:02.1041091Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1041492Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1041861Z return mod(**inputs) 2025-08-14T21:54:02.1042281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1042794Z outputs = self.model( 2025-08-14T21:54:02.1043211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1043661Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1044078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1044505Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1044882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1045273Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1045691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 777, in forward 2025-08-14T21:54:02.1046158Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:54:02.1046615Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1047064Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1047535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.1048026Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.1048199Z 2025-08-14T21:54:02.1048292Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1048522Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1048818Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1049055Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1049276Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1049490Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1049711Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1049929Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1050144Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1050366Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1050638Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1050863Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1051083Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1051302Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1051515Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1051731Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1051976Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1052358Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1052696Z return mod(**inputs) 2025-08-14T21:54:02.1053110Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1053528Z outputs = self.model( 2025-08-14T21:54:02.1053912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1054344Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1054772Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1055215Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1055598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1055990Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1056420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:54:02.1056886Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:02.1057335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1057778Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1058261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.1058778Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.1058975Z 2025-08-14T21:54:02.1059106Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1059481Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1059834Z return mod(**inputs) 2025-08-14T21:54:02.1060235Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1060650Z outputs = self.model( 2025-08-14T21:54:02.1061054Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1061482Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1061902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1062322Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1062708Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1063107Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1063530Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:54:02.1063985Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:02.1064436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1064885Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1065358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.1065904Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.1066077Z 2025-08-14T21:54:02.1066171Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1066401Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1066625Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1066849Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1067075Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1067294Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1067518Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1067746Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1067963Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1068206Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1068433Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1068651Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1068913Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1069318Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1069676Z return mod(**inputs) 2025-08-14T21:54:02.1070094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1070543Z outputs = self.model( 2025-08-14T21:54:02.1070945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1071377Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1071801Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1072227Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1072606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1072995Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1073423Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 777, in forward 2025-08-14T21:54:02.1073885Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:54:02.1074337Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1074784Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1075271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.1075786Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.1075983Z 2025-08-14T21:54:02.1076096Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1076485Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1076840Z return mod(**inputs) 2025-08-14T21:54:02.1077239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1077654Z outputs = self.model( 2025-08-14T21:54:02.1078064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1078478Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1078876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1079289Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1079656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1080045Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1080465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 777, in forward 2025-08-14T21:54:02.1080977Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:54:02.1081436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1081886Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1082356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.1082848Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.1083021Z 2025-08-14T21:54:02.1083116Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1083374Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1083604Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1083827Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1084047Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1084263Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1084483Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1084703Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1084916Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1085160Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1085381Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1085596Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1085817Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1086037Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1086252Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1086473Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1086732Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1087125Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1087476Z return mod(**inputs) 2025-08-14T21:54:02.1087906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1088334Z outputs = self.model( 2025-08-14T21:54:02.1088821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1089274Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1089696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1090125Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1090510Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1090897Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1091328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:54:02.1091775Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:02.1092228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1092675Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1093156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.1093662Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.1093867Z 2025-08-14T21:54:02.1093981Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1094383Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1094734Z return mod(**inputs) 2025-08-14T21:54:02.1095128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1095591Z outputs = self.model( 2025-08-14T21:54:02.1096011Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1096434Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1096868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1097300Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1097682Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1098078Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1098556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:54:02.1099016Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:02.1099482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1099934Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1100433Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.1100948Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.1101122Z 2025-08-14T21:54:02.1101212Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1101446Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1101674Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1101901Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1102120Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1102344Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1102815Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1103041Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1103270Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1103496Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1103712Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1103936Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1104196Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1104594Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1104948Z return mod(**inputs) 2025-08-14T21:54:02.1105357Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1105791Z outputs = self.model( 2025-08-14T21:54:02.1106188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1106622Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1107062Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1107491Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1107866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1108263Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1108695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 777, in forward 2025-08-14T21:54:02.1109165Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:54:02.1109627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1110079Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1110562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.1111160Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.1111404Z 2025-08-14T21:54:02.1111523Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1111917Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1112273Z return mod(**inputs) 2025-08-14T21:54:02.1112688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1113119Z outputs = self.model( 2025-08-14T21:54:02.1113553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1113982Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1114424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1114854Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1115243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1115633Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1116131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 777, in forward 2025-08-14T21:54:02.1116595Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:54:02.1117043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1117492Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1117972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.1118463Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.1118635Z 2025-08-14T21:54:02.1118723Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1118955Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1119182Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1119405Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1119629Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1119855Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1120074Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1120287Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1120506Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1120727Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1120940Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1121162Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1121385Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1121600Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1121822Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1122046Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1122301Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1122687Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1123043Z return mod(**inputs) 2025-08-14T21:54:02.1123445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1123858Z outputs = self.model( 2025-08-14T21:54:02.1124258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1124687Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1125115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1125533Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1125906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1126353Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1126780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:54:02.1127244Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:02.1127698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1128154Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1128654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.1129269Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.1129470Z 2025-08-14T21:54:02.1129593Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1129994Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1130336Z return mod(**inputs) 2025-08-14T21:54:02.1130760Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1131215Z outputs = self.model( 2025-08-14T21:54:02.1131630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1132072Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1132521Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1132961Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1133331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1133725Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1134156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:54:02.1134604Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:02.1135052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1135501Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1135977Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.1136463Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.1136641Z 2025-08-14T21:54:02.1136727Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1136957Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1137182Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1137402Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1137627Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1137847Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1138072Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1138290Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1138503Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1138707Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1138925Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1139146Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1139393Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1139790Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1140142Z return mod(**inputs) 2025-08-14T21:54:02.1140545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1140986Z outputs = self.model( 2025-08-14T21:54:02.1141419Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1141852Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1142268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1142707Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1143075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1143463Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1143907Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 777, in forward 2025-08-14T21:54:02.1144377Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:54:02.1144847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1145290Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1145750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.1146269Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.1146462Z 2025-08-14T21:54:02.1146579Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1146952Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1147305Z return mod(**inputs) 2025-08-14T21:54:02.1147707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1148131Z outputs = self.model( 2025-08-14T21:54:02.1148522Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1148952Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1149370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1149793Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1150163Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1150555Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1150983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 777, in forward 2025-08-14T21:54:02.1151437Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:54:02.1151895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1152330Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1152798Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.1153280Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.1153460Z 2025-08-14T21:54:02.1153546Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1153774Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1153997Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1154219Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1154439Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1154660Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1154883Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1155098Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1155316Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1155530Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1155776Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1156018Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1156234Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1156456Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1156681Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1156895Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1157148Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1157539Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1157890Z return mod(**inputs) 2025-08-14T21:54:02.1158318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1158743Z outputs = self.model( 2025-08-14T21:54:02.1159149Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1159575Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1160008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1160443Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1160847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1161241Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1161683Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:54:02.1162150Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:02.1162609Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1163062Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1163549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.1164079Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.1164275Z 2025-08-14T21:54:02.1164391Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1164785Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1165146Z return mod(**inputs) 2025-08-14T21:54:02.1165553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1165977Z outputs = self.model( 2025-08-14T21:54:02.1166386Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1166827Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1167238Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1167666Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1168039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1168432Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1168956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:54:02.1169430Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:02.1169884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1170327Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1170804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.1171331Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.1171524Z 2025-08-14T21:54:02.1171623Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1171845Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1172076Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1172306Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1172529Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1172748Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1172972Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1173197Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1173411Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1173655Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1173884Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1174100Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1174356Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1174754Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1175109Z return mod(**inputs) 2025-08-14T21:54:02.1175526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1175977Z outputs = self.model( 2025-08-14T21:54:02.1176395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1176817Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1177241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1177682Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1178058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1178446Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1178878Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 777, in forward 2025-08-14T21:54:02.1179347Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:54:02.1179803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1180250Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1180729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.1181245Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.1181441Z 2025-08-14T21:54:02.1181555Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1181947Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1182306Z return mod(**inputs) 2025-08-14T21:54:02.1182711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1183130Z outputs = self.model( 2025-08-14T21:54:02.1183535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1183964Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1184374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1184803Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1185185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1185576Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1186000Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 777, in forward 2025-08-14T21:54:02.1186506Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:54:02.1186964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1187416Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1187893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.1188385Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.1188560Z 2025-08-14T21:54:02.1188656Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1188900Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1189127Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1189354Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1189571Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1189795Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1190018Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1190240Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1190454Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1190704Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1190926Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1191141Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1191363Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1191587Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1191801Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1192024Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1192284Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1192675Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1193031Z return mod(**inputs) 2025-08-14T21:54:02.1193468Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1193898Z outputs = self.model( 2025-08-14T21:54:02.1194314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1194749Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1195170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1195599Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1195972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1196371Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1196817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:54:02.1197279Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:02.1197736Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1198190Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1198671Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.1199179Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.1199386Z 2025-08-14T21:54:02.1199500Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1199894Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1200249Z return mod(**inputs) 2025-08-14T21:54:02.1200645Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1201103Z outputs = self.model( 2025-08-14T21:54:02.1201520Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1201943Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1202364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1202951Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1203332Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1203720Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1204218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 760, in forward 2025-08-14T21:54:02.1204675Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:02.1205112Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1205560Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1206034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.1206572Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.1206747Z 2025-08-14T21:54:02.1206833Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1207068Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1207300Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1207520Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1207748Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1207973Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1208197Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1208413Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1208640Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1208965Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1209185Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1209407Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1209662Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1210051Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1210405Z return mod(**inputs) 2025-08-14T21:54:02.1210814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1211243Z outputs = self.model( 2025-08-14T21:54:02.1211639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1212069Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1212490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1212923Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1213325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1213830Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1214261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 777, in forward 2025-08-14T21:54:02.1214724Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:54:02.1215179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1215626Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1216102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:02.1216664Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:02.1216904Z 2025-08-14T21:54:02.1217023Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1217420Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1217767Z return mod(**inputs) 2025-08-14T21:54:02.1218167Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1359, in forward 2025-08-14T21:54:02.1218589Z outputs = self.model( 2025-08-14T21:54:02.1218991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1207, in forward 2025-08-14T21:54:02.1219431Z decoder_outputs = self.decoder( 2025-08-14T21:54:02.1219855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1031, in forward 2025-08-14T21:54:02.1220277Z layer_outputs = decoder_layer( 2025-08-14T21:54:02.1220657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:02.1221040Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:02.1221467Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 777, in forward 2025-08-14T21:54:02.1221952Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:54:02.1222415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 438, in forward 2025-08-14T21:54:02.1222862Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:02.1223343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:02.1223834Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:02.1224009Z 2025-08-14T21:54:02.1224098Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1224334Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1224564Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1224782Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1225010Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1225237Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1225460Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1225675Z cudagraph partition due to non gpu ops 2025-08-14T21:54:02.1225930Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:02.1226334Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:02.1226682Z return mod(**inputs) 2025-08-14T21:54:02.1227089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/plbart/modeling_plbart.py", line 1383, in forward 2025-08-14T21:54:02.1227614Z masked_lm_loss = loss_fct(lm_logits.view(-1, self.config.vocab_size), labels.view(-1)) 2025-08-14T21:54:02.1227850Z 2025-08-14T21:54:13.0529849Z Compilation time (from dynamo_timed): 23.100626148 2025-08-14T21:54:13.0750732Z pass 2025-08-14T21:54:13.0751215Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:54:13.0752117Z TIMING: _recursive_pre_grad_passes:0.0547 _recursive_joint_graph_passes:0.54009 _recursive_post_grad_passes:0.11764 async_compile.wait:0.87188 code_gen:9.50777 inductor_compile:11.54687 backend_compile:19.23628 gc:0.0014 entire_frame_compile:23.10063 total_wall_time:23.10063 2025-08-14T21:54:13.0753108Z STATS: call_* op count: 519 | FakeTensorMode.__torch_dispatch__:36236 | FakeTensor.__torch_dispatch__:4384 | ProxyTorchDispatchMode.__torch_dispatch__:9868 2025-08-14T21:54:13.0753664Z Dynamo produced 1 graphs covering 519 ops with 0 graph breaks (0 unique) 2025-08-14T21:54:18.8086861Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:54:18.8087950Z from pkg_resources import resource_filename 2025-08-14T21:54:19.4359342Z 2025-08-14T21:54:23.3326514Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:54:23.3327309Z loading model: 0it [00:03, ?it/s] 2025-08-14T21:54:23.3332066Z cpu eval PegasusForCausalLM 2025-08-14T21:54:23.7502980Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:54:23.9392611Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:54:24.0938539Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:54:35.1643433Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1647783Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1652441Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1652993Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1653242Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1653477Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1654050Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1654276Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1654511Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1654771Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1655052Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1655302Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1655555Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1655769Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1655993Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1656216Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1656448Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1656731Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1656953Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1657177Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1657398Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1657616Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1658331Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1658820Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1659199Z return mod(**inputs) 2025-08-14T21:54:35.1659663Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1660132Z outputs = self.model.decoder( 2025-08-14T21:54:35.1660586Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1661020Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1661415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1661819Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1662272Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1662744Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1663222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1663677Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1664153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:35.1664672Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:35.1665004Z 2025-08-14T21:54:35.1665177Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1665590Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1666050Z return mod(**inputs) 2025-08-14T21:54:35.1666541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1666978Z outputs = self.model.decoder( 2025-08-14T21:54:35.1667403Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1667837Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1668287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1668718Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1669166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1669627Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1670100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1670571Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1671056Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:35.1671563Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:35.1671749Z 2025-08-14T21:54:35.1671840Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1672086Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1672301Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1672521Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1672742Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1672959Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1673183Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1673437Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1673650Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1673872Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1674091Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1674309Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1674528Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1674751Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1674964Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1675171Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1675419Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1675803Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1676140Z return mod(**inputs) 2025-08-14T21:54:35.1676549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1676973Z outputs = self.model.decoder( 2025-08-14T21:54:35.1677411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1677836Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1678216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1678617Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1679035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1679499Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1679966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1680478Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1680958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:35.1681482Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:35.1681686Z 2025-08-14T21:54:35.1681799Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1682209Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1682554Z return mod(**inputs) 2025-08-14T21:54:35.1683000Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1683449Z outputs = self.model.decoder( 2025-08-14T21:54:35.1683887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1684327Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1684714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1685133Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1685580Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1686052Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1686512Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1687014Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1687488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:35.1687980Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:35.1688158Z 2025-08-14T21:54:35.1688255Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1688481Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1689123Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1689370Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1689596Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1689812Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1690038Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1690262Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1690479Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1690701Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1690928Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1691140Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1691364Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1691589Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1691805Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1692027Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1692280Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1692673Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1693023Z return mod(**inputs) 2025-08-14T21:54:35.1693435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1693870Z outputs = self.model.decoder( 2025-08-14T21:54:35.1694290Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1694725Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1695101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1695479Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1695953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1696410Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1696866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1697325Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1697794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:35.1698312Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:35.1698507Z 2025-08-14T21:54:35.1698624Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1699008Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1699365Z return mod(**inputs) 2025-08-14T21:54:35.1699778Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1700216Z outputs = self.model.decoder( 2025-08-14T21:54:35.1700653Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1701082Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1701465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1701845Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1702294Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1703199Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1703755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1704219Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1704698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:35.1705197Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:35.1705371Z 2025-08-14T21:54:35.1705465Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1705692Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1705924Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1706147Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1706363Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1706584Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1706836Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1707229Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1707578Z return mod(**inputs) 2025-08-14T21:54:35.1707979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1708415Z outputs = self.model.decoder( 2025-08-14T21:54:35.1708828Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1709259Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1709636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1710033Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1710444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 442, in forward 2025-08-14T21:54:35.1710867Z hidden_states = residual + hidden_states 2025-08-14T21:54:35.1711083Z 2025-08-14T21:54:35.1711175Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1711459Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1711685Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1711903Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1712117Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1712334Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1712549Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1712765Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1712976Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1713199Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1713487Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1713873Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1714229Z return mod(**inputs) 2025-08-14T21:54:35.1714635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1715068Z outputs = self.model.decoder( 2025-08-14T21:54:35.1715485Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1715952Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1716335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1716709Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1717143Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1717603Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1718058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1718509Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1719098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:35.1719624Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:35.1719825Z 2025-08-14T21:54:35.1719945Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1720333Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1720692Z return mod(**inputs) 2025-08-14T21:54:35.1721114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1721558Z outputs = self.model.decoder( 2025-08-14T21:54:35.1722003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1722451Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1722837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1723224Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1723663Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1724145Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1724626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1725082Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1725571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:35.1726074Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:35.1726275Z 2025-08-14T21:54:35.1726362Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1726615Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1726846Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1727070Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1727291Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1727516Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1727738Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1727956Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1728181Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1728407Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1728623Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1728963Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1729190Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1729407Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1729631Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1729855Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1730113Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1730503Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1730884Z return mod(**inputs) 2025-08-14T21:54:35.1731302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1731728Z outputs = self.model.decoder( 2025-08-14T21:54:35.1732156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1732585Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1732958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1733339Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1733773Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1734229Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1734673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1735124Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1735590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:35.1736094Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:35.1736289Z 2025-08-14T21:54:35.1736406Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1736798Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1737151Z return mod(**inputs) 2025-08-14T21:54:35.1737568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1738005Z outputs = self.model.decoder( 2025-08-14T21:54:35.1738416Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1738845Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1739205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1739595Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1740021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1740484Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1740947Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1741480Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1741964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:35.1742472Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:35.1742639Z 2025-08-14T21:54:35.1742724Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1742950Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1743173Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1743385Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1743605Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1743836Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1744078Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1744465Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1744814Z return mod(**inputs) 2025-08-14T21:54:35.1745228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1745657Z outputs = self.model.decoder( 2025-08-14T21:54:35.1746104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1746542Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1746921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1747310Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1747736Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 442, in forward 2025-08-14T21:54:35.1748172Z hidden_states = residual + hidden_states 2025-08-14T21:54:35.1748318Z 2025-08-14T21:54:35.1748402Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1748630Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1748851Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1749063Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1749284Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1749505Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1749727Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1749934Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1750145Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1750358Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1750596Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1750985Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1751335Z return mod(**inputs) 2025-08-14T21:54:35.1751733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1752164Z outputs = self.model.decoder( 2025-08-14T21:54:35.1752587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1753018Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1753386Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1753776Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1754212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1754669Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1755114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1755566Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1756098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:35.1756617Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:35.1756830Z 2025-08-14T21:54:35.1756945Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1757343Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1757702Z return mod(**inputs) 2025-08-14T21:54:35.1758106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1758569Z outputs = self.model.decoder( 2025-08-14T21:54:35.1758992Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1759416Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1759785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1760175Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1760620Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1761095Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1761566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1762181Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1762679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:35.1763175Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:35.1763358Z 2025-08-14T21:54:35.1763446Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1763684Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1763907Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1764135Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1764359Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1764585Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1764797Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1765018Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1765237Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1765448Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1765669Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1765887Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1766102Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1766320Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1766539Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1766752Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1767005Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1767400Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1767752Z return mod(**inputs) 2025-08-14T21:54:35.1768160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1768595Z outputs = self.model.decoder( 2025-08-14T21:54:35.1769093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1769516Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1769895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1770289Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1770722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1771238Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1771701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1772163Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1772651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:35.1773168Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:35.1773376Z 2025-08-14T21:54:35.1773516Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1773910Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1774257Z return mod(**inputs) 2025-08-14T21:54:35.1774668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1775102Z outputs = self.model.decoder( 2025-08-14T21:54:35.1775522Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1776780Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1777159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1777549Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1777983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1778436Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1778874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1779310Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1779762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:35.1780246Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:35.1780428Z 2025-08-14T21:54:35.1780514Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1780746Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1780967Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1781191Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1781413Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1781627Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1781879Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1782269Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1782619Z return mod(**inputs) 2025-08-14T21:54:35.1783023Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1783453Z outputs = self.model.decoder( 2025-08-14T21:54:35.1783883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1784290Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1784652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1785033Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1785483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 442, in forward 2025-08-14T21:54:35.1785914Z hidden_states = residual + hidden_states 2025-08-14T21:54:35.1786068Z 2025-08-14T21:54:35.1786151Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1786377Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1786633Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1786880Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1787110Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1787334Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1787567Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1787796Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1788029Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1788255Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1788520Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1788949Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1789290Z return mod(**inputs) 2025-08-14T21:54:35.1789699Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1790137Z outputs = self.model.decoder( 2025-08-14T21:54:35.1790568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1790988Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1791446Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1791848Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1792284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1792755Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1793238Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1793701Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1794178Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:35.1794709Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:35.1794907Z 2025-08-14T21:54:35.1795028Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1795421Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1795767Z return mod(**inputs) 2025-08-14T21:54:35.1796176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1796630Z outputs = self.model.decoder( 2025-08-14T21:54:35.1797044Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1797474Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1797856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1798261Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1798698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1799168Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1799623Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1800078Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1800565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:35.1801057Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:35.1801229Z 2025-08-14T21:54:35.1801323Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1801546Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1801887Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1802152Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1802368Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1802592Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1803059Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1803286Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1803503Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1803732Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1803954Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1804174Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1804399Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1804699Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1804916Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1805141Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1805398Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1805794Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1806155Z return mod(**inputs) 2025-08-14T21:54:35.1806571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1807046Z outputs = self.model.decoder( 2025-08-14T21:54:35.1807483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1807929Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1808310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1808748Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1809199Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1809659Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1810111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1810554Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1811033Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:35.1811548Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:35.1811747Z 2025-08-14T21:54:35.1811868Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1812254Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1812605Z return mod(**inputs) 2025-08-14T21:54:35.1813016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1813446Z outputs = self.model.decoder( 2025-08-14T21:54:35.1813850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1814268Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1814633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1815007Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1815427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1815868Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1816315Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1816755Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1817314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:35.1817810Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:35.1817980Z 2025-08-14T21:54:35.1818071Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1818285Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1818502Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1818717Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1818926Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1819141Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1819404Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1819778Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1820127Z return mod(**inputs) 2025-08-14T21:54:35.1820530Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1820956Z outputs = self.model.decoder( 2025-08-14T21:54:35.1821360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1821821Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1822189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1822565Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1822990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 442, in forward 2025-08-14T21:54:35.1823431Z hidden_states = residual + hidden_states 2025-08-14T21:54:35.1823580Z 2025-08-14T21:54:35.1823672Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1823891Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1824115Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1824340Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1824559Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1824785Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1825016Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1825227Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1825444Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1825661Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1825904Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1826277Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1826630Z return mod(**inputs) 2025-08-14T21:54:35.1827043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1827460Z outputs = self.model.decoder( 2025-08-14T21:54:35.1827879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1828299Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1828674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1829061Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1829494Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1829953Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1830407Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1830846Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1831321Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:35.1831887Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:35.1832085Z 2025-08-14T21:54:35.1832198Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1832600Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1832957Z return mod(**inputs) 2025-08-14T21:54:35.1833366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1833797Z outputs = self.model.decoder( 2025-08-14T21:54:35.1834242Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1834673Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1835043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1835436Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1835872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1836330Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1836803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1837255Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1837734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:35.1838230Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:35.1838401Z 2025-08-14T21:54:35.1838490Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1838723Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1838949Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1839168Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1839390Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1839611Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1839825Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1840049Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1840272Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1840492Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1840708Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1840928Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1841150Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1841366Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1841587Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1841809Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1842056Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1842447Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1842804Z return mod(**inputs) 2025-08-14T21:54:35.1843212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1843640Z outputs = self.model.decoder( 2025-08-14T21:54:35.1844080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1844525Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1844890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1845292Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1845747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1846212Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1846723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1847181Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1847663Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:35.1848188Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:35.1848384Z 2025-08-14T21:54:35.1848498Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1849000Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1849361Z return mod(**inputs) 2025-08-14T21:54:35.1849776Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1850229Z outputs = self.model.decoder( 2025-08-14T21:54:35.1850673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1851105Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1851496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1851888Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1852333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1852781Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1853244Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1853694Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1854170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:35.1854657Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:35.1854837Z 2025-08-14T21:54:35.1854926Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1855157Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1855380Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1855597Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1855817Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1856039Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1856283Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1856675Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1857034Z return mod(**inputs) 2025-08-14T21:54:35.1857420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1857843Z outputs = self.model.decoder( 2025-08-14T21:54:35.1858254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1858670Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1859029Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1859417Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1859852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 442, in forward 2025-08-14T21:54:35.1860283Z hidden_states = residual + hidden_states 2025-08-14T21:54:35.1860433Z 2025-08-14T21:54:35.1860518Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1860744Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1860967Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1861191Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1861524Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1861746Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1861954Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1862171Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1862395Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1862617Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1862865Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1863259Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1863618Z return mod(**inputs) 2025-08-14T21:54:35.1864040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1864488Z outputs = self.model.decoder( 2025-08-14T21:54:35.1864914Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1865346Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1865721Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1866132Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1866563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1867008Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1867470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1867908Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1868365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:54:35.1868856Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:54:35.1869052Z 2025-08-14T21:54:35.1869163Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1869551Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1869902Z return mod(**inputs) 2025-08-14T21:54:35.1870298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1634, in forward 2025-08-14T21:54:35.1870736Z outputs = self.model.decoder( 2025-08-14T21:54:35.1871145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:54:35.1871564Z layer_outputs = decoder_layer( 2025-08-14T21:54:35.1871941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:54:35.1872333Z return super().__call__(*args, **kwargs) 2025-08-14T21:54:35.1872770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:54:35.1873222Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:54:35.1873675Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:54:35.1874125Z attn_output, attn_weights = attention_interface( 2025-08-14T21:54:35.1874596Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:54:35.1875090Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:54:35.1875271Z 2025-08-14T21:54:35.1875358Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1875591Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1875813Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1876039Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1876294Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1876530Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1876759Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1876994Z cudagraph partition due to non gpu ops 2025-08-14T21:54:35.1877249Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:54:35.1877638Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:54:35.1877993Z return mod(**inputs) 2025-08-14T21:54:35.1878405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1656, in forward 2025-08-14T21:54:35.1878936Z loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) 2025-08-14T21:54:35.1879161Z 2025-08-14T21:54:45.0364268Z Compilation time (from dynamo_timed): 19.577871859 2025-08-14T21:54:45.0384935Z pass 2025-08-14T21:54:45.0385948Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:54:45.0387028Z TIMING: _recursive_pre_grad_passes:0.04058 _recursive_joint_graph_passes:0.42694 _recursive_post_grad_passes:0.08581 async_compile.wait:1.02047 code_gen:9.87812 inductor_compile:11.51858 backend_compile:17.06603 gc:0.00135 entire_frame_compile:19.57787 total_wall_time:19.57787 2025-08-14T21:54:45.0388318Z STATS: call_* op count: 371 | FakeTensorMode.__torch_dispatch__:27561 | FakeTensor.__torch_dispatch__:3322 | ProxyTorchDispatchMode.__torch_dispatch__:7479 2025-08-14T21:54:45.0388914Z Dynamo produced 1 graphs covering 371 ops with 0 graph breaks (0 unique) 2025-08-14T21:54:50.9574190Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:54:50.9575910Z from pkg_resources import resource_filename 2025-08-14T21:54:51.5985251Z 2025-08-14T21:54:58.0517160Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:54:58.0517482Z loading model: 0it [00:06, ?it/s] 2025-08-14T21:54:58.0519446Z cpu eval PegasusForConditionalGeneration 2025-08-14T21:54:58.8213725Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:54:59.2287344Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:54:59.5095625Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:55:25.0355621Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0358800Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0359113Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0359374Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0359617Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0359861Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0360106Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0360344Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0360576Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0361027Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0361328Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0361567Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0361843Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0362108Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0362365Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0362596Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0362827Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0363057Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0363287Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0363556Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0364149Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0364460Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0364728Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0365155Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0365526Z return mod(**inputs) 2025-08-14T21:55:25.0365984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0366434Z outputs = self.model( 2025-08-14T21:55:25.0366927Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0367394Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0367848Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0368309Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0368910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0369341Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0369871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0370368Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0370839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0371328Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0371819Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0372355Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0372567Z 2025-08-14T21:55:25.0372701Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0373108Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0373470Z return mod(**inputs) 2025-08-14T21:55:25.0373889Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0374339Z outputs = self.model( 2025-08-14T21:55:25.0374752Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0375192Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0375638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0376078Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0376458Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0376868Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0377318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0377783Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0378245Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0378718Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0379212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0379709Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0379911Z 2025-08-14T21:55:25.0380005Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0380252Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0380514Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0380763Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0381003Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0381235Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0381459Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0381685Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0381914Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0382141Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0382371Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0382594Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0382817Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0383073Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0383304Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0383538Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0383788Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0384196Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0384573Z return mod(**inputs) 2025-08-14T21:55:25.0384984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0385446Z outputs = self.model( 2025-08-14T21:55:25.0385865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0386307Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0386731Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0387169Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0387561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0387960Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0388427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0388887Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0389340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0389805Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0390291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0391009Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0391216Z 2025-08-14T21:55:25.0391340Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0391738Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0392107Z return mod(**inputs) 2025-08-14T21:55:25.0392528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0392964Z outputs = self.model( 2025-08-14T21:55:25.0393420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0393871Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0394404Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0394837Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0395238Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0395650Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0396090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0396611Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0397076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0397543Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0398023Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0398531Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0398722Z 2025-08-14T21:55:25.0398813Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0399072Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0399302Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0399532Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0399756Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0399982Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0400209Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0400433Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0400649Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0400901Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0401126Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0401356Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0401578Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0401805Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0402032Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0402254Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0402519Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0403118Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0403481Z return mod(**inputs) 2025-08-14T21:55:25.0403910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0404347Z outputs = self.model( 2025-08-14T21:55:25.0404768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0405201Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0405633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0406074Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0406455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0406873Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0407330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0407794Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0408253Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0408844Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0409389Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0409912Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0410116Z 2025-08-14T21:55:25.0410233Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0410642Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0411007Z return mod(**inputs) 2025-08-14T21:55:25.0411425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0411928Z outputs = self.model( 2025-08-14T21:55:25.0412371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0412814Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0413234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0413678Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0414066Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0414468Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0414960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0415421Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0415886Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0416336Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0416820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0417356Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0417538Z 2025-08-14T21:55:25.0417638Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0417878Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0418116Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0418349Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0418576Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0418811Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0419074Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0419476Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0419838Z return mod(**inputs) 2025-08-14T21:55:25.0420251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0420691Z outputs = self.model( 2025-08-14T21:55:25.0421108Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0421555Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0421989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0422433Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0422816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0423219Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0423671Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 327, in forward 2025-08-14T21:55:25.0424111Z hidden_states = residual + hidden_states 2025-08-14T21:55:25.0424280Z 2025-08-14T21:55:25.0424375Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0424610Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0424845Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0425072Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0425309Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0425541Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0425766Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0426001Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0426230Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0426454Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0426713Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0427171Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0427532Z return mod(**inputs) 2025-08-14T21:55:25.0427935Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0428366Z outputs = self.model( 2025-08-14T21:55:25.0428774Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0429201Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0429642Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0430079Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0430473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0430868Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0431333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0431797Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0432271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0432747Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0433241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0433763Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0433968Z 2025-08-14T21:55:25.0434085Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0434492Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0434853Z return mod(**inputs) 2025-08-14T21:55:25.0435267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0435708Z outputs = self.model( 2025-08-14T21:55:25.0436128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0436585Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0437013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0437462Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0437857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0438270Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0438729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0439203Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0439667Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0440140Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0440632Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0441143Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0441323Z 2025-08-14T21:55:25.0441423Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0441661Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0441902Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0442137Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0442372Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0442619Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0442863Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0443094Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0443312Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0443539Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0443764Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0443979Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0444205Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0444431Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0444651Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0444876Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0445148Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0445550Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0445913Z return mod(**inputs) 2025-08-14T21:55:25.0446340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0446786Z outputs = self.model( 2025-08-14T21:55:25.0447204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0447675Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0448096Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0448525Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0449023Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0449440Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0449880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0450327Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0450814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0451267Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0451749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0452256Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0452466Z 2025-08-14T21:55:25.0452579Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0452976Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0453340Z return mod(**inputs) 2025-08-14T21:55:25.0453777Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0454212Z outputs = self.model( 2025-08-14T21:55:25.0454624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0455075Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0455501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0455948Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0456327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0456718Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0457178Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0457628Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0458075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0458574Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0459060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0459566Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0459740Z 2025-08-14T21:55:25.0459838Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0460067Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0460298Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0460525Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0460767Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0461003Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0461258Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0461649Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0462012Z return mod(**inputs) 2025-08-14T21:55:25.0462427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0462885Z outputs = self.model( 2025-08-14T21:55:25.0463293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0463731Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0464160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0464589Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0464977Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0465376Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0465834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 327, in forward 2025-08-14T21:55:25.0466274Z hidden_states = residual + hidden_states 2025-08-14T21:55:25.0466448Z 2025-08-14T21:55:25.0466538Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0466776Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0467003Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0467234Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0467462Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0467690Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0467912Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0468141Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0468369Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0468590Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0468851Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0469253Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0469614Z return mod(**inputs) 2025-08-14T21:55:25.0470030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0470465Z outputs = self.model( 2025-08-14T21:55:25.0470895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0471355Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0471818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0472278Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0472660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0473087Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0473594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0474051Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0474534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0474994Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0475473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0475987Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0476203Z 2025-08-14T21:55:25.0476351Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0476744Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0477098Z return mod(**inputs) 2025-08-14T21:55:25.0477498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0477924Z outputs = self.model( 2025-08-14T21:55:25.0478329Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0478791Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0479216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0479648Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0480031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0480427Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0480857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0481313Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0481763Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0482215Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0482695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0483189Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0483362Z 2025-08-14T21:55:25.0483459Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0483685Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0483921Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0484149Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0484370Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0484598Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0484826Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0485046Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0485269Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0485494Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0485721Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0485939Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0486164Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0486388Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0486608Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0486835Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0487096Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0487488Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0487847Z return mod(**inputs) 2025-08-14T21:55:25.0488261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0488865Z outputs = self.model( 2025-08-14T21:55:25.0489331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0489769Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0490203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0490651Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0491036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0491456Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0491927Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0492395Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0492885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0493343Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0493904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0494414Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0494619Z 2025-08-14T21:55:25.0494732Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0495130Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0495480Z return mod(**inputs) 2025-08-14T21:55:25.0495924Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0496370Z outputs = self.model( 2025-08-14T21:55:25.0496781Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0497215Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0497645Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0498077Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0498456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0498846Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0499283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0499731Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0500173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0500636Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0501111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0501606Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0501778Z 2025-08-14T21:55:25.0501867Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0502102Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0502330Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0502548Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0502945Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0503179Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0503438Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0503825Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0504253Z return mod(**inputs) 2025-08-14T21:55:25.0504693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0505120Z outputs = self.model( 2025-08-14T21:55:25.0505539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0505973Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0506402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0506827Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0507242Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0507648Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0508092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 327, in forward 2025-08-14T21:55:25.0508548Z hidden_states = residual + hidden_states 2025-08-14T21:55:25.0508709Z 2025-08-14T21:55:25.0508800Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0509067Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0509287Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0509518Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0509746Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0509965Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0510190Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0510414Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0510633Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0510863Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0511120Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0511516Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0511869Z return mod(**inputs) 2025-08-14T21:55:25.0512280Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0512710Z outputs = self.model( 2025-08-14T21:55:25.0513146Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0513587Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0514012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0514445Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0514821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0515217Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0515661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0516106Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0516576Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0517032Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0517509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0518018Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0518220Z 2025-08-14T21:55:25.0518336Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0518732Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0519086Z return mod(**inputs) 2025-08-14T21:55:25.0519548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0519981Z outputs = self.model( 2025-08-14T21:55:25.0520388Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0520825Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0521259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0521698Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0522095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0522496Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0522947Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0523413Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0523876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0524338Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0524847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0525354Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0525529Z 2025-08-14T21:55:25.0525617Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0525856Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0526089Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0526320Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0526542Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0526767Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0526996Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0527216Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0527444Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0527672Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0527893Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0528121Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0528348Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0528566Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0528954Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0529198Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0529454Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0529855Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0530217Z return mod(**inputs) 2025-08-14T21:55:25.0530633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0531062Z outputs = self.model( 2025-08-14T21:55:25.0531476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0531915Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0532343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0532770Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0533151Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0533550Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0534009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0534459Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0534976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0535437Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0535910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0536435Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0536640Z 2025-08-14T21:55:25.0536757Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0537153Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0537564Z return mod(**inputs) 2025-08-14T21:55:25.0538042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0538516Z outputs = self.model( 2025-08-14T21:55:25.0538921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0539343Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0539753Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0540200Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0540566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0540956Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0541392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0541839Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0542276Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0542726Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0543194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0543672Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0543850Z 2025-08-14T21:55:25.0543936Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0544163Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0544383Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0544625Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0544851Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0545076Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0545324Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0545721Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0546080Z return mod(**inputs) 2025-08-14T21:55:25.0546497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0546920Z outputs = self.model( 2025-08-14T21:55:25.0547324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0547751Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0548163Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0548584Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0548956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0549345Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0549768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 327, in forward 2025-08-14T21:55:25.0550236Z hidden_states = residual + hidden_states 2025-08-14T21:55:25.0550384Z 2025-08-14T21:55:25.0550474Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0550689Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0550914Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0551133Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0551618Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0551828Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0552050Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0552274Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0552511Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0552742Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0552997Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0553384Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0553742Z return mod(**inputs) 2025-08-14T21:55:25.0554173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0554615Z outputs = self.model( 2025-08-14T21:55:25.0555007Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0555431Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0555845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0556268Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0556644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0557036Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0557462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0557908Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0558351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0558797Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0559265Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0559758Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0559957Z 2025-08-14T21:55:25.0560070Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0560457Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0560796Z return mod(**inputs) 2025-08-14T21:55:25.0561219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0561648Z outputs = self.model( 2025-08-14T21:55:25.0562064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0562500Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0562939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0563380Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0563754Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0564161Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0564609Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0565096Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0565566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0566037Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0566527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0567026Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0567202Z 2025-08-14T21:55:25.0567291Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0567527Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0567775Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0567997Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0568222Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0568449Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0568666Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0568981Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0569214Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0569443Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0569663Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0569947Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0570176Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0570398Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0570627Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0570853Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0571101Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0571510Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0571882Z return mod(**inputs) 2025-08-14T21:55:25.0572302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0572742Z outputs = self.model( 2025-08-14T21:55:25.0573160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0573599Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0574024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0574471Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0574857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0575258Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0575711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0576175Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0576648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0577104Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0577580Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0578101Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0578302Z 2025-08-14T21:55:25.0578424Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0578813Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0579171Z return mod(**inputs) 2025-08-14T21:55:25.0579584Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0580027Z outputs = self.model( 2025-08-14T21:55:25.0580464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0580915Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0581340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0581772Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0582147Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0582540Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0582997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0583449Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0583905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0584363Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0584843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0585364Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0585548Z 2025-08-14T21:55:25.0585636Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0585874Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0586097Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0586327Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0586557Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0586783Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0587032Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0587431Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0587793Z return mod(**inputs) 2025-08-14T21:55:25.0588188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0588606Z outputs = self.model( 2025-08-14T21:55:25.0589006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0589440Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0589860Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0590294Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0590666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0591051Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0591490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 327, in forward 2025-08-14T21:55:25.0591936Z hidden_states = residual + hidden_states 2025-08-14T21:55:25.0592089Z 2025-08-14T21:55:25.0592183Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0592409Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0592644Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0592869Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0593089Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0593318Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0593546Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0593765Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0593994Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0594221Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0594477Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0594868Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0595256Z return mod(**inputs) 2025-08-14T21:55:25.0595684Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0596117Z outputs = self.model( 2025-08-14T21:55:25.0596525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0596963Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0597404Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0597841Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0598241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0598648Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0599098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0599567Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0600024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0600506Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0600982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0601502Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0601700Z 2025-08-14T21:55:25.0601823Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0602219Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0602566Z return mod(**inputs) 2025-08-14T21:55:25.0603187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0603635Z outputs = self.model( 2025-08-14T21:55:25.0604044Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1279, in forward 2025-08-14T21:55:25.0604490Z encoder_outputs = self.encoder( 2025-08-14T21:55:25.0604924Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 841, in forward 2025-08-14T21:55:25.0605363Z layer_outputs = encoder_layer( 2025-08-14T21:55:25.0605742Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0606149Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0606593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 312, in forward 2025-08-14T21:55:25.0607045Z hidden_states, attn_weights = self.self_attn( 2025-08-14T21:55:25.0607499Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0607959Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0608444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0609006Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0609685Z 2025-08-14T21:55:25.0609781Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0610022Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0610260Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0610489Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0610720Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0610953Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0611175Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0611485Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0611746Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0611972Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0612205Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0612439Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0612661Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0612909Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0613138Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0613364Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0613612Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0614041Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0614402Z return mod(**inputs) 2025-08-14T21:55:25.0614807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0615240Z outputs = self.model( 2025-08-14T21:55:25.0615686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0616121Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0616575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0617025Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0617411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0617799Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0618275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0618760Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0619253Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0619703Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0620186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0620708Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0620909Z 2025-08-14T21:55:25.0621032Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0621422Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0621777Z return mod(**inputs) 2025-08-14T21:55:25.0622213Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0622636Z outputs = self.model( 2025-08-14T21:55:25.0623047Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0623484Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0623913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0624339Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0624722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0625122Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0625556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0626022Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0626478Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0626964Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0627443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0627929Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0628107Z 2025-08-14T21:55:25.0628193Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0628429Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0628657Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0628889Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0629119Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0629342Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0629591Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0629820Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0630038Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0630264Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0630490Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0630719Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0630966Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0631358Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0631740Z return mod(**inputs) 2025-08-14T21:55:25.0632145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0632577Z outputs = self.model( 2025-08-14T21:55:25.0632989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0633432Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0633871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0634304Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0634687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0635076Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0635516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0635994Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0636469Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0636919Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0637389Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0637889Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0638083Z 2025-08-14T21:55:25.0638205Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0638585Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0638927Z return mod(**inputs) 2025-08-14T21:55:25.0639324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0639744Z outputs = self.model( 2025-08-14T21:55:25.0640139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0640567Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0640981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0641406Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0641776Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0642223Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0642658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0643132Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0643595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0644051Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0644555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0645053Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0645234Z 2025-08-14T21:55:25.0645321Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0645556Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0645785Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0646015Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0646242Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0646465Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0646713Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0646943Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0647160Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0647384Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0647609Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0647828Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0648053Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0648275Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0648499Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0648799Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0649074Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0649484Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0649837Z return mod(**inputs) 2025-08-14T21:55:25.0650252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0650695Z outputs = self.model( 2025-08-14T21:55:25.0651110Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0651543Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0652075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0652513Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0652887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0653294Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0653757Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0654220Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0654669Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0655130Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0655610Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0656128Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0656328Z 2025-08-14T21:55:25.0656446Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0656847Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0657232Z return mod(**inputs) 2025-08-14T21:55:25.0657660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0658080Z outputs = self.model( 2025-08-14T21:55:25.0658501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0658955Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0659381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0659802Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0660195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0660576Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0661006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0661455Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0661901Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0663455Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0663944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0664421Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0664590Z 2025-08-14T21:55:25.0664683Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0664904Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0665129Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0665354Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0665566Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0665786Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0666003Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0666215Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0666436Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0666663Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0666877Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0667099Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0667357Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0667741Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0668079Z return mod(**inputs) 2025-08-14T21:55:25.0668476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0668905Z outputs = self.model( 2025-08-14T21:55:25.0669314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0669734Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0670148Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0670565Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0670928Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0671312Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0671736Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0672186Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0672631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0673124Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0673593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0674087Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0674286Z 2025-08-14T21:55:25.0674398Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0674784Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0675133Z return mod(**inputs) 2025-08-14T21:55:25.0675564Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0675989Z outputs = self.model( 2025-08-14T21:55:25.0676384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0676813Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0677220Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0677637Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0678033Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0678405Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0678829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0679296Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0679752Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0680207Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0680689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0681186Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0681360Z 2025-08-14T21:55:25.0681459Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0681689Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0681930Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0682152Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0682365Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0682593Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0682850Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0683241Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0683596Z return mod(**inputs) 2025-08-14T21:55:25.0684034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0684480Z outputs = self.model( 2025-08-14T21:55:25.0684882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0685318Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0685743Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0686167Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0686551Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0686949Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0687387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 442, in forward 2025-08-14T21:55:25.0687821Z hidden_states = residual + hidden_states 2025-08-14T21:55:25.0688004Z 2025-08-14T21:55:25.0688107Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0688343Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0688565Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0688875Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0689107Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0689337Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0689560Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0689787Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0690016Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0690235Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0690520Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0690922Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0691275Z return mod(**inputs) 2025-08-14T21:55:25.0691686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0692118Z outputs = self.model( 2025-08-14T21:55:25.0692527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0692976Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0693402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0693833Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0694208Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0712239Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0712811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0713398Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0713877Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0714353Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0714854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0715373Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0715578Z 2025-08-14T21:55:25.0715702Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0716111Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0716478Z return mod(**inputs) 2025-08-14T21:55:25.0716891Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0717336Z outputs = self.model( 2025-08-14T21:55:25.0717761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0718205Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0718638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0719077Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0719465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0719858Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0720312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0720779Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0721239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0721890Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0722381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0722887Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0723064Z 2025-08-14T21:55:25.0723164Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0723396Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0723625Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0723854Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0724116Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0724344Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0724571Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0724787Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0725016Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0725242Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0725459Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0725686Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0725995Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0726400Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0726756Z return mod(**inputs) 2025-08-14T21:55:25.0727175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0727612Z outputs = self.model( 2025-08-14T21:55:25.0728022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0728457Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0729009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0729460Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0729838Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0730243Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0730685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0731159Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0731618Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0732075Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0732558Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0733071Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0733282Z 2025-08-14T21:55:25.0733400Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0733795Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0734154Z return mod(**inputs) 2025-08-14T21:55:25.0734562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0734980Z outputs = self.model( 2025-08-14T21:55:25.0735381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0735801Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0736207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0736668Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0737073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0737469Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0737916Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0738399Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0738854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0739292Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0739779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0740260Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0740430Z 2025-08-14T21:55:25.0740521Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0740743Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0740965Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0741184Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0741420Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0741645Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0741865Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0742078Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0742298Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0742519Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0742733Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0742956Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0743174Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0743395Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0743607Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0743830Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0744082Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0744461Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0744822Z return mod(**inputs) 2025-08-14T21:55:25.0745236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0745660Z outputs = self.model( 2025-08-14T21:55:25.0746071Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0746504Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0746936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0747352Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0747739Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0748140Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0748587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0749030Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0749491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0749953Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0750430Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0750950Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0751153Z 2025-08-14T21:55:25.0751267Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0751727Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0752081Z return mod(**inputs) 2025-08-14T21:55:25.0752494Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0752914Z outputs = self.model( 2025-08-14T21:55:25.0753334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0753762Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0754192Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0754631Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0754992Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0755394Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0755844Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0756289Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0756765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0757222Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0757684Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0758172Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0758341Z 2025-08-14T21:55:25.0758426Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0758659Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0758920Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0759143Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0759217Z return mod(**inputs) 2025-08-14T21:55:25.0759516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0759595Z outputs = self.model( 2025-08-14T21:55:25.0759892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0759970Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0760250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0760338Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0760572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0760662Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0760948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 416, in forward 2025-08-14T21:55:25.0761037Z hidden_states = residual + hidden_states 2025-08-14T21:55:25.0761043Z 2025-08-14T21:55:25.0761134Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0761219Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0761301Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0761389Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0761468Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0761545Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0761635Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0761715Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0761801Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0761880Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0762015Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0762254Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0762331Z return mod(**inputs) 2025-08-14T21:55:25.0762619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0762702Z outputs = self.model( 2025-08-14T21:55:25.0762988Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0763075Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0763388Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0763471Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0763718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0763809Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0764096Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0764248Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0764532Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0764645Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0764955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0765098Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0765102Z 2025-08-14T21:55:25.0765225Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0765440Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0765526Z return mod(**inputs) 2025-08-14T21:55:25.0765813Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0765891Z outputs = self.model( 2025-08-14T21:55:25.0766182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0766262Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0766549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0766640Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0766879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0766974Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0767263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0767380Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0767673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0767778Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0768099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0768214Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0768218Z 2025-08-14T21:55:25.0768308Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0768401Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0768484Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0768566Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0768680Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0768882Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0768972Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0769065Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0769150Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0769240Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0769324Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0769405Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0769496Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0769578Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0769660Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0769769Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0769887Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0770103Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0770186Z return mod(**inputs) 2025-08-14T21:55:25.0770475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0770559Z outputs = self.model( 2025-08-14T21:55:25.0770866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0770946Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0771237Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0771316Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0771556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0771655Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0771937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0772057Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0772338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0772442Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0772759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0772898Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0772903Z 2025-08-14T21:55:25.0773023Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0773238Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0773312Z return mod(**inputs) 2025-08-14T21:55:25.0773607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0773685Z outputs = self.model( 2025-08-14T21:55:25.0773978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0774060Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0774342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0774428Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0774666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0774755Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0775043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0775150Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0775485Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0775593Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0775906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0776031Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0776035Z 2025-08-14T21:55:25.0776121Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0776213Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0776315Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0776401Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0776493Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0776576Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0776660Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0776750Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0776835Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0776919Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0777011Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0777113Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0777229Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0777456Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0777543Z return mod(**inputs) 2025-08-14T21:55:25.0777846Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0777922Z outputs = self.model( 2025-08-14T21:55:25.0778209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0778300Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0778588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0778668Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0778918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0779005Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0779298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0779414Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0779699Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0779808Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0780123Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0780270Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0780274Z 2025-08-14T21:55:25.0780386Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0780607Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0780687Z return mod(**inputs) 2025-08-14T21:55:25.0780991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0781065Z outputs = self.model( 2025-08-14T21:55:25.0781369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0781445Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0781728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0781844Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0782082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0782180Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0782464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0782586Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0782886Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0782991Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0783308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0783425Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0783431Z 2025-08-14T21:55:25.0783523Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0783609Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0783743Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0783978Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0784048Z return mod(**inputs) 2025-08-14T21:55:25.0784323Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0784406Z outputs = self.model( 2025-08-14T21:55:25.0784680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0784765Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0785037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0785117Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0785355Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0785443Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0785717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 433, in forward 2025-08-14T21:55:25.0785811Z hidden_states = residual + hidden_states 2025-08-14T21:55:25.0785814Z 2025-08-14T21:55:25.0785896Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0785985Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0786066Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0786146Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0786234Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0786313Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0786392Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0786480Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0786560Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0786639Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0786730Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0786809Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0786895Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0786974Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0787085Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0787299Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0787370Z return mod(**inputs) 2025-08-14T21:55:25.0787644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0787725Z outputs = self.model( 2025-08-14T21:55:25.0788078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0788165Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0788443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0788523Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0788766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0788851Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0789142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0789258Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0789577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0789688Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0789994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0790151Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0790155Z 2025-08-14T21:55:25.0790273Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0790515Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0790593Z return mod(**inputs) 2025-08-14T21:55:25.0790885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0790958Z outputs = self.model( 2025-08-14T21:55:25.0791260Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0791342Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0791621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0791711Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0791948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0792042Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0792339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0792447Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0792740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0792843Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0793165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0793280Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0793286Z 2025-08-14T21:55:25.0793371Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0793464Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0793557Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0793635Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0793723Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0793803Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0793884Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0793971Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0794050Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0794136Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0794215Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0794325Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0794459Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0794670Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0794744Z return mod(**inputs) 2025-08-14T21:55:25.0795047Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0795121Z outputs = self.model( 2025-08-14T21:55:25.0795460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0795558Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0795848Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0795934Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0796169Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0796254Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0796558Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0796690Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0796987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0797089Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0797405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0797547Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0797551Z 2025-08-14T21:55:25.0797661Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0797907Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0797979Z return mod(**inputs) 2025-08-14T21:55:25.0798278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0798360Z outputs = self.model( 2025-08-14T21:55:25.0798658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0798734Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0799042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0799119Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0799356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0799441Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0799724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0799847Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0800133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0800238Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0800545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0800659Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0800663Z 2025-08-14T21:55:25.0800757Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0800842Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0800947Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0801056Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0801140Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0801228Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0801341Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0801557Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0801637Z return mod(**inputs) 2025-08-14T21:55:25.0801938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0802013Z outputs = self.model( 2025-08-14T21:55:25.0802326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0802408Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0802874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0802965Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0803209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0803361Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0803648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 442, in forward 2025-08-14T21:55:25.0803738Z hidden_states = residual + hidden_states 2025-08-14T21:55:25.0803750Z 2025-08-14T21:55:25.0803835Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0803920Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0804008Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0804090Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0804174Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0804262Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0804346Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0804430Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0804520Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0804604Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0804718Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0804944Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0805019Z return mod(**inputs) 2025-08-14T21:55:25.0805314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0805394Z outputs = self.model( 2025-08-14T21:55:25.0805678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0805771Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0806060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0806146Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0806385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0806476Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0806768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0806876Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0807181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0807295Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0807607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0807812Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0807817Z 2025-08-14T21:55:25.0807932Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0808147Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0808229Z return mod(**inputs) 2025-08-14T21:55:25.0808529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0808610Z outputs = self.model( 2025-08-14T21:55:25.0808995Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0809084Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0809391Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0809472Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0809712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0809806Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0810109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0810223Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0810505Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0810610Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0810929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0811046Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0811053Z 2025-08-14T21:55:25.0811149Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0811234Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0811319Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0811411Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0811496Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0811577Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0811666Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0811749Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0811829Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0811919Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0812003Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0812091Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0812203Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0812418Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0812502Z return mod(**inputs) 2025-08-14T21:55:25.0812789Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0812867Z outputs = self.model( 2025-08-14T21:55:25.0813156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0813236Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0813525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0813603Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0813842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0813938Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0814252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0814390Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0814683Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0814789Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0815104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0815243Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0815247Z 2025-08-14T21:55:25.0815378Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0815604Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0815678Z return mod(**inputs) 2025-08-14T21:55:25.0815972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0816047Z outputs = self.model( 2025-08-14T21:55:25.0816339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0816444Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0816718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0816795Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0817037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0817123Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0817412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0817530Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0817820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0817930Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0818237Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0818356Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0818360Z 2025-08-14T21:55:25.0818445Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0818528Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0818618Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0818700Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0818780Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0818869Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0818950Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0819033Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0819119Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0819198Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0819287Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0819367Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0819447Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0819534Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0819613Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0819691Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0819807Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0820017Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0820090Z return mod(**inputs) 2025-08-14T21:55:25.0820374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0820532Z outputs = self.model( 2025-08-14T21:55:25.0820819Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0820900Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0821183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0821269Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0821507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0821621Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0821913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0822020Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0822328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0822433Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0822766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0822909Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0822913Z 2025-08-14T21:55:25.0823026Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0823251Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0823329Z return mod(**inputs) 2025-08-14T21:55:25.0823633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0823718Z outputs = self.model( 2025-08-14T21:55:25.0824008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0824096Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0824381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0824462Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0824706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0824793Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0825074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0825189Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0825496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0825611Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0825932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0826047Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0826051Z 2025-08-14T21:55:25.0826142Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0826223Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0826339Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0826548Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0826620Z return mod(**inputs) 2025-08-14T21:55:25.0826902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0826974Z outputs = self.model( 2025-08-14T21:55:25.0827297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0827386Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0827660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0827747Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0827976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0828062Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0828367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 416, in forward 2025-08-14T21:55:25.0828459Z hidden_states = residual + hidden_states 2025-08-14T21:55:25.0828463Z 2025-08-14T21:55:25.0828545Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0828639Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0828724Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0828815Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0828897Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0828997Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0829084Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0829166Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0829246Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0829336Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0829445Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0829660Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0829738Z return mod(**inputs) 2025-08-14T21:55:25.0830031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0830109Z outputs = self.model( 2025-08-14T21:55:25.0830386Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0830463Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0830745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0830830Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0831060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0831152Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0831436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0831552Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0831838Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0831957Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0832267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0832413Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0832416Z 2025-08-14T21:55:25.0832527Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0832741Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0832823Z return mod(**inputs) 2025-08-14T21:55:25.0833107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0833181Z outputs = self.model( 2025-08-14T21:55:25.0833471Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0833596Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0833891Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0833974Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0834215Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0834313Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0834602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0834748Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0835034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0835139Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0835462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0835578Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0835610Z 2025-08-14T21:55:25.0835698Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0835792Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0835877Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0835967Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0836051Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0836133Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0836226Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0836309Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0836392Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0836482Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0836566Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0836652Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0836743Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0836824Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0836914Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0836995Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0837108Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0837332Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0837406Z return mod(**inputs) 2025-08-14T21:55:25.0837697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0837779Z outputs = self.model( 2025-08-14T21:55:25.0838064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0838155Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0838441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0838519Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0838769Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0838859Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0839142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0839258Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0839545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0839657Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0840009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0840151Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0840156Z 2025-08-14T21:55:25.0840276Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0840491Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0840571Z return mod(**inputs) 2025-08-14T21:55:25.0840860Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0840934Z outputs = self.model( 2025-08-14T21:55:25.0841245Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0841329Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0841616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0841707Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0841944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0842065Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0842351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0842458Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0842755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0842857Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0843175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0843295Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0843299Z 2025-08-14T21:55:25.0843384Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0843477Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0843564Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0843648Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0843738Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0843819Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0843903Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0843992Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0844074Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0844163Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0844243Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0844324Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0844444Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0844661Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0844733Z return mod(**inputs) 2025-08-14T21:55:25.0845026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0845101Z outputs = self.model( 2025-08-14T21:55:25.0845392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0845470Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0845776Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0845862Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0846098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0846206Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0846513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0846634Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0846928Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0847031Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0847344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0847507Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0847511Z 2025-08-14T21:55:25.0847626Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0847850Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0847924Z return mod(**inputs) 2025-08-14T21:55:25.0848231Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0848313Z outputs = self.model( 2025-08-14T21:55:25.0848615Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0848778Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0849080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0849158Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0849407Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0849493Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0849775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0849905Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0850188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0850300Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0850610Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0850723Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0850728Z 2025-08-14T21:55:25.0850825Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0850909Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0851024Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0851247Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0851322Z return mod(**inputs) 2025-08-14T21:55:25.0851631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0851707Z outputs = self.model( 2025-08-14T21:55:25.0851990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0852081Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0852384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0852462Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0852708Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0852796Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0853091Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 433, in forward 2025-08-14T21:55:25.0853225Z hidden_states = residual + hidden_states 2025-08-14T21:55:25.0853230Z 2025-08-14T21:55:25.0853315Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0853405Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0853488Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0853577Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0853658Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0853739Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0853838Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0853915Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0854039Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0854126Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0854205Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0854285Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0854373Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0854453Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0854561Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0854775Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0854868Z return mod(**inputs) 2025-08-14T21:55:25.0855154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0855228Z outputs = self.model( 2025-08-14T21:55:25.0855518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0855605Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0855899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0855988Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0856228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0856314Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0856612Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0856719Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0857013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0857126Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0857449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0857605Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0857611Z 2025-08-14T21:55:25.0857720Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0857950Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0858028Z return mod(**inputs) 2025-08-14T21:55:25.0858321Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0858401Z outputs = self.model( 2025-08-14T21:55:25.0858688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0858765Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0859061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0859142Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0859381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0859520Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0859815Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0859931Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0860223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0860327Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0860671Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0860790Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0860794Z 2025-08-14T21:55:25.0860887Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0860972Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0861057Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0861148Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0861229Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0861309Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0861427Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0861509Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0861599Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0861686Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0861765Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0861851Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0861963Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0862191Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0862267Z return mod(**inputs) 2025-08-14T21:55:25.0862556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0862631Z outputs = self.model( 2025-08-14T21:55:25.0862915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0862994Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0863276Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0863351Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0863583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0863677Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0863952Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0864066Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0864354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0864457Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0864775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0864913Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0864917Z 2025-08-14T21:55:25.0865030Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0865257Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0865328Z return mod(**inputs) 2025-08-14T21:55:25.0865618Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0865728Z outputs = self.model( 2025-08-14T21:55:25.0866021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0866111Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0866392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0866469Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0866706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0866791Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0867099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0867214Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0867487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0867597Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0867896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0868037Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0868041Z 2025-08-14T21:55:25.0868125Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0868207Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0868298Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0868377Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0868457Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0868542Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0868649Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0868858Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0868937Z return mod(**inputs) 2025-08-14T21:55:25.0869214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0869293Z outputs = self.model( 2025-08-14T21:55:25.0869569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0869646Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0869931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0870006Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0870246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0870331Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0870609Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 442, in forward 2025-08-14T21:55:25.0870705Z hidden_states = residual + hidden_states 2025-08-14T21:55:25.0870708Z 2025-08-14T21:55:25.0870788Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0870870Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0870959Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0871039Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0871128Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0871207Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0871286Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0871374Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0871454Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0871534Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0871651Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0871862Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0871972Z return mod(**inputs) 2025-08-14T21:55:25.0872266Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0872343Z outputs = self.model( 2025-08-14T21:55:25.0872633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0872714Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0873017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0873119Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0873359Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0873447Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0873754Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0873862Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0874157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0874282Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0874588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0874735Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0874739Z 2025-08-14T21:55:25.0874853Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0875074Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0875148Z return mod(**inputs) 2025-08-14T21:55:25.0875444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0875527Z outputs = self.model( 2025-08-14T21:55:25.0875808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0875897Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0876188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0876265Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0876510Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0876598Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0876885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0877003Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0877296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0877406Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0877718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0877833Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0877837Z 2025-08-14T21:55:25.0877932Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0878019Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0878103Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0878192Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0878274Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0878364Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0878471Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0878569Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0878660Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0878741Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0878825Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0878914Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0879024Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0879253Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0879334Z return mod(**inputs) 2025-08-14T21:55:25.0879657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0879742Z outputs = self.model( 2025-08-14T21:55:25.0880036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0880121Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0880412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0880511Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0880762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0880850Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0881136Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0881263Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0881560Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0881664Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0881993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0882133Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0882138Z 2025-08-14T21:55:25.0882260Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0882494Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0882566Z return mod(**inputs) 2025-08-14T21:55:25.0882875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0882952Z outputs = self.model( 2025-08-14T21:55:25.0883250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0883331Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0883626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0883709Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0883954Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0884043Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0884342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0884458Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0884753Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0884855Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0885176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0885347Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0885352Z 2025-08-14T21:55:25.0885439Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0885530Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0885614Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0885695Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0885786Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0885866Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0885947Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0886037Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0886134Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0886218Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0886308Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0886388Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0886469Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0886562Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0886644Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0886732Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0886845Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0887083Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0887165Z return mod(**inputs) 2025-08-14T21:55:25.0887468Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0887543Z outputs = self.model( 2025-08-14T21:55:25.0887857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0887939Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0888229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0888311Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0888551Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0888654Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0889016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0889141Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0889425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0889532Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0889852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0889994Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0890000Z 2025-08-14T21:55:25.0890112Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0890336Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0890411Z return mod(**inputs) 2025-08-14T21:55:25.0890715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0890792Z outputs = self.model( 2025-08-14T21:55:25.0891094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0891187Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0891490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0891579Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0891887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0891978Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0892265Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 407, in forward 2025-08-14T21:55:25.0892375Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:55:25.0892658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0892769Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0893097Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0893224Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0893228Z 2025-08-14T21:55:25.0893315Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0893402Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0893525Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0893742Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0893837Z return mod(**inputs) 2025-08-14T21:55:25.0894128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0894202Z outputs = self.model( 2025-08-14T21:55:25.0894493Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0894575Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0894859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0894946Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0895187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0895282Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0895564Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 416, in forward 2025-08-14T21:55:25.0895653Z hidden_states = residual + hidden_states 2025-08-14T21:55:25.0895657Z 2025-08-14T21:55:25.0895749Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0895831Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0895913Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0896005Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0896086Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0896173Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0896254Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0896335Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0896423Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0896505Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0896616Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0896840Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0896915Z return mod(**inputs) 2025-08-14T21:55:25.0897199Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0897282Z outputs = self.model( 2025-08-14T21:55:25.0897565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0897650Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0897932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0898038Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0898302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0898391Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0898683Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0898802Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0899084Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0899192Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0899515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-08-14T21:55:25.0899657Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:25.0899669Z 2025-08-14T21:55:25.0899782Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0899998Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0900098Z return mod(**inputs) 2025-08-14T21:55:25.0900383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1471, in forward 2025-08-14T21:55:25.0900457Z outputs = self.model( 2025-08-14T21:55:25.0900748Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1297, in forward 2025-08-14T21:55:25.0900829Z decoder_outputs = self.decoder( 2025-08-14T21:55:25.0901121Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1115, in forward 2025-08-14T21:55:25.0901200Z layer_outputs = decoder_layer( 2025-08-14T21:55:25.0901435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:25.0901533Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:25.0901816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 424, in forward 2025-08-14T21:55:25.0901935Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-08-14T21:55:25.0902225Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 253, in forward 2025-08-14T21:55:25.0902328Z attn_output, attn_weights = attention_interface( 2025-08-14T21:55:25.0902904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-08-14T21:55:25.0903028Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:55:25.0903032Z 2025-08-14T21:55:25.0903118Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0903215Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0903301Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0903384Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0903477Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0903562Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0903652Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0903733Z cudagraph partition due to non gpu ops 2025-08-14T21:55:25.0903846Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:25.0904075Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:25.0904149Z return mod(**inputs) 2025-08-14T21:55:25.0904433Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/pegasus/modeling_pegasus.py", line 1494, in forward 2025-08-14T21:55:25.0904625Z masked_lm_loss = loss_fct(lm_logits.view(-1, self.config.vocab_size), labels.view(-1)) 2025-08-14T21:55:25.0904696Z 2025-08-14T21:55:38.0669640Z Compilation time (from dynamo_timed): 37.161387587 2025-08-14T21:55:38.0677755Z pass 2025-08-14T21:55:38.0678262Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:55:38.0679250Z TIMING: _recursive_pre_grad_passes:0.09664 _recursive_joint_graph_passes:0.89455 _recursive_post_grad_passes:0.17911 async_compile.wait:1.10327 code_gen:12.93248 inductor_compile:16.17859 backend_compile:30.65715 gc:0.00091 entire_frame_compile:37.16139 total_wall_time:37.16139 2025-08-14T21:55:38.0680238Z STATS: call_* op count: 967 | FakeTensorMode.__torch_dispatch__:69867 | FakeTensor.__torch_dispatch__:8151 | ProxyTorchDispatchMode.__torch_dispatch__:19145 2025-08-14T21:55:38.0680862Z Dynamo produced 1 graphs covering 967 ops with 0 graph breaks (0 unique) 2025-08-14T21:55:44.3686040Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:55:44.3687228Z from pkg_resources import resource_filename 2025-08-14T21:55:44.9687067Z 2025-08-14T21:55:44.9810099Z loading model: 0it [00:00, ?it/s]If you want to use `RobertaLMHeadModel` as a standalone, add `is_decoder=True.` 2025-08-14T21:55:44.9813616Z WARNING:transformers.models.roberta.modeling_roberta:If you want to use `RobertaLMHeadModel` as a standalone, add `is_decoder=True.` 2025-08-14T21:55:46.7285800Z We strongly recommend passing in an `attention_mask` since your input_ids may be padded. See https://huggingface.co/docs/transformers/troubleshooting#incorrect-output-when-padding-tokens-arent-masked. 2025-08-14T21:55:46.7286796Z You may ignore this warning if your `pad_token_id` (0) is identical to the `bos_token_id` (0), `eos_token_id` (2), or the `sep_token_id` (None), and your input is not padded. 2025-08-14T21:55:46.7287783Z WARNING:transformers.modeling_utils:We strongly recommend passing in an `attention_mask` since your input_ids may be padded. See https://huggingface.co/docs/transformers/troubleshooting#incorrect-output-when-padding-tokens-arent-masked. 2025-08-14T21:55:46.7288955Z You may ignore this warning if your `pad_token_id` (0) is identical to the `bos_token_id` (0), `eos_token_id` (2), or the `sep_token_id` (None), and your input is not padded. 2025-08-14T21:55:46.8696876Z 2025-08-14T21:55:46.8697741Z loading model: 0it [00:01, ?it/s] 2025-08-14T21:55:46.8698056Z cpu eval RobertaForCausalLM 2025-08-14T21:55:47.4222633Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:55:47.6008512Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:55:47.7762658Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:55:58.8109479Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:58.8110636Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:58.8111079Z return mod(**inputs) 2025-08-14T21:55:58.8111560Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 999, in forward 2025-08-14T21:55:58.8112018Z outputs = self.roberta( 2025-08-14T21:55:58.8112447Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 826, in forward 2025-08-14T21:55:58.8112921Z embedding_output = self.embeddings( 2025-08-14T21:55:58.8113410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 89, in forward 2025-08-14T21:55:58.8113991Z position_ids = create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length) 2025-08-14T21:55:58.8114991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1576, in create_position_ids_from_input_ids 2025-08-14T21:55:58.8115595Z mask = input_ids.ne(padding_idx).int() 2025-08-14T21:55:58.8115750Z 2025-08-14T21:55:58.8115854Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8116123Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8116365Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8116614Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8116842Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8117073Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8117298Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8117576Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8117801Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8118024Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8118247Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8118467Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8118730Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:58.8119141Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:58.8119553Z return mod(**inputs) 2025-08-14T21:55:58.8119984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 999, in forward 2025-08-14T21:55:58.8120426Z outputs = self.roberta( 2025-08-14T21:55:58.8120849Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 826, in forward 2025-08-14T21:55:58.8121280Z embedding_output = self.embeddings( 2025-08-14T21:55:58.8121730Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 89, in forward 2025-08-14T21:55:58.8122312Z position_ids = create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length) 2025-08-14T21:55:58.8122980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1577, in create_position_ids_from_input_ids 2025-08-14T21:55:58.8123610Z incremental_indices = (torch.cumsum(mask, dim=1).type_as(mask) + past_key_values_length) * mask 2025-08-14T21:55:58.8123878Z 2025-08-14T21:55:58.8123998Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:58.8124395Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:58.8124746Z return mod(**inputs) 2025-08-14T21:55:58.8125155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 999, in forward 2025-08-14T21:55:58.8125578Z outputs = self.roberta( 2025-08-14T21:55:58.8125994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 826, in forward 2025-08-14T21:55:58.8126441Z embedding_output = self.embeddings( 2025-08-14T21:55:58.8126895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 89, in forward 2025-08-14T21:55:58.8127480Z position_ids = create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length) 2025-08-14T21:55:58.8128114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1577, in create_position_ids_from_input_ids 2025-08-14T21:55:58.8129011Z incremental_indices = (torch.cumsum(mask, dim=1).type_as(mask) + past_key_values_length) * mask 2025-08-14T21:55:58.8129297Z 2025-08-14T21:55:58.8129391Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8129632Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8129860Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8130080Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8130359Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8130617Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8130865Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8131122Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:58.8131529Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:58.8131901Z return mod(**inputs) 2025-08-14T21:55:58.8132322Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 999, in forward 2025-08-14T21:55:58.8132750Z outputs = self.roberta( 2025-08-14T21:55:58.8133211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:55:58.8133649Z encoder_outputs = self.encoder( 2025-08-14T21:55:58.8134085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:55:58.8134510Z layer_outputs = layer_module( 2025-08-14T21:55:58.8134897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:58.8135298Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:58.8135769Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:55:58.8136225Z self_attention_outputs = self.attention( 2025-08-14T21:55:58.8136656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8137109Z return func(*args, **kwargs) 2025-08-14T21:55:58.8137531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:55:58.8137965Z self_outputs = self.self( 2025-08-14T21:55:58.8138369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8138778Z return func(*args, **kwargs) 2025-08-14T21:55:58.8139201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:55:58.8139708Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:58.8139911Z 2025-08-14T21:55:58.8140015Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8140249Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8140480Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8140711Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8140933Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8141163Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8141404Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8141628Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8141843Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8142068Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8142291Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8142508Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8142737Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8143005Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:58.8143422Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:58.8143781Z return mod(**inputs) 2025-08-14T21:55:58.8144197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 999, in forward 2025-08-14T21:55:58.8144629Z outputs = self.roberta( 2025-08-14T21:55:58.8145041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:55:58.8145479Z encoder_outputs = self.encoder( 2025-08-14T21:55:58.8145906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:55:58.8146390Z layer_outputs = layer_module( 2025-08-14T21:55:58.8146776Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:58.8147203Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:58.8147652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:55:58.8148080Z self_attention_outputs = self.attention( 2025-08-14T21:55:58.8148499Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8148938Z return func(*args, **kwargs) 2025-08-14T21:55:58.8149352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:55:58.8149775Z self_outputs = self.self( 2025-08-14T21:55:58.8150178Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8150581Z return func(*args, **kwargs) 2025-08-14T21:55:58.8150984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:55:58.8151488Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:58.8151694Z 2025-08-14T21:55:58.8151781Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8152010Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8152229Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8152454Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8152675Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8152894Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8153112Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8153331Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8153546Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8153807Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8154035Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8154245Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8154464Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8154721Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:58.8155117Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:58.8155471Z return mod(**inputs) 2025-08-14T21:55:58.8155884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 999, in forward 2025-08-14T21:55:58.8156317Z outputs = self.roberta( 2025-08-14T21:55:58.8156723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:55:58.8157161Z encoder_outputs = self.encoder( 2025-08-14T21:55:58.8157587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:55:58.8158022Z layer_outputs = layer_module( 2025-08-14T21:55:58.8158401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:58.8158797Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:58.8159232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:55:58.8159675Z self_attention_outputs = self.attention( 2025-08-14T21:55:58.8160085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8160489Z return func(*args, **kwargs) 2025-08-14T21:55:58.8160910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:55:58.8161386Z self_outputs = self.self( 2025-08-14T21:55:58.8161788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8162192Z return func(*args, **kwargs) 2025-08-14T21:55:58.8162611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:55:58.8163095Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:58.8163300Z 2025-08-14T21:55:58.8163385Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8163621Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8163884Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8164109Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8164331Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8164552Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8164783Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8165007Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8165230Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8165448Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8165704Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8165928Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8166147Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8166403Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:58.8166803Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:58.8167155Z return mod(**inputs) 2025-08-14T21:55:58.8167583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 999, in forward 2025-08-14T21:55:58.8168017Z outputs = self.roberta( 2025-08-14T21:55:58.8168442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:55:58.8168963Z encoder_outputs = self.encoder( 2025-08-14T21:55:58.8169406Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:55:58.8169853Z layer_outputs = layer_module( 2025-08-14T21:55:58.8170226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:58.8170632Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:58.8171079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:55:58.8171540Z self_attention_outputs = self.attention( 2025-08-14T21:55:58.8171960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8172371Z return func(*args, **kwargs) 2025-08-14T21:55:58.8172840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:55:58.8173268Z self_outputs = self.self( 2025-08-14T21:55:58.8173670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8174088Z return func(*args, **kwargs) 2025-08-14T21:55:58.8174500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:55:58.8174968Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:58.8175171Z 2025-08-14T21:55:58.8175257Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8175483Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8175707Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8175918Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8176171Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8176410Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8176623Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8176841Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8177062Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8177272Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8177490Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8177710Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8177929Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8178179Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:58.8178583Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:58.8178934Z return mod(**inputs) 2025-08-14T21:55:58.8179334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 999, in forward 2025-08-14T21:55:58.8179832Z outputs = self.roberta( 2025-08-14T21:55:58.8180234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:55:58.8180653Z encoder_outputs = self.encoder( 2025-08-14T21:55:58.8181083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:55:58.8181501Z layer_outputs = layer_module( 2025-08-14T21:55:58.8181868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:58.8182243Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:58.8182663Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:55:58.8183099Z self_attention_outputs = self.attention( 2025-08-14T21:55:58.8183498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8183889Z return func(*args, **kwargs) 2025-08-14T21:55:58.8184290Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:55:58.8184702Z self_outputs = self.self( 2025-08-14T21:55:58.8185078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8185476Z return func(*args, **kwargs) 2025-08-14T21:55:58.8185876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:55:58.8186363Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:58.8186556Z 2025-08-14T21:55:58.8186638Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8186861Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8187083Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8187293Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8187508Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8187723Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8187941Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8188155Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8188368Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8188587Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8188803Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8189023Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8189252Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8189490Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:58.8189873Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:58.8190215Z return mod(**inputs) 2025-08-14T21:55:58.8190625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 999, in forward 2025-08-14T21:55:58.8191063Z outputs = self.roberta( 2025-08-14T21:55:58.8191459Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:55:58.8191880Z encoder_outputs = self.encoder( 2025-08-14T21:55:58.8192283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:55:58.8192706Z layer_outputs = layer_module( 2025-08-14T21:55:58.8193077Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:58.8193478Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:58.8193895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:55:58.8194327Z self_attention_outputs = self.attention( 2025-08-14T21:55:58.8194730Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8195113Z return func(*args, **kwargs) 2025-08-14T21:55:58.8195539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:55:58.8195953Z self_outputs = self.self( 2025-08-14T21:55:58.8196337Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8196723Z return func(*args, **kwargs) 2025-08-14T21:55:58.8197131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:55:58.8197607Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:58.8197797Z 2025-08-14T21:55:58.8197894Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8198116Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8198345Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8198566Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8198785Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8199009Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8199233Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8199516Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8199740Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8199966Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8200175Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8200386Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8200592Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8200836Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:58.8201206Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:58.8201530Z return mod(**inputs) 2025-08-14T21:55:58.8201922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 999, in forward 2025-08-14T21:55:58.8202338Z outputs = self.roberta( 2025-08-14T21:55:58.8202948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:55:58.8203400Z encoder_outputs = self.encoder( 2025-08-14T21:55:58.8203845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:55:58.8204277Z layer_outputs = layer_module( 2025-08-14T21:55:58.8204658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:58.8205047Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:58.8205505Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:55:58.8206100Z self_attention_outputs = self.attention( 2025-08-14T21:55:58.8206519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8206928Z return func(*args, **kwargs) 2025-08-14T21:55:58.8207365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:55:58.8207804Z self_outputs = self.self( 2025-08-14T21:55:58.8208190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8208630Z return func(*args, **kwargs) 2025-08-14T21:55:58.8209124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:55:58.8209626Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:58.8209837Z 2025-08-14T21:55:58.8209925Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8210158Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8210387Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8211206Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8211434Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8211664Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8211906Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8212140Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8212375Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8212597Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8212830Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8213060Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8213290Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8213545Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:58.8213954Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:58.8214317Z return mod(**inputs) 2025-08-14T21:55:58.8214722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 999, in forward 2025-08-14T21:55:58.8215159Z outputs = self.roberta( 2025-08-14T21:55:58.8215579Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:55:58.8216011Z encoder_outputs = self.encoder( 2025-08-14T21:55:58.8216439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:55:58.8216874Z layer_outputs = layer_module( 2025-08-14T21:55:58.8217260Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:58.8217659Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:58.8218090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:55:58.8218521Z self_attention_outputs = self.attention( 2025-08-14T21:55:58.8218929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8219325Z return func(*args, **kwargs) 2025-08-14T21:55:58.8219737Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:55:58.8220155Z self_outputs = self.self( 2025-08-14T21:55:58.8220538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8220937Z return func(*args, **kwargs) 2025-08-14T21:55:58.8221343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:55:58.8221895Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:58.8222089Z 2025-08-14T21:55:58.8222172Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8222403Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8222627Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8222840Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8223057Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8223271Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8223485Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8223695Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8223936Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8224161Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8224371Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8224590Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8224808Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8225050Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:58.8225434Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:58.8225811Z return mod(**inputs) 2025-08-14T21:55:58.8226210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 999, in forward 2025-08-14T21:55:58.8226618Z outputs = self.roberta( 2025-08-14T21:55:58.8227016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:55:58.8227432Z encoder_outputs = self.encoder( 2025-08-14T21:55:58.8227836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:55:58.8228253Z layer_outputs = layer_module( 2025-08-14T21:55:58.8228619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:58.8229004Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:58.8229415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:55:58.8229841Z self_attention_outputs = self.attention( 2025-08-14T21:55:58.8230248Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8230632Z return func(*args, **kwargs) 2025-08-14T21:55:58.8231036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:55:58.8231553Z self_outputs = self.self( 2025-08-14T21:55:58.8231948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8232328Z return func(*args, **kwargs) 2025-08-14T21:55:58.8232733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:55:58.8233204Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:58.8233399Z 2025-08-14T21:55:58.8233491Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8233709Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8233931Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8234147Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8234358Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8234573Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8234791Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8234996Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8235216Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8235429Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8235636Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8235881Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8236118Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8236371Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:58.8236748Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:58.8237092Z return mod(**inputs) 2025-08-14T21:55:58.8237485Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 999, in forward 2025-08-14T21:55:58.8237892Z outputs = self.roberta( 2025-08-14T21:55:58.8238307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:55:58.8238728Z encoder_outputs = self.encoder( 2025-08-14T21:55:58.8239142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:55:58.8239557Z layer_outputs = layer_module( 2025-08-14T21:55:58.8239933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:58.8240324Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:58.8240756Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:55:58.8241184Z self_attention_outputs = self.attention( 2025-08-14T21:55:58.8241587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8241978Z return func(*args, **kwargs) 2025-08-14T21:55:58.8242379Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:55:58.8242790Z self_outputs = self.self( 2025-08-14T21:55:58.8243175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8243573Z return func(*args, **kwargs) 2025-08-14T21:55:58.8243966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:55:58.8244454Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:58.8244652Z 2025-08-14T21:55:58.8244744Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8244965Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8245191Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8245412Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8245628Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8245855Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8246076Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8246297Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8246512Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8246735Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8246958Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8247173Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8247393Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8247646Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:58.8248034Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:58.8248385Z return mod(**inputs) 2025-08-14T21:55:58.8248871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 999, in forward 2025-08-14T21:55:58.8249326Z outputs = self.roberta( 2025-08-14T21:55:58.8249729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:55:58.8250170Z encoder_outputs = self.encoder( 2025-08-14T21:55:58.8250617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:55:58.8251058Z layer_outputs = layer_module( 2025-08-14T21:55:58.8251437Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:58.8251836Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:58.8252331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:55:58.8252782Z self_attention_outputs = self.attention( 2025-08-14T21:55:58.8253214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8253622Z return func(*args, **kwargs) 2025-08-14T21:55:58.8254036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:55:58.8254457Z self_outputs = self.self( 2025-08-14T21:55:58.8254851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8255254Z return func(*args, **kwargs) 2025-08-14T21:55:58.8255663Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:55:58.8256176Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:58.8256373Z 2025-08-14T21:55:58.8256457Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8256679Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8256890Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8257107Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8257322Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8257528Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8257744Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8257960Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8258169Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8258385Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8258599Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8258810Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8259025Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8259268Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:58.8259649Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:58.8259984Z return mod(**inputs) 2025-08-14T21:55:58.8260374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 999, in forward 2025-08-14T21:55:58.8260783Z outputs = self.roberta( 2025-08-14T21:55:58.8261169Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:55:58.8261584Z encoder_outputs = self.encoder( 2025-08-14T21:55:58.8261990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:55:58.8262403Z layer_outputs = layer_module( 2025-08-14T21:55:58.8262761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:55:58.8263140Z return super().__call__(*args, **kwargs) 2025-08-14T21:55:58.8263556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:55:58.8263980Z self_attention_outputs = self.attention( 2025-08-14T21:55:58.8264377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8264765Z return func(*args, **kwargs) 2025-08-14T21:55:58.8265166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:55:58.8265612Z self_outputs = self.self( 2025-08-14T21:55:58.8265996Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:55:58.8266380Z return func(*args, **kwargs) 2025-08-14T21:55:58.8266780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:55:58.8267241Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:55:58.8267440Z 2025-08-14T21:55:58.8267522Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8267760Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8267982Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8268206Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8268430Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8268653Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8268868Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8269094Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8269314Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8269533Z cudagraph partition due to non gpu ops 2025-08-14T21:55:58.8269804Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:55:58.8270182Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:55:58.8270517Z return mod(**inputs) 2025-08-14T21:55:58.8270920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1022, in forward 2025-08-14T21:55:58.8271344Z lm_loss = self.loss_function( 2025-08-14T21:55:58.8271731Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/loss/loss_utils.py", line 67, in ForCausalLMLoss 2025-08-14T21:55:58.8272221Z loss = fixed_cross_entropy(logits, shift_labels, num_items_in_batch, ignore_index, **kwargs) 2025-08-14T21:55:58.8272730Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/loss/loss_utils.py", line 36, in fixed_cross_entropy 2025-08-14T21:55:58.8273259Z loss = nn.functional.cross_entropy(source, target, ignore_index=ignore_index, reduction=reduction) 2025-08-14T21:55:58.8273519Z 2025-08-14T21:56:07.8508696Z Compilation time (from dynamo_timed): 18.651484983 2025-08-14T21:56:07.8609394Z pass 2025-08-14T21:56:07.8609838Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:56:07.8610734Z TIMING: _recursive_pre_grad_passes:0.03847 _recursive_joint_graph_passes:0.42171 _recursive_post_grad_passes:0.08487 async_compile.wait:0.8549 code_gen:8.33344 inductor_compile:10.28629 backend_compile:15.68241 gc:0.00015 entire_frame_compile:18.65148 total_wall_time:18.65148 2025-08-14T21:56:07.8611722Z STATS: call_* op count: 305 | FakeTensorMode.__torch_dispatch__:27119 | FakeTensor.__torch_dispatch__:3318 | ProxyTorchDispatchMode.__torch_dispatch__:7255 2025-08-14T21:56:07.8612274Z Dynamo produced 1 graphs covering 305 ops with 0 graph breaks (0 unique) 2025-08-14T21:56:13.5395103Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:56:13.5396077Z from pkg_resources import resource_filename 2025-08-14T21:56:14.1475878Z 2025-08-14T21:56:15.3417292Z loading model: 0it [00:00, ?it/s]We strongly recommend passing in an `attention_mask` since your input_ids may be padded. See https://huggingface.co/docs/transformers/troubleshooting#incorrect-output-when-padding-tokens-arent-masked. 2025-08-14T21:56:15.3418277Z You may ignore this warning if your `pad_token_id` (0) is identical to the `bos_token_id` (0), `eos_token_id` (2), or the `sep_token_id` (None), and your input is not padded. 2025-08-14T21:56:15.3419598Z WARNING:transformers.modeling_utils:We strongly recommend passing in an `attention_mask` since your input_ids may be padded. See https://huggingface.co/docs/transformers/troubleshooting#incorrect-output-when-padding-tokens-arent-masked. 2025-08-14T21:56:15.3420507Z You may ignore this warning if your `pad_token_id` (0) is identical to the `bos_token_id` (0), `eos_token_id` (2), or the `sep_token_id` (None), and your input is not padded. 2025-08-14T21:56:15.4447065Z 2025-08-14T21:56:15.4448017Z loading model: 0it [00:01, ?it/s] 2025-08-14T21:56:15.4448365Z cpu eval RobertaForQuestionAnswering 2025-08-14T21:56:15.8673561Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:56:15.9805762Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:56:16.0908578Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:56:27.0808400Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0809582Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0810379Z return mod(**inputs) 2025-08-14T21:56:27.0810835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1516, in forward 2025-08-14T21:56:27.0811295Z outputs = self.roberta( 2025-08-14T21:56:27.0811707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 826, in forward 2025-08-14T21:56:27.0812177Z embedding_output = self.embeddings( 2025-08-14T21:56:27.0812615Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 89, in forward 2025-08-14T21:56:27.0813205Z position_ids = create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length) 2025-08-14T21:56:27.0813886Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1576, in create_position_ids_from_input_ids 2025-08-14T21:56:27.0814416Z mask = input_ids.ne(padding_idx).int() 2025-08-14T21:56:27.0814572Z 2025-08-14T21:56:27.0814680Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0814901Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0815121Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0815339Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0815551Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0815768Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0815985Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0816201Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0816409Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0816627Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0816850Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0817087Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0817359Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0817749Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0818090Z return mod(**inputs) 2025-08-14T21:56:27.0818508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1530, in forward 2025-08-14T21:56:27.0818955Z logits = self.qa_outputs(sequence_output) 2025-08-14T21:56:27.0819104Z 2025-08-14T21:56:27.0819225Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0819609Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0819962Z return mod(**inputs) 2025-08-14T21:56:27.0820366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1516, in forward 2025-08-14T21:56:27.0820942Z outputs = self.roberta( 2025-08-14T21:56:27.0821351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 826, in forward 2025-08-14T21:56:27.0821784Z embedding_output = self.embeddings( 2025-08-14T21:56:27.0822216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 89, in forward 2025-08-14T21:56:27.0822799Z position_ids = create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length) 2025-08-14T21:56:27.0823480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1577, in create_position_ids_from_input_ids 2025-08-14T21:56:27.0824124Z incremental_indices = (torch.cumsum(mask, dim=1).type_as(mask) + past_key_values_length) * mask 2025-08-14T21:56:27.0824388Z 2025-08-14T21:56:27.0824508Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0824904Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0825266Z return mod(**inputs) 2025-08-14T21:56:27.0825683Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1516, in forward 2025-08-14T21:56:27.0826153Z outputs = self.roberta( 2025-08-14T21:56:27.0826611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 826, in forward 2025-08-14T21:56:27.0827050Z embedding_output = self.embeddings( 2025-08-14T21:56:27.0827484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 89, in forward 2025-08-14T21:56:27.0828037Z position_ids = create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length) 2025-08-14T21:56:27.0828673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1577, in create_position_ids_from_input_ids 2025-08-14T21:56:27.0829304Z incremental_indices = (torch.cumsum(mask, dim=1).type_as(mask) + past_key_values_length) * mask 2025-08-14T21:56:27.0829565Z 2025-08-14T21:56:27.0829666Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0829892Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0830121Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0830348Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0830572Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0830787Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0831008Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0831270Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0831667Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0832046Z return mod(**inputs) 2025-08-14T21:56:27.0832463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1516, in forward 2025-08-14T21:56:27.0832975Z outputs = self.roberta( 2025-08-14T21:56:27.0833387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:56:27.0833819Z encoder_outputs = self.encoder( 2025-08-14T21:56:27.0834249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:56:27.0834687Z layer_outputs = layer_module( 2025-08-14T21:56:27.0835073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:27.0835472Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:27.0835923Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:56:27.0836383Z self_attention_outputs = self.attention( 2025-08-14T21:56:27.0836829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0837241Z return func(*args, **kwargs) 2025-08-14T21:56:27.0837658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:56:27.0838087Z self_outputs = self.self( 2025-08-14T21:56:27.0838484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0838897Z return func(*args, **kwargs) 2025-08-14T21:56:27.0839319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:56:27.0839821Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:56:27.0840022Z 2025-08-14T21:56:27.0840118Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0840350Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0840568Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0840791Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0841032Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0841248Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0841467Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0841685Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0841897Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0842118Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0842338Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0842554Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0842775Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0843026Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0843414Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0843759Z return mod(**inputs) 2025-08-14T21:56:27.0844168Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1516, in forward 2025-08-14T21:56:27.0844597Z outputs = self.roberta( 2025-08-14T21:56:27.0844997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:56:27.0845424Z encoder_outputs = self.encoder( 2025-08-14T21:56:27.0845846Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:56:27.0846269Z layer_outputs = layer_module( 2025-08-14T21:56:27.0846641Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:27.0847037Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:27.0847467Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:56:27.0847910Z self_attention_outputs = self.attention( 2025-08-14T21:56:27.0848318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0848825Z return func(*args, **kwargs) 2025-08-14T21:56:27.0849294Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:56:27.0849717Z self_outputs = self.self( 2025-08-14T21:56:27.0850119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0850517Z return func(*args, **kwargs) 2025-08-14T21:56:27.0850923Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:56:27.0851474Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:56:27.0851676Z 2025-08-14T21:56:27.0851780Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0852009Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0852220Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0852444Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0852662Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0852870Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0853089Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0853310Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0853525Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0853761Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0853979Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0854192Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0854400Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0854647Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0855030Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0855362Z return mod(**inputs) 2025-08-14T21:56:27.0855784Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1516, in forward 2025-08-14T21:56:27.0857097Z outputs = self.roberta( 2025-08-14T21:56:27.0857516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:56:27.0857951Z encoder_outputs = self.encoder( 2025-08-14T21:56:27.0858388Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:56:27.0858822Z layer_outputs = layer_module( 2025-08-14T21:56:27.0859181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:27.0859565Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:27.0859993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:56:27.0860429Z self_attention_outputs = self.attention( 2025-08-14T21:56:27.0860898Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0861307Z return func(*args, **kwargs) 2025-08-14T21:56:27.0861724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:56:27.0862147Z self_outputs = self.self( 2025-08-14T21:56:27.0862538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0862941Z return func(*args, **kwargs) 2025-08-14T21:56:27.0863370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:56:27.0863857Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:56:27.0864066Z 2025-08-14T21:56:27.0864154Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0864387Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0864615Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0864833Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0865055Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0865278Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0865497Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0865720Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0865943Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0866157Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0866406Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0866630Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0866886Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0867177Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0867569Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0867922Z return mod(**inputs) 2025-08-14T21:56:27.0868326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1516, in forward 2025-08-14T21:56:27.0868754Z outputs = self.roberta( 2025-08-14T21:56:27.0869163Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:56:27.0869720Z encoder_outputs = self.encoder( 2025-08-14T21:56:27.0870144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:56:27.0870574Z layer_outputs = layer_module( 2025-08-14T21:56:27.0870958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:27.0871347Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:27.0871788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:56:27.0872291Z self_attention_outputs = self.attention( 2025-08-14T21:56:27.0872710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0873103Z return func(*args, **kwargs) 2025-08-14T21:56:27.0873519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:56:27.0873941Z self_outputs = self.self( 2025-08-14T21:56:27.0874325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0874724Z return func(*args, **kwargs) 2025-08-14T21:56:27.0875143Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:56:27.0875630Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:56:27.0875827Z 2025-08-14T21:56:27.0875911Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0876138Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0876362Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0876576Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0876798Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0877024Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0877247Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0877460Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0877681Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0877902Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0878118Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0878342Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0878564Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0878809Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0879202Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0879553Z return mod(**inputs) 2025-08-14T21:56:27.0879958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1516, in forward 2025-08-14T21:56:27.0880390Z outputs = self.roberta( 2025-08-14T21:56:27.0880821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:56:27.0881251Z encoder_outputs = self.encoder( 2025-08-14T21:56:27.0881663Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:56:27.0882125Z layer_outputs = layer_module( 2025-08-14T21:56:27.0882508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:27.0882887Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:27.0883311Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:56:27.0883749Z self_attention_outputs = self.attention( 2025-08-14T21:56:27.0884161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0884564Z return func(*args, **kwargs) 2025-08-14T21:56:27.0885016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:56:27.0885442Z self_outputs = self.self( 2025-08-14T21:56:27.0885838Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0886243Z return func(*args, **kwargs) 2025-08-14T21:56:27.0886658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:56:27.0887165Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:56:27.0887361Z 2025-08-14T21:56:27.0887454Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0887677Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0887903Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0888129Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0888349Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0888570Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0888934Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0889163Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0889392Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0889622Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0889842Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0890076Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0890293Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0890544Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0890945Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0891291Z return mod(**inputs) 2025-08-14T21:56:27.0891688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1516, in forward 2025-08-14T21:56:27.0892099Z outputs = self.roberta( 2025-08-14T21:56:27.0892504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:56:27.0892916Z encoder_outputs = self.encoder( 2025-08-14T21:56:27.0893338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:56:27.0893745Z layer_outputs = layer_module( 2025-08-14T21:56:27.0894107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:27.0894498Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:27.0894921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:56:27.0895358Z self_attention_outputs = self.attention( 2025-08-14T21:56:27.0895789Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0896189Z return func(*args, **kwargs) 2025-08-14T21:56:27.0896588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:56:27.0897036Z self_outputs = self.self( 2025-08-14T21:56:27.0897443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0897829Z return func(*args, **kwargs) 2025-08-14T21:56:27.0898233Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:56:27.0898707Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:56:27.0898895Z 2025-08-14T21:56:27.0898985Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0899199Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0899441Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0899663Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0899878Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0900102Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0900321Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0900631Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0900848Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0901067Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0901285Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0901519Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0901741Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0902002Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0902398Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0902982Z return mod(**inputs) 2025-08-14T21:56:27.0903402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1516, in forward 2025-08-14T21:56:27.0903835Z outputs = self.roberta( 2025-08-14T21:56:27.0904237Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:56:27.0904667Z encoder_outputs = self.encoder( 2025-08-14T21:56:27.0905108Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:56:27.0905527Z layer_outputs = layer_module( 2025-08-14T21:56:27.0905903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:27.0906293Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:27.0906737Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:56:27.0907168Z self_attention_outputs = self.attention( 2025-08-14T21:56:27.0907580Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0907983Z return func(*args, **kwargs) 2025-08-14T21:56:27.0908388Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:56:27.0908813Z self_outputs = self.self( 2025-08-14T21:56:27.0909206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0909608Z return func(*args, **kwargs) 2025-08-14T21:56:27.0910015Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:56:27.0910500Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:56:27.0910702Z 2025-08-14T21:56:27.0910787Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0911018Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0911242Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0911466Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0911689Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0911994Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0912260Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0912488Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0912703Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0912929Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0913151Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0913368Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0913592Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0913857Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0914244Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0914622Z return mod(**inputs) 2025-08-14T21:56:27.0915035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1516, in forward 2025-08-14T21:56:27.0915466Z outputs = self.roberta( 2025-08-14T21:56:27.0915876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:56:27.0916300Z encoder_outputs = self.encoder( 2025-08-14T21:56:27.0916711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:56:27.0917157Z layer_outputs = layer_module( 2025-08-14T21:56:27.0917515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:27.0917898Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:27.0918321Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:56:27.0918755Z self_attention_outputs = self.attention( 2025-08-14T21:56:27.0919168Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0919576Z return func(*args, **kwargs) 2025-08-14T21:56:27.0920007Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:56:27.0920427Z self_outputs = self.self( 2025-08-14T21:56:27.0920812Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0921211Z return func(*args, **kwargs) 2025-08-14T21:56:27.0921612Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:56:27.0922072Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:56:27.0922271Z 2025-08-14T21:56:27.0922354Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0922575Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0922786Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0923004Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0923218Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0923424Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0923638Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0923854Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0924066Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0924271Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0924485Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0924702Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0924910Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0925155Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0925540Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0925874Z return mod(**inputs) 2025-08-14T21:56:27.0926285Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1516, in forward 2025-08-14T21:56:27.0926770Z outputs = self.roberta( 2025-08-14T21:56:27.0927184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:56:27.0927622Z encoder_outputs = self.encoder( 2025-08-14T21:56:27.0928075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:56:27.0928518Z layer_outputs = layer_module( 2025-08-14T21:56:27.0928961Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:27.0929387Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:27.0929835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:56:27.0930284Z self_attention_outputs = self.attention( 2025-08-14T21:56:27.0930695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0931096Z return func(*args, **kwargs) 2025-08-14T21:56:27.0931535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:56:27.0931978Z self_outputs = self.self( 2025-08-14T21:56:27.0932361Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0932756Z return func(*args, **kwargs) 2025-08-14T21:56:27.0933174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:56:27.0933717Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:56:27.0933917Z 2025-08-14T21:56:27.0934000Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0934222Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0934511Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0934732Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0934942Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0935156Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0935377Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0935584Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0935802Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0936019Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0936228Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0936452Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0936677Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0936919Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0937311Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0937669Z return mod(**inputs) 2025-08-14T21:56:27.0938084Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1516, in forward 2025-08-14T21:56:27.0938503Z outputs = self.roberta( 2025-08-14T21:56:27.0938914Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:56:27.0939343Z encoder_outputs = self.encoder( 2025-08-14T21:56:27.0939774Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:56:27.0940213Z layer_outputs = layer_module( 2025-08-14T21:56:27.0940590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:27.0940988Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:27.0941433Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:56:27.0941905Z self_attention_outputs = self.attention( 2025-08-14T21:56:27.0942343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0942753Z return func(*args, **kwargs) 2025-08-14T21:56:27.0943182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:56:27.0943611Z self_outputs = self.self( 2025-08-14T21:56:27.0944005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0944410Z return func(*args, **kwargs) 2025-08-14T21:56:27.0944869Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:56:27.0945369Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:56:27.0945569Z 2025-08-14T21:56:27.0945661Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0945885Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0946114Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0946340Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0946574Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0946794Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0947016Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0947228Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0947448Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0947670Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0947890Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0948104Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0948326Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0948577Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0948957Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0949312Z return mod(**inputs) 2025-08-14T21:56:27.0949726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1516, in forward 2025-08-14T21:56:27.0950147Z outputs = self.roberta( 2025-08-14T21:56:27.0950553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:56:27.0950978Z encoder_outputs = self.encoder( 2025-08-14T21:56:27.0951400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:56:27.0951805Z layer_outputs = layer_module( 2025-08-14T21:56:27.0952170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:27.0952545Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:27.0952962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:56:27.0953378Z self_attention_outputs = self.attention( 2025-08-14T21:56:27.0953777Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0954169Z return func(*args, **kwargs) 2025-08-14T21:56:27.0954560Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:56:27.0954970Z self_outputs = self.self( 2025-08-14T21:56:27.0955346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0955734Z return func(*args, **kwargs) 2025-08-14T21:56:27.0956123Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:56:27.0956630Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:56:27.0956840Z 2025-08-14T21:56:27.0956933Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0957152Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0957375Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0957593Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0957808Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0958018Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0958244Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0958467Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0958680Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0958926Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0959153Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0959376Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0959592Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0959841Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0960225Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0960562Z return mod(**inputs) 2025-08-14T21:56:27.0960961Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1516, in forward 2025-08-14T21:56:27.0961449Z outputs = self.roberta( 2025-08-14T21:56:27.0961839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 890, in forward 2025-08-14T21:56:27.0962265Z encoder_outputs = self.encoder( 2025-08-14T21:56:27.0962709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 631, in forward 2025-08-14T21:56:27.0963149Z layer_outputs = layer_module( 2025-08-14T21:56:27.0963518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:27.0963917Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:27.0964365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 539, in forward 2025-08-14T21:56:27.0964803Z self_attention_outputs = self.attention( 2025-08-14T21:56:27.0965216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0965618Z return func(*args, **kwargs) 2025-08-14T21:56:27.0966046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 466, in forward 2025-08-14T21:56:27.0966475Z self_outputs = self.self( 2025-08-14T21:56:27.0966873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-08-14T21:56:27.0967286Z return func(*args, **kwargs) 2025-08-14T21:56:27.0967713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 388, in forward 2025-08-14T21:56:27.0968202Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-08-14T21:56:27.0968406Z 2025-08-14T21:56:27.0968491Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0968807Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0969039Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0969265Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0969489Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0969702Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0969925Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0970155Z cudagraph partition due to non gpu ops 2025-08-14T21:56:27.0970400Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0970801Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0971157Z return mod(**inputs) 2025-08-14T21:56:27.0971627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1548, in forward 2025-08-14T21:56:27.0972093Z start_loss = loss_fct(start_logits, start_positions) 2025-08-14T21:56:27.0972274Z 2025-08-14T21:56:27.0972387Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:27.0972775Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:27.0973129Z return mod(**inputs) 2025-08-14T21:56:27.0973536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/roberta/modeling_roberta.py", line 1549, in forward 2025-08-14T21:56:27.0974005Z end_loss = loss_fct(end_logits, end_positions) 2025-08-14T21:56:27.0974167Z 2025-08-14T21:56:35.5592575Z Compilation time (from dynamo_timed): 18.226882639 2025-08-14T21:56:35.5595355Z pass 2025-08-14T21:56:35.5595768Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:56:35.5596785Z TIMING: _recursive_pre_grad_passes:0.03679 _recursive_joint_graph_passes:0.41935 _recursive_post_grad_passes:0.08957 async_compile.wait:0.56833 code_gen:7.94462 inductor_compile:9.90818 backend_compile:15.3002 gc:0.00051 entire_frame_compile:18.22688 total_wall_time:18.22688 2025-08-14T21:56:35.5600717Z STATS: call_* op count: 305 | FakeTensorMode.__torch_dispatch__:26937 | FakeTensor.__torch_dispatch__:3324 | ProxyTorchDispatchMode.__torch_dispatch__:7255 2025-08-14T21:56:35.5601365Z Dynamo produced 1 graphs covering 305 ops with 0 graph breaks (0 unique) 2025-08-14T21:56:41.3355004Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:56:41.3355983Z from pkg_resources import resource_filename 2025-08-14T21:56:41.9312896Z 2025-08-14T21:56:43.1841102Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:56:43.1841488Z loading model: 0it [00:01, ?it/s] 2025-08-14T21:56:43.1841864Z cpu eval T5ForConditionalGeneration 2025-08-14T21:56:44.6500538Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:56:45.0258601Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:56:45.4685477Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:56:58.9392710Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9393249Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9393617Z return mod(**inputs) 2025-08-14T21:56:58.9394031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9394476Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9394870Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9395289Z layer_outputs = layer_module( 2025-08-14T21:56:58.9395673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9396086Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9396494Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9396909Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9397320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9397720Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9398459Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 546, in forward 2025-08-14T21:56:58.9398948Z position_bias = position_bias + causal_mask 2025-08-14T21:56:58.9399106Z 2025-08-14T21:56:58.9399208Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9399439Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9400213Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9401010Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9401276Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9401718Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9402172Z return mod(**inputs) 2025-08-14T21:56:58.9403358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9403837Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9404307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9405356Z layer_outputs = layer_module( 2025-08-14T21:56:58.9405796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9406341Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9406794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9407340Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9407866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9408290Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9408930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9409407Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9409619Z 2025-08-14T21:56:58.9409734Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9410107Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9410405Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9410927Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9411402Z return mod(**inputs) 2025-08-14T21:56:58.9411801Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9412211Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9412621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9413028Z layer_outputs = layer_module( 2025-08-14T21:56:58.9413414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9413905Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9414443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9414987Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9415401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9415806Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9416214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9416812Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9416993Z 2025-08-14T21:56:58.9417166Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9417595Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9418027Z return mod(**inputs) 2025-08-14T21:56:58.9418457Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9418913Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9419328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9419740Z layer_outputs = layer_module( 2025-08-14T21:56:58.9420120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9420525Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9420965Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9421383Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9422048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9422558Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9422985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9423513Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9423689Z 2025-08-14T21:56:58.9423777Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9424010Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9424235Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9424454Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9424681Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9424908Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9425125Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9425352Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9425617Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9426018Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9426536Z return mod(**inputs) 2025-08-14T21:56:58.9426917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9427323Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9427708Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9428110Z layer_outputs = layer_module( 2025-08-14T21:56:58.9428493Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9428899Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9429298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9429711Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9430106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9430496Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9430886Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9431329Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9431523Z 2025-08-14T21:56:58.9431640Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9432182Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9432544Z return mod(**inputs) 2025-08-14T21:56:58.9432933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9433331Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9433717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9434243Z layer_outputs = layer_module( 2025-08-14T21:56:58.9434607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9434982Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9435374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9435776Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9436326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9436766Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9437163Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 558, in forward 2025-08-14T21:56:58.9437648Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:56:58.9437873Z 2025-08-14T21:56:58.9437993Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9438369Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9438746Z return mod(**inputs) 2025-08-14T21:56:58.9439132Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9439545Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9439941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9440345Z layer_outputs = layer_module( 2025-08-14T21:56:58.9440789Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9441338Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9441746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9442159Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9442644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9443129Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9443528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9443964Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9444144Z 2025-08-14T21:56:58.9444275Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9444793Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9445176Z return mod(**inputs) 2025-08-14T21:56:58.9445682Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9446313Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9446858Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9447262Z layer_outputs = layer_module( 2025-08-14T21:56:58.9447640Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9448087Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9448554Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9449294Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9449709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9450288Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9450827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9451265Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9451452Z 2025-08-14T21:56:58.9451546Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9451784Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9452019Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9452240Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9452462Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9452687Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9452906Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9453159Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9453393Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9453613Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9453870Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9454208Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9454519Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9454926Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9455315Z return mod(**inputs) 2025-08-14T21:56:58.9455786Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9456283Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9456771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9457245Z layer_outputs = layer_module( 2025-08-14T21:56:58.9457623Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9458030Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9458439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9458870Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9459269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9459687Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9460097Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9460586Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9460889Z 2025-08-14T21:56:58.9461025Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9461432Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9461777Z return mod(**inputs) 2025-08-14T21:56:58.9462135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9462528Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9462910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9463300Z layer_outputs = layer_module( 2025-08-14T21:56:58.9463660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9464042Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9464434Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9464823Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9465213Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9465612Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9465998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 558, in forward 2025-08-14T21:56:58.9466528Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:56:58.9466762Z 2025-08-14T21:56:58.9466847Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9467076Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9467326Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9467717Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9468065Z return mod(**inputs) 2025-08-14T21:56:58.9468464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9468854Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9469238Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9469630Z layer_outputs = layer_module( 2025-08-14T21:56:58.9469995Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9470370Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9470785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9471177Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9471560Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9471967Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9472361Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9472782Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9472959Z 2025-08-14T21:56:58.9473075Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9473467Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9473812Z return mod(**inputs) 2025-08-14T21:56:58.9474170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9474567Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9474955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9475347Z layer_outputs = layer_module( 2025-08-14T21:56:58.9475709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9476089Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9476481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9476882Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9477267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9477677Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9478068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9478486Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9478661Z 2025-08-14T21:56:58.9478746Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9478968Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9479187Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9479398Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9479615Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9479830Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9480037Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9480287Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9480528Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9480742Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9480989Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9481378Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9481725Z return mod(**inputs) 2025-08-14T21:56:58.9482091Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9482495Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9482918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9483311Z layer_outputs = layer_module( 2025-08-14T21:56:58.9483687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9484078Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9484487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9484886Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9485314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9485735Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9486131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9486586Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9486791Z 2025-08-14T21:56:58.9486905Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9487300Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9487650Z return mod(**inputs) 2025-08-14T21:56:58.9488034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9488438Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9488930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9489358Z layer_outputs = layer_module( 2025-08-14T21:56:58.9489742Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9490145Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9490552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9490972Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9491384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9491806Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9492210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 558, in forward 2025-08-14T21:56:58.9492704Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:56:58.9492935Z 2025-08-14T21:56:58.9493028Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9493255Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9493517Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9493914Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9494291Z return mod(**inputs) 2025-08-14T21:56:58.9494673Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9495087Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9495542Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9495941Z layer_outputs = layer_module( 2025-08-14T21:56:58.9496318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9496713Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9497118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9497538Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9497963Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9498443Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9498840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9499279Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9499463Z 2025-08-14T21:56:58.9499579Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9499971Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9500352Z return mod(**inputs) 2025-08-14T21:56:58.9500731Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9501149Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9501537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9501899Z layer_outputs = layer_module( 2025-08-14T21:56:58.9502246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9502813Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9503195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9503596Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9503987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9504388Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9504774Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9505208Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9505369Z 2025-08-14T21:56:58.9505458Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9505665Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9505875Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9506082Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9506288Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9506495Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9506715Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9506932Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9507150Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9507357Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9507593Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9507951Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9508282Z return mod(**inputs) 2025-08-14T21:56:58.9508635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9509006Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9509365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9509737Z layer_outputs = layer_module( 2025-08-14T21:56:58.9510207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9510566Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9510938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9511322Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9511712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9512103Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9512527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9512980Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9513170Z 2025-08-14T21:56:58.9513286Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9513675Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9514022Z return mod(**inputs) 2025-08-14T21:56:58.9514406Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9514794Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9515156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9515524Z layer_outputs = layer_module( 2025-08-14T21:56:58.9515873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9516226Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9516598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9516968Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9517333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9517710Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9518085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 558, in forward 2025-08-14T21:56:58.9518550Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:56:58.9518769Z 2025-08-14T21:56:58.9518853Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9519075Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9519324Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9519693Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9520049Z return mod(**inputs) 2025-08-14T21:56:58.9520393Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9520779Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9521156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9521547Z layer_outputs = layer_module( 2025-08-14T21:56:58.9521910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9522281Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9522671Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9523074Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9523475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9523876Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9524322Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9524762Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9524930Z 2025-08-14T21:56:58.9525047Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9525420Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9525765Z return mod(**inputs) 2025-08-14T21:56:58.9526135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9526531Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9526949Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9527351Z layer_outputs = layer_module( 2025-08-14T21:56:58.9527724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9528113Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9528516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9529005Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9529405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9529822Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9530229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9530658Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9530828Z 2025-08-14T21:56:58.9530913Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9531141Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9531388Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9531763Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9532113Z return mod(**inputs) 2025-08-14T21:56:58.9532482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9532875Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9533255Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9533646Z layer_outputs = layer_module( 2025-08-14T21:56:58.9534012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9534393Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9534774Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9535168Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9535561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 609, in forward 2025-08-14T21:56:58.9535997Z hidden_states = hidden_states + self.dropout(attention_output[0]) 2025-08-14T21:56:58.9536199Z 2025-08-14T21:56:58.9536282Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9536501Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9536714Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9536921Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9537137Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9537337Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9537532Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9537735Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9537968Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9538327Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9538691Z return mod(**inputs) 2025-08-14T21:56:58.9539034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9539399Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9539750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9540116Z layer_outputs = layer_module( 2025-08-14T21:56:58.9540461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9540816Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9541209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9541590Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9541973Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9542349Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9542726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9543172Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9543341Z 2025-08-14T21:56:58.9543453Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9543795Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9544109Z return mod(**inputs) 2025-08-14T21:56:58.9544452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9544805Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9545159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9545526Z layer_outputs = layer_module( 2025-08-14T21:56:58.9545867Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9546223Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9546593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9546964Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9547325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9547702Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9548071Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 558, in forward 2025-08-14T21:56:58.9548513Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:56:58.9548720Z 2025-08-14T21:56:58.9548801Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9549015Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9549255Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9549632Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9549975Z return mod(**inputs) 2025-08-14T21:56:58.9550339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9550730Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9551105Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9551485Z layer_outputs = layer_module( 2025-08-14T21:56:58.9551832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9552212Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9552603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9552995Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9553384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9553783Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9554178Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9554614Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9554818Z 2025-08-14T21:56:58.9554935Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9555308Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9555651Z return mod(**inputs) 2025-08-14T21:56:58.9556022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9556407Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9556791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9557212Z layer_outputs = layer_module( 2025-08-14T21:56:58.9557577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9557953Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9558347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9558749Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9559133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9559535Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9559925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9560342Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9560513Z 2025-08-14T21:56:58.9560596Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9560822Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9561040Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9561248Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9561463Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9561676Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9561889Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9562097Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9562310Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9562521Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9562762Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9563144Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9563487Z return mod(**inputs) 2025-08-14T21:56:58.9563851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9564241Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9564625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9565021Z layer_outputs = layer_module( 2025-08-14T21:56:58.9565379Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9565757Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9566148Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9566584Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9566988Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9567393Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9567789Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9568233Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9568430Z 2025-08-14T21:56:58.9568540Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9569035Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9569406Z return mod(**inputs) 2025-08-14T21:56:58.9569788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9570204Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9570589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9570989Z layer_outputs = layer_module( 2025-08-14T21:56:58.9571396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9571796Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9572208Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9572615Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9573030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9573442Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9573850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 558, in forward 2025-08-14T21:56:58.9574346Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:56:58.9574577Z 2025-08-14T21:56:58.9574664Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9574898Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9575150Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9575541Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9575892Z return mod(**inputs) 2025-08-14T21:56:58.9576263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9576679Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9577074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9577482Z layer_outputs = layer_module( 2025-08-14T21:56:58.9577852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9578247Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9578646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9579050Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9579442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9579853Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9580253Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9580678Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9580863Z 2025-08-14T21:56:58.9580977Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9581416Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9581779Z return mod(**inputs) 2025-08-14T21:56:58.9582139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9582537Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9582932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9583295Z layer_outputs = layer_module( 2025-08-14T21:56:58.9583642Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9584049Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9584440Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9584838Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9585243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9585659Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9586046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9586485Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9586661Z 2025-08-14T21:56:58.9586745Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9586972Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9587196Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9587555Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9587876Z return mod(**inputs) 2025-08-14T21:56:58.9588230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:56:58.9588614Z encoder_outputs = self.encoder( 2025-08-14T21:56:58.9588999Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9589382Z layer_outputs = layer_module( 2025-08-14T21:56:58.9589738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9590123Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9590489Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9590863Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9591233Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 609, in forward 2025-08-14T21:56:58.9591679Z hidden_states = hidden_states + self.dropout(attention_output[0]) 2025-08-14T21:56:58.9591879Z 2025-08-14T21:56:58.9591976Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9592200Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9592426Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9592648Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9592871Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9593259Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9593506Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9593885Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9594223Z return mod(**inputs) 2025-08-14T21:56:58.9594599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9594994Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9595380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9595790Z layer_outputs = layer_module( 2025-08-14T21:56:58.9596174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9596557Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9596945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9597348Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9597741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9598148Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9598548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9598980Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9599166Z 2025-08-14T21:56:58.9599259Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9599475Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9599725Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9600102Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9600457Z return mod(**inputs) 2025-08-14T21:56:58.9600815Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9601204Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9601583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9601972Z layer_outputs = layer_module( 2025-08-14T21:56:58.9602337Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9602926Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9603327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9603718Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9604116Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9604520Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9604915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9605374Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9605553Z 2025-08-14T21:56:58.9605664Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9606055Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9606403Z return mod(**inputs) 2025-08-14T21:56:58.9606782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9607197Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9607579Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9607966Z layer_outputs = layer_module( 2025-08-14T21:56:58.9608331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9608712Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9609165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9609578Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9609982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9610396Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9610894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9611320Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9611493Z 2025-08-14T21:56:58.9611589Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9611818Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9612034Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9612265Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9612482Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9612691Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9612933Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9613156Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9613367Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9613586Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9613832Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9614207Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9614538Z return mod(**inputs) 2025-08-14T21:56:58.9614885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9615297Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9615652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9616018Z layer_outputs = layer_module( 2025-08-14T21:56:58.9616365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9616733Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9617132Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9617505Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9617876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9618245Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9618616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9619038Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9619215Z 2025-08-14T21:56:58.9619299Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9619500Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9619735Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9620089Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9620402Z return mod(**inputs) 2025-08-14T21:56:58.9620749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9621126Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9621534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9621905Z layer_outputs = layer_module( 2025-08-14T21:56:58.9622246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9622609Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9622970Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9623350Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9623720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9624098Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9624510Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9624913Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9625072Z 2025-08-14T21:56:58.9625186Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9625555Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9625898Z return mod(**inputs) 2025-08-14T21:56:58.9626261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9626662Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9627058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9627454Z layer_outputs = layer_module( 2025-08-14T21:56:58.9627798Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9628159Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9628524Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9628923Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9629289Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9629691Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9630092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9630527Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9630694Z 2025-08-14T21:56:58.9630784Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9631005Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9631215Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9631423Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9631620Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9631822Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9632058Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9632438Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9632784Z return mod(**inputs) 2025-08-14T21:56:58.9633155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9633555Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9633936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9634325Z layer_outputs = layer_module( 2025-08-14T21:56:58.9634686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9635067Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9635463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9635865Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9636257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9636696Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9637097Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9637547Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9637735Z 2025-08-14T21:56:58.9637823Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9638037Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9638283Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9638710Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9639049Z return mod(**inputs) 2025-08-14T21:56:58.9639415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9639812Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9640196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9640587Z layer_outputs = layer_module( 2025-08-14T21:56:58.9640977Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9641369Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9641750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9642155Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9642546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9642942Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9643351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9643774Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9643946Z 2025-08-14T21:56:58.9644061Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9644429Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9644772Z return mod(**inputs) 2025-08-14T21:56:58.9645137Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9645536Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9645913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9646299Z layer_outputs = layer_module( 2025-08-14T21:56:58.9646659Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9647039Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9647420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9647817Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9648207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9648591Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9649035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9649470Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9649646Z 2025-08-14T21:56:58.9649740Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9649963Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9650192Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9650430Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9650640Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9650857Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9651084Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9651281Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9651489Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9651691Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9651919Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9652274Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9652646Z return mod(**inputs) 2025-08-14T21:56:58.9653032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9653421Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9653809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9654199Z layer_outputs = layer_module( 2025-08-14T21:56:58.9654567Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9654942Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9655328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9655703Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9656065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9656442Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9656814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9657252Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9657430Z 2025-08-14T21:56:58.9657506Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9657717Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9657951Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9658300Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9658626Z return mod(**inputs) 2025-08-14T21:56:58.9658974Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9659347Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9659711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9660103Z layer_outputs = layer_module( 2025-08-14T21:56:58.9660469Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9660858Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9661251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9661646Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9662017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9662387Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9662760Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9663191Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9663359Z 2025-08-14T21:56:58.9663478Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9663848Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9664191Z return mod(**inputs) 2025-08-14T21:56:58.9664554Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9664948Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9665333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9665733Z layer_outputs = layer_module( 2025-08-14T21:56:58.9666098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9666478Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9666951Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9667348Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9667730Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9668127Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9668517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9668958Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9669126Z 2025-08-14T21:56:58.9669206Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9669451Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9669706Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9670080Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9670430Z return mod(**inputs) 2025-08-14T21:56:58.9670796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9671183Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9671580Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9671969Z layer_outputs = layer_module( 2025-08-14T21:56:58.9672336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9672710Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9673082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9673459Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9673831Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 609, in forward 2025-08-14T21:56:58.9674253Z hidden_states = hidden_states + self.dropout(attention_output[0]) 2025-08-14T21:56:58.9674446Z 2025-08-14T21:56:58.9674526Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9674740Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9674947Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9675143Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9675377Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9675743Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9676060Z return mod(**inputs) 2025-08-14T21:56:58.9676410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9676782Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9677146Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9677517Z layer_outputs = layer_module( 2025-08-14T21:56:58.9677860Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9678218Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9678583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9678959Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9679331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9679708Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9680075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9680490Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9680690Z 2025-08-14T21:56:58.9680778Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9680998Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9681236Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9681596Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9681920Z return mod(**inputs) 2025-08-14T21:56:58.9682258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9682627Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9683006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9683370Z layer_outputs = layer_module( 2025-08-14T21:56:58.9683721Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9684102Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9684499Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9684869Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9685257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9685636Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9685997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9686425Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9686604Z 2025-08-14T21:56:58.9686718Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9687106Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9687451Z return mod(**inputs) 2025-08-14T21:56:58.9687831Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9688241Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9688621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9689099Z layer_outputs = layer_module( 2025-08-14T21:56:58.9689478Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9689888Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9690284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9690676Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9691075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9691493Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9691888Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9692325Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9692499Z 2025-08-14T21:56:58.9704686Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9705049Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9705287Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9705508Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9705717Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9705938Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9706176Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9706391Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9706612Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9706831Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9707228Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9707690Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9708054Z return mod(**inputs) 2025-08-14T21:56:58.9708475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9708884Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9709293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9709700Z layer_outputs = layer_module( 2025-08-14T21:56:58.9710118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9710512Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9710919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9711330Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9711719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9712145Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9712516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9712930Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9713108Z 2025-08-14T21:56:58.9713186Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9713399Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9713640Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9714001Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9714332Z return mod(**inputs) 2025-08-14T21:56:58.9714693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9715076Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9715459Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9715857Z layer_outputs = layer_module( 2025-08-14T21:56:58.9716229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9716615Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9717003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9717403Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9717796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9718191Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9718588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9719017Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9719193Z 2025-08-14T21:56:58.9719315Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9719695Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9720041Z return mod(**inputs) 2025-08-14T21:56:58.9720409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9720795Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9721179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9721570Z layer_outputs = layer_module( 2025-08-14T21:56:58.9721977Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9722356Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9722749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9723160Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9723543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9723946Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9724358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9724791Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9724969Z 2025-08-14T21:56:58.9725056Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9725289Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9725520Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9725739Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9725964Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9726188Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9726462Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9726850Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9727215Z return mod(**inputs) 2025-08-14T21:56:58.9727590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9727985Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9728373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9728860Z layer_outputs = layer_module( 2025-08-14T21:56:58.9729254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9729647Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9730053Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9730465Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9730868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9731292Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9731689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9732145Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9732337Z 2025-08-14T21:56:58.9732427Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9732652Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9732904Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9733279Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9733622Z return mod(**inputs) 2025-08-14T21:56:58.9734002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9734402Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9734785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9735183Z layer_outputs = layer_module( 2025-08-14T21:56:58.9735550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9735938Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9736325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9736769Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9737182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9737592Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9738009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9738602Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9738784Z 2025-08-14T21:56:58.9738911Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9739322Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9739683Z return mod(**inputs) 2025-08-14T21:56:58.9740105Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9740512Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9740917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9741319Z layer_outputs = layer_module( 2025-08-14T21:56:58.9741720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9742103Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9742515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9742933Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9743332Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9743752Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9744157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9744598Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9744772Z 2025-08-14T21:56:58.9744860Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9745093Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9745350Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9745742Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9746085Z return mod(**inputs) 2025-08-14T21:56:58.9746460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9746868Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9747250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9747656Z layer_outputs = layer_module( 2025-08-14T21:56:58.9748031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9748420Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9748809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9749214Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9749613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 647, in forward 2025-08-14T21:56:58.9750062Z layer_output = hidden_states + self.dropout(attention_output[0]) 2025-08-14T21:56:58.9750266Z 2025-08-14T21:56:58.9750360Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9750586Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9750805Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9751013Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9751227Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9751466Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9751691Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9751911Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9752158Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9752532Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9752874Z return mod(**inputs) 2025-08-14T21:56:58.9753247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9753620Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9754021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9754412Z layer_outputs = layer_module( 2025-08-14T21:56:58.9754771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9755143Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9755539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9755933Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9757072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9757442Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9757813Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9758233Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9758410Z 2025-08-14T21:56:58.9758498Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9758701Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9758939Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9759307Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9759644Z return mod(**inputs) 2025-08-14T21:56:58.9760011Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9760405Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9760778Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9761173Z layer_outputs = layer_module( 2025-08-14T21:56:58.9761517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9761886Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9762264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9762663Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9763061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9763460Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9763847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9764280Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9764441Z 2025-08-14T21:56:58.9764552Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9764901Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9765245Z return mod(**inputs) 2025-08-14T21:56:58.9765606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9766004Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9766423Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9766814Z layer_outputs = layer_module( 2025-08-14T21:56:58.9767178Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9767553Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9767941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9768340Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9768808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9769223Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9769618Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9770058Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9770230Z 2025-08-14T21:56:58.9770324Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9770543Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9770770Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9771014Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9771230Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9771447Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9771696Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9772071Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9772415Z return mod(**inputs) 2025-08-14T21:56:58.9772782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9773180Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9773558Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9773946Z layer_outputs = layer_module( 2025-08-14T21:56:58.9774312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9774686Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9775073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9775465Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9775855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9776247Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9776640Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9777081Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9777273Z 2025-08-14T21:56:58.9777364Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9777580Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9777833Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9778213Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9778548Z return mod(**inputs) 2025-08-14T21:56:58.9778915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9779297Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9779642Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9779996Z layer_outputs = layer_module( 2025-08-14T21:56:58.9780332Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9780711Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9781095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9781494Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9781886Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9781976Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9782226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9782357Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9782362Z 2025-08-14T21:56:58.9782473Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9782694Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9782768Z return mod(**inputs) 2025-08-14T21:56:58.9783019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9783106Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9783377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9783462Z layer_outputs = layer_module( 2025-08-14T21:56:58.9783697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9783776Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9784016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9784098Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9784335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9784419Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9784650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9784765Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9784769Z 2025-08-14T21:56:58.9784846Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9784923Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9785006Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9785079Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9785160Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9785235Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9785338Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9785547Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9785613Z return mod(**inputs) 2025-08-14T21:56:58.9785850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9785931Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9786166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9786244Z layer_outputs = layer_module( 2025-08-14T21:56:58.9786463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9786540Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9786778Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 731, in forward 2025-08-14T21:56:58.9786871Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:56:58.9787102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 343, in forward 2025-08-14T21:56:58.9787277Z hidden_states = hidden_states + self.dropout(forwarded_states) 2025-08-14T21:56:58.9787281Z 2025-08-14T21:56:58.9787359Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9787440Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9787517Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9787592Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9787704Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9787906Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9787972Z return mod(**inputs) 2025-08-14T21:56:58.9788233Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9788308Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9788546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9788620Z layer_outputs = layer_module( 2025-08-14T21:56:58.9788839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9788924Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9789174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9789253Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9789490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9789571Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9789808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9789933Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9789938Z 2025-08-14T21:56:58.9790014Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9790097Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9790199Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9790395Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9790469Z return mod(**inputs) 2025-08-14T21:56:58.9790703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9790782Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9791016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9791088Z layer_outputs = layer_module( 2025-08-14T21:56:58.9791314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9791391Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9791635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9791722Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9791974Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9792068Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9792327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9792439Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9792443Z 2025-08-14T21:56:58.9792559Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9792768Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9792844Z return mod(**inputs) 2025-08-14T21:56:58.9793170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9793253Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9793508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9793585Z layer_outputs = layer_module( 2025-08-14T21:56:58.9793821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9793907Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9794151Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:56:58.9794239Z self_attention_outputs = self.layer[0]( 2025-08-14T21:56:58.9794476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:56:58.9794556Z attention_output = self.SelfAttention( 2025-08-14T21:56:58.9794790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9794905Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9794925Z 2025-08-14T21:56:58.9795016Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9795098Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9795179Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9795266Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9795346Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9795426Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9795547Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9795760Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9795830Z return mod(**inputs) 2025-08-14T21:56:58.9796085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9796167Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9796423Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9796501Z layer_outputs = layer_module( 2025-08-14T21:56:58.9796740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9796834Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9797089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9797185Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9797429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9797517Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9797771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:56:58.9797906Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:56:58.9797911Z 2025-08-14T21:56:58.9797992Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9798081Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9798189Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9798416Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9798485Z return mod(**inputs) 2025-08-14T21:56:58.9798734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9798818Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9799073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9799170Z layer_outputs = layer_module( 2025-08-14T21:56:58.9799433Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9799518Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9799775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9799859Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9800107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9800203Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9800508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:56:58.9800631Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:56:58.9800637Z 2025-08-14T21:56:58.9800744Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9800964Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9801042Z return mod(**inputs) 2025-08-14T21:56:58.9801305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:56:58.9801383Z decoder_outputs = self.decoder( 2025-08-14T21:56:58.9801682Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:56:58.9801757Z layer_outputs = layer_module( 2025-08-14T21:56:58.9802003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:56:58.9802085Z return super().__call__(*args, **kwargs) 2025-08-14T21:56:58.9802338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:56:58.9802435Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:56:58.9802824Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:56:58.9802921Z attention_output = self.EncDecAttention( 2025-08-14T21:56:58.9803197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:56:58.9803310Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:56:58.9803314Z 2025-08-14T21:56:58.9803404Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9803486Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9803569Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9803656Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9803735Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9803813Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9803901Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9803983Z cudagraph partition due to non gpu ops 2025-08-14T21:56:58.9804101Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:56:58.9804308Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:56:58.9804378Z return mod(**inputs) 2025-08-14T21:56:58.9804653Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1798, in forward 2025-08-14T21:56:58.9804803Z loss = loss_fct(lm_logits.view(-1, lm_logits.size(-1)), labels.view(-1)) 2025-08-14T21:56:58.9804807Z 2025-08-14T21:57:09.5881045Z Compilation time (from dynamo_timed): 22.584980588 2025-08-14T21:57:09.6000692Z pass 2025-08-14T21:57:09.6001104Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:57:09.6006638Z TIMING: _recursive_pre_grad_passes:0.06388 _recursive_joint_graph_passes:0.62725 _recursive_post_grad_passes:0.23136 async_compile.wait:0.8222 code_gen:9.55208 inductor_compile:11.63688 backend_compile:19.00593 gc:0.00019 entire_frame_compile:22.58498 total_wall_time:22.58498 2025-08-14T21:57:09.6007748Z STATS: call_* op count: 824 | FakeTensorMode.__torch_dispatch__:38726 | FakeTensor.__torch_dispatch__:5207 | ProxyTorchDispatchMode.__torch_dispatch__:10688 2025-08-14T21:57:09.6008341Z Dynamo produced 1 graphs covering 824 ops with 0 graph breaks (0 unique) 2025-08-14T21:57:15.2276084Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:57:15.2277092Z from pkg_resources import resource_filename 2025-08-14T21:57:15.8310673Z 2025-08-14T21:57:17.0768104Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:57:17.0768469Z loading model: 0it [00:01, ?it/s] 2025-08-14T21:57:17.0768876Z cpu eval T5Small 2025-08-14T21:57:18.3036474Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:57:18.6668754Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:57:19.0486118Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:57:32.6056317Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6058903Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6059308Z return mod(**inputs) 2025-08-14T21:57:32.6059773Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6060189Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6060609Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6061015Z layer_outputs = layer_module( 2025-08-14T21:57:32.6061378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6065315Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6065824Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6066246Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6066663Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6067131Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6067514Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 546, in forward 2025-08-14T21:57:32.6067900Z position_bias = position_bias + causal_mask 2025-08-14T21:57:32.6068069Z 2025-08-14T21:57:32.6068158Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6068382Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6068595Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6068819Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6069119Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6069541Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6069985Z return mod(**inputs) 2025-08-14T21:57:32.6070364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6070789Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6071160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6071534Z layer_outputs = layer_module( 2025-08-14T21:57:32.6072236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6072636Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6073027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6073430Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6073829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6074235Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6074647Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6075079Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6075259Z 2025-08-14T21:57:32.6075351Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6075574Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6075844Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6076243Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6076643Z return mod(**inputs) 2025-08-14T21:57:32.6077017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6077458Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6077845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6078240Z layer_outputs = layer_module( 2025-08-14T21:57:32.6078601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6078966Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6079358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6079768Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6080167Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6080570Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6081004Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6081425Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6081594Z 2025-08-14T21:57:32.6081711Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6082069Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6082421Z return mod(**inputs) 2025-08-14T21:57:32.6082793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6083181Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6083574Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6083970Z layer_outputs = layer_module( 2025-08-14T21:57:32.6084374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6084756Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6085146Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6085542Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6085927Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6086319Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6086706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6087183Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6087364Z 2025-08-14T21:57:32.6087455Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6087695Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6087929Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6088163Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6088387Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6088618Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6089174Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6089392Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6089687Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6090089Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6090451Z return mod(**inputs) 2025-08-14T21:57:32.6090841Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6091234Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6091624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6092082Z layer_outputs = layer_module( 2025-08-14T21:57:32.6092445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6092824Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6093208Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6093604Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6093993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6094397Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6094788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6095239Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6095433Z 2025-08-14T21:57:32.6095550Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6095930Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6096265Z return mod(**inputs) 2025-08-14T21:57:32.6096632Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6097024Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6097400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6097789Z layer_outputs = layer_module( 2025-08-14T21:57:32.6098156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6098536Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6098918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6099311Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6099695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6100089Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6100492Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 558, in forward 2025-08-14T21:57:32.6100973Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:57:32.6101193Z 2025-08-14T21:57:32.6101306Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6101721Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6102065Z return mod(**inputs) 2025-08-14T21:57:32.6102429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6103066Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6103449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6103839Z layer_outputs = layer_module( 2025-08-14T21:57:32.6104205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6104665Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6105067Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6105464Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6105863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6106262Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6106687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6107095Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6107265Z 2025-08-14T21:57:32.6107372Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6107753Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6108105Z return mod(**inputs) 2025-08-14T21:57:32.6108469Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6108852Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6109242Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6109632Z layer_outputs = layer_module( 2025-08-14T21:57:32.6109987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6110376Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6110770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6111161Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6111527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6111904Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6112274Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6112679Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6112840Z 2025-08-14T21:57:32.6112922Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6113137Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6113347Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6113551Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6113761Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6113967Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6114162Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6114367Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6114573Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6114791Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6114999Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6115213Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6115460Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6115832Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6116250Z return mod(**inputs) 2025-08-14T21:57:32.6116622Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6116986Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6117352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6117715Z layer_outputs = layer_module( 2025-08-14T21:57:32.6118057Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6118428Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6118800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6119175Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6119545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6119922Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6120293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6120735Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6120915Z 2025-08-14T21:57:32.6121019Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6121381Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6121706Z return mod(**inputs) 2025-08-14T21:57:32.6122054Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6122418Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6122782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6123156Z layer_outputs = layer_module( 2025-08-14T21:57:32.6123492Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6123855Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6124226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6124602Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6124986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6125383Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6125775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 558, in forward 2025-08-14T21:57:32.6126250Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:57:32.6126475Z 2025-08-14T21:57:32.6126560Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6126784Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6127032Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6127408Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6127751Z return mod(**inputs) 2025-08-14T21:57:32.6128144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6128555Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6129025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6129429Z layer_outputs = layer_module( 2025-08-14T21:57:32.6129798Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6130900Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6131324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6131731Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6132128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6132526Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6132924Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6133378Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6133554Z 2025-08-14T21:57:32.6133664Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6134045Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6134391Z return mod(**inputs) 2025-08-14T21:57:32.6134759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6135140Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6135548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6135954Z layer_outputs = layer_module( 2025-08-14T21:57:32.6136312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6136692Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6137087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6137481Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6137865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6138269Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6138658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6139085Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6139257Z 2025-08-14T21:57:32.6139343Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6139569Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6139795Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6140007Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6140225Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6140447Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6140656Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6140877Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6141094Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6141310Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6141554Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6141934Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6142278Z return mod(**inputs) 2025-08-14T21:57:32.6142654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6143051Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6143438Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6143828Z layer_outputs = layer_module( 2025-08-14T21:57:32.6144185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6144567Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6144957Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6145489Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6145896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6146300Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6146698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6147113Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6147305Z 2025-08-14T21:57:32.6147411Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6147816Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6148137Z return mod(**inputs) 2025-08-14T21:57:32.6148487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6148868Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6149273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6149643Z layer_outputs = layer_module( 2025-08-14T21:57:32.6150048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6150409Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6150771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6151143Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6151513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6151891Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6152254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 558, in forward 2025-08-14T21:57:32.6152706Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:57:32.6152921Z 2025-08-14T21:57:32.6153002Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6153217Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6153446Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6153805Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6154133Z return mod(**inputs) 2025-08-14T21:57:32.6154477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6154851Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6155217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6155587Z layer_outputs = layer_module( 2025-08-14T21:57:32.6155926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6156289Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6156662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6157030Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6157404Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6157781Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6158155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6158558Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6158733Z 2025-08-14T21:57:32.6158849Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6159251Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6159560Z return mod(**inputs) 2025-08-14T21:57:32.6159899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6160261Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6160614Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6160968Z layer_outputs = layer_module( 2025-08-14T21:57:32.6161320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6161675Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6162032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6162390Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6162751Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6163126Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6163519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6163944Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6164121Z 2025-08-14T21:57:32.6164205Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6164429Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6164644Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6164863Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6165079Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6165285Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6165499Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6165714Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6165924Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6166138Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6166379Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6166757Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6167087Z return mod(**inputs) 2025-08-14T21:57:32.6167455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6170940Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6171369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6171750Z layer_outputs = layer_module( 2025-08-14T21:57:32.6172104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6172510Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6172910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6173309Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6173722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6174115Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6174509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6174990Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6175182Z 2025-08-14T21:57:32.6175296Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6175686Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6176044Z return mod(**inputs) 2025-08-14T21:57:32.6176415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6176788Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6177175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6177573Z layer_outputs = layer_module( 2025-08-14T21:57:32.6177930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6178315Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6178705Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6179097Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6179479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6179874Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6180263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 558, in forward 2025-08-14T21:57:32.6180732Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:57:32.6180973Z 2025-08-14T21:57:32.6181059Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6181287Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6181538Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6181912Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6182255Z return mod(**inputs) 2025-08-14T21:57:32.6182625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6183021Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6183398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6183781Z layer_outputs = layer_module( 2025-08-14T21:57:32.6184149Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6184523Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6184912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6185305Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6185762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6186151Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6186539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6186965Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6187134Z 2025-08-14T21:57:32.6187243Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6187621Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6187991Z return mod(**inputs) 2025-08-14T21:57:32.6188355Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6188737Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6189124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6189515Z layer_outputs = layer_module( 2025-08-14T21:57:32.6189874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6190257Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6190690Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6191199Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6191578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6192020Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6192397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6192800Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6192964Z 2025-08-14T21:57:32.6193044Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6193258Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6193493Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6193843Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6194169Z return mod(**inputs) 2025-08-14T21:57:32.6194522Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6194895Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6195281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6195679Z layer_outputs = layer_module( 2025-08-14T21:57:32.6196048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6196424Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6196829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6197198Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6197566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 609, in forward 2025-08-14T21:57:32.6197986Z hidden_states = hidden_states + self.dropout(attention_output[0]) 2025-08-14T21:57:32.6198178Z 2025-08-14T21:57:32.6198257Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6198469Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6198679Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6198881Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6199081Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6199282Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6199475Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6199712Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6199946Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6200288Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6200604Z return mod(**inputs) 2025-08-14T21:57:32.6200945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6201306Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6201665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6202032Z layer_outputs = layer_module( 2025-08-14T21:57:32.6202378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6202926Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6203306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6203701Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6204089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6204487Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6204987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6205430Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6205622Z 2025-08-14T21:57:32.6205733Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6206127Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6206502Z return mod(**inputs) 2025-08-14T21:57:32.6206888Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6207293Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6207694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6208108Z layer_outputs = layer_module( 2025-08-14T21:57:32.6208470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6208916Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6209341Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6209809Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6210193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6210594Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6210990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 558, in forward 2025-08-14T21:57:32.6211462Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:57:32.6211681Z 2025-08-14T21:57:32.6211767Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6211995Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6212260Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6212638Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6212978Z return mod(**inputs) 2025-08-14T21:57:32.6213508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6213899Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6214277Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6214704Z layer_outputs = layer_module( 2025-08-14T21:57:32.6215074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6215450Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6215848Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6216248Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6216637Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6217028Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6217418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6217845Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6218013Z 2025-08-14T21:57:32.6218131Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6218506Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6218855Z return mod(**inputs) 2025-08-14T21:57:32.6219219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6219621Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6220021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6220409Z layer_outputs = layer_module( 2025-08-14T21:57:32.6220788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6221141Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6221508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6221883Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6222245Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6222625Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6222990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6223391Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6223551Z 2025-08-14T21:57:32.6223631Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6223860Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6224070Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6224267Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6224469Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6224675Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6224874Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6225078Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6225282Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6225487Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6225711Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6226070Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6226398Z return mod(**inputs) 2025-08-14T21:57:32.6226741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6227118Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6227478Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6227855Z layer_outputs = layer_module( 2025-08-14T21:57:32.6228215Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6228576Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6228945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6229315Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6229685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6230060Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6230429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6230841Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6231028Z 2025-08-14T21:57:32.6231135Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6231519Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6231865Z return mod(**inputs) 2025-08-14T21:57:32.6232226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6232623Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6233014Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6233416Z layer_outputs = layer_module( 2025-08-14T21:57:32.6233767Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6234128Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6234498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6234865Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6235236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6235612Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6235995Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 558, in forward 2025-08-14T21:57:32.6236438Z attn_weights = nn.functional.softmax(scores.float(), dim=-1).type_as(scores) 2025-08-14T21:57:32.6236653Z 2025-08-14T21:57:32.6236733Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6236945Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6237172Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6237553Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6237876Z return mod(**inputs) 2025-08-14T21:57:32.6238216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6238589Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6238954Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6239329Z layer_outputs = layer_module( 2025-08-14T21:57:32.6239658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6240015Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6240377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6240732Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6241101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6241499Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6241892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6242336Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6242517Z 2025-08-14T21:57:32.6242627Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6243014Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6243358Z return mod(**inputs) 2025-08-14T21:57:32.6243719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6244114Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6244501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6244887Z layer_outputs = layer_module( 2025-08-14T21:57:32.6245255Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6245639Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6246030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6246418Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6246807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6247228Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6247631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6248059Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6248239Z 2025-08-14T21:57:32.6248324Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6248549Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6248882Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6249284Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6249629Z return mod(**inputs) 2025-08-14T21:57:32.6249994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1725, in forward 2025-08-14T21:57:32.6250359Z encoder_outputs = self.encoder( 2025-08-14T21:57:32.6250712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6251075Z layer_outputs = layer_module( 2025-08-14T21:57:32.6251431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6251843Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6252235Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6252680Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6253052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 609, in forward 2025-08-14T21:57:32.6253477Z hidden_states = hidden_states + self.dropout(attention_output[0]) 2025-08-14T21:57:32.6253657Z 2025-08-14T21:57:32.6253748Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6253962Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6254183Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6254401Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6254608Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6254824Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6255070Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6255450Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6255785Z return mod(**inputs) 2025-08-14T21:57:32.6256155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6256583Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6256964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6257354Z layer_outputs = layer_module( 2025-08-14T21:57:32.6257719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6258104Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6258491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6258894Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6259289Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6259682Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6260079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6260524Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6260713Z 2025-08-14T21:57:32.6260804Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6261021Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6261272Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6261700Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6262043Z return mod(**inputs) 2025-08-14T21:57:32.6262403Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6262797Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6263186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6263547Z layer_outputs = layer_module( 2025-08-14T21:57:32.6263900Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6264261Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6264631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6265008Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6265369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6265743Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6266110Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6266664Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6266839Z 2025-08-14T21:57:32.6266949Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6267296Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6267599Z return mod(**inputs) 2025-08-14T21:57:32.6267930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6268288Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6268633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6269000Z layer_outputs = layer_module( 2025-08-14T21:57:32.6269338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6269692Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6270053Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6270463Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6270826Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6271199Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6271562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6271965Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6272125Z 2025-08-14T21:57:32.6272212Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6272419Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6272630Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6272837Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6273036Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6273242Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6273447Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6273654Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6273853Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6274060Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6274294Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6274646Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6275000Z return mod(**inputs) 2025-08-14T21:57:32.6275378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6275730Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6276080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6276432Z layer_outputs = layer_module( 2025-08-14T21:57:32.6276778Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6277125Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6277488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6277858Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6278220Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6278592Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6278944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6279356Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6279525Z 2025-08-14T21:57:32.6279600Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6279812Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6280049Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6280412Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6280726Z return mod(**inputs) 2025-08-14T21:57:32.6281076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6281451Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6281807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6282166Z layer_outputs = layer_module( 2025-08-14T21:57:32.6282501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6282853Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6283203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6283577Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6283963Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6284337Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6284724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6285151Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6285320Z 2025-08-14T21:57:32.6285437Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6285849Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6286215Z return mod(**inputs) 2025-08-14T21:57:32.6286601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6287009Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6287401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6287796Z layer_outputs = layer_module( 2025-08-14T21:57:32.6288162Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6288552Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6289092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6289516Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6289930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6290324Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6290714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6291111Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6291268Z 2025-08-14T21:57:32.6291348Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6291560Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6291769Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6291990Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6292200Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6292419Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6292663Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6293040Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6293411Z return mod(**inputs) 2025-08-14T21:57:32.6293779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6294170Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6294550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6294942Z layer_outputs = layer_module( 2025-08-14T21:57:32.6295310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6295695Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6296073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6296452Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6296826Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6297224Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6297620Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6298064Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6298276Z 2025-08-14T21:57:32.6298362Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6298587Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6298835Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6299215Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6299548Z return mod(**inputs) 2025-08-14T21:57:32.6299913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6300306Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6300686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6301077Z layer_outputs = layer_module( 2025-08-14T21:57:32.6301442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6301819Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6302202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6302596Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6303179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6303657Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6304052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6304475Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6304641Z 2025-08-14T21:57:32.6304757Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6305132Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6305476Z return mod(**inputs) 2025-08-14T21:57:32.6305843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6306236Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6306612Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6307003Z layer_outputs = layer_module( 2025-08-14T21:57:32.6307367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6307736Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6308167Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6308618Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6309005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6309396Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6309788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6310210Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6310378Z 2025-08-14T21:57:32.6310462Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6310690Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6310915Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6311130Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6311339Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6311556Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6311760Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6311958Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6312163Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6312365Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6312619Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6312982Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6313308Z return mod(**inputs) 2025-08-14T21:57:32.6313655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6314021Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6314386Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6314756Z layer_outputs = layer_module( 2025-08-14T21:57:32.6315094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6315453Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6315829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6316205Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6316570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6316946Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6317318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6317774Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6317960Z 2025-08-14T21:57:32.6318036Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6318246Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6318479Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6318830Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6319153Z return mod(**inputs) 2025-08-14T21:57:32.6319502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6319864Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6320227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6320595Z layer_outputs = layer_module( 2025-08-14T21:57:32.6320941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6321293Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6321676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6322051Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6322422Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6322793Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6323168Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6323594Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6323765Z 2025-08-14T21:57:32.6323873Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6324257Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6324601Z return mod(**inputs) 2025-08-14T21:57:32.6324963Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6325350Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6325732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6326118Z layer_outputs = layer_module( 2025-08-14T21:57:32.6326497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6326876Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6327263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6327664Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6328050Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6328450Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6328912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6329348Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6329516Z 2025-08-14T21:57:32.6329600Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6329820Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6330072Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6330441Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6330792Z return mod(**inputs) 2025-08-14T21:57:32.6331129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6331531Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6331879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6332237Z layer_outputs = layer_module( 2025-08-14T21:57:32.6332575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6332930Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6333300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6333673Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6334041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 609, in forward 2025-08-14T21:57:32.6334463Z hidden_states = hidden_states + self.dropout(attention_output[0]) 2025-08-14T21:57:32.6334647Z 2025-08-14T21:57:32.6334723Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6334929Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6335123Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6335323Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6335575Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6335923Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6336234Z return mod(**inputs) 2025-08-14T21:57:32.6336598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6336991Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6337371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6337763Z layer_outputs = layer_module( 2025-08-14T21:57:32.6338128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6338504Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6338875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6339245Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6339606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6339964Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6340359Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6340778Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6340955Z 2025-08-14T21:57:32.6341040Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6341244Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6341480Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6341837Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6342159Z return mod(**inputs) 2025-08-14T21:57:32.6342508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6342876Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6343235Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6343611Z layer_outputs = layer_module( 2025-08-14T21:57:32.6343971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6344349Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6344729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6345175Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6345575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6345962Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6346333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6346740Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6346902Z 2025-08-14T21:57:32.6347013Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6347386Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6347723Z return mod(**inputs) 2025-08-14T21:57:32.6348091Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6348469Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6348830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6349203Z layer_outputs = layer_module( 2025-08-14T21:57:32.6349573Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6349930Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6350289Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6350664Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6351032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6351400Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6351770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6352174Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6352335Z 2025-08-14T21:57:32.6352421Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6352627Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6352837Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6353043Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6353241Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6353449Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6353665Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6353897Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6354127Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6354333Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6354568Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6354924Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6355252Z return mod(**inputs) 2025-08-14T21:57:32.6355621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6356012Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6356398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6356789Z layer_outputs = layer_module( 2025-08-14T21:57:32.6357157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6357524Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6357892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6358268Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6358633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6359059Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6359434Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6359852Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6360028Z 2025-08-14T21:57:32.6360108Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6360318Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6360556Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6360917Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6361234Z return mod(**inputs) 2025-08-14T21:57:32.6361586Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6361981Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6362360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6362752Z layer_outputs = layer_module( 2025-08-14T21:57:32.6363117Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6363513Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6363893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6364287Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6364674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6365066Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6365456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6365884Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6366055Z 2025-08-14T21:57:32.6366172Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6366550Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6366898Z return mod(**inputs) 2025-08-14T21:57:32.6367270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6367673Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6368094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6368487Z layer_outputs = layer_module( 2025-08-14T21:57:32.6368942Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6369326Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6369727Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6370129Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6370530Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6370930Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6371331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6371762Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6371933Z 2025-08-14T21:57:32.6372019Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6372249Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6372476Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6372698Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6372933Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6373201Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6373453Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6373834Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6374184Z return mod(**inputs) 2025-08-14T21:57:32.6374556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6374936Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6375308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6375697Z layer_outputs = layer_module( 2025-08-14T21:57:32.6376093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6376475Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6376878Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6377286Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6377674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6378075Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6378446Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6378863Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6379038Z 2025-08-14T21:57:32.6379116Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6379325Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6379560Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6379916Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6380238Z return mod(**inputs) 2025-08-14T21:57:32.6380586Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6380963Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6381317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6381695Z layer_outputs = layer_module( 2025-08-14T21:57:32.6382050Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6382399Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6382747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6383148Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6383513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6383882Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6384252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6384651Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6384808Z 2025-08-14T21:57:32.6384917Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6385269Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6385595Z return mod(**inputs) 2025-08-14T21:57:32.6385958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6386345Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6386728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6387152Z layer_outputs = layer_module( 2025-08-14T21:57:32.6387518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6387895Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6388288Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6388665Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6389033Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6389398Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6389769Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6390170Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6390330Z 2025-08-14T21:57:32.6390412Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6390629Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6390867Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6391225Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6391557Z return mod(**inputs) 2025-08-14T21:57:32.6391902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6392274Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6392630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6392997Z layer_outputs = layer_module( 2025-08-14T21:57:32.6393355Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6393733Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6394113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6394508Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6394901Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 647, in forward 2025-08-14T21:57:32.6395338Z layer_output = hidden_states + self.dropout(attention_output[0]) 2025-08-14T21:57:32.6395537Z 2025-08-14T21:57:32.6395631Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6395859Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6396070Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6396270Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6396474Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6396678Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6396876Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6397081Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6397316Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6397682Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6398034Z return mod(**inputs) 2025-08-14T21:57:32.6398405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6398801Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6399182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6399578Z layer_outputs = layer_module( 2025-08-14T21:57:32.6399947Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6400327Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6400720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6401159Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6401553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6401947Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6402338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6402940Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6403133Z 2025-08-14T21:57:32.6403228Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6403446Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6403697Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6404078Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6404420Z return mod(**inputs) 2025-08-14T21:57:32.6404797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6405199Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6405631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6406014Z layer_outputs = layer_module( 2025-08-14T21:57:32.6406382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6406765Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6407152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6407583Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6407991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6408404Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6408878Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6409328Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6409504Z 2025-08-14T21:57:32.6409624Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6410011Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6410386Z return mod(**inputs) 2025-08-14T21:57:32.6410827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6411226Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6411610Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6412008Z layer_outputs = layer_module( 2025-08-14T21:57:32.6412384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6412771Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6413164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6413567Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6413966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6414363Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6414768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6415204Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6415377Z 2025-08-14T21:57:32.6415469Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6415732Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6415980Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6416202Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6416422Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6416649Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6416899Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6417285Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6417649Z return mod(**inputs) 2025-08-14T21:57:32.6418048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6418450Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6418834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6419235Z layer_outputs = layer_module( 2025-08-14T21:57:32.6419613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6419999Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6420401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6420829Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6421228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6421626Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6422026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6422479Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6422660Z 2025-08-14T21:57:32.6422747Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6422955Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6423193Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6423554Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6423878Z return mod(**inputs) 2025-08-14T21:57:32.6424234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6424606Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6424990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6425356Z layer_outputs = layer_module( 2025-08-14T21:57:32.6425706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6426066Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6426429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6426810Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6427185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6427268Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6427500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6427605Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6427609Z 2025-08-14T21:57:32.6427711Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6427911Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6427977Z return mod(**inputs) 2025-08-14T21:57:32.6428210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6428332Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6428561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6428647Z layer_outputs = layer_module( 2025-08-14T21:57:32.6428880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6428963Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6429216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6429302Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6429553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6429641Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6429888Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6430012Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6430015Z 2025-08-14T21:57:32.6430100Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6430199Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6430286Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6430366Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6430453Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6430533Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6430653Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6430859Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6430925Z return mod(**inputs) 2025-08-14T21:57:32.6431159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6431245Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6431482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6431559Z layer_outputs = layer_module( 2025-08-14T21:57:32.6431773Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6431849Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6432096Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 731, in forward 2025-08-14T21:57:32.6432189Z hidden_states = self.layer[-1](hidden_states) 2025-08-14T21:57:32.6432410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 343, in forward 2025-08-14T21:57:32.6432540Z hidden_states = hidden_states + self.dropout(forwarded_states) 2025-08-14T21:57:32.6432545Z 2025-08-14T21:57:32.6432622Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6432707Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6432781Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6432855Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6432965Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6433161Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6433238Z return mod(**inputs) 2025-08-14T21:57:32.6433476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6433548Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6433781Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6433850Z layer_outputs = layer_module( 2025-08-14T21:57:32.6434063Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6434184Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6434413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6434499Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6434725Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6434807Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6435040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6435162Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6435165Z 2025-08-14T21:57:32.6435240Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6435321Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6435422Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6435623Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6435690Z return mod(**inputs) 2025-08-14T21:57:32.6435943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6436023Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6436269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6436340Z layer_outputs = layer_module( 2025-08-14T21:57:32.6436561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6436638Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6436867Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6436946Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6437171Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6437257Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6437483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6437588Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6437599Z 2025-08-14T21:57:32.6437699Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6437904Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6437977Z return mod(**inputs) 2025-08-14T21:57:32.6438205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6438278Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6438515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6438586Z layer_outputs = layer_module( 2025-08-14T21:57:32.6438807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6438883Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6439107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 681, in forward 2025-08-14T21:57:32.6439193Z self_attention_outputs = self.layer[0]( 2025-08-14T21:57:32.6439429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 599, in forward 2025-08-14T21:57:32.6439516Z attention_output = self.SelfAttention( 2025-08-14T21:57:32.6439767Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6439916Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6439920Z 2025-08-14T21:57:32.6440008Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6440088Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6440171Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6440257Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6440337Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6440414Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6440530Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6440742Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6440819Z return mod(**inputs) 2025-08-14T21:57:32.6441067Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6441152Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6441392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6441461Z layer_outputs = layer_module( 2025-08-14T21:57:32.6441672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6441777Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6442005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6442093Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6442324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6442409Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6442657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 526, in forward 2025-08-14T21:57:32.6442791Z scores = torch.matmul(query_states, key_states.transpose(3, 2)) 2025-08-14T21:57:32.6442795Z 2025-08-14T21:57:32.6442882Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6442962Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6443070Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6443283Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6443352Z return mod(**inputs) 2025-08-14T21:57:32.6443625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6443711Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6443956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6444037Z layer_outputs = layer_module( 2025-08-14T21:57:32.6444268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6444353Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6444605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6444691Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6444934Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6445028Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6445275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 565, in forward 2025-08-14T21:57:32.6445394Z attn_output = torch.matmul(attn_weights, value_states) 2025-08-14T21:57:32.6445399Z 2025-08-14T21:57:32.6445507Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6445714Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6445823Z return mod(**inputs) 2025-08-14T21:57:32.6446070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1762, in forward 2025-08-14T21:57:32.6446153Z decoder_outputs = self.decoder( 2025-08-14T21:57:32.6446405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1092, in forward 2025-08-14T21:57:32.6446482Z layer_outputs = layer_module( 2025-08-14T21:57:32.6446719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:57:32.6446801Z return super().__call__(*args, **kwargs) 2025-08-14T21:57:32.6447042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 705, in forward 2025-08-14T21:57:32.6447134Z cross_attention_outputs = self.layer[1]( 2025-08-14T21:57:32.6447378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 635, in forward 2025-08-14T21:57:32.6447475Z attention_output = self.EncDecAttention( 2025-08-14T21:57:32.6447718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 567, in forward 2025-08-14T21:57:32.6447850Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-08-14T21:57:32.6447854Z 2025-08-14T21:57:32.6447947Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6448029Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6448110Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6448198Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6448277Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6448364Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6448443Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6448522Z cudagraph partition due to non gpu ops 2025-08-14T21:57:32.6448637Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:57:32.6448939Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:57:32.6449020Z return mod(**inputs) 2025-08-14T21:57:32.6449287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/t5/modeling_t5.py", line 1798, in forward 2025-08-14T21:57:32.6449442Z loss = loss_fct(lm_logits.view(-1, lm_logits.size(-1)), labels.view(-1)) 2025-08-14T21:57:32.6449448Z 2025-08-14T21:57:41.9005309Z Compilation time (from dynamo_timed): 21.301821799 2025-08-14T21:57:41.9124383Z pass 2025-08-14T21:57:41.9126033Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:57:41.9126939Z TIMING: _recursive_pre_grad_passes:0.06096 _recursive_joint_graph_passes:0.65372 _recursive_post_grad_passes:0.23298 async_compile.wait:0.00532 code_gen:8.13968 inductor_compile:10.21864 backend_compile:17.72374 gc:0.0004 entire_frame_compile:21.30182 total_wall_time:21.30182 2025-08-14T21:57:41.9128001Z STATS: call_* op count: 824 | FakeTensorMode.__torch_dispatch__:38726 | FakeTensor.__torch_dispatch__:5207 | ProxyTorchDispatchMode.__torch_dispatch__:10688 2025-08-14T21:57:41.9128567Z Dynamo produced 1 graphs covering 824 ops with 0 graph breaks (0 unique) 2025-08-14T21:57:47.5045617Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:57:47.5046644Z from pkg_resources import resource_filename 2025-08-14T21:57:48.1333942Z 2025-08-14T21:57:50.8377492Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:57:50.8377825Z loading model: 0it [00:02, ?it/s] 2025-08-14T21:57:50.8378079Z cpu eval TrOCRForCausalLM 2025-08-14T21:57:50.9802829Z WARNING:common:fp64 golden ref were not generated for TrOCRForCausalLM. Setting accuracy check to cosine 2025-08-14T21:57:51.0124772Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:57:51.2571612Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:57:51.4255423Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:58:02.6822780Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6823094Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6823358Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6823586Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6823835Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6824069Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6824304Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6824535Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6824762Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6824993Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6825213Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6825431Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6826014Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6826234Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6826461Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6826681Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6826891Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6827125Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6827352Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6827580Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6827800Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6828020Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6828244Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6828455Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6828676Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6828909Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6829130Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6829343Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6829554Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6829763Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6829971Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6830184Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6830476Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6830696Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6830920Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6831145Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6831360Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6831587Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6831818Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6832037Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6832248Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6832464Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6832683Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6832892Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6833106Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6833328Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6833542Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6833763Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6833978Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6834192Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6834412Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6834628Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6834847Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6835126Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6835397Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6835627Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6835843Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6836070Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6836308Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6836522Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6836746Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6836966Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6837187Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6837414Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6837635Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6837860Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6838078Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6838296Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6838517Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6838730Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6838951Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6839172Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6839417Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6839643Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6839866Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6840081Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6840318Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6840543Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6840763Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6840974Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6841194Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6841414Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6841628Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6841851Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6842073Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6842294Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6842507Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6842729Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6842949Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6843163Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6843385Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6843622Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6843839Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6844076Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6844299Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6844515Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6844736Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6844955Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6845174Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6845387Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6845605Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6845824Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6846039Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6846261Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6846481Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6846693Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6846917Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6847139Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6847353Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6847575Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6847795Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6848018Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6848230Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6848484Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6849145Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6849384Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6849608Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6849835Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6850057Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6850272Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6850490Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6850710Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6850922Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6851146Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6851367Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6851580Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6851799Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6852018Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6852231Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6852456Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6852677Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6852904Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6853111Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6853353Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6853566Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6853774Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6853988Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6854202Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6854410Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6854623Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6854838Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6855079Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6855293Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6855500Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6855716Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6855933Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6856139Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6856367Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6856582Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6856796Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6857002Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6857215Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6857427Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6857658Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6857873Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6858085Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6858290Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6858502Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6879725Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6883588Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6883877Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6884135Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6884366Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6884594Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6884828Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6885058Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6885289Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6885529Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6885763Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6885991Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6886213Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6886437Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6886667Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6886903Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6887300Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6887566Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6887795Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6888032Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6888270Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6888507Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6889073Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6889373Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6889629Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6889920Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6890178Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6890441Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6890656Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6890955Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6891226Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6891485Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6891755Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6892016Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6892267Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6892567Z cudagraph partition due to non gpu ops 2025-08-14T21:58:02.6892857Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:02.6893291Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:02.6893677Z return mod(**inputs) 2025-08-14T21:58:02.6894132Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/trocr/modeling_trocr.py", line 844, in forward 2025-08-14T21:58:02.6894683Z loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) 2025-08-14T21:58:02.6894941Z 2025-08-14T21:58:11.6764464Z Compilation time (from dynamo_timed): 18.897820821 2025-08-14T21:58:11.6765003Z pass 2025-08-14T21:58:11.6765640Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:58:11.6766661Z TIMING: _recursive_pre_grad_passes:0.04352 _recursive_joint_graph_passes:0.55021 _recursive_post_grad_passes:0.08622 async_compile.wait:0.86149 code_gen:8.88901 inductor_compile:10.49657 backend_compile:16.36376 gc:0.00116 entire_frame_compile:18.89782 total_wall_time:18.89782 2025-08-14T21:58:11.6767717Z STATS: call_* op count: 445 | FakeTensorMode.__torch_dispatch__:28877 | FakeTensor.__torch_dispatch__:3278 | ProxyTorchDispatchMode.__torch_dispatch__:8203 2025-08-14T21:58:11.6768690Z Dynamo produced 1 graphs covering 445 ops with 0 graph breaks (0 unique) 2025-08-14T21:58:17.4320603Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:58:17.4321623Z from pkg_resources import resource_filename 2025-08-14T21:58:18.0600665Z 2025-08-14T21:58:24.9471133Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:58:24.9471447Z loading model: 0it [00:06, ?it/s] 2025-08-14T21:58:24.9472456Z cpu eval XGLMForCausalLM 2025-08-14T21:58:25.3631218Z WARNING:common:fp64 golden ref were not generated for XGLMForCausalLM. Setting accuracy check to cosine 2025-08-14T21:58:25.4561924Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:58:26.0520688Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:58:26.5965119Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:58:48.6123084Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6123411Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6124042Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6124363Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6124675Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6125126Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6125505Z return mod(**inputs) 2025-08-14T21:58:48.6125950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6126391Z outputs = self.model( 2025-08-14T21:58:48.6126815Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6127247Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6127692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6128091Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6128532Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6129335Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6129872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6130337Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6130531Z 2025-08-14T21:58:48.6130656Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6131064Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6131424Z return mod(**inputs) 2025-08-14T21:58:48.6131832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6132263Z outputs = self.model( 2025-08-14T21:58:48.6132676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6133111Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6133516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6133924Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6134354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6134818Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6135330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6135834Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6136040Z 2025-08-14T21:58:48.6136131Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6136364Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6136626Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6137024Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6137398Z return mod(**inputs) 2025-08-14T21:58:48.6137799Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6138214Z outputs = self.model( 2025-08-14T21:58:48.6138605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6139026Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6139405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6139790Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6140217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6140698Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6141148Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6141596Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6141771Z 2025-08-14T21:58:48.6141885Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6142289Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6142657Z return mod(**inputs) 2025-08-14T21:58:48.6143055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6143472Z outputs = self.model( 2025-08-14T21:58:48.6143870Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6144290Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6144681Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6145085Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6145524Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6145954Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6146398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6146862Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6147057Z 2025-08-14T21:58:48.6147149Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6147374Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6147601Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6147825Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6148039Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6148260Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6148478Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6148692Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6148912Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6149130Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6149371Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6149813Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6150177Z return mod(**inputs) 2025-08-14T21:58:48.6150568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6150974Z outputs = self.model( 2025-08-14T21:58:48.6151367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6151788Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6152163Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6152549Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6152968Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6153410Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6153845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6154343Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6154521Z 2025-08-14T21:58:48.6154634Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6155024Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6155437Z return mod(**inputs) 2025-08-14T21:58:48.6155815Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6156225Z outputs = self.model( 2025-08-14T21:58:48.6156612Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6157022Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6157395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6157797Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6158201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6158628Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6159071Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6159545Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6159746Z 2025-08-14T21:58:48.6159840Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6160099Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6160357Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6160752Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6161108Z return mod(**inputs) 2025-08-14T21:58:48.6161493Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6161903Z outputs = self.model( 2025-08-14T21:58:48.6162293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6162689Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6163069Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6163464Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6163881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6164310Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6164742Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6165204Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6165371Z 2025-08-14T21:58:48.6165483Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6165872Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6166229Z return mod(**inputs) 2025-08-14T21:58:48.6166621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6167023Z outputs = self.model( 2025-08-14T21:58:48.6167412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6167827Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6168209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6168594Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6169195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6169646Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6170073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6170611Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6170816Z 2025-08-14T21:58:48.6170902Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6171130Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6171503Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6171730Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6171956Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6172171Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6172396Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6172623Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6172842Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6173070Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6173370Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6173775Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6174149Z return mod(**inputs) 2025-08-14T21:58:48.6174546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6174961Z outputs = self.model( 2025-08-14T21:58:48.6175409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6175840Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6176223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6176619Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6177026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6177464Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6177900Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6178362Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6178542Z 2025-08-14T21:58:48.6178655Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6179048Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6179399Z return mod(**inputs) 2025-08-14T21:58:48.6179774Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6180195Z outputs = self.model( 2025-08-14T21:58:48.6180611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6181037Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6181406Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6181816Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6182240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6182673Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6183118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6183600Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6183803Z 2025-08-14T21:58:48.6183897Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6184122Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6184383Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6184775Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6185118Z return mod(**inputs) 2025-08-14T21:58:48.6185536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6185971Z outputs = self.model( 2025-08-14T21:58:48.6186358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6186783Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6187167Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6187615Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6188034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6188476Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6188918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6189365Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6189528Z 2025-08-14T21:58:48.6189641Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6190034Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6190418Z return mod(**inputs) 2025-08-14T21:58:48.6190803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6191202Z outputs = self.model( 2025-08-14T21:58:48.6191603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6192015Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6192384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6192781Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6193196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6193634Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6194055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6194523Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6194720Z 2025-08-14T21:58:48.6194810Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6195039Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6195276Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6195515Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6195738Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6195954Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6196209Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6196602Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6196953Z return mod(**inputs) 2025-08-14T21:58:48.6197345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6197757Z outputs = self.model( 2025-08-14T21:58:48.6198144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6198558Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6198938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6199331Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6199738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 366, in forward 2025-08-14T21:58:48.6200165Z hidden_states = residual + hidden_states 2025-08-14T21:58:48.6200342Z 2025-08-14T21:58:48.6200427Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6200675Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6200894Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6201120Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6201375Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6201761Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6202117Z return mod(**inputs) 2025-08-14T21:58:48.6202506Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6203126Z outputs = self.model( 2025-08-14T21:58:48.6203509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6203930Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6204312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6204704Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6205125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6205659Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6206104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6206558Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6207255Z 2025-08-14T21:58:48.6207864Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6208370Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6208920Z return mod(**inputs) 2025-08-14T21:58:48.6209479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6209931Z outputs = self.model( 2025-08-14T21:58:48.6210803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6211384Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6211851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6212310Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6213040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6213576Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6214077Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6214691Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6214914Z 2025-08-14T21:58:48.6215039Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6215360Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6215631Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6216206Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6216567Z return mod(**inputs) 2025-08-14T21:58:48.6217064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6217498Z outputs = self.model( 2025-08-14T21:58:48.6217977Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6218462Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6218884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6219465Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6219975Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6220502Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6221058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6221529Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6221775Z 2025-08-14T21:58:48.6221898Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6222337Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6222797Z return mod(**inputs) 2025-08-14T21:58:48.6223268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6223704Z outputs = self.model( 2025-08-14T21:58:48.6224217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6224649Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6225145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6225668Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6226109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6226669Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6227144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6227652Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6227866Z 2025-08-14T21:58:48.6228001Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6228260Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6228490Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6228764Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6229009Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6229229Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6229471Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6229759Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6229973Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6230210Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6230604Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6231024Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6231463Z return mod(**inputs) 2025-08-14T21:58:48.6231936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6232367Z outputs = self.model( 2025-08-14T21:58:48.6232875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6233281Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6233742Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6234132Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6234623Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6235150Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6235592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6236140Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6236329Z 2025-08-14T21:58:48.6236554Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6237040Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6237477Z return mod(**inputs) 2025-08-14T21:58:48.6237875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6238396Z outputs = self.model( 2025-08-14T21:58:48.6238790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6239317Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6239742Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6240169Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6240697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6241143Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6241686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6242290Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6242497Z 2025-08-14T21:58:48.6242584Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6242852Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6243176Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6243593Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6244023Z return mod(**inputs) 2025-08-14T21:58:48.6244472Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6244916Z outputs = self.model( 2025-08-14T21:58:48.6245416Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6245833Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6246306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6246716Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6247203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6247730Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6248196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6248740Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6249334Z 2025-08-14T21:58:48.6249459Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6249870Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6250335Z return mod(**inputs) 2025-08-14T21:58:48.6250760Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6251235Z outputs = self.model( 2025-08-14T21:58:48.6251722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6252142Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6252623Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6253039Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6253531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6254067Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6254562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6255152Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6255348Z 2025-08-14T21:58:48.6255486Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6255759Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6255991Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6256220Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6256490Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6256752Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6257010Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6257399Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6257766Z return mod(**inputs) 2025-08-14T21:58:48.6258234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6258663Z outputs = self.model( 2025-08-14T21:58:48.6259049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6259491Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6259866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6260351Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6260767Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 366, in forward 2025-08-14T21:58:48.6261200Z hidden_states = residual + hidden_states 2025-08-14T21:58:48.6261351Z 2025-08-14T21:58:48.6261445Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6261674Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6261897Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6262117Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6262360Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6262740Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6263090Z return mod(**inputs) 2025-08-14T21:58:48.6263477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6263973Z outputs = self.model( 2025-08-14T21:58:48.6264380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6264786Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6265159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6265544Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6265947Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6266379Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6266799Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6267247Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6267421Z 2025-08-14T21:58:48.6267546Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6267929Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6268266Z return mod(**inputs) 2025-08-14T21:58:48.6268640Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6269040Z outputs = self.model( 2025-08-14T21:58:48.6269407Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6269895Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6270269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6270657Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6271148Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6271597Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6272038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6272516Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6272726Z 2025-08-14T21:58:48.6272812Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6273045Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6273312Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6273694Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6274042Z return mod(**inputs) 2025-08-14T21:58:48.6274422Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6274839Z outputs = self.model( 2025-08-14T21:58:48.6275217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6275622Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6275990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6276364Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6276765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6277200Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6277625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6278046Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6278216Z 2025-08-14T21:58:48.6278323Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6278706Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6279048Z return mod(**inputs) 2025-08-14T21:58:48.6279484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6279886Z outputs = self.model( 2025-08-14T21:58:48.6280265Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6280681Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6281063Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6281469Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6281878Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6282338Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6282770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6283254Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6283447Z 2025-08-14T21:58:48.6283533Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6283765Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6283993Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6284273Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6284552Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6284797Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6285022Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6285238Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6285467Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6285690Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6285933Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6286328Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6286680Z return mod(**inputs) 2025-08-14T21:58:48.6287064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6287486Z outputs = self.model( 2025-08-14T21:58:48.6287869Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6288293Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6288666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6289174Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6289635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6290084Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6290523Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6290985Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6291164Z 2025-08-14T21:58:48.6291284Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6291671Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6292035Z return mod(**inputs) 2025-08-14T21:58:48.6292476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6292895Z outputs = self.model( 2025-08-14T21:58:48.6293281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6293697Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6294076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6294504Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6294930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6295373Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6295808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6296287Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6296500Z 2025-08-14T21:58:48.6296587Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6296820Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6297067Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6297457Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6297808Z return mod(**inputs) 2025-08-14T21:58:48.6298195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6298613Z outputs = self.model( 2025-08-14T21:58:48.6298993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6299403Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6299816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6300212Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6300619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6301060Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6301484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6301924Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6302095Z 2025-08-14T21:58:48.6302202Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6302590Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6303194Z return mod(**inputs) 2025-08-14T21:58:48.6303569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6303975Z outputs = self.model( 2025-08-14T21:58:48.6304344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6304805Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6305171Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6305558Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6305955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6306387Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6306818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6307276Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6307465Z 2025-08-14T21:58:48.6307551Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6307777Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6308000Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6308213Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6308430Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6308644Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6308881Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6309288Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6309636Z return mod(**inputs) 2025-08-14T21:58:48.6310019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6310486Z outputs = self.model( 2025-08-14T21:58:48.6311034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6311470Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6311839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6312245Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6312672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 366, in forward 2025-08-14T21:58:48.6313108Z hidden_states = residual + hidden_states 2025-08-14T21:58:48.6313255Z 2025-08-14T21:58:48.6313342Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6313568Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6313795Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6314010Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6314268Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6314673Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6315107Z return mod(**inputs) 2025-08-14T21:58:48.6315486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6315901Z outputs = self.model( 2025-08-14T21:58:48.6316285Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6316699Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6317081Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6317482Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6317903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6318349Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6318785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6319245Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6319424Z 2025-08-14T21:58:48.6319564Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6319951Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6320309Z return mod(**inputs) 2025-08-14T21:58:48.6320698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6321106Z outputs = self.model( 2025-08-14T21:58:48.6321501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6321922Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6322291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6322684Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6323102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6323549Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6323974Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6324456Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6324659Z 2025-08-14T21:58:48.6324765Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6324999Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6325245Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6325633Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6325989Z return mod(**inputs) 2025-08-14T21:58:48.6326374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6326788Z outputs = self.model( 2025-08-14T21:58:48.6327174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6327605Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6327975Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6328376Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6328792Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6329333Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6329770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6330291Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6330461Z 2025-08-14T21:58:48.6330581Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6330966Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6331325Z return mod(**inputs) 2025-08-14T21:58:48.6331712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6332178Z outputs = self.model( 2025-08-14T21:58:48.6332566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6332979Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6333363Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6333751Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6334174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6334616Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6335074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6335533Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6335734Z 2025-08-14T21:58:48.6335820Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6336058Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6336279Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6336502Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6336729Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6336952Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6337167Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6337391Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6337617Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6337833Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6338084Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6338475Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6338820Z return mod(**inputs) 2025-08-14T21:58:48.6339210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6339641Z outputs = self.model( 2025-08-14T21:58:48.6340031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6340436Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6340811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6341203Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6341605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6342044Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6342476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6342927Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6343105Z 2025-08-14T21:58:48.6343219Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6343610Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6343960Z return mod(**inputs) 2025-08-14T21:58:48.6344335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6344775Z outputs = self.model( 2025-08-14T21:58:48.6345183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6345600Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6345981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6346386Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6346818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6347282Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6347728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6348199Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6348397Z 2025-08-14T21:58:48.6348492Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6348720Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6348977Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6349364Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6349746Z return mod(**inputs) 2025-08-14T21:58:48.6350124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6350534Z outputs = self.model( 2025-08-14T21:58:48.6350919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6351328Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6351709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6352099Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6352516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6353001Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6353457Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6353920Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6354090Z 2025-08-14T21:58:48.6354212Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6354641Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6355006Z return mod(**inputs) 2025-08-14T21:58:48.6355385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6355779Z outputs = self.model( 2025-08-14T21:58:48.6356156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6356568Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6356942Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6357341Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6357762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6358214Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6358644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6359103Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6359297Z 2025-08-14T21:58:48.6359385Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6359613Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6359860Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6360105Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6360332Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6360551Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6360807Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6361212Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6361574Z return mod(**inputs) 2025-08-14T21:58:48.6361970Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6362447Z outputs = self.model( 2025-08-14T21:58:48.6362943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6363445Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6363858Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6364259Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6364685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 366, in forward 2025-08-14T21:58:48.6365144Z hidden_states = residual + hidden_states 2025-08-14T21:58:48.6365300Z 2025-08-14T21:58:48.6365385Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6365611Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6365827Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6366053Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6366305Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6366685Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6367036Z return mod(**inputs) 2025-08-14T21:58:48.6367422Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6367839Z outputs = self.model( 2025-08-14T21:58:48.6368213Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6368653Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6369155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6369557Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6370007Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6370457Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6370893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6371349Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6371537Z 2025-08-14T21:58:48.6371652Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6372041Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6372400Z return mod(**inputs) 2025-08-14T21:58:48.6372781Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6373190Z outputs = self.model( 2025-08-14T21:58:48.6373578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6373985Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6374369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6374760Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6375175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6375663Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6376108Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6376592Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6376795Z 2025-08-14T21:58:48.6376892Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6377120Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6377380Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6377770Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6393824Z return mod(**inputs) 2025-08-14T21:58:48.6394327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6394782Z outputs = self.model( 2025-08-14T21:58:48.6395195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6395627Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6396117Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6396525Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6396953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6397409Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6397850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6398304Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6398475Z 2025-08-14T21:58:48.6398592Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6398999Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6399355Z return mod(**inputs) 2025-08-14T21:58:48.6399752Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6400175Z outputs = self.model( 2025-08-14T21:58:48.6400562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6400985Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6401399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6401801Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6402215Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6402846Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6403305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6403783Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6403981Z 2025-08-14T21:58:48.6404075Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6404316Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6404546Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6404763Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6404990Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6405216Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6405433Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6405661Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6405885Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6406101Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6406445Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6406882Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6407250Z return mod(**inputs) 2025-08-14T21:58:48.6407643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6408058Z outputs = self.model( 2025-08-14T21:58:48.6408450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6408982Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6409390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6409791Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6410208Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6410647Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6411088Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6411603Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6411784Z 2025-08-14T21:58:48.6411912Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6412308Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6412672Z return mod(**inputs) 2025-08-14T21:58:48.6413074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6413484Z outputs = self.model( 2025-08-14T21:58:48.6413881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6414308Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6414701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6415100Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6415546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6416014Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6416486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6416990Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6417196Z 2025-08-14T21:58:48.6417283Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6417511Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6417755Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6418144Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6418502Z return mod(**inputs) 2025-08-14T21:58:48.6418890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6419305Z outputs = self.model( 2025-08-14T21:58:48.6419703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6420117Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6420492Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6420890Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6421317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6421755Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6422268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6422694Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6422862Z 2025-08-14T21:58:48.6422983Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6423375Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6423729Z return mod(**inputs) 2025-08-14T21:58:48.6424114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6424527Z outputs = self.model( 2025-08-14T21:58:48.6424914Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6425332Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6425713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6426109Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6426517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6426982Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6427424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6427877Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6428076Z 2025-08-14T21:58:48.6428161Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6428386Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6428612Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6428822Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6429046Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6429273Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6429523Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6429912Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6430270Z return mod(**inputs) 2025-08-14T21:58:48.6430652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6431065Z outputs = self.model( 2025-08-14T21:58:48.6431471Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6431990Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6432371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6432760Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6433174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 366, in forward 2025-08-14T21:58:48.6433593Z hidden_states = residual + hidden_states 2025-08-14T21:58:48.6433751Z 2025-08-14T21:58:48.6433871Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6434117Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6434347Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6434563Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6434820Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6435214Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6435563Z return mod(**inputs) 2025-08-14T21:58:48.6435965Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6436365Z outputs = self.model( 2025-08-14T21:58:48.6436759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6437273Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6437665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6438071Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6438482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6438939Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6439387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6439846Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6440026Z 2025-08-14T21:58:48.6440138Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6440541Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6440904Z return mod(**inputs) 2025-08-14T21:58:48.6441297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6441740Z outputs = self.model( 2025-08-14T21:58:48.6442133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6442558Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6442930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6443333Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6443761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6444211Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6444648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6445130Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6445331Z 2025-08-14T21:58:48.6445426Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6445648Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6445906Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6446294Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6446644Z return mod(**inputs) 2025-08-14T21:58:48.6447043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6447453Z outputs = self.model( 2025-08-14T21:58:48.6447838Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6448252Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6448631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6449140Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6449565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6450000Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6450435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6450886Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6451049Z 2025-08-14T21:58:48.6451169Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6451547Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6451900Z return mod(**inputs) 2025-08-14T21:58:48.6452351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6452756Z outputs = self.model( 2025-08-14T21:58:48.6453144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6453566Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6453956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6454333Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6454743Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6455171Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6455581Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6456055Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6456253Z 2025-08-14T21:58:48.6456341Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6456575Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6456822Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6457048Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6457272Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6457486Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6457705Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6457929Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6458346Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6458563Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6458818Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6459213Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6459549Z return mod(**inputs) 2025-08-14T21:58:48.6459928Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6460333Z outputs = self.model( 2025-08-14T21:58:48.6460702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6461108Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6461477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6461882Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6462290Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6462748Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6463179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6463640Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6463818Z 2025-08-14T21:58:48.6463931Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6464324Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6464667Z return mod(**inputs) 2025-08-14T21:58:48.6465033Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6465432Z outputs = self.model( 2025-08-14T21:58:48.6465803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6466202Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6466559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6466971Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6467397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6467834Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6468267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6468746Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6468941Z 2025-08-14T21:58:48.6469035Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6469250Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6469497Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6469882Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6470227Z return mod(**inputs) 2025-08-14T21:58:48.6470618Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6471026Z outputs = self.model( 2025-08-14T21:58:48.6471412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6471843Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6472221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6472611Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6473019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6473458Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6473894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6474332Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6474495Z 2025-08-14T21:58:48.6474607Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6474998Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6475359Z return mod(**inputs) 2025-08-14T21:58:48.6475741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6476151Z outputs = self.model( 2025-08-14T21:58:48.6476551Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6476967Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6477332Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6477728Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6478151Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6478594Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6479025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6479506Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6479697Z 2025-08-14T21:58:48.6479793Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6480019Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6480241Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6480464Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6480690Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6480907Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6481160Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6481548Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6481933Z return mod(**inputs) 2025-08-14T21:58:48.6482334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6482747Z outputs = self.model( 2025-08-14T21:58:48.6483135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6483540Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6483923Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6484313Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6484722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 366, in forward 2025-08-14T21:58:48.6485152Z hidden_states = residual + hidden_states 2025-08-14T21:58:48.6485306Z 2025-08-14T21:58:48.6485391Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6485621Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6485838Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6486058Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6486331Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6486714Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6487072Z return mod(**inputs) 2025-08-14T21:58:48.6487465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6487875Z outputs = self.model( 2025-08-14T21:58:48.6488250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6488664Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6489204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6489605Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6490024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6490480Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6490917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6491373Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6491613Z 2025-08-14T21:58:48.6491728Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6492124Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6492470Z return mod(**inputs) 2025-08-14T21:58:48.6492861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6493284Z outputs = self.model( 2025-08-14T21:58:48.6493668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6494076Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6494455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6494854Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6495263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6495705Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6496144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6496634Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6496862Z 2025-08-14T21:58:48.6496968Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6497201Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6497459Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6497846Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6498191Z return mod(**inputs) 2025-08-14T21:58:48.6498576Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6498995Z outputs = self.model( 2025-08-14T21:58:48.6499375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6499787Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6500166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6500558Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6500964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6501402Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6501859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6502286Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6502456Z 2025-08-14T21:58:48.6502566Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6503151Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6503508Z return mod(**inputs) 2025-08-14T21:58:48.6503889Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6504304Z outputs = self.model( 2025-08-14T21:58:48.6504694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6505112Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6505488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6505886Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6506304Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6506805Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6507243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6507709Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6507900Z 2025-08-14T21:58:48.6507994Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6508221Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6508452Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6508679Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6508897Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6509125Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6509352Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6509568Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6509793Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6510016Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6510271Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6510651Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6511001Z return mod(**inputs) 2025-08-14T21:58:48.6511385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6511833Z outputs = self.model( 2025-08-14T21:58:48.6512251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6512672Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6513056Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6513442Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6513860Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6514320Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6514755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6515225Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6515409Z 2025-08-14T21:58:48.6515524Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6515915Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6516263Z return mod(**inputs) 2025-08-14T21:58:48.6516687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6517106Z outputs = self.model( 2025-08-14T21:58:48.6517479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6517950Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6518323Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6518717Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6519120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6519562Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6519992Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6520466Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6520664Z 2025-08-14T21:58:48.6520750Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6520988Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6521234Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6521646Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6521991Z return mod(**inputs) 2025-08-14T21:58:48.6522367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6522765Z outputs = self.model( 2025-08-14T21:58:48.6523144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6523560Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6523939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6524329Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6524749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6525191Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6525628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6526060Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6526232Z 2025-08-14T21:58:48.6526344Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6526754Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6527129Z return mod(**inputs) 2025-08-14T21:58:48.6527515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6527922Z outputs = self.model( 2025-08-14T21:58:48.6528307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6528713Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6529232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6529636Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6530055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6530499Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6530936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6531396Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6531615Z 2025-08-14T21:58:48.6531699Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6531925Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6532146Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6532362Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6532570Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6532664Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6532774Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6532981Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6533059Z return mod(**inputs) 2025-08-14T21:58:48.6533319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6533398Z outputs = self.model( 2025-08-14T21:58:48.6533664Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6533749Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6533978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6534069Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6534340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 366, in forward 2025-08-14T21:58:48.6534427Z hidden_states = residual + hidden_states 2025-08-14T21:58:48.6534431Z 2025-08-14T21:58:48.6534520Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6534600Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6534684Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6534763Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6534870Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6535082Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6535154Z return mod(**inputs) 2025-08-14T21:58:48.6535404Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6535483Z outputs = self.model( 2025-08-14T21:58:48.6535737Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6535821Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6536050Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6536132Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6536411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6536530Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6536785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6536915Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6536919Z 2025-08-14T21:58:48.6537025Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6537241Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6537311Z return mod(**inputs) 2025-08-14T21:58:48.6537566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6537647Z outputs = self.model( 2025-08-14T21:58:48.6537900Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6537979Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6538217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6538320Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6538584Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6538685Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6538941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6539091Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6539094Z 2025-08-14T21:58:48.6539173Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6539262Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6539373Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6539585Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6539659Z return mod(**inputs) 2025-08-14T21:58:48.6539927Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6539996Z outputs = self.model( 2025-08-14T21:58:48.6540246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6540334Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6540561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6540639Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6540880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6540986Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6541238Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6541337Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6541348Z 2025-08-14T21:58:48.6541453Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6541660Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6541733Z return mod(**inputs) 2025-08-14T21:58:48.6541992Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6542062Z outputs = self.model( 2025-08-14T21:58:48.6542325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6542419Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6542676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6542762Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6543018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6543126Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6543380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6543513Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6543524Z 2025-08-14T21:58:48.6543608Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6543691Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6543778Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6543857Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6543938Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6544027Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6544106Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6544185Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6545187Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6545269Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6545376Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6545589Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6545661Z return mod(**inputs) 2025-08-14T21:58:48.6545926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6545997Z outputs = self.model( 2025-08-14T21:58:48.6546252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6546340Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6546574Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6546668Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6546926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6547028Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6547312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6547432Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6547436Z 2025-08-14T21:58:48.6547542Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6547760Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6547832Z return mod(**inputs) 2025-08-14T21:58:48.6548101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6548171Z outputs = self.model( 2025-08-14T21:58:48.6548431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6548517Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6548748Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6548835Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6549102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6549203Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6549467Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6549672Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6549677Z 2025-08-14T21:58:48.6549761Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6549853Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6549959Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6550174Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6550242Z return mod(**inputs) 2025-08-14T21:58:48.6550500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6550582Z outputs = self.model( 2025-08-14T21:58:48.6550839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6550913Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6551156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6551241Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6551503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6551622Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6551882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6551986Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6551990Z 2025-08-14T21:58:48.6552092Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6552298Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6552365Z return mod(**inputs) 2025-08-14T21:58:48.6552607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6552685Z outputs = self.model( 2025-08-14T21:58:48.6552926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6552999Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6553235Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6553318Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6553594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6553695Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6553949Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6554087Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6554093Z 2025-08-14T21:58:48.6554177Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6554258Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6554348Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6554428Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6554516Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6554594Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6554701Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6554918Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6554985Z return mod(**inputs) 2025-08-14T21:58:48.6555241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6555318Z outputs = self.model( 2025-08-14T21:58:48.6555575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6555695Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6555927Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6556013Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6556273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 366, in forward 2025-08-14T21:58:48.6556358Z hidden_states = residual + hidden_states 2025-08-14T21:58:48.6556362Z 2025-08-14T21:58:48.6556442Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6556529Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6556607Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6556690Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6556795Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6557001Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6557078Z return mod(**inputs) 2025-08-14T21:58:48.6557331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6557421Z outputs = self.model( 2025-08-14T21:58:48.6557687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6557762Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6558002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6558086Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6558341Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6558452Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6558712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6558829Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6558840Z 2025-08-14T21:58:48.6558948Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6559157Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6559232Z return mod(**inputs) 2025-08-14T21:58:48.6559504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6559575Z outputs = self.model( 2025-08-14T21:58:48.6559839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6559916Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6560153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6560242Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6560496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6560607Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6560863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6561001Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6561013Z 2025-08-14T21:58:48.6561094Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6561175Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6561289Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6561496Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6561582Z return mod(**inputs) 2025-08-14T21:58:48.6561902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6561975Z outputs = self.model( 2025-08-14T21:58:48.6562232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6562316Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6562545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6562637Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6562889Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6562990Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6563254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6563356Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6563360Z 2025-08-14T21:58:48.6563473Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6563696Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6563763Z return mod(**inputs) 2025-08-14T21:58:48.6564037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6564108Z outputs = self.model( 2025-08-14T21:58:48.6564380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6564475Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6564703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6564797Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6565053Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6565154Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6565417Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6565547Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6565551Z 2025-08-14T21:58:48.6565640Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6565736Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6565819Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6565906Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6565986Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6566064Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6566152Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6566234Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6566314Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6566401Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6566511Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6566734Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6566805Z return mod(**inputs) 2025-08-14T21:58:48.6567075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6567156Z outputs = self.model( 2025-08-14T21:58:48.6567416Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6567494Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6567739Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6567881Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6568163Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6568267Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6568537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6568663Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6568666Z 2025-08-14T21:58:48.6568780Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6569097Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6569176Z return mod(**inputs) 2025-08-14T21:58:48.6569439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6569522Z outputs = self.model( 2025-08-14T21:58:48.6569794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6569882Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6570142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6570226Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6570541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6570646Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6570904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6571051Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6571055Z 2025-08-14T21:58:48.6571138Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6571223Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6571339Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6571550Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6571628Z return mod(**inputs) 2025-08-14T21:58:48.6571887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6571958Z outputs = self.model( 2025-08-14T21:58:48.6572239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6572316Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6572546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6572637Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6572893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6573002Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6573259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6573357Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6573360Z 2025-08-14T21:58:48.6573478Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6573685Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6573761Z return mod(**inputs) 2025-08-14T21:58:48.6574015Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6574085Z outputs = self.model( 2025-08-14T21:58:48.6574378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6574475Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6574704Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6574798Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6575055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6575163Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6575420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6575552Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6575556Z 2025-08-14T21:58:48.6575648Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6575728Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6575814Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6575894Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6575973Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6576056Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6576229Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6576518Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6576618Z return mod(**inputs) 2025-08-14T21:58:48.6577014Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6577089Z outputs = self.model( 2025-08-14T21:58:48.6577354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6577430Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6577666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6577755Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6578013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 366, in forward 2025-08-14T21:58:48.6578107Z hidden_states = residual + hidden_states 2025-08-14T21:58:48.6578111Z 2025-08-14T21:58:48.6578193Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6578280Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6578359Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6578460Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6578576Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6578784Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6578851Z return mod(**inputs) 2025-08-14T21:58:48.6579116Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6579190Z outputs = self.model( 2025-08-14T21:58:48.6579445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6579531Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6579761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6579851Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6580108Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6580210Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6580473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6580590Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6580613Z 2025-08-14T21:58:48.6580747Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6580957Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6581034Z return mod(**inputs) 2025-08-14T21:58:48.6581288Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6581356Z outputs = self.model( 2025-08-14T21:58:48.6581600Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6581684Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6581914Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6582005Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6582262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6582365Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6582631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6582790Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6582794Z 2025-08-14T21:58:48.6582881Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6582960Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6583067Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6583283Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6583350Z return mod(**inputs) 2025-08-14T21:58:48.6583608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6583689Z outputs = self.model( 2025-08-14T21:58:48.6583952Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6584037Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6584276Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6584361Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6584627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6584747Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6585009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6585117Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6585121Z 2025-08-14T21:58:48.6585231Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6585462Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6585529Z return mod(**inputs) 2025-08-14T21:58:48.6585783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6585863Z outputs = self.model( 2025-08-14T21:58:48.6586118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6586199Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6586418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6586497Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6586743Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6586857Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6587111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6587245Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6587250Z 2025-08-14T21:58:48.6587328Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6587411Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6587486Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6587560Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6587644Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6587718Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6587793Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6587873Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6587947Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6588022Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6588134Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6588331Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6588403Z return mod(**inputs) 2025-08-14T21:58:48.6588662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6588728Z outputs = self.model( 2025-08-14T21:58:48.6588977Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6589049Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6589265Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6589351Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6589590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6589696Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6589937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6590048Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6590052Z 2025-08-14T21:58:48.6590162Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6590359Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6590447Z return mod(**inputs) 2025-08-14T21:58:48.6590688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6590755Z outputs = self.model( 2025-08-14T21:58:48.6591012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6591085Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6591301Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6591386Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6591628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6591729Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6591968Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6592102Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6592105Z 2025-08-14T21:58:48.6592193Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6592274Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6592385Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6592640Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6592711Z return mod(**inputs) 2025-08-14T21:58:48.6592978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6593050Z outputs = self.model( 2025-08-14T21:58:48.6593305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6593390Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6593625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6593720Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6593982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6594094Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6594360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6594457Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6594478Z 2025-08-14T21:58:48.6594584Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6594805Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6594872Z return mod(**inputs) 2025-08-14T21:58:48.6595139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6595208Z outputs = self.model( 2025-08-14T21:58:48.6595465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6595550Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6595786Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6595878Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6596139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6596240Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6596505Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6596658Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6596662Z 2025-08-14T21:58:48.6596745Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6596833Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6596912Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6596997Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6597076Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6597156Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6597272Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6597480Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6597550Z return mod(**inputs) 2025-08-14T21:58:48.6597818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6597887Z outputs = self.model( 2025-08-14T21:58:48.6598152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6598228Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6598457Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6598546Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6598834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 366, in forward 2025-08-14T21:58:48.6598920Z hidden_states = residual + hidden_states 2025-08-14T21:58:48.6598924Z 2025-08-14T21:58:48.6599009Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6599090Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6599174Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6599251Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6599357Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6599572Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6599641Z return mod(**inputs) 2025-08-14T21:58:48.6599896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6599973Z outputs = self.model( 2025-08-14T21:58:48.6600228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6600314Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6600547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6600651Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6600912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6601013Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6601266Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6601388Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6601392Z 2025-08-14T21:58:48.6601496Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6601712Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6601783Z return mod(**inputs) 2025-08-14T21:58:48.6602039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6602120Z outputs = self.model( 2025-08-14T21:58:48.6602375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6602457Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6602888Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6602979Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6603244Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6603350Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6603617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6603767Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6603771Z 2025-08-14T21:58:48.6603857Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6603948Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6604058Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6604270Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6604351Z return mod(**inputs) 2025-08-14T21:58:48.6604615Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6604688Z outputs = self.model( 2025-08-14T21:58:48.6604960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6605078Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6605354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6605442Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6605707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6605818Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6606083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6606193Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6606197Z 2025-08-14T21:58:48.6606305Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6606517Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6606598Z return mod(**inputs) 2025-08-14T21:58:48.6606860Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6606930Z outputs = self.model( 2025-08-14T21:58:48.6607200Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6607308Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6607553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6607641Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6607905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6608016Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6608280Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6608418Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6608430Z 2025-08-14T21:58:48.6608515Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6608599Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6608692Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6608774Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6608912Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6609006Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6609087Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6609189Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6609282Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6609363Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6609481Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6609698Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6609770Z return mod(**inputs) 2025-08-14T21:58:48.6610045Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6610129Z outputs = self.model( 2025-08-14T21:58:48.6610385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6610471Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6610704Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6610798Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6611055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6611157Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6611421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6611574Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6611578Z 2025-08-14T21:58:48.6611687Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6611906Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6611974Z return mod(**inputs) 2025-08-14T21:58:48.6612248Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6612322Z outputs = self.model( 2025-08-14T21:58:48.6612590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6612674Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6612912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6613006Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6613274Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6613414Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6613675Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6613813Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6613817Z 2025-08-14T21:58:48.6613899Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6613988Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6614093Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6614308Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6614375Z return mod(**inputs) 2025-08-14T21:58:48.6614643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6614723Z outputs = self.model( 2025-08-14T21:58:48.6614989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6615068Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6615313Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6615398Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6615738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6615846Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6616107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6616219Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6616225Z 2025-08-14T21:58:48.6616333Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6616552Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6616624Z return mod(**inputs) 2025-08-14T21:58:48.6616885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6616965Z outputs = self.model( 2025-08-14T21:58:48.6617228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6617305Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6617549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6617635Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6617944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6618050Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6618315Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6618458Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6618462Z 2025-08-14T21:58:48.6618553Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6618645Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6618741Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6618826Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6618918Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6619002Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6619115Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6619340Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6619417Z return mod(**inputs) 2025-08-14T21:58:48.6619687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6619788Z outputs = self.model( 2025-08-14T21:58:48.6620049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6620136Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6620377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6620461Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6620735Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 366, in forward 2025-08-14T21:58:48.6620822Z hidden_states = residual + hidden_states 2025-08-14T21:58:48.6620828Z 2025-08-14T21:58:48.6620909Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6621000Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6621079Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6621165Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6621276Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6621492Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6621572Z return mod(**inputs) 2025-08-14T21:58:48.6621856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6621932Z outputs = self.model( 2025-08-14T21:58:48.6622200Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6622279Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6622523Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6622610Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6622869Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6622984Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6623244Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 156, in forward 2025-08-14T21:58:48.6623368Z query_states = self.q_proj(hidden_states) * self.scaling 2025-08-14T21:58:48.6623374Z 2025-08-14T21:58:48.6623483Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6623693Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6623772Z return mod(**inputs) 2025-08-14T21:58:48.6624034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6624144Z outputs = self.model( 2025-08-14T21:58:48.6624417Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6624498Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6624740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6624826Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6625087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6625199Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6625463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 197, in forward 2025-08-14T21:58:48.6625605Z attn_weights = torch.bmm(query_states, key_states.transpose(1, 2)) 2025-08-14T21:58:48.6625618Z 2025-08-14T21:58:48.6625702Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6625785Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6625899Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6626157Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6626227Z return mod(**inputs) 2025-08-14T21:58:48.6626505Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6626578Z outputs = self.model( 2025-08-14T21:58:48.6626858Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6626937Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6627177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6627272Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6627541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6627657Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6627926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 243, in forward 2025-08-14T21:58:48.6628024Z attn_output = torch.bmm(attn_probs, value_states) 2025-08-14T21:58:48.6628027Z 2025-08-14T21:58:48.6628156Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6628367Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6628436Z return mod(**inputs) 2025-08-14T21:58:48.6628701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 664, in forward 2025-08-14T21:58:48.6628771Z outputs = self.model( 2025-08-14T21:58:48.6629030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 552, in forward 2025-08-14T21:58:48.6629113Z layer_outputs = decoder_layer( 2025-08-14T21:58:48.6629347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T21:58:48.6629439Z return super().__call__(*args, **kwargs) 2025-08-14T21:58:48.6629695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 330, in forward 2025-08-14T21:58:48.6629798Z hidden_states, self_attn_weights = self.self_attn( 2025-08-14T21:58:48.6630065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 256, in forward 2025-08-14T21:58:48.6630195Z attn_output = attn_output.reshape(bsz, tgt_len, self.embed_dim) 2025-08-14T21:58:48.6630199Z 2025-08-14T21:58:48.6630289Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6630391Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6630489Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6630577Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6630656Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6630736Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6630823Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6630900Z cudagraph partition due to non gpu ops 2025-08-14T21:58:48.6631007Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:58:48.6631226Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:58:48.6631295Z return mod(**inputs) 2025-08-14T21:58:48.6631559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xglm/modeling_xglm.py", line 685, in forward 2025-08-14T21:58:48.6631636Z loss = self.loss_function( 2025-08-14T21:58:48.6631896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/loss/loss_utils.py", line 67, in ForCausalLMLoss 2025-08-14T21:58:48.6632092Z loss = fixed_cross_entropy(logits, shift_labels, num_items_in_batch, ignore_index, **kwargs) 2025-08-14T21:58:48.6632354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/loss/loss_utils.py", line 36, in fixed_cross_entropy 2025-08-14T21:58:48.6632592Z loss = nn.functional.cross_entropy(source, target, ignore_index=ignore_index, reduction=reduction) 2025-08-14T21:58:48.6632597Z 2025-08-14T21:59:01.7789767Z Compilation time (from dynamo_timed): 33.562098438 2025-08-14T21:59:01.7831184Z pass 2025-08-14T21:59:01.7831962Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:59:01.7832860Z TIMING: _recursive_pre_grad_passes:0.08447 _recursive_joint_graph_passes:1.28685 _recursive_post_grad_passes:0.31564 async_compile.wait:0.93381 code_gen:12.64297 inductor_compile:16.23855 backend_compile:28.43571 gc:0.00014 entire_frame_compile:33.5621 total_wall_time:33.5621 2025-08-14T21:59:01.7833886Z STATS: call_* op count: 923 | FakeTensorMode.__torch_dispatch__:62628 | FakeTensor.__torch_dispatch__:8045 | ProxyTorchDispatchMode.__torch_dispatch__:16180 2025-08-14T21:59:01.7834422Z Dynamo produced 1 graphs covering 923 ops with 0 graph breaks (0 unique) 2025-08-14T21:59:08.0083934Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T21:59:08.0085048Z from pkg_resources import resource_filename 2025-08-14T21:59:08.6279839Z 2025-08-14T21:59:11.9114424Z loading model: 0it [00:00, ?it/s] 2025-08-14T21:59:11.9115544Z loading model: 0it [00:03, ?it/s] 2025-08-14T21:59:11.9116250Z cpu eval XLNetLMHeadModel 2025-08-14T21:59:14.4721141Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:59:15.0909741Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:59:15.6729849Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:59:45.2910040Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2910542Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2910953Z return mod(**inputs) 2025-08-14T21:59:45.2911427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2911900Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2912356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1334, in forward 2025-08-14T21:59:45.2913259Z pos_emb = self.relative_positional_encoding(qlen, klen, bsz=bsz) 2025-08-14T21:59:45.2913855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1157, in relative_positional_encoding 2025-08-14T21:59:45.2914415Z pos_emb = self.positional_embedding(fwd_pos_seq, inv_freq, bsz) 2025-08-14T21:59:45.2914938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1115, in positional_embedding 2025-08-14T21:59:45.2915491Z pos_emb = torch.cat([torch.sin(sinusoid_inp), torch.cos(sinusoid_inp)], dim=-1) 2025-08-14T21:59:45.2915727Z 2025-08-14T21:59:45.2915848Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2916270Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2916632Z return mod(**inputs) 2025-08-14T21:59:45.2917039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2917496Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2917937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1334, in forward 2025-08-14T21:59:45.2918520Z pos_emb = self.relative_positional_encoding(qlen, klen, bsz=bsz) 2025-08-14T21:59:45.2919067Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1157, in relative_positional_encoding 2025-08-14T21:59:45.2919613Z pos_emb = self.positional_embedding(fwd_pos_seq, inv_freq, bsz) 2025-08-14T21:59:45.2920146Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1115, in positional_embedding 2025-08-14T21:59:45.2920699Z pos_emb = torch.cat([torch.sin(sinusoid_inp), torch.cos(sinusoid_inp)], dim=-1) 2025-08-14T21:59:45.2920920Z 2025-08-14T21:59:45.2921040Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2921457Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2921826Z return mod(**inputs) 2025-08-14T21:59:45.2922234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2922717Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2923165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1307, in forward 2025-08-14T21:59:45.2923678Z word_emb_k = self.word_embedding(input_ids) 2025-08-14T21:59:45.2923842Z 2025-08-14T21:59:45.2923955Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2924350Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2924713Z return mod(**inputs) 2025-08-14T21:59:45.2925118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2925565Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2926023Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.2926467Z outputs = layer_module( 2025-08-14T21:59:45.2926876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.2927308Z outputs = self.rel_attn( 2025-08-14T21:59:45.2927710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.2928158Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.2928325Z 2025-08-14T21:59:45.2928441Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2929077Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2930399Z return mod(**inputs) 2025-08-14T21:59:45.2930804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2931236Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2931669Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.2932093Z outputs = layer_module( 2025-08-14T21:59:45.2932487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.2932910Z outputs = self.rel_attn( 2025-08-14T21:59:45.2933514Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.2933977Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.2934147Z 2025-08-14T21:59:45.2934258Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2934645Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2935021Z return mod(**inputs) 2025-08-14T21:59:45.2935418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2935881Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2936309Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.2936725Z outputs = layer_module( 2025-08-14T21:59:45.2937118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.2937531Z outputs = self.rel_attn( 2025-08-14T21:59:45.2937925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.2938352Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.2938764Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.2939256Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.2939461Z 2025-08-14T21:59:45.2939569Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2939945Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2940308Z return mod(**inputs) 2025-08-14T21:59:45.2940693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2941119Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2941534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1334, in forward 2025-08-14T21:59:45.2941975Z pos_emb = self.relative_positional_encoding(qlen, klen, bsz=bsz) 2025-08-14T21:59:45.2942463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1157, in relative_positional_encoding 2025-08-14T21:59:45.2942975Z pos_emb = self.positional_embedding(fwd_pos_seq, inv_freq, bsz) 2025-08-14T21:59:45.2943460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1115, in positional_embedding 2025-08-14T21:59:45.2944060Z pos_emb = torch.cat([torch.sin(sinusoid_inp), torch.cos(sinusoid_inp)], dim=-1) 2025-08-14T21:59:45.2944288Z 2025-08-14T21:59:45.2944402Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2944794Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2945140Z return mod(**inputs) 2025-08-14T21:59:45.2945599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2946021Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2946439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.2946847Z outputs = layer_module( 2025-08-14T21:59:45.2947231Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.2947660Z outputs = self.rel_attn( 2025-08-14T21:59:45.2948056Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.2948565Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.2948780Z 2025-08-14T21:59:45.2948890Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2949279Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2949630Z return mod(**inputs) 2025-08-14T21:59:45.2950030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2950505Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2950920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.2951333Z outputs = layer_module( 2025-08-14T21:59:45.2951730Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.2952145Z outputs = self.rel_attn( 2025-08-14T21:59:45.2952537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.2953020Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.2953220Z 2025-08-14T21:59:45.2953339Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2953726Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2954069Z return mod(**inputs) 2025-08-14T21:59:45.2954461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2954893Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2955327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.2955754Z outputs = layer_module( 2025-08-14T21:59:45.2956162Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.2956576Z outputs = self.rel_attn( 2025-08-14T21:59:45.2956969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.2957393Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.2957825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.2958322Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.2958520Z 2025-08-14T21:59:45.2958633Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2959029Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2959382Z return mod(**inputs) 2025-08-14T21:59:45.2959770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2960199Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2960665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.2961085Z outputs = layer_module( 2025-08-14T21:59:45.2961475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.2961891Z outputs = self.rel_attn( 2025-08-14T21:59:45.2962287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.2962729Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.2962905Z 2025-08-14T21:59:45.2963018Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2963414Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2963752Z return mod(**inputs) 2025-08-14T21:59:45.2964130Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2964562Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2964985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.2965426Z outputs = layer_module( 2025-08-14T21:59:45.2965814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.2966228Z outputs = self.rel_attn( 2025-08-14T21:59:45.2966637Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.2967055Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.2967491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.2967991Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.2968183Z 2025-08-14T21:59:45.2968304Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2968690Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2969200Z return mod(**inputs) 2025-08-14T21:59:45.2969616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2970047Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2970502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.2970907Z outputs = layer_module( 2025-08-14T21:59:45.2971293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.2971691Z outputs = self.rel_attn( 2025-08-14T21:59:45.2972076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.2972505Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.2972963Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.2973424Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.2973604Z 2025-08-14T21:59:45.2973714Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2974093Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2974428Z return mod(**inputs) 2025-08-14T21:59:45.2974812Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2975231Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2975643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.2976091Z outputs = layer_module( 2025-08-14T21:59:45.2976503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.2976911Z outputs = self.rel_attn( 2025-08-14T21:59:45.2977296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.2977719Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.2978166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.2978641Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.2978814Z 2025-08-14T21:59:45.2978902Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.2979133Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.2979358Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.2979578Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.2979819Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2980199Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2980563Z return mod(**inputs) 2025-08-14T21:59:45.2980938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2981367Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2981780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.2982180Z outputs = layer_module( 2025-08-14T21:59:45.2982554Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.2982961Z outputs = self.rel_attn( 2025-08-14T21:59:45.2983349Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.2983771Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.2983941Z 2025-08-14T21:59:45.2984050Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2984428Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2984770Z return mod(**inputs) 2025-08-14T21:59:45.2985167Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2985590Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2986002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.2986407Z outputs = layer_module( 2025-08-14T21:59:45.2986793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.2987203Z outputs = self.rel_attn( 2025-08-14T21:59:45.2987587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.2988027Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.2988194Z 2025-08-14T21:59:45.2988302Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2988678Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2989019Z return mod(**inputs) 2025-08-14T21:59:45.2989393Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2989813Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2990224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.2990673Z outputs = layer_module( 2025-08-14T21:59:45.2991058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.2991464Z outputs = self.rel_attn( 2025-08-14T21:59:45.2991856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.2992264Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.2992689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.2993179Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.2993376Z 2025-08-14T21:59:45.2993494Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2993870Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2994218Z return mod(**inputs) 2025-08-14T21:59:45.2994609Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2995041Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2995464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.2995881Z outputs = layer_module( 2025-08-14T21:59:45.2996286Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.2996698Z outputs = self.rel_attn( 2025-08-14T21:59:45.2997099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.2997596Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.2997796Z 2025-08-14T21:59:45.2997907Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.2998298Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.2998653Z return mod(**inputs) 2025-08-14T21:59:45.2999054Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.2999479Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.2999910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3000357Z outputs = layer_module( 2025-08-14T21:59:45.3000746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3001170Z outputs = self.rel_attn( 2025-08-14T21:59:45.3001565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3001999Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3002417Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3003152Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3003361Z 2025-08-14T21:59:45.3003476Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3003868Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3004216Z return mod(**inputs) 2025-08-14T21:59:45.3004610Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3005054Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3005477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3005969Z outputs = layer_module( 2025-08-14T21:59:45.3006420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3006856Z outputs = self.rel_attn( 2025-08-14T21:59:45.3007252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3007720Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3007886Z 2025-08-14T21:59:45.3008008Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3008395Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3008739Z return mod(**inputs) 2025-08-14T21:59:45.3009197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3009648Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3010068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3010483Z outputs = layer_module( 2025-08-14T21:59:45.3010912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3011336Z outputs = self.rel_attn( 2025-08-14T21:59:45.3011710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3012115Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3012529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3012992Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3013181Z 2025-08-14T21:59:45.3013288Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3013666Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3014002Z return mod(**inputs) 2025-08-14T21:59:45.3014372Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3014788Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3015198Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3015631Z outputs = layer_module( 2025-08-14T21:59:45.3016008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3016408Z outputs = self.rel_attn( 2025-08-14T21:59:45.3016790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3017206Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3017640Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3018101Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3018273Z 2025-08-14T21:59:45.3018387Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3018752Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3019094Z return mod(**inputs) 2025-08-14T21:59:45.3019476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3019894Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3020298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3020721Z outputs = layer_module( 2025-08-14T21:59:45.3021122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3021521Z outputs = self.rel_attn( 2025-08-14T21:59:45.3021911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3022350Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3022787Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3023221Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3023396Z 2025-08-14T21:59:45.3023480Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3023698Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3023904Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3024116Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3024354Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3024716Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3025035Z return mod(**inputs) 2025-08-14T21:59:45.3025418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3025813Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3026195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3026572Z outputs = layer_module( 2025-08-14T21:59:45.3026933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3027314Z outputs = self.rel_attn( 2025-08-14T21:59:45.3027675Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3028111Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3028271Z 2025-08-14T21:59:45.3028387Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3028758Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3029097Z return mod(**inputs) 2025-08-14T21:59:45.3029475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3029912Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3030296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3030677Z outputs = layer_module( 2025-08-14T21:59:45.3031038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3031419Z outputs = self.rel_attn( 2025-08-14T21:59:45.3031776Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3032181Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3032333Z 2025-08-14T21:59:45.3032443Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3032785Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3033106Z return mod(**inputs) 2025-08-14T21:59:45.3033464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3033856Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3034234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3034638Z outputs = layer_module( 2025-08-14T21:59:45.3035020Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3035398Z outputs = self.rel_attn( 2025-08-14T21:59:45.3035798Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3036185Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3036595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3037099Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3037292Z 2025-08-14T21:59:45.3037395Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3037757Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3038083Z return mod(**inputs) 2025-08-14T21:59:45.3038441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3038839Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3039237Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3039628Z outputs = layer_module( 2025-08-14T21:59:45.3039992Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3040372Z outputs = self.rel_attn( 2025-08-14T21:59:45.3040738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3041174Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3041366Z 2025-08-14T21:59:45.3041468Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3041830Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3042143Z return mod(**inputs) 2025-08-14T21:59:45.3042534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3042940Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3043329Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3043749Z outputs = layer_module( 2025-08-14T21:59:45.3044134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3044531Z outputs = self.rel_attn( 2025-08-14T21:59:45.3044912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3045326Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3045762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3046241Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3046437Z 2025-08-14T21:59:45.3046545Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3046921Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3047261Z return mod(**inputs) 2025-08-14T21:59:45.3047654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3048076Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3048500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3049007Z outputs = layer_module( 2025-08-14T21:59:45.3049462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3049875Z outputs = self.rel_attn( 2025-08-14T21:59:45.3050283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3050699Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3050851Z 2025-08-14T21:59:45.3050952Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3051312Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3051638Z return mod(**inputs) 2025-08-14T21:59:45.3052006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3052402Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3052813Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3053225Z outputs = layer_module( 2025-08-14T21:59:45.3053604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3054003Z outputs = self.rel_attn( 2025-08-14T21:59:45.3054390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3054799Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3055186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3055649Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3055831Z 2025-08-14T21:59:45.3055947Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3056323Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3056644Z return mod(**inputs) 2025-08-14T21:59:45.3057005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3057417Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3057821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3058223Z outputs = layer_module( 2025-08-14T21:59:45.3058627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3059032Z outputs = self.rel_attn( 2025-08-14T21:59:45.3059408Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3059808Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3060230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3060687Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3060869Z 2025-08-14T21:59:45.3060977Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3061354Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3061693Z return mod(**inputs) 2025-08-14T21:59:45.3062069Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3062488Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3062902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3063303Z outputs = layer_module( 2025-08-14T21:59:45.3063747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3064155Z outputs = self.rel_attn( 2025-08-14T21:59:45.3064552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3064979Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3065426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3065900Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3066077Z 2025-08-14T21:59:45.3066174Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3066402Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3066631Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3066857Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3067102Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3067494Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3067840Z return mod(**inputs) 2025-08-14T21:59:45.3068229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3068698Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3069112Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3069512Z outputs = layer_module( 2025-08-14T21:59:45.3069890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3070290Z outputs = self.rel_attn( 2025-08-14T21:59:45.3070676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3071104Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3071253Z 2025-08-14T21:59:45.3071357Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3071720Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3072063Z return mod(**inputs) 2025-08-14T21:59:45.3072436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3072852Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3073280Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3073686Z outputs = layer_module( 2025-08-14T21:59:45.3074066Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3074464Z outputs = self.rel_attn( 2025-08-14T21:59:45.3074840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3075282Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3075454Z 2025-08-14T21:59:45.3075557Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3075915Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3076240Z return mod(**inputs) 2025-08-14T21:59:45.3076595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3076994Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3077385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3077767Z outputs = layer_module( 2025-08-14T21:59:45.3078139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3078535Z outputs = self.rel_attn( 2025-08-14T21:59:45.3079095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3079503Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3079922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3080408Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3080593Z 2025-08-14T21:59:45.3080704Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3081061Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3081400Z return mod(**inputs) 2025-08-14T21:59:45.3081782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3082203Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3082609Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3083039Z outputs = layer_module( 2025-08-14T21:59:45.3083425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3083835Z outputs = self.rel_attn( 2025-08-14T21:59:45.3084239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3084727Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3084927Z 2025-08-14T21:59:45.3085047Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3085431Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3085789Z return mod(**inputs) 2025-08-14T21:59:45.3086181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3086626Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3087045Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3087466Z outputs = layer_module( 2025-08-14T21:59:45.3087881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3088285Z outputs = self.rel_attn( 2025-08-14T21:59:45.3088678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3089178Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3089624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3090111Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3090316Z 2025-08-14T21:59:45.3090429Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3090815Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3091155Z return mod(**inputs) 2025-08-14T21:59:45.3091549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3091987Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3092410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3092828Z outputs = layer_module( 2025-08-14T21:59:45.3093221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3093692Z outputs = self.rel_attn( 2025-08-14T21:59:45.3094093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3094548Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3094719Z 2025-08-14T21:59:45.3094831Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3095214Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3095557Z return mod(**inputs) 2025-08-14T21:59:45.3095949Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3096380Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3096800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3097208Z outputs = layer_module( 2025-08-14T21:59:45.3097601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3098038Z outputs = self.rel_attn( 2025-08-14T21:59:45.3098428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3098849Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3099281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3099770Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3099957Z 2025-08-14T21:59:45.3100068Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3100461Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3100813Z return mod(**inputs) 2025-08-14T21:59:45.3101204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3101594Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3102004Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3102409Z outputs = layer_module( 2025-08-14T21:59:45.3102956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3103460Z outputs = self.rel_attn( 2025-08-14T21:59:45.3103854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3104257Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3104667Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3105111Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3105277Z 2025-08-14T21:59:45.3105390Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3105752Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3106067Z return mod(**inputs) 2025-08-14T21:59:45.3106430Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3106825Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3107210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3107592Z outputs = layer_module( 2025-08-14T21:59:45.3107952Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3108361Z outputs = self.rel_attn( 2025-08-14T21:59:45.3108744Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3109153Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3109593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3110049Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3110228Z 2025-08-14T21:59:45.3110314Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3110540Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3110763Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3110976Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3111223Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3111600Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3111939Z return mod(**inputs) 2025-08-14T21:59:45.3112324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3112777Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3113168Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3113545Z outputs = layer_module( 2025-08-14T21:59:45.3113911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3114295Z outputs = self.rel_attn( 2025-08-14T21:59:45.3114652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3115073Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3115227Z 2025-08-14T21:59:45.3115344Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3115694Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3116010Z return mod(**inputs) 2025-08-14T21:59:45.3116374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3116777Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3117201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3117593Z outputs = layer_module( 2025-08-14T21:59:45.3117993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3118412Z outputs = self.rel_attn( 2025-08-14T21:59:45.3118803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3119256Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3119434Z 2025-08-14T21:59:45.3119538Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3119898Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3120218Z return mod(**inputs) 2025-08-14T21:59:45.3120583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3120987Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3121375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3121756Z outputs = layer_module( 2025-08-14T21:59:45.3122148Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3122621Z outputs = self.rel_attn( 2025-08-14T21:59:45.3123025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3123443Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3123861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3124352Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3124547Z 2025-08-14T21:59:45.3124655Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3125045Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3125392Z return mod(**inputs) 2025-08-14T21:59:45.3125780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3126223Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3126666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3127078Z outputs = layer_module( 2025-08-14T21:59:45.3127496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3127915Z outputs = self.rel_attn( 2025-08-14T21:59:45.3128314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3128865Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3129088Z 2025-08-14T21:59:45.3129204Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3129601Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3129955Z return mod(**inputs) 2025-08-14T21:59:45.3130348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3130748Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3131145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3131529Z outputs = layer_module( 2025-08-14T21:59:45.3131886Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3132299Z outputs = self.rel_attn( 2025-08-14T21:59:45.3132670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3133053Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3133457Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3133955Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3134146Z 2025-08-14T21:59:45.3134263Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3134637Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3134977Z return mod(**inputs) 2025-08-14T21:59:45.3135364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3135763Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3136156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3136537Z outputs = layer_module( 2025-08-14T21:59:45.3136900Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3137318Z outputs = self.rel_attn( 2025-08-14T21:59:45.3137723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3138165Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3138328Z 2025-08-14T21:59:45.3138445Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3138814Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3139156Z return mod(**inputs) 2025-08-14T21:59:45.3139519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3139905Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3140297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3140673Z outputs = layer_module( 2025-08-14T21:59:45.3141056Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3141449Z outputs = self.rel_attn( 2025-08-14T21:59:45.3141834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3142268Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3142681Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3143125Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3143303Z 2025-08-14T21:59:45.3143410Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3143786Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3144117Z return mod(**inputs) 2025-08-14T21:59:45.3144502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3144918Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3145329Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3145717Z outputs = layer_module( 2025-08-14T21:59:45.3146079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3146461Z outputs = self.rel_attn( 2025-08-14T21:59:45.3146837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3147242Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3147659Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3148100Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3148266Z 2025-08-14T21:59:45.3148371Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3148728Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3149052Z return mod(**inputs) 2025-08-14T21:59:45.3149414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3149805Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3150196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3150577Z outputs = layer_module( 2025-08-14T21:59:45.3150938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3151350Z outputs = self.rel_attn( 2025-08-14T21:59:45.3151773Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3152200Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3152635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3153102Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3153276Z 2025-08-14T21:59:45.3153368Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3153588Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3153812Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3154030Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3154274Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3154628Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3154955Z return mod(**inputs) 2025-08-14T21:59:45.3155325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3155715Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3156142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3156556Z outputs = layer_module( 2025-08-14T21:59:45.3156935Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3157343Z outputs = self.rel_attn( 2025-08-14T21:59:45.3157726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3158167Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3158323Z 2025-08-14T21:59:45.3158438Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3158807Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3159126Z return mod(**inputs) 2025-08-14T21:59:45.3159483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3159908Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3160331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3160724Z outputs = layer_module( 2025-08-14T21:59:45.3161103Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3161485Z outputs = self.rel_attn( 2025-08-14T21:59:45.3161854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3162324Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3162485Z 2025-08-14T21:59:45.3162594Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3162987Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3163339Z return mod(**inputs) 2025-08-14T21:59:45.3163740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3164178Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3164625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3165056Z outputs = layer_module( 2025-08-14T21:59:45.3165456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3165889Z outputs = self.rel_attn( 2025-08-14T21:59:45.3166338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3166775Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3167205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3167726Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3167933Z 2025-08-14T21:59:45.3168046Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3168458Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3168903Z return mod(**inputs) 2025-08-14T21:59:45.3169322Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3169767Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3170192Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3170610Z outputs = layer_module( 2025-08-14T21:59:45.3171007Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3171479Z outputs = self.rel_attn( 2025-08-14T21:59:45.3171871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3172352Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3172552Z 2025-08-14T21:59:45.3172673Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3173065Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3173406Z return mod(**inputs) 2025-08-14T21:59:45.3173803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3174249Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3174665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3175098Z outputs = layer_module( 2025-08-14T21:59:45.3175496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3175923Z outputs = self.rel_attn( 2025-08-14T21:59:45.3176334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3176755Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3177187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3177659Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3177858Z 2025-08-14T21:59:45.3177967Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3178344Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3178683Z return mod(**inputs) 2025-08-14T21:59:45.3179057Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3179476Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3179887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3180291Z outputs = layer_module( 2025-08-14T21:59:45.3180663Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3181065Z outputs = self.rel_attn( 2025-08-14T21:59:45.3181486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3181917Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3182087Z 2025-08-14T21:59:45.3182198Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3182575Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3182916Z return mod(**inputs) 2025-08-14T21:59:45.3183292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3183713Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3184125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3184520Z outputs = layer_module( 2025-08-14T21:59:45.3184905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3185315Z outputs = self.rel_attn( 2025-08-14T21:59:45.3185696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3186115Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3186533Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3187002Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3187184Z 2025-08-14T21:59:45.3187303Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3187676Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3188017Z return mod(**inputs) 2025-08-14T21:59:45.3188406Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3188817Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3189230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3189633Z outputs = layer_module( 2025-08-14T21:59:45.3190013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3190406Z outputs = self.rel_attn( 2025-08-14T21:59:45.3190817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3191219Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3191628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3192070Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3192241Z 2025-08-14T21:59:45.3192346Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3192702Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3193020Z return mod(**inputs) 2025-08-14T21:59:45.3193378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3193771Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3194157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3194542Z outputs = layer_module( 2025-08-14T21:59:45.3194932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3195359Z outputs = self.rel_attn( 2025-08-14T21:59:45.3195734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3196206Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3196644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3197088Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3197254Z 2025-08-14T21:59:45.3197333Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3197548Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3197767Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3197979Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3198224Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3198605Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3198949Z return mod(**inputs) 2025-08-14T21:59:45.3199331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3199748Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3200148Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3200553Z outputs = layer_module( 2025-08-14T21:59:45.3200917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3201300Z outputs = self.rel_attn( 2025-08-14T21:59:45.3201668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3202071Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3202229Z 2025-08-14T21:59:45.3202332Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3202866Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3203224Z return mod(**inputs) 2025-08-14T21:59:45.3203616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3204045Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3204461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3204866Z outputs = layer_module( 2025-08-14T21:59:45.3205315Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3205731Z outputs = self.rel_attn( 2025-08-14T21:59:45.3206129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3206564Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3206739Z 2025-08-14T21:59:45.3206851Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3207237Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3207592Z return mod(**inputs) 2025-08-14T21:59:45.3207997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3208429Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3208911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3209344Z outputs = layer_module( 2025-08-14T21:59:45.3209741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3210159Z outputs = self.rel_attn( 2025-08-14T21:59:45.3210553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3211030Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3211461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3211957Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3212154Z 2025-08-14T21:59:45.3212265Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3212654Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3213003Z return mod(**inputs) 2025-08-14T21:59:45.3213392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3213822Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3214243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3214657Z outputs = layer_module( 2025-08-14T21:59:45.3215042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3215484Z outputs = self.rel_attn( 2025-08-14T21:59:45.3215875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3216355Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3216557Z 2025-08-14T21:59:45.3216668Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3217067Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3217405Z return mod(**inputs) 2025-08-14T21:59:45.3217790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3218194Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3218589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3218970Z outputs = layer_module( 2025-08-14T21:59:45.3219327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3219710Z outputs = self.rel_attn( 2025-08-14T21:59:45.3220099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3220532Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3220945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3221427Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3221621Z 2025-08-14T21:59:45.3221739Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3222124Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3222465Z return mod(**inputs) 2025-08-14T21:59:45.3222831Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3223228Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3223613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3223995Z outputs = layer_module( 2025-08-14T21:59:45.3224370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3224776Z outputs = self.rel_attn( 2025-08-14T21:59:45.3225156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3225649Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3225814Z 2025-08-14T21:59:45.3225931Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3226303Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3226643Z return mod(**inputs) 2025-08-14T21:59:45.3227026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3227444Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3227864Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3228287Z outputs = layer_module( 2025-08-14T21:59:45.3228671Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3229077Z outputs = self.rel_attn( 2025-08-14T21:59:45.3229462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3229866Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3230306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3230771Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3230963Z 2025-08-14T21:59:45.3231074Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3231457Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3231798Z return mod(**inputs) 2025-08-14T21:59:45.3232176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3232603Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3233021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3233418Z outputs = layer_module( 2025-08-14T21:59:45.3233809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3234222Z outputs = self.rel_attn( 2025-08-14T21:59:45.3234608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3235054Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3235501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3235963Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3236135Z 2025-08-14T21:59:45.3236250Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3236636Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3236978Z return mod(**inputs) 2025-08-14T21:59:45.3237364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3237844Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3238260Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3238674Z outputs = layer_module( 2025-08-14T21:59:45.3239057Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3239455Z outputs = self.rel_attn( 2025-08-14T21:59:45.3239842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3240291Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3240743Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3241211Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3241394Z 2025-08-14T21:59:45.3241481Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3241709Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3241923Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3242141Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3242392Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3242768Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3243111Z return mod(**inputs) 2025-08-14T21:59:45.3243497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3243931Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3244353Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3244790Z outputs = layer_module( 2025-08-14T21:59:45.3245184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3245597Z outputs = self.rel_attn( 2025-08-14T21:59:45.3245998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3246444Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3246608Z 2025-08-14T21:59:45.3246728Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3247114Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3247474Z return mod(**inputs) 2025-08-14T21:59:45.3247870Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3248301Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3248720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3249222Z outputs = layer_module( 2025-08-14T21:59:45.3249623Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3250060Z outputs = self.rel_attn( 2025-08-14T21:59:45.3250461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3250917Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3251087Z 2025-08-14T21:59:45.3251209Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3251603Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3251957Z return mod(**inputs) 2025-08-14T21:59:45.3252356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3252784Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3253212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3253631Z outputs = layer_module( 2025-08-14T21:59:45.3254031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3254441Z outputs = self.rel_attn( 2025-08-14T21:59:45.3254844Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3255304Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3255758Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3256255Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3256468Z 2025-08-14T21:59:45.3256585Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3256987Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3257341Z return mod(**inputs) 2025-08-14T21:59:45.3257747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3258193Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3258628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3259050Z outputs = layer_module( 2025-08-14T21:59:45.3259456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3259881Z outputs = self.rel_attn( 2025-08-14T21:59:45.3260279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3260802Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3261012Z 2025-08-14T21:59:45.3261124Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3261526Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3261859Z return mod(**inputs) 2025-08-14T21:59:45.3262247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3262680Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3263096Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3263490Z outputs = layer_module( 2025-08-14T21:59:45.3263873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3264292Z outputs = self.rel_attn( 2025-08-14T21:59:45.3264684Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3265103Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3265555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3266054Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3266248Z 2025-08-14T21:59:45.3266359Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3266756Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3267097Z return mod(**inputs) 2025-08-14T21:59:45.3267494Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3267920Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3268345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3268757Z outputs = layer_module( 2025-08-14T21:59:45.3269146Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3269562Z outputs = self.rel_attn( 2025-08-14T21:59:45.3269955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3270400Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3270585Z 2025-08-14T21:59:45.3270711Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3271100Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3271450Z return mod(**inputs) 2025-08-14T21:59:45.3271834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3272264Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3272695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3273108Z outputs = layer_module( 2025-08-14T21:59:45.3273490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3273902Z outputs = self.rel_attn( 2025-08-14T21:59:45.3274302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3274716Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3275149Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3275677Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3275861Z 2025-08-14T21:59:45.3275979Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3276355Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3276696Z return mod(**inputs) 2025-08-14T21:59:45.3277085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3277515Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3277921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3278326Z outputs = layer_module( 2025-08-14T21:59:45.3278714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3279160Z outputs = self.rel_attn( 2025-08-14T21:59:45.3279549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3279977Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3280445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3280908Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3281090Z 2025-08-14T21:59:45.3281198Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3281573Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3281916Z return mod(**inputs) 2025-08-14T21:59:45.3282292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3282715Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3283130Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3283536Z outputs = layer_module( 2025-08-14T21:59:45.3283919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3284322Z outputs = self.rel_attn( 2025-08-14T21:59:45.3284708Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3285134Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3285637Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3286116Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3286295Z 2025-08-14T21:59:45.3286388Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3286613Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3286838Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3287061Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3287304Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3287695Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3288044Z return mod(**inputs) 2025-08-14T21:59:45.3288425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3288934Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3289383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3289803Z outputs = layer_module( 2025-08-14T21:59:45.3290195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3290635Z outputs = self.rel_attn( 2025-08-14T21:59:45.3291038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3291489Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3291653Z 2025-08-14T21:59:45.3291766Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3292158Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3292512Z return mod(**inputs) 2025-08-14T21:59:45.3292901Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3293334Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3293759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3294176Z outputs = layer_module( 2025-08-14T21:59:45.3294566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3294982Z outputs = self.rel_attn( 2025-08-14T21:59:45.3295403Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3295847Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3296020Z 2025-08-14T21:59:45.3296134Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3296527Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3296882Z return mod(**inputs) 2025-08-14T21:59:45.3297271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3297707Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3298136Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3298548Z outputs = layer_module( 2025-08-14T21:59:45.3298948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3299353Z outputs = self.rel_attn( 2025-08-14T21:59:45.3299740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3300140Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3300644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3301128Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3301321Z 2025-08-14T21:59:45.3301439Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3301808Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3302147Z return mod(**inputs) 2025-08-14T21:59:45.3302531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3303159Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3303582Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3303996Z outputs = layer_module( 2025-08-14T21:59:45.3304383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3304793Z outputs = self.rel_attn( 2025-08-14T21:59:45.3305188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3305715Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3305911Z 2025-08-14T21:59:45.3306030Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3306401Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3306739Z return mod(**inputs) 2025-08-14T21:59:45.3307120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3307528Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3307949Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3308373Z outputs = layer_module( 2025-08-14T21:59:45.3308766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3309181Z outputs = self.rel_attn( 2025-08-14T21:59:45.3309565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3309971Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3310412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3310893Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3311093Z 2025-08-14T21:59:45.3311204Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3311586Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3311920Z return mod(**inputs) 2025-08-14T21:59:45.3312313Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3312736Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3313153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3313552Z outputs = layer_module( 2025-08-14T21:59:45.3313940Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3314347Z outputs = self.rel_attn( 2025-08-14T21:59:45.3314727Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3315166Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3315170Z 2025-08-14T21:59:45.3315318Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3315553Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3315626Z return mod(**inputs) 2025-08-14T21:59:45.3315905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3315991Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3316260Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3316341Z outputs = layer_module( 2025-08-14T21:59:45.3316608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3316690Z outputs = self.rel_attn( 2025-08-14T21:59:45.3316952Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3317030Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3317318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3317468Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3317472Z 2025-08-14T21:59:45.3317586Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3317796Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3317866Z return mod(**inputs) 2025-08-14T21:59:45.3318143Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3318232Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3318497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3318578Z outputs = layer_module( 2025-08-14T21:59:45.3318847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3318926Z outputs = self.rel_attn( 2025-08-14T21:59:45.3319194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3319289Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3319606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3319726Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3319730Z 2025-08-14T21:59:45.3319843Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3320050Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3320120Z return mod(**inputs) 2025-08-14T21:59:45.3320394Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3320480Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3320748Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3320827Z outputs = layer_module( 2025-08-14T21:59:45.3321091Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3321173Z outputs = self.rel_attn( 2025-08-14T21:59:45.3321435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3321530Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3321824Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3321984Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3321997Z 2025-08-14T21:59:45.3322091Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3322975Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3323073Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3323165Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3323277Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3323497Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3323578Z return mod(**inputs) 2025-08-14T21:59:45.3323853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3323943Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3324221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3324297Z outputs = layer_module( 2025-08-14T21:59:45.3324575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3324686Z outputs = self.rel_attn( 2025-08-14T21:59:45.3324959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3325078Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3325083Z 2025-08-14T21:59:45.3325534Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3325789Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3325888Z return mod(**inputs) 2025-08-14T21:59:45.3326189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3326293Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3326567Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3326646Z outputs = layer_module( 2025-08-14T21:59:45.3326920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3326993Z outputs = self.rel_attn( 2025-08-14T21:59:45.3327323Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3327707Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3327716Z 2025-08-14T21:59:45.3327835Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3328059Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3328134Z return mod(**inputs) 2025-08-14T21:59:45.3328415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3328505Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3328780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3328935Z outputs = layer_module( 2025-08-14T21:59:45.3329216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3329296Z outputs = self.rel_attn( 2025-08-14T21:59:45.3329577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3329658Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3329957Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3330145Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3330150Z 2025-08-14T21:59:45.3330262Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3330484Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3330554Z return mod(**inputs) 2025-08-14T21:59:45.3330833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3330924Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3331196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3331278Z outputs = layer_module( 2025-08-14T21:59:45.3331555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3331632Z outputs = self.rel_attn( 2025-08-14T21:59:45.3331911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3332053Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3332076Z 2025-08-14T21:59:45.3332197Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3332411Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3332480Z return mod(**inputs) 2025-08-14T21:59:45.3332765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3332854Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3333135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3333209Z outputs = layer_module( 2025-08-14T21:59:45.3333482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3333564Z outputs = self.rel_attn( 2025-08-14T21:59:45.3333838Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3333927Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3334226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3334382Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3334386Z 2025-08-14T21:59:45.3334505Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3334719Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3334789Z return mod(**inputs) 2025-08-14T21:59:45.3335072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3335160Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3335440Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3335515Z outputs = layer_module( 2025-08-14T21:59:45.3335784Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3335866Z outputs = self.rel_attn( 2025-08-14T21:59:45.3336138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3336244Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3336255Z 2025-08-14T21:59:45.3336365Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3336617Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3336698Z return mod(**inputs) 2025-08-14T21:59:45.3336969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3337059Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3337340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3337412Z outputs = layer_module( 2025-08-14T21:59:45.3337691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3337766Z outputs = self.rel_attn( 2025-08-14T21:59:45.3338034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3338119Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3338409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3338540Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3338562Z 2025-08-14T21:59:45.3338683Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3338899Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3338978Z return mod(**inputs) 2025-08-14T21:59:45.3339256Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3339345Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3339631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3339704Z outputs = layer_module( 2025-08-14T21:59:45.3339986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3340065Z outputs = self.rel_attn( 2025-08-14T21:59:45.3340331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3340433Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3340729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3340864Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3340869Z 2025-08-14T21:59:45.3340988Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3341202Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3341278Z return mod(**inputs) 2025-08-14T21:59:45.3341552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3341641Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3341920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3341993Z outputs = layer_module( 2025-08-14T21:59:45.3342265Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3342345Z outputs = self.rel_attn( 2025-08-14T21:59:45.3342617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3342719Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3343013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3343163Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3343188Z 2025-08-14T21:59:45.3343278Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3343360Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3343452Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3343531Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3343639Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3343852Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3343920Z return mod(**inputs) 2025-08-14T21:59:45.3344183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3344276Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3344540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3344619Z outputs = layer_module( 2025-08-14T21:59:45.3344881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3344952Z outputs = self.rel_attn( 2025-08-14T21:59:45.3345244Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3345348Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3345352Z 2025-08-14T21:59:45.3345459Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3345676Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3345745Z return mod(**inputs) 2025-08-14T21:59:45.3346017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3346102Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3346369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3346448Z outputs = layer_module( 2025-08-14T21:59:45.3346712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3346786Z outputs = self.rel_attn( 2025-08-14T21:59:45.3347056Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3347182Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3347187Z 2025-08-14T21:59:45.3347302Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3347510Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3347579Z return mod(**inputs) 2025-08-14T21:59:45.3347850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3347938Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3348215Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3348287Z outputs = layer_module( 2025-08-14T21:59:45.3348559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3348641Z outputs = self.rel_attn( 2025-08-14T21:59:45.3348916Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3348991Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3349278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3349414Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3349440Z 2025-08-14T21:59:45.3349575Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3349782Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3349852Z return mod(**inputs) 2025-08-14T21:59:45.3350125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3350210Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3350478Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3350548Z outputs = layer_module( 2025-08-14T21:59:45.3350808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3350886Z outputs = self.rel_attn( 2025-08-14T21:59:45.3351151Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3351288Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3351326Z 2025-08-14T21:59:45.3351432Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3351639Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3351714Z return mod(**inputs) 2025-08-14T21:59:45.3351981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3352067Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3352340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3352409Z outputs = layer_module( 2025-08-14T21:59:45.3352680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3352753Z outputs = self.rel_attn( 2025-08-14T21:59:45.3353015Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3353101Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3353391Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3353529Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3353558Z 2025-08-14T21:59:45.3353668Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3353883Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3353961Z return mod(**inputs) 2025-08-14T21:59:45.3354232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3354323Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3354604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3354679Z outputs = layer_module( 2025-08-14T21:59:45.3354956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3355028Z outputs = self.rel_attn( 2025-08-14T21:59:45.3355310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3355422Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3355426Z 2025-08-14T21:59:45.3355530Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3355737Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3355833Z return mod(**inputs) 2025-08-14T21:59:45.3356119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3356215Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3356490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3356560Z outputs = layer_module( 2025-08-14T21:59:45.3356831Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3356903Z outputs = self.rel_attn( 2025-08-14T21:59:45.3357165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3357250Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3357530Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3357664Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3357668Z 2025-08-14T21:59:45.3357776Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3357999Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3358075Z return mod(**inputs) 2025-08-14T21:59:45.3358337Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3358432Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3358695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3358767Z outputs = layer_module( 2025-08-14T21:59:45.3359037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3359112Z outputs = self.rel_attn( 2025-08-14T21:59:45.3359372Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3359473Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3359759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3359882Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3359886Z 2025-08-14T21:59:45.3360010Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3360221Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3360298Z return mod(**inputs) 2025-08-14T21:59:45.3360563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3360659Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3360924Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3360995Z outputs = layer_module( 2025-08-14T21:59:45.3361263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3361336Z outputs = self.rel_attn( 2025-08-14T21:59:45.3361598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3361698Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3361980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3362102Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3362126Z 2025-08-14T21:59:45.3362211Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3362313Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3362405Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3362486Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3362596Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3362816Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3362887Z return mod(**inputs) 2025-08-14T21:59:45.3363170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3363258Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3363527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3363608Z outputs = layer_module( 2025-08-14T21:59:45.3363881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3363956Z outputs = self.rel_attn( 2025-08-14T21:59:45.3364232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3364358Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3364362Z 2025-08-14T21:59:45.3364479Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3364695Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3364765Z return mod(**inputs) 2025-08-14T21:59:45.3365048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3365137Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3365414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3365489Z outputs = layer_module( 2025-08-14T21:59:45.3365760Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3365843Z outputs = self.rel_attn( 2025-08-14T21:59:45.3366114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3366221Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3366232Z 2025-08-14T21:59:45.3366359Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3366574Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3366652Z return mod(**inputs) 2025-08-14T21:59:45.3366926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3367014Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3367300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3367374Z outputs = layer_module( 2025-08-14T21:59:45.3367653Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3367727Z outputs = self.rel_attn( 2025-08-14T21:59:45.3367999Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3368083Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3368375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3368514Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3368546Z 2025-08-14T21:59:45.3368658Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3368997Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3369082Z return mod(**inputs) 2025-08-14T21:59:45.3369377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3369466Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3369768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3369843Z outputs = layer_module( 2025-08-14T21:59:45.3370142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3370217Z outputs = self.rel_attn( 2025-08-14T21:59:45.3370505Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3370660Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3370664Z 2025-08-14T21:59:45.3370775Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3371028Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3371108Z return mod(**inputs) 2025-08-14T21:59:45.3371390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3371485Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3371772Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3371842Z outputs = layer_module( 2025-08-14T21:59:45.3372119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3372195Z outputs = self.rel_attn( 2025-08-14T21:59:45.3372485Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3372563Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3372860Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3373003Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3373007Z 2025-08-14T21:59:45.3373140Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3373358Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3373433Z return mod(**inputs) 2025-08-14T21:59:45.3373714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3373811Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3374098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3374170Z outputs = layer_module( 2025-08-14T21:59:45.3374450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3374523Z outputs = self.rel_attn( 2025-08-14T21:59:45.3374805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3374921Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3374925Z 2025-08-14T21:59:45.3375033Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3375261Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3375331Z return mod(**inputs) 2025-08-14T21:59:45.3375658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3375757Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3376043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3376122Z outputs = layer_module( 2025-08-14T21:59:45.3376456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3376531Z outputs = self.rel_attn( 2025-08-14T21:59:45.3376820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3376899Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3377206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3377352Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3377356Z 2025-08-14T21:59:45.3377467Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3377697Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3377784Z return mod(**inputs) 2025-08-14T21:59:45.3378060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3378155Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3378429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3378506Z outputs = layer_module( 2025-08-14T21:59:45.3378780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3378854Z outputs = self.rel_attn( 2025-08-14T21:59:45.3379133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3379228Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3379514Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3379639Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3379643Z 2025-08-14T21:59:45.3379750Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3379980Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3380050Z return mod(**inputs) 2025-08-14T21:59:45.3380315Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3380411Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3380679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3380757Z outputs = layer_module( 2025-08-14T21:59:45.3381021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3381093Z outputs = self.rel_attn( 2025-08-14T21:59:45.3381362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3381456Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3381740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3381863Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3381867Z 2025-08-14T21:59:45.3381950Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3382059Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3382157Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3382238Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3382354Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3382562Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3382631Z return mod(**inputs) 2025-08-14T21:59:45.3382904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3382993Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3383265Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3383335Z outputs = layer_module( 2025-08-14T21:59:45.3383601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3383684Z outputs = self.rel_attn( 2025-08-14T21:59:45.3383952Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3384076Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3384087Z 2025-08-14T21:59:45.3384194Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3384401Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3384478Z return mod(**inputs) 2025-08-14T21:59:45.3384744Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3384829Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3385099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3385170Z outputs = layer_module( 2025-08-14T21:59:45.3385438Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3385510Z outputs = self.rel_attn( 2025-08-14T21:59:45.3385772Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3385884Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3385888Z 2025-08-14T21:59:45.3385993Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3386216Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3386294Z return mod(**inputs) 2025-08-14T21:59:45.3386558Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3386650Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3386919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3386987Z outputs = layer_module( 2025-08-14T21:59:45.3387257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3387329Z outputs = self.rel_attn( 2025-08-14T21:59:45.3387596Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3387671Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3387976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3388141Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3388145Z 2025-08-14T21:59:45.3388250Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3388492Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3388570Z return mod(**inputs) 2025-08-14T21:59:45.3388834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3388927Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3389195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3389265Z outputs = layer_module( 2025-08-14T21:59:45.3389540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3389611Z outputs = self.rel_attn( 2025-08-14T21:59:45.3389872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3390017Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3390023Z 2025-08-14T21:59:45.3390129Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3390342Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3390431Z return mod(**inputs) 2025-08-14T21:59:45.3390695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3390789Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3391051Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3391128Z outputs = layer_module( 2025-08-14T21:59:45.3391390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3391461Z outputs = self.rel_attn( 2025-08-14T21:59:45.3391733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3391809Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3392087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3392221Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3392224Z 2025-08-14T21:59:45.3392326Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3392545Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3392612Z return mod(**inputs) 2025-08-14T21:59:45.3392865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3392954Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3393209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3393285Z outputs = layer_module( 2025-08-14T21:59:45.3393549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3393621Z outputs = self.rel_attn( 2025-08-14T21:59:45.3393891Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3393993Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3393999Z 2025-08-14T21:59:45.3394112Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3394322Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3394389Z return mod(**inputs) 2025-08-14T21:59:45.3394660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3394783Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3395049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3395128Z outputs = layer_module( 2025-08-14T21:59:45.3395392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3395470Z outputs = self.rel_attn( 2025-08-14T21:59:45.3395732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3395807Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3396092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3396219Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3396224Z 2025-08-14T21:59:45.3396332Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3396546Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3396633Z return mod(**inputs) 2025-08-14T21:59:45.3396911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3396996Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3397263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3397340Z outputs = layer_module( 2025-08-14T21:59:45.3397599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3397677Z outputs = self.rel_attn( 2025-08-14T21:59:45.3397939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3398034Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3398325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3398443Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3398446Z 2025-08-14T21:59:45.3398552Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3399432Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3399507Z return mod(**inputs) 2025-08-14T21:59:45.3399777Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3399864Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3400125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3400206Z outputs = layer_module( 2025-08-14T21:59:45.3400467Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3400547Z outputs = self.rel_attn( 2025-08-14T21:59:45.3400807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3400901Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3401203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3401320Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3401323Z 2025-08-14T21:59:45.3401407Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3401499Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3401614Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3401721Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3401829Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3402037Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3402115Z return mod(**inputs) 2025-08-14T21:59:45.3402384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3402468Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3402944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3403020Z outputs = layer_module( 2025-08-14T21:59:45.3403296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3403370Z outputs = self.rel_attn( 2025-08-14T21:59:45.3403639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3403750Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3403802Z 2025-08-14T21:59:45.3403915Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3404129Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3404207Z return mod(**inputs) 2025-08-14T21:59:45.3404480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3404575Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3404843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3404915Z outputs = layer_module( 2025-08-14T21:59:45.3405193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3405269Z outputs = self.rel_attn( 2025-08-14T21:59:45.3405545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3405654Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3405658Z 2025-08-14T21:59:45.3405767Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3406021Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3406094Z return mod(**inputs) 2025-08-14T21:59:45.3406365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3406471Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3406735Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3406815Z outputs = layer_module( 2025-08-14T21:59:45.3407082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3407158Z outputs = self.rel_attn( 2025-08-14T21:59:45.3407435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3407514Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3407801Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3407949Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3407953Z 2025-08-14T21:59:45.3408062Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3408281Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3408406Z return mod(**inputs) 2025-08-14T21:59:45.3408680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3408780Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3409115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3409201Z outputs = layer_module( 2025-08-14T21:59:45.3409475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3409548Z outputs = self.rel_attn( 2025-08-14T21:59:45.3409824Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3409964Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3409970Z 2025-08-14T21:59:45.3410089Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3410303Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3410374Z return mod(**inputs) 2025-08-14T21:59:45.3410677Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3410765Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3411040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3411123Z outputs = layer_module( 2025-08-14T21:59:45.3411391Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3411474Z outputs = self.rel_attn( 2025-08-14T21:59:45.3411742Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3411823Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3412122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3412264Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3412267Z 2025-08-14T21:59:45.3412385Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3412598Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3412687Z return mod(**inputs) 2025-08-14T21:59:45.3412968Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3413055Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3413324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3413408Z outputs = layer_module( 2025-08-14T21:59:45.3413678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3413760Z outputs = self.rel_attn( 2025-08-14T21:59:45.3414027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3414133Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3414137Z 2025-08-14T21:59:45.3414255Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3414467Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3414538Z return mod(**inputs) 2025-08-14T21:59:45.3414816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3414955Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3415251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3415324Z outputs = layer_module( 2025-08-14T21:59:45.3415598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3415681Z outputs = self.rel_attn( 2025-08-14T21:59:45.3415954Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3416034Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3416302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3416421Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3416425Z 2025-08-14T21:59:45.3416536Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3416735Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3416801Z return mod(**inputs) 2025-08-14T21:59:45.3417059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3417163Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3417418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3417486Z outputs = layer_module( 2025-08-14T21:59:45.3417734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3417813Z outputs = self.rel_attn( 2025-08-14T21:59:45.3418073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3418170Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3418458Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3418576Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3418580Z 2025-08-14T21:59:45.3418694Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3418897Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3418965Z return mod(**inputs) 2025-08-14T21:59:45.3419251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3419336Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3419590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3419657Z outputs = layer_module( 2025-08-14T21:59:45.3419906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3419981Z outputs = self.rel_attn( 2025-08-14T21:59:45.3420227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3420315Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3420590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3420699Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3420702Z 2025-08-14T21:59:45.3420788Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3420865Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3420940Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3421040Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3421156Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3421354Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3421427Z return mod(**inputs) 2025-08-14T21:59:45.3421678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3421768Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3422019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3422087Z outputs = layer_module( 2025-08-14T21:59:45.3422350Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3422422Z outputs = self.rel_attn( 2025-08-14T21:59:45.3422693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3422797Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3422801Z 2025-08-14T21:59:45.3422906Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3423145Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3423210Z return mod(**inputs) 2025-08-14T21:59:45.3423460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3423554Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3423819Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3423896Z outputs = layer_module( 2025-08-14T21:59:45.3424168Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3424240Z outputs = self.rel_attn( 2025-08-14T21:59:45.3424496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3424596Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3424600Z 2025-08-14T21:59:45.3424706Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3424904Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3424969Z return mod(**inputs) 2025-08-14T21:59:45.3425254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3425338Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3425587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3425665Z outputs = layer_module( 2025-08-14T21:59:45.3425914Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3425989Z outputs = self.rel_attn( 2025-08-14T21:59:45.3426239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3426311Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3426586Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3426717Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3426720Z 2025-08-14T21:59:45.3426829Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3427025Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3427108Z return mod(**inputs) 2025-08-14T21:59:45.3427385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3427472Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3427740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3427816Z outputs = layer_module( 2025-08-14T21:59:45.3428076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3428156Z outputs = self.rel_attn( 2025-08-14T21:59:45.3428418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3428554Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3428558Z 2025-08-14T21:59:45.3428670Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3428880Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3428947Z return mod(**inputs) 2025-08-14T21:59:45.3429215Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3429328Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3429582Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3429648Z outputs = layer_module( 2025-08-14T21:59:45.3429896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3429972Z outputs = self.rel_attn( 2025-08-14T21:59:45.3430219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3430301Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3430564Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3430692Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3430697Z 2025-08-14T21:59:45.3430805Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3430999Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3431063Z return mod(**inputs) 2025-08-14T21:59:45.3431348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3431431Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3431688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3431757Z outputs = layer_module( 2025-08-14T21:59:45.3432008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3432085Z outputs = self.rel_attn( 2025-08-14T21:59:45.3432335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3432442Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3432446Z 2025-08-14T21:59:45.3432546Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3432746Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3432821Z return mod(**inputs) 2025-08-14T21:59:45.3433070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3433152Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3433450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3433518Z outputs = layer_module( 2025-08-14T21:59:45.3433775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3433844Z outputs = self.rel_attn( 2025-08-14T21:59:45.3434093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3434172Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3434439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3434558Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3434569Z 2025-08-14T21:59:45.3434669Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3434866Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3434938Z return mod(**inputs) 2025-08-14T21:59:45.3435187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3435286Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3435541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3435608Z outputs = layer_module( 2025-08-14T21:59:45.3435860Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3435927Z outputs = self.rel_attn( 2025-08-14T21:59:45.3436172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3436267Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3436542Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3436659Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3436671Z 2025-08-14T21:59:45.3436778Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3436984Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3437059Z return mod(**inputs) 2025-08-14T21:59:45.3437334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3437422Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3437696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3437766Z outputs = layer_module( 2025-08-14T21:59:45.3438036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3438108Z outputs = self.rel_attn( 2025-08-14T21:59:45.3438369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3438482Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3438749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3438861Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3438872Z 2025-08-14T21:59:45.3438952Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3439030Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3439116Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3439191Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3439291Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3439529Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3439594Z return mod(**inputs) 2025-08-14T21:59:45.3439844Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3439933Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3440184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3440260Z outputs = layer_module( 2025-08-14T21:59:45.3440510Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3440578Z outputs = self.rel_attn( 2025-08-14T21:59:45.3440834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3440932Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3440937Z 2025-08-14T21:59:45.3441045Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3441240Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3441322Z return mod(**inputs) 2025-08-14T21:59:45.3441579Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3441660Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3441910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3441986Z outputs = layer_module( 2025-08-14T21:59:45.3442240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3442319Z outputs = self.rel_attn( 2025-08-14T21:59:45.3442585Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3442688Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3442694Z 2025-08-14T21:59:45.3442807Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3443016Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3443084Z return mod(**inputs) 2025-08-14T21:59:45.3443373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3443463Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3443741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3443813Z outputs = layer_module( 2025-08-14T21:59:45.3444089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3444170Z outputs = self.rel_attn( 2025-08-14T21:59:45.3444450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3444535Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3444818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3444961Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3444964Z 2025-08-14T21:59:45.3445080Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3445293Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3445361Z return mod(**inputs) 2025-08-14T21:59:45.3445644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3445770Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3446050Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3446124Z outputs = layer_module( 2025-08-14T21:59:45.3446395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3446476Z outputs = self.rel_attn( 2025-08-14T21:59:45.3446746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3446894Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3446898Z 2025-08-14T21:59:45.3447008Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3447220Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3447303Z return mod(**inputs) 2025-08-14T21:59:45.3447573Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3447688Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3447967Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3448038Z outputs = layer_module( 2025-08-14T21:59:45.3448320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3448393Z outputs = self.rel_attn( 2025-08-14T21:59:45.3448664Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3448748Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3449136Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3449291Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3449295Z 2025-08-14T21:59:45.3449406Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3449620Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3449700Z return mod(**inputs) 2025-08-14T21:59:45.3450003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3450088Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3450345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3450412Z outputs = layer_module( 2025-08-14T21:59:45.3450681Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3450757Z outputs = self.rel_attn( 2025-08-14T21:59:45.3451026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3451144Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3451148Z 2025-08-14T21:59:45.3451257Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3451475Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3451547Z return mod(**inputs) 2025-08-14T21:59:45.3451816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3451911Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3452179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3452289Z outputs = layer_module( 2025-08-14T21:59:45.3452566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3452642Z outputs = self.rel_attn( 2025-08-14T21:59:45.3452927Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3453006Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3453303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3453441Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3453445Z 2025-08-14T21:59:45.3453555Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3453771Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3453843Z return mod(**inputs) 2025-08-14T21:59:45.3454127Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3454221Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3454516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3454587Z outputs = layer_module( 2025-08-14T21:59:45.3454878Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3454953Z outputs = self.rel_attn( 2025-08-14T21:59:45.3455241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3455337Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3455629Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3455760Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3455763Z 2025-08-14T21:59:45.3455872Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3456099Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3456176Z return mod(**inputs) 2025-08-14T21:59:45.3456455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3456569Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3456850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3456922Z outputs = layer_module( 2025-08-14T21:59:45.3457221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3457297Z outputs = self.rel_attn( 2025-08-14T21:59:45.3457590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3457688Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3457990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3458116Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3458119Z 2025-08-14T21:59:45.3458207Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3458295Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3458390Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3458473Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3458591Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3458812Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3458917Z return mod(**inputs) 2025-08-14T21:59:45.3459215Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3459305Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3459593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3459672Z outputs = layer_module( 2025-08-14T21:59:45.3459960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3460038Z outputs = self.rel_attn( 2025-08-14T21:59:45.3460313Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3460414Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3460420Z 2025-08-14T21:59:45.3460535Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3460744Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3460839Z return mod(**inputs) 2025-08-14T21:59:45.3461114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3461199Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3461475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3461546Z outputs = layer_module( 2025-08-14T21:59:45.3461818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3461898Z outputs = self.rel_attn( 2025-08-14T21:59:45.3462172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3462287Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3462290Z 2025-08-14T21:59:45.3462396Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3462618Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3462693Z return mod(**inputs) 2025-08-14T21:59:45.3462969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3463073Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3463358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3463428Z outputs = layer_module( 2025-08-14T21:59:45.3463696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3463770Z outputs = self.rel_attn( 2025-08-14T21:59:45.3464036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3464120Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3464400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3464543Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3464546Z 2025-08-14T21:59:45.3464656Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3464860Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3464938Z return mod(**inputs) 2025-08-14T21:59:45.3465201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3465307Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3465625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3465696Z outputs = layer_module( 2025-08-14T21:59:45.3465966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3466038Z outputs = self.rel_attn( 2025-08-14T21:59:45.3466298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3466442Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3466446Z 2025-08-14T21:59:45.3466552Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3466764Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3466834Z return mod(**inputs) 2025-08-14T21:59:45.3467101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3467191Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3467475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3467546Z outputs = layer_module( 2025-08-14T21:59:45.3467821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3467895Z outputs = self.rel_attn( 2025-08-14T21:59:45.3468170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3468248Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3468552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3468700Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3468703Z 2025-08-14T21:59:45.3468810Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3469028Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3469097Z return mod(**inputs) 2025-08-14T21:59:45.3469367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3469479Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3469747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3469817Z outputs = layer_module( 2025-08-14T21:59:45.3470089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3470165Z outputs = self.rel_attn( 2025-08-14T21:59:45.3470439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3470544Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3470549Z 2025-08-14T21:59:45.3470655Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3470870Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3470939Z return mod(**inputs) 2025-08-14T21:59:45.3471206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3471302Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3471569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3471667Z outputs = layer_module( 2025-08-14T21:59:45.3471960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3472033Z outputs = self.rel_attn( 2025-08-14T21:59:45.3472304Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3472381Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3472667Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3472795Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3472798Z 2025-08-14T21:59:45.3472906Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3473120Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3473189Z return mod(**inputs) 2025-08-14T21:59:45.3473455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3473549Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3473814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3473922Z outputs = layer_module( 2025-08-14T21:59:45.3474183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3474255Z outputs = self.rel_attn( 2025-08-14T21:59:45.3474529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3474621Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3474911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3475031Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3475035Z 2025-08-14T21:59:45.3475142Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3475359Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3475429Z return mod(**inputs) 2025-08-14T21:59:45.3475695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3475790Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3476073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3476152Z outputs = layer_module( 2025-08-14T21:59:45.3476414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3476486Z outputs = self.rel_attn( 2025-08-14T21:59:45.3476756Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3476849Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3477139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3477256Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3477260Z 2025-08-14T21:59:45.3477341Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3477433Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3477514Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3477594Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3477707Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3477911Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3478006Z return mod(**inputs) 2025-08-14T21:59:45.3478286Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3478374Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3478649Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3478720Z outputs = layer_module( 2025-08-14T21:59:45.3478981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3479063Z outputs = self.rel_attn( 2025-08-14T21:59:45.3479325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3479432Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3479436Z 2025-08-14T21:59:45.3479542Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3479751Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3479826Z return mod(**inputs) 2025-08-14T21:59:45.3480089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3480197Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3480466Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3480536Z outputs = layer_module( 2025-08-14T21:59:45.3480803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3480873Z outputs = self.rel_attn( 2025-08-14T21:59:45.3481135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3481247Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3481252Z 2025-08-14T21:59:45.3481359Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3481569Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3481639Z return mod(**inputs) 2025-08-14T21:59:45.3481901Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3481993Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3482274Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3482346Z outputs = layer_module( 2025-08-14T21:59:45.3482611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3482682Z outputs = self.rel_attn( 2025-08-14T21:59:45.3482950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3483026Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3483304Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3483449Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3483453Z 2025-08-14T21:59:45.3483559Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3483772Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3483841Z return mod(**inputs) 2025-08-14T21:59:45.3484115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3484211Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3484514Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3484588Z outputs = layer_module( 2025-08-14T21:59:45.3484869Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3484944Z outputs = self.rel_attn( 2025-08-14T21:59:45.3485221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3485365Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3485369Z 2025-08-14T21:59:45.3485479Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3485700Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3485769Z return mod(**inputs) 2025-08-14T21:59:45.3486050Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3486140Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3486412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3486511Z outputs = layer_module( 2025-08-14T21:59:45.3486783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3486857Z outputs = self.rel_attn( 2025-08-14T21:59:45.3487135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3487213Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3487506Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3487645Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3487651Z 2025-08-14T21:59:45.3487764Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3487987Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3488059Z return mod(**inputs) 2025-08-14T21:59:45.3488331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3488427Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3488714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3489023Z outputs = layer_module( 2025-08-14T21:59:45.3489303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3489378Z outputs = self.rel_attn( 2025-08-14T21:59:45.3489661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3489769Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3489773Z 2025-08-14T21:59:45.3489893Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3490103Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3490175Z return mod(**inputs) 2025-08-14T21:59:45.3490455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3490544Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3490814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3490897Z outputs = layer_module( 2025-08-14T21:59:45.3491167Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3491293Z outputs = self.rel_attn( 2025-08-14T21:59:45.3491564Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3491643Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3491941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3492070Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3492076Z 2025-08-14T21:59:45.3492192Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3492403Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3492473Z return mod(**inputs) 2025-08-14T21:59:45.3492751Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3492841Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3493113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3493213Z outputs = layer_module( 2025-08-14T21:59:45.3493489Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3493567Z outputs = self.rel_attn( 2025-08-14T21:59:45.3493839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3493937Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3494241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3494360Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3494365Z 2025-08-14T21:59:45.3494485Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3494696Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3494770Z return mod(**inputs) 2025-08-14T21:59:45.3495055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3495143Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3495446Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3495529Z outputs = layer_module( 2025-08-14T21:59:45.3495809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3495887Z outputs = self.rel_attn( 2025-08-14T21:59:45.3496165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3496263Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3496563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3496671Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3496675Z 2025-08-14T21:59:45.3496760Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3496838Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3496917Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3497003Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3497103Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3497299Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3497372Z return mod(**inputs) 2025-08-14T21:59:45.3497716Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3497801Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3498077Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3498148Z outputs = layer_module( 2025-08-14T21:59:45.3498420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3498490Z outputs = self.rel_attn( 2025-08-14T21:59:45.3498762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3498870Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3498874Z 2025-08-14T21:59:45.3498978Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3499197Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3499270Z return mod(**inputs) 2025-08-14T21:59:45.3499529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3499640Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3499911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3499982Z outputs = layer_module( 2025-08-14T21:59:45.3500261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3500334Z outputs = self.rel_attn( 2025-08-14T21:59:45.3500612Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3500715Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3500720Z 2025-08-14T21:59:45.3500830Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3501046Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3501113Z return mod(**inputs) 2025-08-14T21:59:45.3501370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3501453Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3501748Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3501827Z outputs = layer_module( 2025-08-14T21:59:45.3502101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3502174Z outputs = self.rel_attn( 2025-08-14T21:59:45.3502453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3502533Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3502990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3503135Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3503139Z 2025-08-14T21:59:45.3503248Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3503461Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3503531Z return mod(**inputs) 2025-08-14T21:59:45.3503818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3503904Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3504178Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3504333Z outputs = layer_module( 2025-08-14T21:59:45.3504598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3504671Z outputs = self.rel_attn( 2025-08-14T21:59:45.3504939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3505074Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3505078Z 2025-08-14T21:59:45.3505193Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3505398Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3505464Z return mod(**inputs) 2025-08-14T21:59:45.3505732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3505819Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3506087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3506189Z outputs = layer_module( 2025-08-14T21:59:45.3506439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3506516Z outputs = self.rel_attn( 2025-08-14T21:59:45.3506769Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3506841Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3507116Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3507241Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3507247Z 2025-08-14T21:59:45.3507354Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3507550Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3507615Z return mod(**inputs) 2025-08-14T21:59:45.3507883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3507969Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3508236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3508339Z outputs = layer_module( 2025-08-14T21:59:45.3508604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3508683Z outputs = self.rel_attn( 2025-08-14T21:59:45.3508945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3509052Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3509056Z 2025-08-14T21:59:45.3509170Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3509376Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3509452Z return mod(**inputs) 2025-08-14T21:59:45.3509723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3509806Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3510062Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3510128Z outputs = layer_module( 2025-08-14T21:59:45.3510372Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3510464Z outputs = self.rel_attn( 2025-08-14T21:59:45.3510729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3510809Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3511081Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3511201Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3511205Z 2025-08-14T21:59:45.3511315Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3511512Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3511584Z return mod(**inputs) 2025-08-14T21:59:45.3511835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3511918Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3512191Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3512261Z outputs = layer_module( 2025-08-14T21:59:45.3512541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3512621Z outputs = self.rel_attn( 2025-08-14T21:59:45.3512884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3512985Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3513268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3513394Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3513398Z 2025-08-14T21:59:45.3513506Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3513702Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3513771Z return mod(**inputs) 2025-08-14T21:59:45.3514020Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3514104Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3514360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3514472Z outputs = layer_module( 2025-08-14T21:59:45.3514722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3514799Z outputs = self.rel_attn( 2025-08-14T21:59:45.3515047Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3515144Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3515413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3515524Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3515527Z 2025-08-14T21:59:45.3515614Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3515694Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3515772Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3515854Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3515957Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3516160Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3516224Z return mod(**inputs) 2025-08-14T21:59:45.3516475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3516601Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3516853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3516923Z outputs = layer_module( 2025-08-14T21:59:45.3517181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3517249Z outputs = self.rel_attn( 2025-08-14T21:59:45.3517508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3517608Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3517612Z 2025-08-14T21:59:45.3517719Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3517947Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3518016Z return mod(**inputs) 2025-08-14T21:59:45.3518302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3518388Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3518683Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3518762Z outputs = layer_module( 2025-08-14T21:59:45.3519037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3519108Z outputs = self.rel_attn( 2025-08-14T21:59:45.3519395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3519499Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3519502Z 2025-08-14T21:59:45.3519616Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3519834Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3519902Z return mod(**inputs) 2025-08-14T21:59:45.3520174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3520260Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3520543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3520611Z outputs = layer_module( 2025-08-14T21:59:45.3520896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3520978Z outputs = self.rel_attn( 2025-08-14T21:59:45.3521247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3521324Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3521630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3521765Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3521770Z 2025-08-14T21:59:45.3521883Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3522097Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3522166Z return mod(**inputs) 2025-08-14T21:59:45.3522443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3522529Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3522803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3522901Z outputs = layer_module( 2025-08-14T21:59:45.3523197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3523278Z outputs = self.rel_attn( 2025-08-14T21:59:45.3523550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3523686Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3523690Z 2025-08-14T21:59:45.3523803Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3524021Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3524096Z return mod(**inputs) 2025-08-14T21:59:45.3524366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3524451Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3524738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3524807Z outputs = layer_module( 2025-08-14T21:59:45.3525080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3525179Z outputs = self.rel_attn( 2025-08-14T21:59:45.3525452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3525535Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3525836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3525970Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3525974Z 2025-08-14T21:59:45.3526087Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3526295Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3526371Z return mod(**inputs) 2025-08-14T21:59:45.3526632Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3526720Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3526996Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3527068Z outputs = layer_module( 2025-08-14T21:59:45.3527357Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3527440Z outputs = self.rel_attn( 2025-08-14T21:59:45.3527715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3527837Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3527841Z 2025-08-14T21:59:45.3527955Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3528172Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3528256Z return mod(**inputs) 2025-08-14T21:59:45.3528532Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3528632Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3528989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3529070Z outputs = layer_module( 2025-08-14T21:59:45.3529350Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3529423Z outputs = self.rel_attn( 2025-08-14T21:59:45.3548604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3548877Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3549259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3549410Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3549416Z 2025-08-14T21:59:45.3549550Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3549780Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3549858Z return mod(**inputs) 2025-08-14T21:59:45.3550153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3550252Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3550528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3550618Z outputs = layer_module( 2025-08-14T21:59:45.3550889Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3551010Z outputs = self.rel_attn( 2025-08-14T21:59:45.3551280Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3551378Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3551679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3551804Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3551809Z 2025-08-14T21:59:45.3551931Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3552151Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3552228Z return mod(**inputs) 2025-08-14T21:59:45.3552508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3552603Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3552872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3552955Z outputs = layer_module( 2025-08-14T21:59:45.3553295Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3553380Z outputs = self.rel_attn( 2025-08-14T21:59:45.3553641Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3553734Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3554032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3554151Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3554157Z 2025-08-14T21:59:45.3554246Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3554337Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3554419Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3554508Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3554619Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3554832Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3554913Z return mod(**inputs) 2025-08-14T21:59:45.3555182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3555272Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3555597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3555671Z outputs = layer_module( 2025-08-14T21:59:45.3555943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3556018Z outputs = self.rel_attn( 2025-08-14T21:59:45.3556283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3556397Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3556401Z 2025-08-14T21:59:45.3556511Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3556730Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3556800Z return mod(**inputs) 2025-08-14T21:59:45.3557068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3557162Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3557425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3557515Z outputs = layer_module( 2025-08-14T21:59:45.3557790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3557863Z outputs = self.rel_attn( 2025-08-14T21:59:45.3558137Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3558246Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3558250Z 2025-08-14T21:59:45.3558358Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3558574Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3558648Z return mod(**inputs) 2025-08-14T21:59:45.3558915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3559011Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3559279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3559359Z outputs = layer_module( 2025-08-14T21:59:45.3559643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3559715Z outputs = self.rel_attn( 2025-08-14T21:59:45.3559975Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3560051Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3560349Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3560495Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3560501Z 2025-08-14T21:59:45.3560612Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3560829Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3560899Z return mod(**inputs) 2025-08-14T21:59:45.3561170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3561265Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3561532Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3561610Z outputs = layer_module( 2025-08-14T21:59:45.3561889Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3561980Z outputs = self.rel_attn( 2025-08-14T21:59:45.3562254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3562401Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3562406Z 2025-08-14T21:59:45.3562523Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3562735Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3562805Z return mod(**inputs) 2025-08-14T21:59:45.3563079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3563166Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3563431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3563512Z outputs = layer_module( 2025-08-14T21:59:45.3563776Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3563876Z outputs = self.rel_attn( 2025-08-14T21:59:45.3564149Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3564231Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3564535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3564674Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3564678Z 2025-08-14T21:59:45.3564796Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3565015Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3565090Z return mod(**inputs) 2025-08-14T21:59:45.3565373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3565465Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3565740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3565821Z outputs = layer_module( 2025-08-14T21:59:45.3566114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3566199Z outputs = self.rel_attn( 2025-08-14T21:59:45.3566475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3566586Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3566592Z 2025-08-14T21:59:45.3566713Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3566931Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3567012Z return mod(**inputs) 2025-08-14T21:59:45.3567293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3567384Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3567670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3567745Z outputs = layer_module( 2025-08-14T21:59:45.3568023Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3568105Z outputs = self.rel_attn( 2025-08-14T21:59:45.3568383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3568509Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3568893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3569042Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3569047Z 2025-08-14T21:59:45.3569168Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3569384Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3569456Z return mod(**inputs) 2025-08-14T21:59:45.3569738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3569828Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3570111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3570186Z outputs = layer_module( 2025-08-14T21:59:45.3570462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3570568Z outputs = self.rel_attn( 2025-08-14T21:59:45.3570840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3570950Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3571249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3571373Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3571377Z 2025-08-14T21:59:45.3571499Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3571712Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3571788Z return mod(**inputs) 2025-08-14T21:59:45.3572075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3572167Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3572450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3572523Z outputs = layer_module( 2025-08-14T21:59:45.3572793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3572906Z outputs = self.rel_attn( 2025-08-14T21:59:45.3573178Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3573279Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3573571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3573693Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3573697Z 2025-08-14T21:59:45.3573793Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3573883Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3573968Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3574058Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3574169Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3574391Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3574462Z return mod(**inputs) 2025-08-14T21:59:45.3574734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3574831Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3575121Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3575213Z outputs = layer_module( 2025-08-14T21:59:45.3575498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3575575Z outputs = self.rel_attn( 2025-08-14T21:59:45.3575854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3575960Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3575964Z 2025-08-14T21:59:45.3576075Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3576301Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3576372Z return mod(**inputs) 2025-08-14T21:59:45.3576650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3576752Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3577025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3577126Z outputs = layer_module( 2025-08-14T21:59:45.3577400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3577474Z outputs = self.rel_attn( 2025-08-14T21:59:45.3577757Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3577868Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3577872Z 2025-08-14T21:59:45.3577989Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3578206Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3578278Z return mod(**inputs) 2025-08-14T21:59:45.3578562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3578652Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3578944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3579025Z outputs = layer_module( 2025-08-14T21:59:45.3579308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3579406Z outputs = self.rel_attn( 2025-08-14T21:59:45.3579690Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3579768Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3580067Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3580211Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3580215Z 2025-08-14T21:59:45.3580330Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3580545Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3580617Z return mod(**inputs) 2025-08-14T21:59:45.3580902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3580994Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3581282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3581363Z outputs = layer_module( 2025-08-14T21:59:45.3581650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3581746Z outputs = self.rel_attn( 2025-08-14T21:59:45.3582039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3582179Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3582183Z 2025-08-14T21:59:45.3582299Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3582517Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3582594Z return mod(**inputs) 2025-08-14T21:59:45.3582869Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3582957Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3583232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3583304Z outputs = layer_module( 2025-08-14T21:59:45.3583580Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3583661Z outputs = self.rel_attn( 2025-08-14T21:59:45.3583958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3584041Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3584342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3584480Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3584484Z 2025-08-14T21:59:45.3584601Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3584823Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3584904Z return mod(**inputs) 2025-08-14T21:59:45.3585187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3585276Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3585568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3585637Z outputs = layer_module( 2025-08-14T21:59:45.3585922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3586019Z outputs = self.rel_attn( 2025-08-14T21:59:45.3586297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3586411Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3586415Z 2025-08-14T21:59:45.3586522Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3586739Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3586816Z return mod(**inputs) 2025-08-14T21:59:45.3587093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3587183Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3587466Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3587536Z outputs = layer_module( 2025-08-14T21:59:45.3587816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3587886Z outputs = self.rel_attn( 2025-08-14T21:59:45.3588164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3588266Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3588577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3588715Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3588721Z 2025-08-14T21:59:45.3588827Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3589038Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3589117Z return mod(**inputs) 2025-08-14T21:59:45.3589389Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3589477Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3589754Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3589824Z outputs = layer_module( 2025-08-14T21:59:45.3590102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3590172Z outputs = self.rel_attn( 2025-08-14T21:59:45.3590438Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3590565Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3590853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3590977Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3590981Z 2025-08-14T21:59:45.3591087Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3591297Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3591373Z return mod(**inputs) 2025-08-14T21:59:45.3591643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3591730Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3592011Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3592078Z outputs = layer_module( 2025-08-14T21:59:45.3592337Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3592404Z outputs = self.rel_attn( 2025-08-14T21:59:45.3592671Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3592769Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3593038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3593159Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3593164Z 2025-08-14T21:59:45.3593244Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3593324Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3593409Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3593484Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3593584Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3593790Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3593856Z return mod(**inputs) 2025-08-14T21:59:45.3594108Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3594198Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3594452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3594548Z outputs = layer_module( 2025-08-14T21:59:45.3594836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3594909Z outputs = self.rel_attn( 2025-08-14T21:59:45.3595181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3595285Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3595288Z 2025-08-14T21:59:45.3595403Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3595611Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3595682Z return mod(**inputs) 2025-08-14T21:59:45.3595961Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3596049Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3596352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3596430Z outputs = layer_module( 2025-08-14T21:59:45.3596691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3596796Z outputs = self.rel_attn( 2025-08-14T21:59:45.3597061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3597167Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3597171Z 2025-08-14T21:59:45.3597287Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3597496Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3597571Z return mod(**inputs) 2025-08-14T21:59:45.3597839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3597929Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3598214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3598281Z outputs = layer_module( 2025-08-14T21:59:45.3598531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3598607Z outputs = self.rel_attn( 2025-08-14T21:59:45.3598873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3598955Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3599221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3599352Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3599357Z 2025-08-14T21:59:45.3599466Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3599661Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3599737Z return mod(**inputs) 2025-08-14T21:59:45.3599986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3600070Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3600328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3600395Z outputs = layer_module( 2025-08-14T21:59:45.3600644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3600720Z outputs = self.rel_attn( 2025-08-14T21:59:45.3601005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3601145Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3601150Z 2025-08-14T21:59:45.3601251Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3601449Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3601523Z return mod(**inputs) 2025-08-14T21:59:45.3601779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3601862Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3602123Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3602188Z outputs = layer_module( 2025-08-14T21:59:45.3602453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3602524Z outputs = self.rel_attn( 2025-08-14T21:59:45.3603014Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3603180Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3603465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3603610Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3603614Z 2025-08-14T21:59:45.3603723Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3603932Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3604011Z return mod(**inputs) 2025-08-14T21:59:45.3604277Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3604368Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3604639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3604712Z outputs = layer_module( 2025-08-14T21:59:45.3604983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3605056Z outputs = self.rel_attn( 2025-08-14T21:59:45.3605363Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3605480Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3605484Z 2025-08-14T21:59:45.3605591Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3605805Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3605878Z return mod(**inputs) 2025-08-14T21:59:45.3606144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3606239Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3606512Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3606585Z outputs = layer_module( 2025-08-14T21:59:45.3606868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3606943Z outputs = self.rel_attn( 2025-08-14T21:59:45.3607221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3607299Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3607618Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3607786Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3607790Z 2025-08-14T21:59:45.3607904Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3608126Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3608197Z return mod(**inputs) 2025-08-14T21:59:45.3608477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3608575Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3608899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3608980Z outputs = layer_module( 2025-08-14T21:59:45.3609267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3609345Z outputs = self.rel_attn( 2025-08-14T21:59:45.3609628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3609748Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3610043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3610170Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3610174Z 2025-08-14T21:59:45.3610285Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3610507Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3610580Z return mod(**inputs) 2025-08-14T21:59:45.3610858Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3610959Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3611236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3611318Z outputs = layer_module( 2025-08-14T21:59:45.3611604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3611678Z outputs = self.rel_attn( 2025-08-14T21:59:45.3611977Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3612075Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3612366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3612494Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3612499Z 2025-08-14T21:59:45.3612589Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3612675Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3612768Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3612851Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3612969Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3613181Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3613250Z return mod(**inputs) 2025-08-14T21:59:45.3613532Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3613621Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3613892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3613972Z outputs = layer_module( 2025-08-14T21:59:45.3614286Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3614368Z outputs = self.rel_attn( 2025-08-14T21:59:45.3614644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 416, in forward 2025-08-14T21:59:45.3614750Z q_head_h = torch.einsum("ibh,hnd->ibnd", h, self.q) 2025-08-14T21:59:45.3614754Z 2025-08-14T21:59:45.3614870Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3615085Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3615155Z return mod(**inputs) 2025-08-14T21:59:45.3615431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3615513Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3615771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3615839Z outputs = layer_module( 2025-08-14T21:59:45.3616087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3616180Z outputs = self.rel_attn( 2025-08-14T21:59:45.3616427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 417, in forward 2025-08-14T21:59:45.3616534Z k_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.k) 2025-08-14T21:59:45.3616537Z 2025-08-14T21:59:45.3616644Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3616850Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3616925Z return mod(**inputs) 2025-08-14T21:59:45.3617187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3617275Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3617545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3617616Z outputs = layer_module( 2025-08-14T21:59:45.3617884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3617964Z outputs = self.rel_attn( 2025-08-14T21:59:45.3618227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3618308Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3618571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 263, in rel_attn_core 2025-08-14T21:59:45.3618706Z ac = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_w_bias, k_head_h) 2025-08-14T21:59:45.3618711Z 2025-08-14T21:59:45.3618814Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3619006Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3619077Z return mod(**inputs) 2025-08-14T21:59:45.3619328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3619408Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3619666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3619731Z outputs = layer_module( 2025-08-14T21:59:45.3619987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3620056Z outputs = self.rel_attn( 2025-08-14T21:59:45.3620305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 422, in forward 2025-08-14T21:59:45.3620484Z k_head_r = torch.einsum("ibh,hnd->ibnd", r.type(self.r.dtype), self.r) 2025-08-14T21:59:45.3620488Z 2025-08-14T21:59:45.3620588Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3620793Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3620857Z return mod(**inputs) 2025-08-14T21:59:45.3621107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3621197Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3621445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3621509Z outputs = layer_module( 2025-08-14T21:59:45.3621764Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3621834Z outputs = self.rel_attn( 2025-08-14T21:59:45.3622085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3622174Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3622437Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 266, in rel_attn_core 2025-08-14T21:59:45.3622569Z bd = torch.einsum("ibnd,jbnd->bnij", q_head + self.r_r_bias, k_head_r) 2025-08-14T21:59:45.3622573Z 2025-08-14T21:59:45.3622674Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3622873Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3622938Z return mod(**inputs) 2025-08-14T21:59:45.3623187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3623279Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3623533Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3623598Z outputs = layer_module( 2025-08-14T21:59:45.3623853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3623921Z outputs = self.rel_attn( 2025-08-14T21:59:45.3624200Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 418, in forward 2025-08-14T21:59:45.3624308Z v_head_h = torch.einsum("ibh,hnd->ibnd", cat, self.v) 2025-08-14T21:59:45.3624312Z 2025-08-14T21:59:45.3624417Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3624629Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3624699Z return mod(**inputs) 2025-08-14T21:59:45.3624963Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3625051Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3625302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3625375Z outputs = layer_module( 2025-08-14T21:59:45.3625624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3625691Z outputs = self.rel_attn( 2025-08-14T21:59:45.3625943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 425, in forward 2025-08-14T21:59:45.3626017Z attn_vec = self.rel_attn_core( 2025-08-14T21:59:45.3626317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 294, in rel_attn_core 2025-08-14T21:59:45.3626514Z attn_vec = torch.einsum("bnij,jbnd->ibnd", attn_prob, v_head_h) 2025-08-14T21:59:45.3626518Z 2025-08-14T21:59:45.3626638Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3626845Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3626910Z return mod(**inputs) 2025-08-14T21:59:45.3627165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3627256Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3627535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3627616Z outputs = layer_module( 2025-08-14T21:59:45.3627927Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3627998Z outputs = self.rel_attn( 2025-08-14T21:59:45.3628255Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3628346Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3628636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3628747Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3628750Z 2025-08-14T21:59:45.3628852Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3629056Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3629120Z return mod(**inputs) 2025-08-14T21:59:45.3629368Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1607, in forward 2025-08-14T21:59:45.3629459Z transformer_outputs = self.transformer( 2025-08-14T21:59:45.3629714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1368, in forward 2025-08-14T21:59:45.3629787Z outputs = layer_module( 2025-08-14T21:59:45.3630038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 494, in forward 2025-08-14T21:59:45.3630107Z outputs = self.rel_attn( 2025-08-14T21:59:45.3630362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 440, in forward 2025-08-14T21:59:45.3630466Z output_h = self.post_attention(h, attn_vec) 2025-08-14T21:59:45.3630735Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 304, in post_attention 2025-08-14T21:59:45.3630851Z attn_out = torch.einsum("ibnd,hnd->ibh", attn_vec, self.o) 2025-08-14T21:59:45.3630855Z 2025-08-14T21:59:45.3630934Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3631021Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3631099Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3631176Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3631261Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3631338Z cudagraph partition due to non gpu ops 2025-08-14T21:59:45.3631439Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T21:59:45.3631640Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T21:59:45.3631705Z return mod(**inputs) 2025-08-14T21:59:45.3631964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1630, in forward 2025-08-14T21:59:45.3632094Z loss = loss_fct(logits.view(-1, logits.size(-1)), labels.view(-1)) 2025-08-14T21:59:45.3632097Z 2025-08-14T21:59:59.1314754Z Compilation time (from dynamo_timed): 41.814896747 2025-08-14T21:59:59.1351137Z pass 2025-08-14T21:59:59.1351894Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T21:59:59.1353416Z TIMING: _recursive_pre_grad_passes:0.0801 _recursive_joint_graph_passes:1.53881 _recursive_post_grad_passes:0.29741 async_compile.wait:0.88753 code_gen:12.19346 inductor_compile:17.18699 backend_compile:34.28951 gc:0.00023 entire_frame_compile:41.8149 total_wall_time:41.8149 2025-08-14T21:59:59.1354469Z STATS: call_* op count: 820 | FakeTensorMode.__torch_dispatch__:99702 | FakeTensor.__torch_dispatch__:14466 | ProxyTorchDispatchMode.__torch_dispatch__:21786 2025-08-14T21:59:59.1354990Z Dynamo produced 1 graphs covering 820 ops with 0 graph breaks (0 unique) 2025-08-14T22:00:05.4709628Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/llvmlite/binding/ffi.py:175: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-08-14T22:00:05.4710641Z from pkg_resources import resource_filename 2025-08-14T22:00:06.1934246Z 2025-08-14T22:00:07.7592964Z loading model: 0it [00:00, ?it/s] 2025-08-14T22:00:07.7593520Z loading model: 0it [00:01, ?it/s] 2025-08-14T22:00:07.7594438Z cpu eval YituTechConvBert 2025-08-14T22:00:08.6926467Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T22:00:08.9524568Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T22:00:09.2944004Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T22:00:28.8799635Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8800014Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8800250Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8800471Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8800751Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8801019Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8801241Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8801452Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8801687Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8801907Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8802165Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8802933Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8803626Z return mod(**inputs) 2025-08-14T22:00:28.8804088Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8804549Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8805006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8805470Z hidden_states = self.encoder( 2025-08-14T22:00:28.8805906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8806350Z layer_outputs = layer_module( 2025-08-14T22:00:28.8806742Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8807169Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8807629Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8808109Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8808555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8809204Z self_outputs = self.self( 2025-08-14T22:00:28.8809765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.8810345Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.8810867Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.8811297Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.8811439Z 2025-08-14T22:00:28.8811555Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8811951Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8812310Z return mod(**inputs) 2025-08-14T22:00:28.8812751Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8813189Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8813638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8814087Z hidden_states = self.encoder( 2025-08-14T22:00:28.8814500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8814966Z layer_outputs = layer_module( 2025-08-14T22:00:28.8815346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8815734Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8816150Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8816580Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8817006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8817424Z self_outputs = self.self( 2025-08-14T22:00:28.8817828Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.8818341Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.8818868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.8819305Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.8819446Z 2025-08-14T22:00:28.8819590Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8819988Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8820368Z return mod(**inputs) 2025-08-14T22:00:28.8820777Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8821217Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8821666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8822103Z hidden_states = self.encoder( 2025-08-14T22:00:28.8822525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8822959Z layer_outputs = layer_module( 2025-08-14T22:00:28.8823339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8823728Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8824164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8824609Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8825072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8825520Z self_outputs = self.self( 2025-08-14T22:00:28.8825943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.8826466Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.8826989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 283, in forward 2025-08-14T22:00:28.8827416Z x = self.pointwise(x) 2025-08-14T22:00:28.8827543Z 2025-08-14T22:00:28.8827657Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8828049Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8828416Z return mod(**inputs) 2025-08-14T22:00:28.8828826Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8829273Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8829715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8830168Z hidden_states = self.encoder( 2025-08-14T22:00:28.8830605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8831040Z layer_outputs = layer_module( 2025-08-14T22:00:28.8831413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8831796Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8832232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8832677Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8833107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8833534Z self_outputs = self.self( 2025-08-14T22:00:28.8833953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.8834439Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.8834624Z 2025-08-14T22:00:28.8834736Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8835150Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8835504Z return mod(**inputs) 2025-08-14T22:00:28.8835914Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8836351Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8836780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8837196Z hidden_states = self.encoder( 2025-08-14T22:00:28.8837602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8838017Z layer_outputs = layer_module( 2025-08-14T22:00:28.8838381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8838759Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8839175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8839615Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8840049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8840529Z self_outputs = self.self( 2025-08-14T22:00:28.8840940Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.8841431Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.8841622Z 2025-08-14T22:00:28.8841744Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8842140Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8842509Z return mod(**inputs) 2025-08-14T22:00:28.8842917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8843358Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8843784Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8844226Z hidden_states = self.encoder( 2025-08-14T22:00:28.8844661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8845109Z layer_outputs = layer_module( 2025-08-14T22:00:28.8845481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8845869Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8846305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8846738Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8847177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8847606Z self_outputs = self.self( 2025-08-14T22:00:28.8848025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 380, in forward 2025-08-14T22:00:28.8848510Z conv_out_layer = torch.matmul(conv_out_layer, conv_kernel_layer) 2025-08-14T22:00:28.8848715Z 2025-08-14T22:00:28.8848808Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8849142Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8849401Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8849796Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8850176Z return mod(**inputs) 2025-08-14T22:00:28.8850592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8851029Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8851486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8851913Z hidden_states = self.encoder( 2025-08-14T22:00:28.8852318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8852742Z layer_outputs = layer_module( 2025-08-14T22:00:28.8853114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8853500Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8853928Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8854371Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8854805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8855219Z self_outputs = self.self( 2025-08-14T22:00:28.8855687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 405, in forward 2025-08-14T22:00:28.8856164Z context_layer = torch.cat([context_layer, conv_out], 2) 2025-08-14T22:00:28.8856341Z 2025-08-14T22:00:28.8856439Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8856665Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8856895Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8857119Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8857333Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8857559Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8857783Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8858004Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8858218Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8858440Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8858659Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8858874Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8859096Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8859317Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8859530Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8859777Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8860029Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8860412Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8860764Z return mod(**inputs) 2025-08-14T22:00:28.8861176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8861622Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8862061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8862499Z hidden_states = self.encoder( 2025-08-14T22:00:28.8862930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8863365Z layer_outputs = layer_module( 2025-08-14T22:00:28.8863733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8864124Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8864561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8865017Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8865463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8865898Z self_outputs = self.self( 2025-08-14T22:00:28.8866326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.8866853Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.8867379Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.8867818Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.8867958Z 2025-08-14T22:00:28.8868079Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8868465Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8868823Z return mod(**inputs) 2025-08-14T22:00:28.8869234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8869671Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8870117Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8870623Z hidden_states = self.encoder( 2025-08-14T22:00:28.8871047Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8871473Z layer_outputs = layer_module( 2025-08-14T22:00:28.8871837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8872227Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8872659Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8873270Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8873727Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8874150Z self_outputs = self.self( 2025-08-14T22:00:28.8874555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.8875070Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.8875624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.8876072Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.8876218Z 2025-08-14T22:00:28.8876338Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8876742Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8877109Z return mod(**inputs) 2025-08-14T22:00:28.8877506Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8877956Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8878405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8878843Z hidden_states = self.encoder( 2025-08-14T22:00:28.8879262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8879702Z layer_outputs = layer_module( 2025-08-14T22:00:28.8880083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8880503Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8880932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8881375Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8881812Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8882237Z self_outputs = self.self( 2025-08-14T22:00:28.8882651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.8883174Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.8883691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 283, in forward 2025-08-14T22:00:28.8884160Z x = self.pointwise(x) 2025-08-14T22:00:28.8884289Z 2025-08-14T22:00:28.8884404Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8885081Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8885430Z return mod(**inputs) 2025-08-14T22:00:28.8885830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8886322Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8886766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8887191Z hidden_states = self.encoder( 2025-08-14T22:00:28.8887612Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8888040Z layer_outputs = layer_module( 2025-08-14T22:00:28.8888413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8888795Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8889301Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8889745Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8890184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8890607Z self_outputs = self.self( 2025-08-14T22:00:28.8891020Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.8891525Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.8891710Z 2025-08-14T22:00:28.8891823Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8892219Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8892572Z return mod(**inputs) 2025-08-14T22:00:28.8892979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8893419Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8893862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8894300Z hidden_states = self.encoder( 2025-08-14T22:00:28.8894721Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8895147Z layer_outputs = layer_module( 2025-08-14T22:00:28.8895526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8895924Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8896373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8896819Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8897262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8897695Z self_outputs = self.self( 2025-08-14T22:00:28.8898109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.8898599Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.8898788Z 2025-08-14T22:00:28.8898908Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8899309Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8899666Z return mod(**inputs) 2025-08-14T22:00:28.8900079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8900524Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8900957Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8901414Z hidden_states = self.encoder( 2025-08-14T22:00:28.8901861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8902299Z layer_outputs = layer_module( 2025-08-14T22:00:28.8902913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8903319Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8903763Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8904205Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8904651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8905090Z self_outputs = self.self( 2025-08-14T22:00:28.8905509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 380, in forward 2025-08-14T22:00:28.8905998Z conv_out_layer = torch.matmul(conv_out_layer, conv_kernel_layer) 2025-08-14T22:00:28.8906203Z 2025-08-14T22:00:28.8906291Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8906603Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8906848Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8907234Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8907584Z return mod(**inputs) 2025-08-14T22:00:28.8907991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8908425Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8908862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8909293Z hidden_states = self.encoder( 2025-08-14T22:00:28.8909718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8910140Z layer_outputs = layer_module( 2025-08-14T22:00:28.8910515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8910911Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8911335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8911815Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8912256Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8912684Z self_outputs = self.self( 2025-08-14T22:00:28.8913093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 405, in forward 2025-08-14T22:00:28.8913572Z context_layer = torch.cat([context_layer, conv_out], 2) 2025-08-14T22:00:28.8913750Z 2025-08-14T22:00:28.8913844Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8914069Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8914296Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8914519Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8914742Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8914958Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8915182Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8915403Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8915615Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8915835Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8916056Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8916268Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8916550Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8916802Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8917014Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8917235Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8917492Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8917887Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8918233Z return mod(**inputs) 2025-08-14T22:00:28.8918647Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8919094Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8919525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8919959Z hidden_states = self.encoder( 2025-08-14T22:00:28.8920382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8920811Z layer_outputs = layer_module( 2025-08-14T22:00:28.8921173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8921591Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8922025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8922474Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8922909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8923341Z self_outputs = self.self( 2025-08-14T22:00:28.8923761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.8924285Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.8924817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.8925260Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.8925402Z 2025-08-14T22:00:28.8925523Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8925912Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8926263Z return mod(**inputs) 2025-08-14T22:00:28.8926697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8927144Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8927575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8928009Z hidden_states = self.encoder( 2025-08-14T22:00:28.8928436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8928944Z layer_outputs = layer_module( 2025-08-14T22:00:28.8929337Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8929735Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8930174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8930695Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8931138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8931570Z self_outputs = self.self( 2025-08-14T22:00:28.8931981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.8932562Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.8933086Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.8933524Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.8933663Z 2025-08-14T22:00:28.8933777Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8934170Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8934522Z return mod(**inputs) 2025-08-14T22:00:28.8934932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8935366Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8935809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8936247Z hidden_states = self.encoder( 2025-08-14T22:00:28.8936665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8937119Z layer_outputs = layer_module( 2025-08-14T22:00:28.8937503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8937887Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8938305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8938734Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8939161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8939586Z self_outputs = self.self( 2025-08-14T22:00:28.8939981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.8940499Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.8941021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 283, in forward 2025-08-14T22:00:28.8941433Z x = self.pointwise(x) 2025-08-14T22:00:28.8941559Z 2025-08-14T22:00:28.8941669Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8942070Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8942412Z return mod(**inputs) 2025-08-14T22:00:28.8942801Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8943228Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8943655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8944074Z hidden_states = self.encoder( 2025-08-14T22:00:28.8944477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8944894Z layer_outputs = layer_module( 2025-08-14T22:00:28.8945255Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8945628Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8946049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8946481Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8946905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8947369Z self_outputs = self.self( 2025-08-14T22:00:28.8947790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.8948278Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.8948465Z 2025-08-14T22:00:28.8948596Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8948970Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8949316Z return mod(**inputs) 2025-08-14T22:00:28.8949714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8950140Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8950571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8950999Z hidden_states = self.encoder( 2025-08-14T22:00:28.8951412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8951857Z layer_outputs = layer_module( 2025-08-14T22:00:28.8952225Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8952621Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8953045Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8953481Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8953915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8954336Z self_outputs = self.self( 2025-08-14T22:00:28.8954742Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.8955211Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.8955392Z 2025-08-14T22:00:28.8955512Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8955897Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8956234Z return mod(**inputs) 2025-08-14T22:00:28.8956658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8957092Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8957516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8957950Z hidden_states = self.encoder( 2025-08-14T22:00:28.8958382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8958824Z layer_outputs = layer_module( 2025-08-14T22:00:28.8959187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8959570Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8959997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8960423Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8960857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8961275Z self_outputs = self.self( 2025-08-14T22:00:28.8961680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 380, in forward 2025-08-14T22:00:28.8962172Z conv_out_layer = torch.matmul(conv_out_layer, conv_kernel_layer) 2025-08-14T22:00:28.8962385Z 2025-08-14T22:00:28.8962472Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8962701Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8962954Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8963334Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8963675Z return mod(**inputs) 2025-08-14T22:00:28.8964077Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8964506Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8964937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8965360Z hidden_states = self.encoder( 2025-08-14T22:00:28.8965775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8966193Z layer_outputs = layer_module( 2025-08-14T22:00:28.8966559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8966967Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8967397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8967842Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8968284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8968713Z self_outputs = self.self( 2025-08-14T22:00:28.8969219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 405, in forward 2025-08-14T22:00:28.8969702Z context_layer = torch.cat([context_layer, conv_out], 2) 2025-08-14T22:00:28.8969883Z 2025-08-14T22:00:28.8969980Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8970211Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8970432Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8970656Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8970886Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8971100Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8971320Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8971541Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8971799Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8972027Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8972250Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8972464Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8972687Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8972911Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8973134Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8973350Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.8973606Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8974001Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8974349Z return mod(**inputs) 2025-08-14T22:00:28.8974763Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8975213Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8975646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8976081Z hidden_states = self.encoder( 2025-08-14T22:00:28.8976502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8976962Z layer_outputs = layer_module( 2025-08-14T22:00:28.8977362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8977761Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8978201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8978638Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8979070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8979498Z self_outputs = self.self( 2025-08-14T22:00:28.8979917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.8980428Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.8980960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.8981396Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.8981555Z 2025-08-14T22:00:28.8981675Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8982067Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8982430Z return mod(**inputs) 2025-08-14T22:00:28.8982847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8983299Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8983717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8984146Z hidden_states = self.encoder( 2025-08-14T22:00:28.8984563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8984972Z layer_outputs = layer_module( 2025-08-14T22:00:28.8985336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8985717Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8986137Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8986557Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8987001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8987422Z self_outputs = self.self( 2025-08-14T22:00:28.8987817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.8988326Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.8988836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.8989281Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.8989420Z 2025-08-14T22:00:28.8989531Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8989923Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8990277Z return mod(**inputs) 2025-08-14T22:00:28.8990686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8991121Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8991561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.8992023Z hidden_states = self.encoder( 2025-08-14T22:00:28.8992458Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.8992892Z layer_outputs = layer_module( 2025-08-14T22:00:28.8993262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.8993667Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.8994096Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.8994548Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.8994987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.8995425Z self_outputs = self.self( 2025-08-14T22:00:28.8995829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.8996352Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.8996871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 283, in forward 2025-08-14T22:00:28.8997322Z x = self.pointwise(x) 2025-08-14T22:00:28.8997449Z 2025-08-14T22:00:28.8997560Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.8997950Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.8998300Z return mod(**inputs) 2025-08-14T22:00:28.8998697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.8999139Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.8999589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9000012Z hidden_states = self.encoder( 2025-08-14T22:00:28.9000415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9000834Z layer_outputs = layer_module( 2025-08-14T22:00:28.9001196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9001570Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9002018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9002450Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9003078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9003501Z self_outputs = self.self( 2025-08-14T22:00:28.9003908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9004375Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9004560Z 2025-08-14T22:00:28.9004679Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9005053Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9005396Z return mod(**inputs) 2025-08-14T22:00:28.9005796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9006220Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9006648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9007131Z hidden_states = self.encoder( 2025-08-14T22:00:28.9007572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9007984Z layer_outputs = layer_module( 2025-08-14T22:00:28.9008346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9008730Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9009211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9009661Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9010101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9010531Z self_outputs = self.self( 2025-08-14T22:00:28.9010940Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9011409Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9011589Z 2025-08-14T22:00:28.9011708Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9012129Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9012465Z return mod(**inputs) 2025-08-14T22:00:28.9012859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9013290Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9013708Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9014128Z hidden_states = self.encoder( 2025-08-14T22:00:28.9014534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9014952Z layer_outputs = layer_module( 2025-08-14T22:00:28.9015304Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9015681Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9016101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9016526Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9016980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9017401Z self_outputs = self.self( 2025-08-14T22:00:28.9017804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 380, in forward 2025-08-14T22:00:28.9018282Z conv_out_layer = torch.matmul(conv_out_layer, conv_kernel_layer) 2025-08-14T22:00:28.9018484Z 2025-08-14T22:00:28.9018573Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9018809Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9019058Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9019431Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9019775Z return mod(**inputs) 2025-08-14T22:00:28.9020181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9020638Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9021071Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9021514Z hidden_states = self.encoder( 2025-08-14T22:00:28.9021937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9022409Z layer_outputs = layer_module( 2025-08-14T22:00:28.9022783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9023184Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9023613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9024060Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9024513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9024956Z self_outputs = self.self( 2025-08-14T22:00:28.9025376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 405, in forward 2025-08-14T22:00:28.9025863Z context_layer = torch.cat([context_layer, conv_out], 2) 2025-08-14T22:00:28.9026042Z 2025-08-14T22:00:28.9026137Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9026370Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9026607Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9026833Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9027161Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9027376Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9027601Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9027825Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9028041Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9028270Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9028497Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9028713Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9028942Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9029168Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9029392Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9029612Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9029870Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9030267Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9030619Z return mod(**inputs) 2025-08-14T22:00:28.9031043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9031503Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9032762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9033213Z hidden_states = self.encoder( 2025-08-14T22:00:28.9033653Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9034095Z layer_outputs = layer_module( 2025-08-14T22:00:28.9034464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9034857Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9035308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9035751Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9036186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9036626Z self_outputs = self.self( 2025-08-14T22:00:28.9037041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9037593Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9038158Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.9038620Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.9038759Z 2025-08-14T22:00:28.9038880Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9039266Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9039625Z return mod(**inputs) 2025-08-14T22:00:28.9040025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9040461Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9040894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9041330Z hidden_states = self.encoder( 2025-08-14T22:00:28.9041756Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9042185Z layer_outputs = layer_module( 2025-08-14T22:00:28.9042562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9042983Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9043426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9043850Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9044279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9044705Z self_outputs = self.self( 2025-08-14T22:00:28.9045124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9045638Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9046166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.9046604Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.9046745Z 2025-08-14T22:00:28.9046859Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9047251Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9047605Z return mod(**inputs) 2025-08-14T22:00:28.9048038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9048477Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9049015Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9049465Z hidden_states = self.encoder( 2025-08-14T22:00:28.9049905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9050332Z layer_outputs = layer_module( 2025-08-14T22:00:28.9050717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9051105Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9051527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9051962Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9052391Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9052818Z self_outputs = self.self( 2025-08-14T22:00:28.9053224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9053781Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9054291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 283, in forward 2025-08-14T22:00:28.9054711Z x = self.pointwise(x) 2025-08-14T22:00:28.9054826Z 2025-08-14T22:00:28.9054932Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9055310Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9055658Z return mod(**inputs) 2025-08-14T22:00:28.9056048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9056481Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9056912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9057333Z hidden_states = self.encoder( 2025-08-14T22:00:28.9057741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9058186Z layer_outputs = layer_module( 2025-08-14T22:00:28.9058553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9058911Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9059317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9059728Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9060134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9060540Z self_outputs = self.self( 2025-08-14T22:00:28.9060954Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9061428Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9061608Z 2025-08-14T22:00:28.9061728Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9062106Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9062451Z return mod(**inputs) 2025-08-14T22:00:28.9062865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9063279Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9063697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9064129Z hidden_states = self.encoder( 2025-08-14T22:00:28.9064549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9064973Z layer_outputs = layer_module( 2025-08-14T22:00:28.9065342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9065733Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9066158Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9066602Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9067039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9067467Z self_outputs = self.self( 2025-08-14T22:00:28.9067874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9068366Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9068572Z 2025-08-14T22:00:28.9068682Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9069065Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9069405Z return mod(**inputs) 2025-08-14T22:00:28.9069803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9070240Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9070661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9071083Z hidden_states = self.encoder( 2025-08-14T22:00:28.9071498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9071922Z layer_outputs = layer_module( 2025-08-14T22:00:28.9072278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9072658Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9073107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9073534Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9073953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9074370Z self_outputs = self.self( 2025-08-14T22:00:28.9074771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 380, in forward 2025-08-14T22:00:28.9075239Z conv_out_layer = torch.matmul(conv_out_layer, conv_kernel_layer) 2025-08-14T22:00:28.9075435Z 2025-08-14T22:00:28.9075522Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9075750Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9075998Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9076364Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9076708Z return mod(**inputs) 2025-08-14T22:00:28.9077106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9077532Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9077982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9078406Z hidden_states = self.encoder( 2025-08-14T22:00:28.9078823Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9079239Z layer_outputs = layer_module( 2025-08-14T22:00:28.9079607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9079990Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9080408Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9080844Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9081277Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9081695Z self_outputs = self.self( 2025-08-14T22:00:28.9082095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 405, in forward 2025-08-14T22:00:28.9082555Z context_layer = torch.cat([context_layer, conv_out], 2) 2025-08-14T22:00:28.9082735Z 2025-08-14T22:00:28.9082820Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9083067Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9083309Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9083532Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9083750Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9083962Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9084180Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9084399Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9084610Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9084825Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9085045Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9085253Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9085469Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9085684Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9085900Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9086106Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9086359Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9086738Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9087075Z return mod(**inputs) 2025-08-14T22:00:28.9087508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9087958Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9088389Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9088800Z hidden_states = self.encoder( 2025-08-14T22:00:28.9089303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9089734Z layer_outputs = layer_module( 2025-08-14T22:00:28.9090102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9090503Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9090925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9091356Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9091771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9092201Z self_outputs = self.self( 2025-08-14T22:00:28.9092627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9093144Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9093655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.9094081Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.9094218Z 2025-08-14T22:00:28.9094335Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9094709Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9095057Z return mod(**inputs) 2025-08-14T22:00:28.9095456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9095893Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9096320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9096745Z hidden_states = self.encoder( 2025-08-14T22:00:28.9097161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9097606Z layer_outputs = layer_module( 2025-08-14T22:00:28.9098009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9098389Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9098820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9099247Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9099676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9100096Z self_outputs = self.self( 2025-08-14T22:00:28.9100501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9101027Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9101568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.9102005Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.9102146Z 2025-08-14T22:00:28.9102264Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9102865Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9103231Z return mod(**inputs) 2025-08-14T22:00:28.9103645Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9104090Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9104538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9104977Z hidden_states = self.encoder( 2025-08-14T22:00:28.9105405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9105837Z layer_outputs = layer_module( 2025-08-14T22:00:28.9106214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9106615Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9107043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9107494Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9107994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9108427Z self_outputs = self.self( 2025-08-14T22:00:28.9108835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9109352Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9109872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 283, in forward 2025-08-14T22:00:28.9110300Z x = self.pointwise(x) 2025-08-14T22:00:28.9110421Z 2025-08-14T22:00:28.9110533Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9110924Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9111279Z return mod(**inputs) 2025-08-14T22:00:28.9111679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9112126Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9112574Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9112994Z hidden_states = self.encoder( 2025-08-14T22:00:28.9113460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9113885Z layer_outputs = layer_module( 2025-08-14T22:00:28.9114252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9114626Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9115053Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9115486Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9115912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9116327Z self_outputs = self.self( 2025-08-14T22:00:28.9116732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9117201Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9117382Z 2025-08-14T22:00:28.9117497Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9117894Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9118238Z return mod(**inputs) 2025-08-14T22:00:28.9118631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9119051Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9119473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9119889Z hidden_states = self.encoder( 2025-08-14T22:00:28.9120302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9120714Z layer_outputs = layer_module( 2025-08-14T22:00:28.9121073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9121459Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9121880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9122303Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9122748Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9123168Z self_outputs = self.self( 2025-08-14T22:00:28.9123587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9124080Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9124274Z 2025-08-14T22:00:28.9124388Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9124789Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9125147Z return mod(**inputs) 2025-08-14T22:00:28.9125556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9125985Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9126405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9126841Z hidden_states = self.encoder( 2025-08-14T22:00:28.9127272Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9127711Z layer_outputs = layer_module( 2025-08-14T22:00:28.9128084Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9128524Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9129038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9129498Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9129934Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9130367Z self_outputs = self.self( 2025-08-14T22:00:28.9130790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 380, in forward 2025-08-14T22:00:28.9131282Z conv_out_layer = torch.matmul(conv_out_layer, conv_kernel_layer) 2025-08-14T22:00:28.9131482Z 2025-08-14T22:00:28.9131570Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9131803Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9132089Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9132478Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9132836Z return mod(**inputs) 2025-08-14T22:00:28.9133276Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9133723Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9134183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9134615Z hidden_states = self.encoder( 2025-08-14T22:00:28.9135035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9135456Z layer_outputs = layer_module( 2025-08-14T22:00:28.9135828Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9136223Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9136657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9137097Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9137532Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9137961Z self_outputs = self.self( 2025-08-14T22:00:28.9138406Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 405, in forward 2025-08-14T22:00:28.9138888Z context_layer = torch.cat([context_layer, conv_out], 2) 2025-08-14T22:00:28.9139073Z 2025-08-14T22:00:28.9139160Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9139389Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9139610Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9139833Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9140056Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9140270Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9140497Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9140721Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9140935Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9141158Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9141380Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9141604Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9141819Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9142037Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9142259Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9142473Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9142723Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9143166Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9143515Z return mod(**inputs) 2025-08-14T22:00:28.9143929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9144371Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9144812Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9145239Z hidden_states = self.encoder( 2025-08-14T22:00:28.9145676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9146091Z layer_outputs = layer_module( 2025-08-14T22:00:28.9146453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9146847Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9147284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9147725Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9148175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9148604Z self_outputs = self.self( 2025-08-14T22:00:28.9149021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9149540Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9150055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.9150490Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.9150630Z 2025-08-14T22:00:28.9150755Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9151136Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9151498Z return mod(**inputs) 2025-08-14T22:00:28.9151891Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9152320Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9152754Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9153177Z hidden_states = self.encoder( 2025-08-14T22:00:28.9153591Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9154013Z layer_outputs = layer_module( 2025-08-14T22:00:28.9154375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9154756Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9155180Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9155604Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9156036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9156461Z self_outputs = self.self( 2025-08-14T22:00:28.9156871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9157373Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9157910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.9158397Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.9158540Z 2025-08-14T22:00:28.9158660Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9159040Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9159392Z return mod(**inputs) 2025-08-14T22:00:28.9159810Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9160303Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9160765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9161193Z hidden_states = self.encoder( 2025-08-14T22:00:28.9161624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9162048Z layer_outputs = layer_module( 2025-08-14T22:00:28.9162419Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9162807Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9163258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9163713Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9164166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9164608Z self_outputs = self.self( 2025-08-14T22:00:28.9165032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9165553Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9166079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 283, in forward 2025-08-14T22:00:28.9166506Z x = self.pointwise(x) 2025-08-14T22:00:28.9166625Z 2025-08-14T22:00:28.9166740Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9167127Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9167480Z return mod(**inputs) 2025-08-14T22:00:28.9167907Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9168348Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9168787Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9169312Z hidden_states = self.encoder( 2025-08-14T22:00:28.9169738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9170170Z layer_outputs = layer_module( 2025-08-14T22:00:28.9170552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9170954Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9171394Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9171850Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9172297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9172732Z self_outputs = self.self( 2025-08-14T22:00:28.9173160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9173674Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9173881Z 2025-08-14T22:00:28.9174005Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9174403Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9174747Z return mod(**inputs) 2025-08-14T22:00:28.9175141Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9175564Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9175993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9176409Z hidden_states = self.encoder( 2025-08-14T22:00:28.9176825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9177238Z layer_outputs = layer_module( 2025-08-14T22:00:28.9177615Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9178004Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9178454Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9178915Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9179344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9179760Z self_outputs = self.self( 2025-08-14T22:00:28.9180153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9180679Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9180864Z 2025-08-14T22:00:28.9180974Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9181353Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9181686Z return mod(**inputs) 2025-08-14T22:00:28.9182081Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9182510Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9182937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9183369Z hidden_states = self.encoder( 2025-08-14T22:00:28.9183779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9184190Z layer_outputs = layer_module( 2025-08-14T22:00:28.9184540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9184918Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9185346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9185771Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9186184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9186597Z self_outputs = self.self( 2025-08-14T22:00:28.9187000Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 380, in forward 2025-08-14T22:00:28.9187465Z conv_out_layer = torch.matmul(conv_out_layer, conv_kernel_layer) 2025-08-14T22:00:28.9187660Z 2025-08-14T22:00:28.9187749Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9187977Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9188225Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9188627Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9188973Z return mod(**inputs) 2025-08-14T22:00:28.9189373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9189805Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9190256Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9190676Z hidden_states = self.encoder( 2025-08-14T22:00:28.9191086Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9191495Z layer_outputs = layer_module( 2025-08-14T22:00:28.9191857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9192236Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9192657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9193140Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9193573Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9194006Z self_outputs = self.self( 2025-08-14T22:00:28.9194419Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 405, in forward 2025-08-14T22:00:28.9194880Z context_layer = torch.cat([context_layer, conv_out], 2) 2025-08-14T22:00:28.9195061Z 2025-08-14T22:00:28.9195147Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9195374Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9195587Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9195809Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9196029Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9196239Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9196457Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9196676Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9196885Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9197098Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9197316Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9197531Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9197767Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9197983Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9198202Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9198410Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9198655Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9199040Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9199382Z return mod(**inputs) 2025-08-14T22:00:28.9199785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9200226Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9200660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9201099Z hidden_states = self.encoder( 2025-08-14T22:00:28.9201595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9202016Z layer_outputs = layer_module( 2025-08-14T22:00:28.9202374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9202975Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9203507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9203954Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9204404Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9204492Z self_outputs = self.self( 2025-08-14T22:00:28.9204780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9204959Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9205249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.9205332Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.9205337Z 2025-08-14T22:00:28.9205459Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9205677Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9205753Z return mod(**inputs) 2025-08-14T22:00:28.9206049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9206166Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9206458Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9206540Z hidden_states = self.encoder( 2025-08-14T22:00:28.9206822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9206908Z layer_outputs = layer_module( 2025-08-14T22:00:28.9207145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9207240Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9207524Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9207613Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9207907Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9207983Z self_outputs = self.self( 2025-08-14T22:00:28.9208295Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9208472Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9208757Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.9208893Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.9208901Z 2025-08-14T22:00:28.9209022Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9209236Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9209317Z return mod(**inputs) 2025-08-14T22:00:28.9209601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9209697Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9209982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9210060Z hidden_states = self.encoder( 2025-08-14T22:00:28.9210348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9210424Z layer_outputs = layer_module( 2025-08-14T22:00:28.9210704Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9210798Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9211083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9211180Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9211466Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9211544Z self_outputs = self.self( 2025-08-14T22:00:28.9211839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9212007Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9212301Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 283, in forward 2025-08-14T22:00:28.9212381Z x = self.pointwise(x) 2025-08-14T22:00:28.9212385Z 2025-08-14T22:00:28.9212497Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9212737Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9212809Z return mod(**inputs) 2025-08-14T22:00:28.9213091Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9213189Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9213474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9213558Z hidden_states = self.encoder( 2025-08-14T22:00:28.9213843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9213922Z layer_outputs = layer_module( 2025-08-14T22:00:28.9214170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9214253Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9214546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9214632Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9214932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9215016Z self_outputs = self.self( 2025-08-14T22:00:28.9215302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9215431Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9215437Z 2025-08-14T22:00:28.9215558Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9215770Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9215850Z return mod(**inputs) 2025-08-14T22:00:28.9216140Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9216227Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9216518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9216595Z hidden_states = self.encoder( 2025-08-14T22:00:28.9216880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9216964Z layer_outputs = layer_module( 2025-08-14T22:00:28.9217202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9217325Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9217610Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9217698Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9217991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9218065Z self_outputs = self.self( 2025-08-14T22:00:28.9218366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9218489Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9218493Z 2025-08-14T22:00:28.9218601Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9218815Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9218887Z return mod(**inputs) 2025-08-14T22:00:28.9219172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9219285Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9219569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9219652Z hidden_states = self.encoder( 2025-08-14T22:00:28.9219956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9220031Z layer_outputs = layer_module( 2025-08-14T22:00:28.9220276Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9220359Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9220655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9220741Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9221038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9221121Z self_outputs = self.self( 2025-08-14T22:00:28.9221403Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 380, in forward 2025-08-14T22:00:28.9221555Z conv_out_layer = torch.matmul(conv_out_layer, conv_kernel_layer) 2025-08-14T22:00:28.9221568Z 2025-08-14T22:00:28.9221655Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9221738Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9221853Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9222060Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9222132Z return mod(**inputs) 2025-08-14T22:00:28.9222429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9222520Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9222816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9222895Z hidden_states = self.encoder( 2025-08-14T22:00:28.9223200Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9223284Z layer_outputs = layer_module( 2025-08-14T22:00:28.9223526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9223610Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9223959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9224048Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9224351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9224425Z self_outputs = self.self( 2025-08-14T22:00:28.9224704Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 405, in forward 2025-08-14T22:00:28.9224829Z context_layer = torch.cat([context_layer, conv_out], 2) 2025-08-14T22:00:28.9224833Z 2025-08-14T22:00:28.9224916Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9224997Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9225085Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9225166Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9225253Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9225332Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9225413Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9225504Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9225616Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9225696Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9225788Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9225868Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9225948Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9226039Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9226121Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9226210Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9226320Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9226530Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9226612Z return mod(**inputs) 2025-08-14T22:00:28.9226899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9226988Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9227286Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9227364Z hidden_states = self.encoder( 2025-08-14T22:00:28.9227674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9227753Z layer_outputs = layer_module( 2025-08-14T22:00:28.9227993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9228086Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9228375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9228467Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9228762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9228838Z self_outputs = self.self( 2025-08-14T22:00:28.9229131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9229297Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9229588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.9229681Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.9229685Z 2025-08-14T22:00:28.9229796Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9230052Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9230125Z return mod(**inputs) 2025-08-14T22:00:28.9230411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9230508Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9230793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9230874Z hidden_states = self.encoder( 2025-08-14T22:00:28.9231169Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9231244Z layer_outputs = layer_module( 2025-08-14T22:00:28.9231487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9231569Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9231859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9231953Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9232262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9232345Z self_outputs = self.self( 2025-08-14T22:00:28.9232634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9232801Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9233096Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.9233178Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.9233184Z 2025-08-14T22:00:28.9233300Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9233514Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9233586Z return mod(**inputs) 2025-08-14T22:00:28.9233884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9233969Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9234258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9234362Z hidden_states = self.encoder( 2025-08-14T22:00:28.9234650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9234733Z layer_outputs = layer_module( 2025-08-14T22:00:28.9234972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9235059Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9235352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9235441Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9235728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9235812Z self_outputs = self.self( 2025-08-14T22:00:28.9236098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9236273Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9236562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 283, in forward 2025-08-14T22:00:28.9236673Z x = self.pointwise(x) 2025-08-14T22:00:28.9236677Z 2025-08-14T22:00:28.9236814Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9237029Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9237111Z return mod(**inputs) 2025-08-14T22:00:28.9237402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9237489Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9237786Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9237864Z hidden_states = self.encoder( 2025-08-14T22:00:28.9238153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9238237Z layer_outputs = layer_module( 2025-08-14T22:00:28.9238480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9238572Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9238860Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9238971Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9239267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9239344Z self_outputs = self.self( 2025-08-14T22:00:28.9239641Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9239771Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9239775Z 2025-08-14T22:00:28.9239886Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9240117Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9240188Z return mod(**inputs) 2025-08-14T22:00:28.9240480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9240578Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9240876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9240984Z hidden_states = self.encoder( 2025-08-14T22:00:28.9241273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9241351Z layer_outputs = layer_module( 2025-08-14T22:00:28.9241594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9241680Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9241972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9242060Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9242345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9242428Z self_outputs = self.self( 2025-08-14T22:00:28.9242732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9242857Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9242861Z 2025-08-14T22:00:28.9242981Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9243193Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9243296Z return mod(**inputs) 2025-08-14T22:00:28.9243614Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9243705Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9244003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9244083Z hidden_states = self.encoder( 2025-08-14T22:00:28.9244390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9244468Z layer_outputs = layer_module( 2025-08-14T22:00:28.9244703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9244799Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9245103Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9245193Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9245486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9245586Z self_outputs = self.self( 2025-08-14T22:00:28.9245885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 380, in forward 2025-08-14T22:00:28.9246024Z conv_out_layer = torch.matmul(conv_out_layer, conv_kernel_layer) 2025-08-14T22:00:28.9246030Z 2025-08-14T22:00:28.9246116Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9246211Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9246323Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9246537Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9246618Z return mod(**inputs) 2025-08-14T22:00:28.9246913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9247008Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9247298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9247375Z hidden_states = self.encoder( 2025-08-14T22:00:28.9247693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9247772Z layer_outputs = layer_module( 2025-08-14T22:00:28.9248023Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9248107Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9248413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9248510Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9248798Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9248960Z self_outputs = self.self( 2025-08-14T22:00:28.9249270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 405, in forward 2025-08-14T22:00:28.9249391Z context_layer = torch.cat([context_layer, conv_out], 2) 2025-08-14T22:00:28.9249399Z 2025-08-14T22:00:28.9249502Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9249588Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9249673Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9249764Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9249846Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9249957Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9250078Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9250164Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9250246Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9250337Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9250420Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9250511Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9250592Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9250673Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9250765Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9250847Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9250958Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9251191Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9251260Z return mod(**inputs) 2025-08-14T22:00:28.9251553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9251638Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9251920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9252029Z hidden_states = self.encoder( 2025-08-14T22:00:28.9252309Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9252384Z layer_outputs = layer_module( 2025-08-14T22:00:28.9252629Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9252711Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9252997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9253085Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9253368Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9253452Z self_outputs = self.self( 2025-08-14T22:00:28.9253732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9253894Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9254208Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.9254292Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.9254296Z 2025-08-14T22:00:28.9254414Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9254622Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9254694Z return mod(**inputs) 2025-08-14T22:00:28.9254985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9255070Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9255361Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9255436Z hidden_states = self.encoder( 2025-08-14T22:00:28.9255717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9255799Z layer_outputs = layer_module( 2025-08-14T22:00:28.9256032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9256113Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9256420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9256526Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9256810Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9256886Z self_outputs = self.self( 2025-08-14T22:00:28.9257170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9257346Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9257640Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.9257727Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.9257730Z 2025-08-14T22:00:28.9257838Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9258049Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9258127Z return mod(**inputs) 2025-08-14T22:00:28.9258406Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9258515Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9258809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9258885Z hidden_states = self.encoder( 2025-08-14T22:00:28.9259173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9259249Z layer_outputs = layer_module( 2025-08-14T22:00:28.9259481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9259572Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9259850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9259943Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9260225Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9260298Z self_outputs = self.self( 2025-08-14T22:00:28.9260603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9260767Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9261044Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 283, in forward 2025-08-14T22:00:28.9261126Z x = self.pointwise(x) 2025-08-14T22:00:28.9261132Z 2025-08-14T22:00:28.9261240Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9261460Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9261533Z return mod(**inputs) 2025-08-14T22:00:28.9261820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9261916Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9262201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9262284Z hidden_states = self.encoder( 2025-08-14T22:00:28.9262569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9262656Z layer_outputs = layer_module( 2025-08-14T22:00:28.9262896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9263013Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9263291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9263385Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9263660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9263740Z self_outputs = self.self( 2025-08-14T22:00:28.9264016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9264140Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9264143Z 2025-08-14T22:00:28.9264258Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9264464Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9264543Z return mod(**inputs) 2025-08-14T22:00:28.9264823Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9264926Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9265214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9265289Z hidden_states = self.encoder( 2025-08-14T22:00:28.9265568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9265651Z layer_outputs = layer_module( 2025-08-14T22:00:28.9265881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9265969Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9266254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9266339Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9266631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9266704Z self_outputs = self.self( 2025-08-14T22:00:28.9266988Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9267128Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9267132Z 2025-08-14T22:00:28.9267241Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9267455Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9267525Z return mod(**inputs) 2025-08-14T22:00:28.9267822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9267918Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9268207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9268293Z hidden_states = self.encoder( 2025-08-14T22:00:28.9268581Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9268659Z layer_outputs = layer_module( 2025-08-14T22:00:28.9268909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9268992Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9269290Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9269392Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9269697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9269783Z self_outputs = self.self( 2025-08-14T22:00:28.9270120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 380, in forward 2025-08-14T22:00:28.9270258Z conv_out_layer = torch.matmul(conv_out_layer, conv_kernel_layer) 2025-08-14T22:00:28.9270262Z 2025-08-14T22:00:28.9270358Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9270444Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9270563Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9270779Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9270852Z return mod(**inputs) 2025-08-14T22:00:28.9271158Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9271247Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9271533Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9271635Z hidden_states = self.encoder( 2025-08-14T22:00:28.9271921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9272004Z layer_outputs = layer_module( 2025-08-14T22:00:28.9272242Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9272325Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9272621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9272710Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9273005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9273082Z self_outputs = self.self( 2025-08-14T22:00:28.9273369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 405, in forward 2025-08-14T22:00:28.9273494Z context_layer = torch.cat([context_layer, conv_out], 2) 2025-08-14T22:00:28.9273498Z 2025-08-14T22:00:28.9273583Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9273685Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9273777Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9273858Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9273948Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9274029Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9274112Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9274201Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9274282Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9274362Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9274454Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9274533Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9274614Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9274702Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9274782Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9274865Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9274988Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9275203Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9275283Z return mod(**inputs) 2025-08-14T22:00:28.9275572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9275709Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9276002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9276083Z hidden_states = self.encoder( 2025-08-14T22:00:28.9276374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9276451Z layer_outputs = layer_module( 2025-08-14T22:00:28.9276698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9276789Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9277074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9277163Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9277461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9277537Z self_outputs = self.self( 2025-08-14T22:00:28.9277831Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9278021Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9278383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.9278479Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.9278482Z 2025-08-14T22:00:28.9278594Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9278825Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9278898Z return mod(**inputs) 2025-08-14T22:00:28.9279196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9279290Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9279576Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9279656Z hidden_states = self.encoder( 2025-08-14T22:00:28.9279950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9280042Z layer_outputs = layer_module( 2025-08-14T22:00:28.9280289Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9280371Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9280655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9280754Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9281038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9281121Z self_outputs = self.self( 2025-08-14T22:00:28.9281405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9281567Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9281861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.9281944Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.9281947Z 2025-08-14T22:00:28.9282056Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9282275Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9282386Z return mod(**inputs) 2025-08-14T22:00:28.9282679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9282769Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9283055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9283141Z hidden_states = self.encoder( 2025-08-14T22:00:28.9283426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9283510Z layer_outputs = layer_module( 2025-08-14T22:00:28.9283746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9283830Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9284126Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9284213Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9284498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9284605Z self_outputs = self.self( 2025-08-14T22:00:28.9284895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9285064Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9285343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 283, in forward 2025-08-14T22:00:28.9285418Z x = self.pointwise(x) 2025-08-14T22:00:28.9285422Z 2025-08-14T22:00:28.9285539Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9285749Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9285827Z return mod(**inputs) 2025-08-14T22:00:28.9286109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9286199Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9286497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9286577Z hidden_states = self.encoder( 2025-08-14T22:00:28.9286887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9286974Z layer_outputs = layer_module( 2025-08-14T22:00:28.9287211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9287304Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9287587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9287674Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9287964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9288038Z self_outputs = self.self( 2025-08-14T22:00:28.9288320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9288455Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9288458Z 2025-08-14T22:00:28.9288567Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9288786Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9288960Z return mod(**inputs) 2025-08-14T22:00:28.9289286Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9289390Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9289678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9289766Z hidden_states = self.encoder( 2025-08-14T22:00:28.9290050Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9290128Z layer_outputs = layer_module( 2025-08-14T22:00:28.9290374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9290456Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9290741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9290838Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9291123Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9291233Z self_outputs = self.self( 2025-08-14T22:00:28.9291521Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9291648Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9291654Z 2025-08-14T22:00:28.9291773Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9291988Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9292070Z return mod(**inputs) 2025-08-14T22:00:28.9292356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9292446Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9292741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9292820Z hidden_states = self.encoder( 2025-08-14T22:00:28.9293106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9293191Z layer_outputs = layer_module( 2025-08-14T22:00:28.9293452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9293544Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9293827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9293912Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9294208Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9294283Z self_outputs = self.self( 2025-08-14T22:00:28.9294577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 380, in forward 2025-08-14T22:00:28.9294714Z conv_out_layer = torch.matmul(conv_out_layer, conv_kernel_layer) 2025-08-14T22:00:28.9294718Z 2025-08-14T22:00:28.9294802Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9294895Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9295007Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9295221Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9295300Z return mod(**inputs) 2025-08-14T22:00:28.9295588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9295742Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9296031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9296110Z hidden_states = self.encoder( 2025-08-14T22:00:28.9296409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9296486Z layer_outputs = layer_module( 2025-08-14T22:00:28.9296725Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9296815Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9297099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9297192Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9297480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9297556Z self_outputs = self.self( 2025-08-14T22:00:28.9297850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 405, in forward 2025-08-14T22:00:28.9297999Z context_layer = torch.cat([context_layer, conv_out], 2) 2025-08-14T22:00:28.9298003Z 2025-08-14T22:00:28.9298090Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9298172Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9298254Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9298341Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9298421Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9298499Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9298588Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9298668Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9298751Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9298837Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9298916Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9299000Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9299081Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9299159Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9299245Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9299323Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9299462Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9299679Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9299749Z return mod(**inputs) 2025-08-14T22:00:28.9300027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9300121Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9300399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9300485Z hidden_states = self.encoder( 2025-08-14T22:00:28.9300765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9300841Z layer_outputs = layer_module( 2025-08-14T22:00:28.9301080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9301163Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9301449Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9301533Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9301808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9301960Z self_outputs = self.self( 2025-08-14T22:00:28.9302242Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9302407Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9302862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.9302948Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.9302955Z 2025-08-14T22:00:28.9303087Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9303297Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9303369Z return mod(**inputs) 2025-08-14T22:00:28.9303655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9303745Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9304037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9304166Z hidden_states = self.encoder( 2025-08-14T22:00:28.9304445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9304527Z layer_outputs = layer_module( 2025-08-14T22:00:28.9304759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9304841Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9305130Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9305215Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9305504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9305579Z self_outputs = self.self( 2025-08-14T22:00:28.9305861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9306031Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9306339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 282, in forward 2025-08-14T22:00:28.9306430Z x = self.depthwise(hidden_states) 2025-08-14T22:00:28.9306434Z 2025-08-14T22:00:28.9306545Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9306753Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9306835Z return mod(**inputs) 2025-08-14T22:00:28.9307115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9307203Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9307495Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9307572Z hidden_states = self.encoder( 2025-08-14T22:00:28.9307857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9307933Z layer_outputs = layer_module( 2025-08-14T22:00:28.9308165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9308254Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9308532Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9308692Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9308972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9309045Z self_outputs = self.self( 2025-08-14T22:00:28.9309330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 347, in forward 2025-08-14T22:00:28.9309489Z mixed_key_conv_attn_layer = self.key_conv_attn_layer(hidden_states.transpose(1, 2)) 2025-08-14T22:00:28.9309769Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 283, in forward 2025-08-14T22:00:28.9309852Z x = self.pointwise(x) 2025-08-14T22:00:28.9309855Z 2025-08-14T22:00:28.9309962Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9310173Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9310244Z return mod(**inputs) 2025-08-14T22:00:28.9310520Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9310632Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9310910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9310991Z hidden_states = self.encoder( 2025-08-14T22:00:28.9311269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9311342Z layer_outputs = layer_module( 2025-08-14T22:00:28.9311579Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9311659Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9311939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9312032Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9312307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9312388Z self_outputs = self.self( 2025-08-14T22:00:28.9312662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9312806Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9312811Z 2025-08-14T22:00:28.9312926Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9313136Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9313213Z return mod(**inputs) 2025-08-14T22:00:28.9313498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9313584Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9313868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9313946Z hidden_states = self.encoder( 2025-08-14T22:00:28.9314223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9314305Z layer_outputs = layer_module( 2025-08-14T22:00:28.9314540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9314629Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9314908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9315015Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9315322Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9315399Z self_outputs = self.self( 2025-08-14T22:00:28.9315714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 362, in forward 2025-08-14T22:00:28.9315846Z conv_kernel_layer = self.conv_kernel_layer(conv_attn_layer) 2025-08-14T22:00:28.9315850Z 2025-08-14T22:00:28.9315960Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9316175Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9316246Z return mod(**inputs) 2025-08-14T22:00:28.9316526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9316623Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9316925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9317008Z hidden_states = self.encoder( 2025-08-14T22:00:28.9317303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9317377Z layer_outputs = layer_module( 2025-08-14T22:00:28.9317618Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9317700Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9317978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9318067Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9318333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9318416Z self_outputs = self.self( 2025-08-14T22:00:28.9318693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 380, in forward 2025-08-14T22:00:28.9318828Z conv_out_layer = torch.matmul(conv_out_layer, conv_kernel_layer) 2025-08-14T22:00:28.9318832Z 2025-08-14T22:00:28.9318922Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9319005Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9319136Z cudagraph partition due to non gpu ops. Found from : 2025-08-14T22:00:28.9319345Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 532, in forward_pass 2025-08-14T22:00:28.9319413Z return mod(**inputs) 2025-08-14T22:00:28.9319722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 925, in forward 2025-08-14T22:00:28.9319811Z generator_hidden_states = self.convbert( 2025-08-14T22:00:28.9320098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 853, in forward 2025-08-14T22:00:28.9320184Z hidden_states = self.encoder( 2025-08-14T22:00:28.9320479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 625, in forward 2025-08-14T22:00:28.9320562Z layer_outputs = layer_module( 2025-08-14T22:00:28.9320808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-08-14T22:00:28.9320902Z return super().__call__(*args, **kwargs) 2025-08-14T22:00:28.9321173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 561, in forward 2025-08-14T22:00:28.9321254Z self_attention_outputs = self.attention( 2025-08-14T22:00:28.9321516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 464, in forward 2025-08-14T22:00:28.9321628Z self_outputs = self.self( 2025-08-14T22:00:28.9321891Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/convbert/modeling_convbert.py", line 405, in forward 2025-08-14T22:00:28.9322010Z context_layer = torch.cat([context_layer, conv_out], 2) 2025-08-14T22:00:28.9322014Z 2025-08-14T22:00:28.9322093Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9322173Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9322257Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9322335Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9322410Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9322492Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9322566Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9322646Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9322721Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9322798Z cudagraph partition due to non gpu ops 2025-08-14T22:00:28.9322886Z cudagraph partition due to non gpu ops 2025-08-14T22:00:39.7748737Z Compilation time (from dynamo_timed): 28.991592365 2025-08-14T22:00:39.7788294Z pass 2025-08-14T22:00:39.7788723Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-08-14T22:00:39.7789610Z TIMING: _recursive_pre_grad_passes:0.06621 _recursive_joint_graph_passes:0.69789 _recursive_post_grad_passes:0.20261 async_compile.wait:0.6569 code_gen:10.82675 inductor_compile:13.2279 backend_compile:24.0053 gc:0.00041 entire_frame_compile:28.99159 total_wall_time:28.99159 2025-08-14T22:00:39.7790719Z STATS: call_* op count: 636 | FakeTensorMode.__torch_dispatch__:51000 | FakeTensor.__torch_dispatch__:5715 | ProxyTorchDispatchMode.__torch_dispatch__:12594 2025-08-14T22:00:39.7791275Z Dynamo produced 1 graphs covering 636 ops with 0 graph breaks (0 unique) 2025-08-14T22:00:42.3611701Z accuracy pass_rate=95.35% 2025-08-14T22:00:42.3621510Z calls_captured gmean=0.00x mean=612.116x 2025-08-14T22:00:42.3621828Z unique_graphs gmean=0.00x mean=1.140x 2025-08-14T22:00:42.3625110Z graph_breaks gmean=0.00x mean=0.140x 2025-08-14T22:00:42.3625435Z unique_graph_breaks gmean=0.00x mean=0.047x 2025-08-14T22:00:42.3628747Z autograd_captures gmean=0.00x mean=0.000x 2025-08-14T22:00:42.3634330Z autograd_compiles gmean=0.00x mean=0.000x 2025-08-14T22:00:42.3634636Z cudagraph_skips gmean=0.00x mean=1.093x 2025-08-14T22:00:42.3638155Z compilation_latency mean=24.948 seconds 2025-08-14T22:00:43.4908013Z + python benchmarks/dynamo/check_accuracy.py --actual /var/lib/jenkins/workspace/test/test-reports/inference_huggingface.csv --expected benchmarks/dynamo/ci_expected_accuracy/cpu_inductor_amp_freezing_huggingface_inference.csv 2025-08-14T22:00:43.8235600Z AlbertForMaskedLM PASS 2025-08-14T22:00:43.8235911Z AlbertForQuestionAnswering PASS 2025-08-14T22:00:43.8236181Z AllenaiLongformerBase PASS 2025-08-14T22:00:43.8239127Z BartForCausalLM PASS 2025-08-14T22:00:43.8241052Z BartForConditionalGeneration PASS 2025-08-14T22:00:43.8244896Z BertForMaskedLM PASS 2025-08-14T22:00:43.8248185Z BertForQuestionAnswering PASS 2025-08-14T22:00:43.8256593Z BlenderbotForCausalLM XFAIL 2025-08-14T22:00:43.8263634Z BlenderbotSmallForCausalLM PASS 2025-08-14T22:00:43.8264137Z BlenderbotSmallForConditionalGeneration PASS 2025-08-14T22:00:43.8264560Z CamemBert PASS 2025-08-14T22:00:43.8268816Z DebertaV2ForMaskedLM XFAIL 2025-08-14T22:00:43.8271306Z DebertaV2ForQuestionAnswering PASS 2025-08-14T22:00:43.8273472Z DistilBertForMaskedLM PASS 2025-08-14T22:00:43.8278534Z DistilBertForQuestionAnswering PASS 2025-08-14T22:00:43.8282644Z DistillGPT2 PASS 2025-08-14T22:00:43.8284787Z ElectraForCausalLM PASS 2025-08-14T22:00:43.8288661Z ElectraForQuestionAnswering PASS 2025-08-14T22:00:43.8302409Z GPT2ForSequenceClassification PASS 2025-08-14T22:00:43.8302975Z GoogleFnet PASS 2025-08-14T22:00:43.8307973Z LayoutLMForMaskedLM PASS 2025-08-14T22:00:43.8311796Z LayoutLMForSequenceClassification PASS 2025-08-14T22:00:43.8315191Z M2M100ForConditionalGeneration PASS 2025-08-14T22:00:43.8315488Z MBartForCausalLM PASS 2025-08-14T22:00:43.8323026Z MBartForConditionalGeneration PASS 2025-08-14T22:00:43.8323326Z MT5ForConditionalGeneration PASS 2025-08-14T22:00:43.8326530Z MegatronBertForCausalLM PASS 2025-08-14T22:00:43.8326800Z MegatronBertForQuestionAnswering PASS 2025-08-14T22:00:43.8335850Z MobileBertForMaskedLM PASS 2025-08-14T22:00:43.8336184Z MobileBertForQuestionAnswering PASS 2025-08-14T22:00:43.8343480Z OPTForCausalLM PASS 2025-08-14T22:00:43.8348456Z PLBartForCausalLM PASS 2025-08-14T22:00:43.8348767Z PLBartForConditionalGeneration PASS 2025-08-14T22:00:43.8349024Z PegasusForCausalLM PASS 2025-08-14T22:00:43.8354770Z PegasusForConditionalGeneration PASS 2025-08-14T22:00:43.8360705Z RobertaForCausalLM PASS 2025-08-14T22:00:43.8360997Z RobertaForQuestionAnswering PASS 2025-08-14T22:00:43.8365200Z T5ForConditionalGeneration PASS 2025-08-14T22:00:43.8365573Z T5Small PASS 2025-08-14T22:00:43.8375286Z TrOCRForCausalLM PASS 2025-08-14T22:00:43.8380585Z XGLMForCausalLM PASS 2025-08-14T22:00:43.8386408Z XLNetLMHeadModel PASS 2025-08-14T22:00:43.8386685Z YituTechConvBert PASS 2025-08-14T22:00:43.8955422Z + python benchmarks/dynamo/check_graph_breaks.py --actual /var/lib/jenkins/workspace/test/test-reports/inference_huggingface.csv --expected benchmarks/dynamo/ci_expected_accuracy/cpu_inductor_amp_freezing_huggingface_inference.csv 2025-08-14T22:00:44.2235209Z AlbertForMaskedLM PASS 2025-08-14T22:00:44.2235537Z AlbertForQuestionAnswering PASS 2025-08-14T22:00:44.2237984Z AllenaiLongformerBase PASS 2025-08-14T22:00:44.2244310Z BartForCausalLM PASS 2025-08-14T22:00:44.2244587Z BartForConditionalGeneration PASS 2025-08-14T22:00:44.2248470Z BertForMaskedLM PASS 2025-08-14T22:00:44.2253912Z BertForQuestionAnswering PASS 2025-08-14T22:00:44.2263883Z BlenderbotForCausalLM PASS 2025-08-14T22:00:44.2264565Z BlenderbotSmallForCausalLM PASS 2025-08-14T22:00:44.2264988Z BlenderbotSmallForConditionalGeneration PASS 2025-08-14T22:00:44.2265340Z CamemBert PASS 2025-08-14T22:00:44.2271430Z DebertaV2ForMaskedLM PASS 2025-08-14T22:00:44.2271759Z DebertaV2ForQuestionAnswering PASS 2025-08-14T22:00:44.2274731Z DistilBertForMaskedLM PASS 2025-08-14T22:00:44.2277854Z DistilBertForQuestionAnswering PASS 2025-08-14T22:00:44.2283258Z DistillGPT2 PASS 2025-08-14T22:00:44.2285184Z ElectraForCausalLM PASS 2025-08-14T22:00:44.2288241Z ElectraForQuestionAnswering PASS 2025-08-14T22:00:44.2294590Z GPT2ForSequenceClassification PASS 2025-08-14T22:00:44.2300032Z GoogleFnet PASS 2025-08-14T22:00:44.2302074Z LayoutLMForMaskedLM PASS 2025-08-14T22:00:44.2307352Z LayoutLMForSequenceClassification PASS 2025-08-14T22:00:44.2308000Z M2M100ForConditionalGeneration PASS 2025-08-14T22:00:44.2313419Z MBartForCausalLM PASS 2025-08-14T22:00:44.2317256Z MBartForConditionalGeneration PASS 2025-08-14T22:00:44.2317586Z MT5ForConditionalGeneration PASS 2025-08-14T22:00:44.2320691Z MegatronBertForCausalLM PASS 2025-08-14T22:00:44.2323412Z MegatronBertForQuestionAnswering PASS 2025-08-14T22:00:44.2327205Z MobileBertForMaskedLM PASS 2025-08-14T22:00:44.2337998Z MobileBertForQuestionAnswering PASS 2025-08-14T22:00:44.2338271Z OPTForCausalLM PASS 2025-08-14T22:00:44.2342969Z PLBartForCausalLM PASS 2025-08-14T22:00:44.2343216Z PLBartForConditionalGeneration PASS 2025-08-14T22:00:44.2345895Z PegasusForCausalLM PASS 2025-08-14T22:00:44.2350578Z PegasusForConditionalGeneration PASS 2025-08-14T22:00:44.2351206Z RobertaForCausalLM PASS 2025-08-14T22:00:44.2356222Z RobertaForQuestionAnswering PASS 2025-08-14T22:00:44.2363822Z T5ForConditionalGeneration PASS 2025-08-14T22:00:44.2364112Z T5Small PASS 2025-08-14T22:00:44.2364563Z TrOCRForCausalLM PASS 2025-08-14T22:00:44.2367855Z XGLMForCausalLM PASS_BUT_FLAKY 2025-08-14T22:00:44.2378772Z XLNetLMHeadModel PASS 2025-08-14T22:00:44.2379062Z YituTechConvBert PASS 2025-08-14T22:00:44.2956969Z + sccache_epilogue 2025-08-14T22:00:44.2957334Z + echo '::group::Sccache Compilation Log' 2025-08-14T22:00:44.2957882Z ##[group]Sccache Compilation Log 2025-08-14T22:00:44.2958145Z + echo '=================== sccache compilation log ===================' 2025-08-14T22:00:44.2958764Z =================== sccache compilation log =================== 2025-08-14T22:00:44.2959196Z + python /var/lib/jenkins/workspace/.ci/pytorch/print_sccache_log.py /var/lib/jenkins/sccache_error.log 2025-08-14T22:00:44.3204621Z + echo '=========== If your build fails, please take a look at the log above for possible reasons ===========' 2025-08-14T22:00:44.3205245Z =========== If your build fails, please take a look at the log above for possible reasons =========== 2025-08-14T22:00:44.3205589Z + sccache --show-stats 2025-08-14T22:00:44.3237154Z Compile requests 399 2025-08-14T22:00:44.3237436Z Compile requests executed 0 2025-08-14T22:00:44.3237666Z Cache hits 0 2025-08-14T22:00:44.3237915Z Cache misses 0 2025-08-14T22:00:44.3238149Z Cache hits rate - 2025-08-14T22:00:44.3238378Z Cache timeouts 0 2025-08-14T22:00:44.3238589Z Cache read errors 0 2025-08-14T22:00:44.3238807Z Forced recaches 0 2025-08-14T22:00:44.3239011Z Cache write errors 0 2025-08-14T22:00:44.3239221Z Cache errors 0 2025-08-14T22:00:44.3239428Z Compilations 0 2025-08-14T22:00:44.3239634Z Compilation failures 0 2025-08-14T22:00:44.3240131Z Non-cacheable compilations 0 2025-08-14T22:00:44.3240375Z Non-cacheable calls 41 2025-08-14T22:00:44.3240602Z Non-compilation calls 358 2025-08-14T22:00:44.3240827Z Unsupported compiler calls 0 2025-08-14T22:00:44.3241062Z Average cache write 0.000 s 2025-08-14T22:00:44.3241309Z Average compiler 0.000 s 2025-08-14T22:00:44.3241544Z Average cache read hit 0.000 s 2025-08-14T22:00:44.3241783Z Failed distributed compilations 0 2025-08-14T22:00:44.3241935Z 2025-08-14T22:00:44.3242029Z Non-cacheable reasons: 2025-08-14T22:00:44.3242235Z -E 41 2025-08-14T22:00:44.3242388Z 2025-08-14T22:00:44.3242568Z Cache location s3, name: ossci-compiler-cache-circleci-v2, prefix: / 2025-08-14T22:00:44.3242894Z Version (client) 0.10.0 2025-08-14T22:00:44.3243130Z + sccache --stop-server 2025-08-14T22:00:44.3263971Z Stopping sccache server... 2025-08-14T22:00:44.3264293Z Compile requests 399 2025-08-14T22:00:44.3264645Z Compile requests executed 0 2025-08-14T22:00:44.3264885Z Cache hits 0 2025-08-14T22:00:44.3265123Z Cache misses 0 2025-08-14T22:00:44.3265431Z Cache hits rate - 2025-08-14T22:00:44.3265852Z Cache timeouts 0 2025-08-14T22:00:44.3266117Z Cache read errors 0 2025-08-14T22:00:44.3266344Z Forced recaches 0 2025-08-14T22:00:44.3266562Z Cache write errors 0 2025-08-14T22:00:44.3266771Z Cache errors 0 2025-08-14T22:00:44.3266989Z Compilations 0 2025-08-14T22:00:44.3267206Z Compilation failures 0 2025-08-14T22:00:44.3267436Z Non-cacheable compilations 0 2025-08-14T22:00:44.3267656Z Non-cacheable calls 41 2025-08-14T22:00:44.3267877Z Non-compilation calls 358 2025-08-14T22:00:44.3268098Z Unsupported compiler calls 0 2025-08-14T22:00:44.3268319Z Average cache write 0.000 s 2025-08-14T22:00:44.3268550Z Average compiler 0.000 s 2025-08-14T22:00:44.3268777Z Average cache read hit 0.000 s 2025-08-14T22:00:44.3269116Z Failed distributed compilations 0 2025-08-14T22:00:44.3269280Z 2025-08-14T22:00:44.3269359Z Non-cacheable reasons: 2025-08-14T22:00:44.3269561Z -E 41 2025-08-14T22:00:44.3269697Z 2025-08-14T22:00:44.3269878Z Cache location s3, name: ossci-compiler-cache-circleci-v2, prefix: / 2025-08-14T22:00:44.3270240Z Version (client) 0.10.0 2025-08-14T22:00:44.3270516Z + echo ::endgroup:: 2025-08-14T22:00:44.3270992Z ##[endgroup] 2025-08-14T22:00:44.3271182Z + cleanup_workspace 2025-08-14T22:00:44.3271509Z + echo 'sudo may print the following warning message that can be ignored. The chown command will still run.' 2025-08-14T22:00:44.3272007Z sudo may print the following warning message that can be ignored. The chown command will still run. 2025-08-14T22:00:44.3272413Z + echo ' sudo: setrlimit(RLIMIT_STACK): Operation not permitted' 2025-08-14T22:00:44.3272726Z sudo: setrlimit(RLIMIT_STACK): Operation not permitted 2025-08-14T22:00:44.3273081Z + echo 'For more details refer to https://github.com/sudo-project/sudo/issues/42' 2025-08-14T22:00:44.3273473Z For more details refer to https://github.com/sudo-project/sudo/issues/42 2025-08-14T22:00:44.3273789Z + sudo chown -R 1000 /var/lib/jenkins/workspace 2025-08-14T22:00:44.7799552Z ##[group]Run pytorch/test-infra/.github/actions/upload-benchmark-results@main 2025-08-14T22:00:44.7799964Z with: 2025-08-14T22:00:44.7800203Z benchmark-results-dir: test/test-reports 2025-08-14T22:00:44.7800474Z dry-run: false 2025-08-14T22:00:44.7800696Z schema-version: v3 2025-08-14T22:00:44.7801169Z github-token: *** 2025-08-14T22:00:44.7801371Z env: 2025-08-14T22:00:44.7801577Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:44.7801974Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:44.7802381Z ##[endgroup] 2025-08-14T22:00:44.7817077Z ##[group]Run set -eux 2025-08-14T22:00:44.7817310Z set -eux 2025-08-14T22:00:44.7817587Z python3 -mpip install boto3==1.35.33 psutil==7.0.0 pynvml==12.0.0 2025-08-14T22:00:44.7817907Z  2025-08-14T22:00:44.7818085Z DEVICE_NAME="" 2025-08-14T22:00:44.7818283Z DEVICE_TYPE="" 2025-08-14T22:00:44.7818475Z  2025-08-14T22:00:44.7818710Z if command -v nvidia-smi; then 2025-08-14T22:00:44.7819043Z  # NB: I'm using PyTorch here to get the device name, however, it needs to 2025-08-14T22:00:44.7819439Z  # install the correct version of PyTorch manually for now. Any PyTorch 2025-08-14T22:00:44.7819814Z  # version is fine, I just use 2.7.1 to satify PYPIDEP linter 2025-08-14T22:00:44.7820128Z  python3 -mpip install torch==2.7.1 2025-08-14T22:00:44.7820388Z elif command -v rocminfo; then 2025-08-14T22:00:44.7820696Z  # NB: Installing torch on ROCm runner with pip here causes CI to fail 2025-08-14T22:00:44.7821087Z  # with a memoryview is too large error only on MI300 runners. Is pip 2025-08-14T22:00:44.7821484Z  # version on ROCm runner there too old? As a workaround, let's use the 2025-08-14T22:00:44.7821898Z  # GPU device name coming from rocminfo instead 2025-08-14T22:00:44.7822156Z  DEVICE_NAME=rocm 2025-08-14T22:00:44.7822497Z  DEVICE_TYPE=$(rocminfo | grep "Marketing Name" | tail -n1 | awk -F':' '{print $2}' | xargs) 2025-08-14T22:00:44.7822833Z fi 2025-08-14T22:00:44.7822993Z  2025-08-14T22:00:44.7823203Z echo "DEVICE_NAME=$DEVICE_NAME" >> $GITHUB_ENV 2025-08-14T22:00:44.7823506Z echo "DEVICE_TYPE=$DEVICE_TYPE" >> $GITHUB_ENV 2025-08-14T22:00:44.7832605Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:00:44.7832894Z env: 2025-08-14T22:00:44.7833098Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:44.7833455Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:44.7833826Z ##[endgroup] 2025-08-14T22:00:44.7868715Z + python3 -mpip install boto3==1.35.33 psutil==7.0.0 pynvml==12.0.0 2025-08-14T22:00:44.9867659Z Defaulting to user installation because normal site-packages is not writeable 2025-08-14T22:00:45.8143882Z Collecting boto3==1.35.33 2025-08-14T22:00:45.8295291Z Downloading boto3-1.35.33-py3-none-any.whl (139 kB) 2025-08-14T22:00:46.0735318Z Collecting psutil==7.0.0 2025-08-14T22:00:46.0832424Z Downloading psutil-7.0.0-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (277 kB) 2025-08-14T22:00:46.1102118Z Collecting pynvml==12.0.0 2025-08-14T22:00:46.1133405Z Downloading pynvml-12.0.0-py3-none-any.whl (26 kB) 2025-08-14T22:00:46.1552299Z Collecting s3transfer<0.11.0,>=0.10.0 2025-08-14T22:00:46.1589954Z Downloading s3transfer-0.10.4-py3-none-any.whl (83 kB) 2025-08-14T22:00:46.1644992Z Requirement already satisfied: jmespath<2.0.0,>=0.7.1 in /usr/lib/python3.9/site-packages (from boto3==1.35.33) (0.10.0) 2025-08-14T22:00:47.0347782Z Collecting botocore<1.36.0,>=1.35.33 2025-08-14T22:00:47.0386193Z Downloading botocore-1.35.99-py3-none-any.whl (13.3 MB) 2025-08-14T22:00:47.1757572Z Collecting nvidia-ml-py<13.0.0a0,>=12.0.0 2025-08-14T22:00:47.1792984Z Downloading nvidia_ml_py-12.575.51-py3-none-any.whl (47 kB) 2025-08-14T22:00:47.1882832Z Requirement already satisfied: urllib3<1.27,>=1.25.4 in /usr/lib/python3.9/site-packages (from botocore<1.36.0,>=1.35.33->boto3==1.35.33) (1.25.10) 2025-08-14T22:00:47.1886333Z Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in /usr/lib/python3.9/site-packages (from botocore<1.36.0,>=1.35.33->boto3==1.35.33) (2.8.1) 2025-08-14T22:00:47.3000631Z Requirement already satisfied: six>=1.5 in /usr/lib/python3.9/site-packages (from python-dateutil<3.0.0,>=2.1->botocore<1.36.0,>=1.35.33->boto3==1.35.33) (1.15.0) 2025-08-14T22:00:47.4235755Z Installing collected packages: botocore, s3transfer, nvidia-ml-py, pynvml, psutil, boto3 2025-08-14T22:00:47.8014593Z Attempting uninstall: nvidia-ml-py 2025-08-14T22:00:47.8015161Z Found existing installation: nvidia-ml-py 11.525.84 2025-08-14T22:00:47.8027924Z Uninstalling nvidia-ml-py-11.525.84: 2025-08-14T22:00:47.8173885Z Successfully uninstalled nvidia-ml-py-11.525.84 2025-08-14T22:00:47.8733740Z Attempting uninstall: psutil 2025-08-14T22:00:47.8734104Z Found existing installation: psutil 5.9.8 2025-08-14T22:00:47.8784784Z Uninstalling psutil-5.9.8: 2025-08-14T22:00:47.8790154Z Successfully uninstalled psutil-5.9.8 2025-08-14T22:00:48.0281662Z Successfully installed boto3-1.35.33 botocore-1.35.99 nvidia-ml-py-12.575.51 psutil-7.0.0 pynvml-12.0.0 s3transfer-0.10.4 2025-08-14T22:00:48.1543121Z + DEVICE_NAME= 2025-08-14T22:00:48.1543586Z + DEVICE_TYPE= 2025-08-14T22:00:48.1543925Z + command -v nvidia-smi 2025-08-14T22:00:48.1544139Z + command -v rocminfo 2025-08-14T22:00:48.1544346Z + echo DEVICE_NAME= 2025-08-14T22:00:48.1546454Z + echo DEVICE_TYPE= 2025-08-14T22:00:48.1562884Z ##[group]Run set -eux 2025-08-14T22:00:48.1563101Z set -eux 2025-08-14T22:00:48.1563267Z  2025-08-14T22:00:48.1563537Z if [[ -z "${GITHUB_TOKEN}" ]]; then 2025-08-14T22:00:48.1563799Z  echo "Missing github-token input" 2025-08-14T22:00:48.1564028Z  exit 1 2025-08-14T22:00:48.1564211Z fi 2025-08-14T22:00:48.1570410Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:00:48.1570688Z env: 2025-08-14T22:00:48.1570859Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:48.1571175Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:48.1571510Z DEVICE_NAME: 2025-08-14T22:00:48.1571677Z DEVICE_TYPE: 2025-08-14T22:00:48.1572084Z GITHUB_TOKEN: *** 2025-08-14T22:00:48.1572253Z ##[endgroup] 2025-08-14T22:00:48.1599248Z + [[ -z *** ]] 2025-08-14T22:00:48.1634595Z ##[group]Run pytorch/test-infra/.github/actions/get-workflow-job-id@main 2025-08-14T22:00:48.1634934Z with: 2025-08-14T22:00:48.1635276Z github-token: *** 2025-08-14T22:00:48.1635480Z env: 2025-08-14T22:00:48.1635667Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:48.1636047Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:48.1636432Z DEVICE_NAME: 2025-08-14T22:00:48.1636619Z DEVICE_TYPE: 2025-08-14T22:00:48.1636824Z ##[endgroup] 2025-08-14T22:00:48.1648303Z ##[group]Run set -eux 2025-08-14T22:00:48.1648538Z set -eux 2025-08-14T22:00:48.1648740Z  2025-08-14T22:00:48.1649297Z python3 "${GITHUB_ACTION_PATH}/../../scripts/get_workflow_job_id.py" "${GITHUB_RUN_ID}" "${RUNNER_NAME}" 2025-08-14T22:00:48.1654864Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:00:48.1655162Z env: 2025-08-14T22:00:48.1655358Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:48.1655714Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:48.1656086Z DEVICE_NAME: 2025-08-14T22:00:48.1656283Z DEVICE_TYPE: 2025-08-14T22:00:48.1656625Z GITHUB_TOKEN: *** 2025-08-14T22:00:48.1656826Z ##[endgroup] 2025-08-14T22:00:48.1682696Z + python3 /home/ec2-user/actions-runner/_work/_actions/pytorch/test-infra/main/.github/actions/get-workflow-job-id/../../scripts/get_workflow_job_id.py 16976338999 i-0115c72a6ef255e70 2025-08-14T22:00:49.6137198Z setting job-id=48128261038 2025-08-14T22:00:49.6137785Z setting job-name=linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx) 2025-08-14T22:00:49.6235631Z ##[group]Run set -eux 2025-08-14T22:00:49.6235847Z set -eux 2025-08-14T22:00:49.6236015Z  2025-08-14T22:00:49.6236289Z python3 "${GITHUB_ACTION_PATH}/../../scripts/benchmarks/gather_metadata.py" \ 2025-08-14T22:00:49.6236624Z  --schema-version "${SCHEMA_VERSION}" \ 2025-08-14T22:00:49.6236865Z  --repo "${REPO}" \ 2025-08-14T22:00:49.6237086Z  --head-branch "${HEAD_BRANCH}" \ 2025-08-14T22:00:49.6237311Z  --head-sha "${HEAD_SHA}" \ 2025-08-14T22:00:49.6237547Z  --workflow-id "${WORKFLOW_RUN_ID}" \ 2025-08-14T22:00:49.6237939Z  --run-attempt "${RUN_ATTEMPT}" \ 2025-08-14T22:00:49.6238166Z  --job-id "${JOB_ID}" \ 2025-08-14T22:00:49.6238372Z  --job-name "${JOB_NAME}" 2025-08-14T22:00:49.6243323Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:00:49.6243596Z env: 2025-08-14T22:00:49.6243759Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:49.6244086Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:49.6244424Z DEVICE_NAME: 2025-08-14T22:00:49.6244588Z DEVICE_TYPE: 2025-08-14T22:00:49.6244762Z SCHEMA_VERSION: v3 2025-08-14T22:00:49.6245080Z REPO: pytorch/pytorch 2025-08-14T22:00:49.6245508Z HEAD_BRANCH: refs/heads/main 2025-08-14T22:00:49.6245806Z HEAD_SHA: 1fc683cf17c8c673044538d10266c00f92987be2 2025-08-14T22:00:49.6246137Z WORKFLOW_RUN_ID: 16976338999 2025-08-14T22:00:49.6261999Z RUN_ATTEMPT: 1 2025-08-14T22:00:49.6262211Z JOB_ID: 48128261038 2025-08-14T22:00:49.6262799Z JOB_NAME: linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx) 2025-08-14T22:00:49.6263206Z ##[endgroup] 2025-08-14T22:00:49.6296290Z + python3 /home/ec2-user/actions-runner/_work/_actions/pytorch/test-infra/main/.github/actions/upload-benchmark-results/../../scripts/benchmarks/gather_metadata.py --schema-version v3 --repo pytorch/pytorch --head-branch refs/heads/main --head-sha 1fc683cf17c8c673044538d10266c00f92987be2 --workflow-id 16976338999 --run-attempt 1 --job-id 48128261038 --job-name 'linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx)' 2025-08-14T22:00:49.6579868Z ##[group]Run set -eux 2025-08-14T22:00:49.6580092Z set -eux 2025-08-14T22:00:49.6580260Z  2025-08-14T22:00:49.6580542Z python3 "${GITHUB_ACTION_PATH}/../../scripts/benchmarks/gather_runners_info.py" 2025-08-14T22:00:49.6585847Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:00:49.6586128Z env: 2025-08-14T22:00:49.6586300Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:49.6586605Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:49.6586950Z DEVICE_NAME: 2025-08-14T22:00:49.6587212Z DEVICE_TYPE: 2025-08-14T22:00:49.6587372Z ##[endgroup] 2025-08-14T22:00:49.6610175Z + python3 /home/ec2-user/actions-runner/_work/_actions/pytorch/test-infra/main/.github/actions/upload-benchmark-results/../../scripts/benchmarks/gather_runners_info.py 2025-08-14T22:00:49.6961786Z INFO:root:Fail to import torch to get the device name 2025-08-14T22:00:49.7060950Z ##[group]Run set -eux 2025-08-14T22:00:49.7061160Z set -eux 2025-08-14T22:00:49.7061325Z  2025-08-14T22:00:49.7061506Z # TODO (huydhn): Implement this part 2025-08-14T22:00:49.7061770Z echo "dependencies={}" >> "${GITHUB_OUTPUT}" 2025-08-14T22:00:49.7066627Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:00:49.7066898Z env: 2025-08-14T22:00:49.7067075Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:49.7067401Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:49.7067856Z DEVICE_NAME: 2025-08-14T22:00:49.7068026Z DEVICE_TYPE: 2025-08-14T22:00:49.7068213Z ##[endgroup] 2025-08-14T22:00:49.7090423Z + echo 'dependencies={}' 2025-08-14T22:00:49.7107030Z ##[group]Run set -eux 2025-08-14T22:00:49.7107250Z set -eux 2025-08-14T22:00:49.7107417Z  2025-08-14T22:00:49.7107613Z if [[ ! -d "${BENCHMARK_RESULTS_DIR}" ]]; then 2025-08-14T22:00:49.7107910Z  echo "${BENCHMARK_RESULTS_DIR} does not exist, skipping" 2025-08-14T22:00:49.7108218Z  # We don't want the job to fail if the directory doesn't exist 2025-08-14T22:00:49.7108472Z  exit 0 2025-08-14T22:00:49.7108635Z fi 2025-08-14T22:00:49.7108781Z  2025-08-14T22:00:49.7108957Z if [[ "${DRY_RUN}" == "true" ]]; then 2025-08-14T22:00:49.7109282Z  python3 "${GITHUB_ACTION_PATH}/../../scripts/upload_benchmark_results.py" \ 2025-08-14T22:00:49.7109633Z  --benchmark-results-dir "${BENCHMARK_RESULTS_DIR}" \ 2025-08-14T22:00:49.7109927Z  --metadata "${BENCHMARK_METADATA}" \ 2025-08-14T22:00:49.7110174Z  --runners "${RUNNER_INFO}" \ 2025-08-14T22:00:49.7110440Z  --dependencies "${DEPENDENCIES}" \ 2025-08-14T22:00:49.7110665Z  --dry-run 2025-08-14T22:00:49.7110847Z else 2025-08-14T22:00:49.7111118Z  python3 "${GITHUB_ACTION_PATH}/../../scripts/upload_benchmark_results.py" \ 2025-08-14T22:00:49.7111463Z  --benchmark-results-dir "${BENCHMARK_RESULTS_DIR}" \ 2025-08-14T22:00:49.7111731Z  --metadata "${BENCHMARK_METADATA}" \ 2025-08-14T22:00:49.7111966Z  --runners "${RUNNER_INFO}" \ 2025-08-14T22:00:49.7112198Z  --dependencies "${DEPENDENCIES}" 2025-08-14T22:00:49.7112403Z fi 2025-08-14T22:00:49.7116820Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:00:49.7117080Z env: 2025-08-14T22:00:49.7117247Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:49.7117573Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:49.7117913Z DEVICE_NAME: 2025-08-14T22:00:49.7118085Z DEVICE_TYPE: 2025-08-14T22:00:49.7118268Z BENCHMARK_RESULTS_DIR: test/test-reports 2025-08-14T22:00:49.7118485Z DRY_RUN: false 2025-08-14T22:00:49.7119394Z BENCHMARK_METADATA: {"timestamp": 1755208849, "schema_version": "v3", "name": "linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx)", "repo": "pytorch/pytorch", "head_branch": "refs/heads/main", "head_sha": "1fc683cf17c8c673044538d10266c00f92987be2", "workflow_id": 16976338999, "run_attempt": 1, "job_id": 48128261038} 2025-08-14T22:00:49.7120527Z RUNNER_INFO: [{"cpu_info": "x86_64", "cpu_count": 32, "avail_mem_in_gb": 123, "extra_info": {"hostname": "ip-10-0-39-154.ec2.internal"}, "name": "", "type": ""}] 2025-08-14T22:00:49.7120933Z DEPENDENCIES: {} 2025-08-14T22:00:49.7121107Z ##[endgroup] 2025-08-14T22:00:49.7144603Z + [[ ! -d test/test-reports ]] 2025-08-14T22:00:49.7144873Z + [[ false == \t\r\u\e ]] 2025-08-14T22:00:49.7147028Z + python3 /home/ec2-user/actions-runner/_work/_actions/pytorch/test-infra/main/.github/actions/upload-benchmark-results/../../scripts/upload_benchmark_results.py --benchmark-results-dir test/test-reports --metadata '{"timestamp": 1755208849, "schema_version": "v3", "name": "linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx)", "repo": "pytorch/pytorch", "head_branch": "refs/heads/main", "head_sha": "1fc683cf17c8c673044538d10266c00f92987be2", "workflow_id": 16976338999, "run_attempt": 1, "job_id": 48128261038}' --runners '[{"cpu_info": "x86_64", "cpu_count": 32, "avail_mem_in_gb": 123, "extra_info": {"hostname": "ip-10-0-39-154.ec2.internal"}, "name": "", "type": ""}]' --dependencies '{}' 2025-08-14T22:00:49.8427125Z INFO:root:Upload test/test-reports/inference_huggingface.json to s3://ossci-benchmarks/v3/pytorch/pytorch/16976338999/48128261038/inference_huggingface.json 2025-08-14T22:00:49.8748910Z INFO:botocore.credentials:Found credentials from IAM Role: gh-ci-github-action-runners-runner-role 2025-08-14T22:00:50.1244751Z ##[group]Run cat test/**/*_toprint.log || true 2025-08-14T22:00:50.1245057Z cat test/**/*_toprint.log || true 2025-08-14T22:00:50.1250709Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:00:50.1250990Z env: 2025-08-14T22:00:50.1251179Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:50.1251523Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:50.1251870Z DEVICE_NAME: 2025-08-14T22:00:50.1252055Z DEVICE_TYPE: 2025-08-14T22:00:50.1252233Z ##[endgroup] 2025-08-14T22:00:50.1329171Z cat: 'test/**/*_toprint.log': No such file or directory 2025-08-14T22:00:50.1355630Z ##[group]Run kill "$MONITOR_SCRIPT_PID" 2025-08-14T22:00:50.1355906Z kill "$MONITOR_SCRIPT_PID" 2025-08-14T22:00:50.1360603Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:00:50.1360866Z env: 2025-08-14T22:00:50.1361051Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:50.1361361Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:50.1361696Z DEVICE_NAME: 2025-08-14T22:00:50.1361866Z DEVICE_TYPE: 2025-08-14T22:00:50.1362041Z MONITOR_SCRIPT_PID: 47994 2025-08-14T22:00:50.1362229Z ##[endgroup] 2025-08-14T22:00:50.1463320Z Prepare all required actions 2025-08-14T22:00:50.1463942Z Getting action download info 2025-08-14T22:00:50.2834740Z Download action repository 'seemethere/upload-artifact-s3@v5' (SHA:baba72d0712b404f646cebe0730933554ebce96a) 2025-08-14T22:00:50.5229398Z Download action repository 'actions/upload-artifact@v4' (SHA:ea165f8d65b6e75b540449e92b4886f43607fa02) 2025-08-14T22:00:50.9098901Z ##[group]Run ./.github/actions/upload-test-artifacts 2025-08-14T22:00:50.9099164Z with: 2025-08-14T22:00:50.9099478Z file-suffix: test-cpu_inductor_amp_freezing_huggingface-1-1-linux.8xlarge.amx_48128261038 2025-08-14T22:00:50.9099831Z s3-bucket: gha-artifacts 2025-08-14T22:00:50.9100044Z env: 2025-08-14T22:00:50.9100212Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:50.9100534Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:50.9100875Z DEVICE_NAME: 2025-08-14T22:00:50.9101048Z DEVICE_TYPE: 2025-08-14T22:00:50.9101220Z ##[endgroup] 2025-08-14T22:00:50.9120227Z ##[group]Run # Remove any previous test jsons if they exist 2025-08-14T22:00:50.9120573Z # Remove any previous test jsons if they exist 2025-08-14T22:00:50.9120842Z rm -f test-jsons-*.zip 2025-08-14T22:00:50.9121145Z zip -r "test-jsons-${FILE_SUFFIX}.zip" test/test-reports -i '*.json' 2025-08-14T22:00:50.9126152Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:00:50.9126416Z env: 2025-08-14T22:00:50.9126590Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:50.9126918Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:50.9127248Z DEVICE_NAME: 2025-08-14T22:00:50.9127505Z DEVICE_TYPE: 2025-08-14T22:00:50.9127810Z FILE_SUFFIX: test-cpu_inductor_amp_freezing_huggingface-1-1-linux.8xlarge.amx_48128261038 2025-08-14T22:00:50.9128151Z ##[endgroup] 2025-08-14T22:00:50.9312395Z adding: test/test-reports/inference_huggingface.json (deflated 99%) 2025-08-14T22:00:50.9335791Z ##[group]Run # Remove any previous test reports if they exist 2025-08-14T22:00:50.9336144Z # Remove any previous test reports if they exist 2025-08-14T22:00:50.9336414Z rm -f test-reports-*.zip 2025-08-14T22:00:50.9336730Z zip -r "test-reports-${FILE_SUFFIX}.zip" test/test-reports -i '*.xml' -i '*.csv' 2025-08-14T22:00:50.9341452Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:00:50.9341719Z env: 2025-08-14T22:00:50.9341891Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:50.9342208Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:50.9342540Z DEVICE_NAME: 2025-08-14T22:00:50.9342720Z DEVICE_TYPE: 2025-08-14T22:00:50.9343012Z FILE_SUFFIX: test-cpu_inductor_amp_freezing_huggingface-1-1-linux.8xlarge.amx_48128261038 2025-08-14T22:00:50.9343339Z ##[endgroup] 2025-08-14T22:00:50.9394469Z adding: test/test-reports/inference_huggingface.csv (deflated 69%) 2025-08-14T22:00:50.9394926Z adding: test/test-reports/inference_huggingface_graph_breaks.csv (deflated 85%) 2025-08-14T22:00:50.9395350Z adding: test/test-reports/inference_huggingface_graph_break_deduped.csv (deflated 64%) 2025-08-14T22:00:50.9414924Z ##[group]Run # Remove any previous usage logs if they exist 2025-08-14T22:00:50.9415261Z # Remove any previous usage logs if they exist 2025-08-14T22:00:50.9415520Z rm -f logs-*.zip 2025-08-14T22:00:50.9415795Z zip "logs-${FILE_SUFFIX}.zip" 'usage_log.txt' || true 2025-08-14T22:00:50.9416193Z zip -r "logs-${FILE_SUFFIX}.zip" test/test-reports -i '*.log' || true 2025-08-14T22:00:50.9420923Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:00:50.9421181Z env: 2025-08-14T22:00:50.9421354Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:50.9421677Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:50.9422000Z DEVICE_NAME: 2025-08-14T22:00:50.9422174Z DEVICE_TYPE: 2025-08-14T22:00:50.9422621Z FILE_SUFFIX: test-cpu_inductor_amp_freezing_huggingface-1-1-linux.8xlarge.amx_48128261038 2025-08-14T22:00:50.9422953Z ##[endgroup] 2025-08-14T22:00:50.9496898Z adding: usage_log.txt (deflated 96%) 2025-08-14T22:00:50.9509998Z 2025-08-14T22:00:50.9510772Z zip error: Nothing to do! (logs-test-cpu_inductor_amp_freezing_huggingface-1-1-linux.8xlarge.amx_48128261038.zip) 2025-08-14T22:00:50.9528613Z ##[group]Run # Remove any previous debugging artifacts if they exist 2025-08-14T22:00:50.9529295Z # Remove any previous debugging artifacts if they exist 2025-08-14T22:00:50.9529586Z rm -f debug-*.zip 2025-08-14T22:00:50.9529791Z if [ -d 'test/debug' ]; then 2025-08-14T22:00:50.9530064Z  zip -r "debug-${FILE_SUFFIX}.zip" test/debug 2025-08-14T22:00:50.9530303Z fi 2025-08-14T22:00:50.9534863Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:00:50.9535118Z env: 2025-08-14T22:00:50.9535292Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:50.9535631Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:50.9535956Z DEVICE_NAME: 2025-08-14T22:00:50.9536133Z DEVICE_TYPE: 2025-08-14T22:00:50.9536434Z FILE_SUFFIX: test-cpu_inductor_amp_freezing_huggingface-1-1-linux.8xlarge.amx_48128261038 2025-08-14T22:00:50.9536807Z ##[endgroup] 2025-08-14T22:00:50.9601285Z ##[group]Run seemethere/upload-artifact-s3@v5 2025-08-14T22:00:50.9601551Z with: 2025-08-14T22:00:50.9601732Z s3-bucket: gha-artifacts 2025-08-14T22:00:50.9601982Z s3-prefix: pytorch/pytorch/16976338999/1/artifact 2025-08-14T22:00:50.9602240Z retention-days: 14 2025-08-14T22:00:50.9602510Z if-no-files-found: warn 2025-08-14T22:00:50.9602883Z path: test-jsons-*.zip 2025-08-14T22:00:50.9603085Z name: artifact 2025-08-14T22:00:50.9603260Z region: us-east-1 2025-08-14T22:00:50.9603445Z env: 2025-08-14T22:00:50.9603618Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:50.9603959Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:50.9604305Z DEVICE_NAME: 2025-08-14T22:00:50.9604484Z DEVICE_TYPE: 2025-08-14T22:00:50.9604662Z ##[endgroup] 2025-08-14T22:00:51.2510977Z NOTE: s3-prefix specified, ignoring name parameter 2025-08-14T22:00:51.2511367Z With the provided path, there will be 1 file uploaded 2025-08-14T22:00:51.2511688Z Uploading to s3 prefix: pytorch/pytorch/16976338999/1/artifact 2025-08-14T22:00:51.2547662Z Starting upload of test-jsons-test-cpu_inductor_amp_freezing_huggingface-1-1-linux.8xlarge.amx_48128261038.zip 2025-08-14T22:00:51.3685333Z Finished upload of test-jsons-test-cpu_inductor_amp_freezing_huggingface-1-1-linux.8xlarge.amx_48128261038.zip 2025-08-14T22:00:51.3824357Z ##[group]Run seemethere/upload-artifact-s3@v5 2025-08-14T22:00:51.3824616Z with: 2025-08-14T22:00:51.3824803Z s3-bucket: gha-artifacts 2025-08-14T22:00:51.3825054Z s3-prefix: pytorch/pytorch/16976338999/1/artifact 2025-08-14T22:00:51.3825301Z retention-days: 14 2025-08-14T22:00:51.3825518Z if-no-files-found: error 2025-08-14T22:00:51.3825735Z path: test-reports-*.zip 2025-08-14T22:00:51.3825925Z name: artifact 2025-08-14T22:00:51.3826104Z region: us-east-1 2025-08-14T22:00:51.3826285Z env: 2025-08-14T22:00:51.3826448Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:51.3826791Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:51.3827149Z DEVICE_NAME: 2025-08-14T22:00:51.3827333Z DEVICE_TYPE: 2025-08-14T22:00:51.3827505Z ##[endgroup] 2025-08-14T22:00:51.6567120Z NOTE: s3-prefix specified, ignoring name parameter 2025-08-14T22:00:51.6567499Z With the provided path, there will be 1 file uploaded 2025-08-14T22:00:51.6567842Z Uploading to s3 prefix: pytorch/pytorch/16976338999/1/artifact 2025-08-14T22:00:51.6601397Z Starting upload of test-reports-test-cpu_inductor_amp_freezing_huggingface-1-1-linux.8xlarge.amx_48128261038.zip 2025-08-14T22:00:51.7635480Z Finished upload of test-reports-test-cpu_inductor_amp_freezing_huggingface-1-1-linux.8xlarge.amx_48128261038.zip 2025-08-14T22:00:51.7782366Z ##[group]Run seemethere/upload-artifact-s3@v5 2025-08-14T22:00:51.7782613Z with: 2025-08-14T22:00:51.7782791Z s3-bucket: gha-artifacts 2025-08-14T22:00:51.7783029Z s3-prefix: pytorch/pytorch/16976338999/1/artifact 2025-08-14T22:00:51.7783269Z retention-days: 14 2025-08-14T22:00:51.7783459Z if-no-files-found: ignore 2025-08-14T22:00:51.7783742Z path: logs-*.zip 2025-08-14T22:00:51.7783906Z name: artifact 2025-08-14T22:00:51.7784079Z region: us-east-1 2025-08-14T22:00:51.7784254Z env: 2025-08-14T22:00:51.7784413Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:51.7784750Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:51.7785104Z DEVICE_NAME: 2025-08-14T22:00:51.7785280Z DEVICE_TYPE: 2025-08-14T22:00:51.7785446Z ##[endgroup] 2025-08-14T22:00:52.0539610Z NOTE: s3-prefix specified, ignoring name parameter 2025-08-14T22:00:52.0540056Z With the provided path, there will be 1 file uploaded 2025-08-14T22:00:52.0540374Z Uploading to s3 prefix: pytorch/pytorch/16976338999/1/artifact 2025-08-14T22:00:52.0576489Z Starting upload of logs-test-cpu_inductor_amp_freezing_huggingface-1-1-linux.8xlarge.amx_48128261038.zip 2025-08-14T22:00:52.1586956Z Finished upload of logs-test-cpu_inductor_amp_freezing_huggingface-1-1-linux.8xlarge.amx_48128261038.zip 2025-08-14T22:00:52.1748582Z ##[group]Run seemethere/upload-artifact-s3@v5 2025-08-14T22:00:52.1748833Z with: 2025-08-14T22:00:52.1749018Z s3-bucket: gha-artifacts 2025-08-14T22:00:52.1749264Z s3-prefix: pytorch/pytorch/16976338999/1/artifact 2025-08-14T22:00:52.1749521Z retention-days: 14 2025-08-14T22:00:52.1749829Z if-no-files-found: ignore 2025-08-14T22:00:52.1750071Z path: debug-*.zip 2025-08-14T22:00:52.1750240Z name: artifact 2025-08-14T22:00:52.1750419Z region: us-east-1 2025-08-14T22:00:52.1750595Z env: 2025-08-14T22:00:52.1750754Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:52.1751091Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:52.1751422Z DEVICE_NAME: 2025-08-14T22:00:52.1751588Z DEVICE_TYPE: 2025-08-14T22:00:52.1751747Z ##[endgroup] 2025-08-14T22:00:52.4376762Z No files were found with the provided path: debug-*.zip. No artifacts will be uploaded. 2025-08-14T22:00:52.4538561Z ##[group]Run # shellcheck disable=SC2156 2025-08-14T22:00:52.4538893Z # shellcheck disable=SC2156 2025-08-14T22:00:52.4539301Z find . -iname "core.[1-9]*" -exec docker exec "${DOCKER_CONTAINER_ID}" sh -c "gdb python {} -ex 'bt' -ex 'q'" \; 2025-08-14T22:00:52.4544363Z shell: /usr/bin/bash -e {0} 2025-08-14T22:00:52.4544579Z env: 2025-08-14T22:00:52.4544748Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:52.4545076Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:52.4545415Z DEVICE_NAME: 2025-08-14T22:00:52.4545589Z DEVICE_TYPE: 2025-08-14T22:00:52.4545755Z ##[endgroup] 2025-08-14T22:00:52.6448337Z Prepare all required actions 2025-08-14T22:00:52.6448723Z Getting action download info 2025-08-14T22:00:52.7449307Z ##[group]Run ./.github/actions/upload-utilization-stats 2025-08-14T22:00:52.7449618Z with: 2025-08-14T22:00:52.7449818Z job_id: 48128261038 2025-08-14T22:00:52.7450267Z job_name: linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx) 2025-08-14T22:00:52.7450777Z workflow_name: inductor-periodic 2025-08-14T22:00:52.7451026Z workflow_run_id: 16976338999 2025-08-14T22:00:52.7451248Z workflow_attempt: 1 2025-08-14T22:00:52.7451445Z env: 2025-08-14T22:00:52.7451633Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:52.7452022Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:52.7452413Z DEVICE_NAME: 2025-08-14T22:00:52.7452605Z DEVICE_TYPE: 2025-08-14T22:00:52.7452799Z ##[endgroup] 2025-08-14T22:00:52.7465283Z ##[group]Run echo "workflow_id: 16976338999" 2025-08-14T22:00:52.7465561Z echo "workflow_id: 16976338999" 2025-08-14T22:00:52.7465805Z echo "workflow_attempt: 1" 2025-08-14T22:00:52.7466067Z echo "workflow_Name: inductor-periodic" 2025-08-14T22:00:52.7466316Z echo "job_id: 48128261038" 2025-08-14T22:00:52.7466766Z echo "job_name: linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx)" 2025-08-14T22:00:52.7467289Z echo "artifact_prefix: " 2025-08-14T22:00:52.7467518Z python3 --version 2025-08-14T22:00:52.7472409Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:00:52.7472680Z env: 2025-08-14T22:00:52.7472862Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:52.7473190Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:52.7473542Z DEVICE_NAME: 2025-08-14T22:00:52.7473714Z DEVICE_TYPE: 2025-08-14T22:00:52.7473884Z ##[endgroup] 2025-08-14T22:00:52.7497741Z workflow_id: 16976338999 2025-08-14T22:00:52.7497996Z workflow_attempt: 1 2025-08-14T22:00:52.7498211Z workflow_Name: inductor-periodic 2025-08-14T22:00:52.7498417Z job_id: 48128261038 2025-08-14T22:00:52.7498921Z job_name: linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx) 2025-08-14T22:00:52.7499336Z artifact_prefix: 2025-08-14T22:00:52.7511242Z Python 3.9.23 2025-08-14T22:00:52.7540293Z ##[group]Run nick-fields/retry@v3.0.0 2025-08-14T22:00:52.7540524Z with: 2025-08-14T22:00:52.7540698Z shell: bash 2025-08-14T22:00:52.7540886Z timeout_minutes: 5 2025-08-14T22:00:52.7541058Z max_attempts: 5 2025-08-14T22:00:52.7541317Z retry_wait_seconds: 30 2025-08-14T22:00:52.7541696Z command: set -eu python3 -m pip install python-dateutil==2.8.2 boto3==1.35.42 pandas==2.1.3 dataclasses_json==0.6.7 2025-08-14T22:00:52.7542097Z polling_interval_seconds: 1 2025-08-14T22:00:52.7542308Z warning_on_retry: true 2025-08-14T22:00:52.7542517Z continue_on_error: false 2025-08-14T22:00:52.7542719Z env: 2025-08-14T22:00:52.7542879Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:00:52.7543215Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:00:52.7543567Z DEVICE_NAME: 2025-08-14T22:00:52.7543738Z DEVICE_TYPE: 2025-08-14T22:00:52.7543914Z ##[endgroup] 2025-08-14T22:00:53.0422482Z Defaulting to user installation because normal site-packages is not writeable 2025-08-14T22:00:53.1053076Z Collecting python-dateutil==2.8.2 2025-08-14T22:00:53.1204887Z Downloading python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB) 2025-08-14T22:00:53.8652351Z Collecting boto3==1.35.42 2025-08-14T22:00:53.8694630Z Downloading boto3-1.35.42-py3-none-any.whl (139 kB) 2025-08-14T22:00:54.2697753Z Collecting pandas==2.1.3 2025-08-14T22:00:54.2770116Z Downloading pandas-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (12.3 MB) 2025-08-14T22:00:54.3838168Z Requirement already satisfied: dataclasses_json==0.6.7 in /home/ec2-user/.local/lib/python3.9/site-packages (0.6.7) 2025-08-14T22:00:54.3854286Z Requirement already satisfied: six>=1.5 in /usr/lib/python3.9/site-packages (from python-dateutil==2.8.2) (1.15.0) 2025-08-14T22:00:54.3887930Z Requirement already satisfied: jmespath<2.0.0,>=0.7.1 in /usr/lib/python3.9/site-packages (from boto3==1.35.42) (0.10.0) 2025-08-14T22:00:54.3894069Z Requirement already satisfied: botocore<1.36.0,>=1.35.42 in /home/ec2-user/.local/lib/python3.9/site-packages (from boto3==1.35.42) (1.35.99) 2025-08-14T22:00:54.3895515Z Requirement already satisfied: s3transfer<0.11.0,>=0.10.0 in /home/ec2-user/.local/lib/python3.9/site-packages (from boto3==1.35.42) (0.10.4) 2025-08-14T22:00:55.0729868Z Collecting numpy<2,>=1.22.4 2025-08-14T22:00:55.0762848Z Downloading numpy-1.26.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.2 MB) 2025-08-14T22:00:55.2545074Z Collecting tzdata>=2022.1 2025-08-14T22:00:55.2581613Z Downloading tzdata-2025.2-py2.py3-none-any.whl (347 kB) 2025-08-14T22:00:55.2688046Z Requirement already satisfied: pytz>=2020.1 in /usr/lib/python3.9/site-packages (from pandas==2.1.3) (2022.7.1) 2025-08-14T22:00:55.2715325Z Requirement already satisfied: typing-inspect<1,>=0.4.0 in /home/ec2-user/.local/lib/python3.9/site-packages (from dataclasses_json==0.6.7) (0.9.0) 2025-08-14T22:00:55.2716921Z Requirement already satisfied: marshmallow<4.0.0,>=3.18.0 in /home/ec2-user/.local/lib/python3.9/site-packages (from dataclasses_json==0.6.7) (3.26.1) 2025-08-14T22:00:55.2788414Z Requirement already satisfied: urllib3<1.27,>=1.25.4 in /usr/lib/python3.9/site-packages (from botocore<1.36.0,>=1.35.42->boto3==1.35.42) (1.25.10) 2025-08-14T22:00:55.2868510Z Requirement already satisfied: packaging>=17.0 in /home/ec2-user/.local/lib/python3.9/site-packages (from marshmallow<4.0.0,>=3.18.0->dataclasses_json==0.6.7) (25.0) 2025-08-14T22:00:55.2956869Z Requirement already satisfied: typing-extensions>=3.7.4 in /home/ec2-user/.local/lib/python3.9/site-packages (from typing-inspect<1,>=0.4.0->dataclasses_json==0.6.7) (4.14.1) 2025-08-14T22:00:55.2958699Z Requirement already satisfied: mypy-extensions>=0.3.0 in /home/ec2-user/.local/lib/python3.9/site-packages (from typing-inspect<1,>=0.4.0->dataclasses_json==0.6.7) (1.1.0) 2025-08-14T22:00:55.4263244Z Installing collected packages: python-dateutil, tzdata, numpy, pandas, boto3 2025-08-14T22:00:59.8659843Z Attempting uninstall: boto3 2025-08-14T22:00:59.8660203Z Found existing installation: boto3 1.35.33 2025-08-14T22:00:59.8731292Z Uninstalling boto3-1.35.33: 2025-08-14T22:00:59.8741875Z Successfully uninstalled boto3-1.35.33 2025-08-14T22:00:59.9208512Z Successfully installed boto3-1.35.42 numpy-1.26.4 pandas-2.1.3 python-dateutil-2.8.2 tzdata-2025.2 2025-08-14T22:01:00.8306338Z Command completed after 1 attempt(s). 2025-08-14T22:01:00.8356768Z ##[group]Run python3 -m tools.stats.upload_utilization_stats.upload_utilization_stats \ 2025-08-14T22:01:00.8357257Z python3 -m tools.stats.upload_utilization_stats.upload_utilization_stats \ 2025-08-14T22:01:00.8357589Z  --workflow-run-id "16976338999" \ 2025-08-14T22:01:00.8357853Z  --workflow-name "inductor-periodic" \ 2025-08-14T22:01:00.8358114Z  --workflow-run-attempt "1" \ 2025-08-14T22:01:00.8358347Z  --job-id "48128261038" \ 2025-08-14T22:01:00.8358765Z  --job-name "linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx)" \ 2025-08-14T22:01:00.8359201Z  --local-path "" \ 2025-08-14T22:01:00.8359419Z  --artifact-prefix "" 2025-08-14T22:01:00.8364290Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:01:00.8364553Z env: 2025-08-14T22:01:00.8364720Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:01:00.8365032Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:01:00.8365361Z DEVICE_NAME: 2025-08-14T22:01:00.8365526Z DEVICE_TYPE: 2025-08-14T22:01:00.8365839Z ##[endgroup] 2025-08-14T22:01:01.8337449Z repo: pytorch/pytorch 2025-08-14T22:01:01.8337918Z Search for test log in s3 bucket: ossci-utilization 2025-08-14T22:01:01.8342629Z Downloading logs-test-cpu_inductor_amp_freezing_huggingface-1-1-linux.8xlarge.amx_48128261038.zip 2025-08-14T22:01:01.8343250Z extracting usage_log.txt from zip file logs-test-cpu_inductor_amp_freezing_huggingface-1-1-linux.8xlarge.amx_48128261038.zip 2025-08-14T22:01:01.8346026Z Converted Log Model: UtilizationMetadata: 2025-08-14T22:01:01.8347039Z UtilizationMetadata(level='metadata', workflow_id='16976338999', job_id='48128261038', workflow_name='inductor-periodic', job_name='linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx)', usage_collect_interval=1.0, data_model_version=1.5, start_at=1755207153, gpu_count=0, cpu_count=32, gpu_type=None, error=None) 2025-08-14T22:01:01.8348105Z [Db Segments] detected pytest cmd: 11, generated segments: 11 2025-08-14T22:01:01.8348433Z [db model] Peek db timeseries 2025-08-14T22:01:01.8348628Z :{ 2025-08-14T22:01:01.8348793Z "created_at": 1755208861, 2025-08-14T22:01:01.8348995Z "type": "utilization", 2025-08-14T22:01:01.8349179Z "tags": [ 2025-08-14T22:01:01.8349344Z "record" 2025-08-14T22:01:01.8349510Z ], 2025-08-14T22:01:01.8349670Z "time_stamp": 1755207153, 2025-08-14T22:01:01.8349872Z "repo": "pytorch/pytorch", 2025-08-14T22:01:01.8350427Z "workflow_id": 16976338999, 2025-08-14T22:01:01.8350627Z "run_attempt": 1, 2025-08-14T22:01:01.8350814Z "job_id": 48128261038, 2025-08-14T22:01:01.8351028Z "workflow_name": "inductor-periodic", 2025-08-14T22:01:01.8351469Z "job_name": "linux-jammy-cpu-py3.9-gcc11-inductor / test (cpu_inductor_amp_freezing_huggingface, 1, 1, linux.8xlarge.amx)", 2025-08-14T22:01:01.8351880Z "json_data": "{}" 2025-08-14T22:01:01.8352058Z } 2025-08-14T22:01:01.8352410Z Writing 1 documents to S3 ossci-utilization/util_metadata/v_1.5/pytorch/pytorch/16976338999/1/48128261038/metadata 2025-08-14T22:01:01.8352984Z Done! Finish writing document to S3 ossci-utilization/util_metadata/v_1.5/pytorch/pytorch/16976338999/1/48128261038/metadata 2025-08-14T22:01:01.8353577Z Writing 339 documents to S3 ossci-utilization/util_timeseries/v_1.5/pytorch/pytorch/16976338999/1/48128261038/time_series 2025-08-14T22:01:01.8354186Z Done! Finish writing document to S3 ossci-utilization/util_timeseries/v_1.5/pytorch/pytorch/16976338999/1/48128261038/time_series 2025-08-14T22:01:01.9618865Z ##[group]Run pytorch/test-infra/.github/actions/teardown-linux@main 2025-08-14T22:01:01.9619181Z with: 2025-08-14T22:01:01.9619352Z env: 2025-08-14T22:01:01.9619519Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:01:01.9619938Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:01:01.9620295Z DEVICE_NAME: 2025-08-14T22:01:01.9620462Z DEVICE_TYPE: 2025-08-14T22:01:01.9620631Z ##[endgroup] 2025-08-14T22:01:01.9632588Z ##[group]Run set -eou pipefail 2025-08-14T22:01:01.9632847Z set -eou pipefail 2025-08-14T22:01:01.9633051Z  2025-08-14T22:01:01.9633318Z echo "Holding runner for 2 hours until all ssh sessions have logged out" 2025-08-14T22:01:01.9633644Z for _ in $(seq 1440); do 2025-08-14T22:01:01.9633887Z  # Break if no ssh session exists anymore 2025-08-14T22:01:01.9634134Z  if [ "$(who)" = "" ]; then 2025-08-14T22:01:01.9634355Z  break 2025-08-14T22:01:01.9634572Z  fi 2025-08-14T22:01:01.9634740Z  echo "." 2025-08-14T22:01:01.9634927Z  sleep 5 2025-08-14T22:01:01.9635108Z done 2025-08-14T22:01:01.9640101Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:01:01.9640374Z env: 2025-08-14T22:01:01.9640550Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:01:01.9640879Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:01:01.9641213Z DEVICE_NAME: 2025-08-14T22:01:01.9641384Z DEVICE_TYPE: 2025-08-14T22:01:01.9641555Z ##[endgroup] 2025-08-14T22:01:01.9666242Z Holding runner for 2 hours until all ssh sessions have logged out 2025-08-14T22:01:01.9742479Z ##[group]Run # ignore expansion of "docker ps -q" since it could be empty 2025-08-14T22:01:01.9743131Z # ignore expansion of "docker ps -q" since it could be empty 2025-08-14T22:01:01.9743531Z # shellcheck disable=SC2046 2025-08-14T22:01:01.9743867Z docker stop $(docker ps -q) || true 2025-08-14T22:01:01.9744205Z # Prune all of the docker images 2025-08-14T22:01:01.9744510Z docker system prune -af 2025-08-14T22:01:01.9749764Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:01:01.9750043Z env: 2025-08-14T22:01:01.9750221Z GIT_DEFAULT_BRANCH: main 2025-08-14T22:01:01.9750545Z DOCKER_CONTAINER_ID: ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:01:01.9750883Z DEVICE_NAME: 2025-08-14T22:01:01.9751063Z DEVICE_TYPE: 2025-08-14T22:01:01.9751238Z ##[endgroup] 2025-08-14T22:01:12.8674717Z ca0b9dd31303 2025-08-14T22:01:13.1817538Z Deleted Containers: 2025-08-14T22:01:13.1818119Z ca0b9dd31303f22b51e5de61d0ae44441c1e171fe6f52356c85bf2a6591d5f61 2025-08-14T22:01:13.1818463Z 2025-08-14T22:01:20.8204021Z Deleted Images: 2025-08-14T22:01:20.8204768Z untagged: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3.9-gcc11-inductor-benchmarks-bfa89110622ba7202628e9faac705f183070defe 2025-08-14T22:01:20.8205914Z untagged: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image@sha256:4236794baba289041d240d08fd393bbd57497c3012e5e0ccd9fd98f61ebf35c6 2025-08-14T22:01:20.8206507Z deleted: sha256:0899ae453036ee7a91795ea95b1db61000579eeb74b140edab5976919ee64bbe 2025-08-14T22:01:20.8206940Z deleted: sha256:aa7b544271e9ba3105dabd1afb12e315887018f3471e03135c1d50e64cc550c4 2025-08-14T22:01:20.8207370Z deleted: sha256:4c685831817cc2fc6dfdfda1726df1f402222d8cdccc40daad3198cf8b17e3f4 2025-08-14T22:01:20.8207803Z deleted: sha256:cedf3fb09a62e68c6d7e22cedbce12e77166a50649d0269200ee0efce8a57b88 2025-08-14T22:01:20.8208253Z deleted: sha256:1b3a9a237b4153f8f523a85cead9d36e29717eb57182e2f75069788681627d95 2025-08-14T22:01:20.8208654Z deleted: sha256:67bd313103dfbe7fe0172e6f4f7ee420fad9743a64a1cc1cd20bc22250d3602c 2025-08-14T22:01:20.8209474Z deleted: sha256:b17820137ada46a2a726c67aa08cce73d2ead7c95db08575cf5e69bedb4b600d 2025-08-14T22:01:20.8210111Z deleted: sha256:b16c9bc40cc1cf924638323aece4168d6332cfae212dad2a431a584a44fe967c 2025-08-14T22:01:20.8210530Z deleted: sha256:ab35ed781133eb4aaa1b2478aea73fb80dc71bceffbe474b55e1a60fc6c5ffbe 2025-08-14T22:01:20.8210950Z deleted: sha256:b9d0b0720dd9c0bcb4f174ae6770a7c2fe540c6983872180f3a5e18300434cdb 2025-08-14T22:01:20.8211423Z deleted: sha256:f5d1a4f32d90030cc174d73b579758d28f95c992a8cf21360e5addee99dea169 2025-08-14T22:01:20.8211832Z deleted: sha256:4af408141f8591f4b69cef9b425b6caa3c4cbc62ced38b5d08f3150f0c8ff449 2025-08-14T22:01:20.8212235Z deleted: sha256:e0019e5c461051e54a9af37ae22b49cfd2c2e5366da57a20304f6ef89171a3b3 2025-08-14T22:01:20.8212672Z deleted: sha256:542f999b2cfc965b97861645356840864e9946fa2fa40f1f5c4c45684e91c239 2025-08-14T22:01:20.8213076Z deleted: sha256:633629aa3d4ae6472e222a1c0b2ceb729b0d84ccb48e12d52ba2d2987c9063e1 2025-08-14T22:01:20.8213480Z deleted: sha256:ea645aba1ba54baac43713f3df7f1b89dd119764a747273897eb2931fea42856 2025-08-14T22:01:20.8213906Z deleted: sha256:1f50e367efff88c7182b9dc3ff618c1cf7bd34edf2f31805e268c50fac02a627 2025-08-14T22:01:20.8214291Z deleted: sha256:aff22d7ae43d842befa617e2e5f9878d09a82b67c362b0c44a40a4c88be92120 2025-08-14T22:01:20.8214668Z deleted: sha256:4275d4addb77b473ed40194e42918cf2aeb484d1d8e25cf54d374392643a095c 2025-08-14T22:01:20.8215031Z deleted: sha256:66471f6c8dc869455ff193909110d824b5d65f7383877a7d0face6331b21fff3 2025-08-14T22:01:20.8215416Z deleted: sha256:8cfd2d55570494ff2b993725f5eb13d0440a5698fa905823ca1677d2d16febb8 2025-08-14T22:01:20.8215812Z deleted: sha256:5c8cf8b9c4a76f679994decc8800bc6eefd258a8dc6293a714d5e100fea3a1bc 2025-08-14T22:01:20.8216205Z deleted: sha256:1acc162c6b9de62d13ce7fd33bb9b134458f7e7dbe996e5442e0047ec8f70c80 2025-08-14T22:01:20.8216610Z deleted: sha256:044bab98f3bceb1948c626ce6bdd19d3ec8f9c5ad42a4f635dd685a7ae9c9024 2025-08-14T22:01:20.8217014Z deleted: sha256:2acb11a9448f13c2c2d29c4d0d4013e046862bd019cf5ec9fe04bdf35299f1dd 2025-08-14T22:01:20.8217403Z deleted: sha256:8e7b56334416233f301944000dec16952e13bb69296cc80e1031bfecaf6e7f9d 2025-08-14T22:01:20.8217799Z deleted: sha256:4a4d1ec727c43389a601aefccdaeff6b3bf54c0daefb12e0c2098c3e18b383ba 2025-08-14T22:01:20.8218196Z deleted: sha256:8b9ca4276331196a2f03c2fa3a87422d2042cf06011b49368c2335be7da829c1 2025-08-14T22:01:20.8218592Z deleted: sha256:5076357fd3cc8b06ed54a0f692362a38f1ebafa4843c0b0bf8021f9021d2e583 2025-08-14T22:01:20.8219002Z deleted: sha256:f9451fa0842798e2a67c059fda5124cafb401801bb8c40d03ae736ff3ef5ed20 2025-08-14T22:01:20.8219398Z deleted: sha256:52b716f02091d6af6b79e7b2e1f5bbd7391235993d415c7a852d6752220c8b65 2025-08-14T22:01:20.8219790Z deleted: sha256:748225161c361d3779c96eb7ae5ea0c33d35311f9445c371d62616b98e3426e8 2025-08-14T22:01:20.8220190Z deleted: sha256:5eeda1478a46d8d58267e8917422eb0a182a40c8bdfb4bfe0869923f8114c770 2025-08-14T22:01:20.8220594Z deleted: sha256:66d4cebb04304f556dd191b425a876f7dbbcde8c3c647af4ef47c10804e51f5a 2025-08-14T22:01:20.8221003Z deleted: sha256:0b526447174d22890be2bc866228e40989483b1102a0430b4ab3ad16dc6c7787 2025-08-14T22:01:20.8221459Z deleted: sha256:1aa31d55f8f9bb51f1eb702ba7d46ceda8290ed90e8e8cf299bb8a9179bf2ae2 2025-08-14T22:01:20.8221885Z deleted: sha256:dd1f47c8dc7518f303a91fc8aae81a512caff53987d5a89a378bb24c1c6d7707 2025-08-14T22:01:20.8222290Z deleted: sha256:d60f9527fcb284e73795a37d4f536badd451a2eade4c9314ebe549d31efcc876 2025-08-14T22:01:20.8222685Z deleted: sha256:f23ad0355704751b0f71a8900169354e3bf23a7b3f5fa2cd9b2478a561bfbb45 2025-08-14T22:01:20.8223093Z deleted: sha256:10e7acf6460743fcad0c1fff0bbd01158fbeb88151621c1e15ae5994f1c8ef55 2025-08-14T22:01:20.8223486Z deleted: sha256:f674e3067e97f1407f4cd55202d4c0c8641f02811550e65a00a875fc19354b75 2025-08-14T22:01:20.8223902Z deleted: sha256:8a9c75c896425ccd25101f0cf39316bec7779111954f44df726842bf583e907b 2025-08-14T22:01:20.8224303Z deleted: sha256:9730d30edfcaa135287479d80f1720b39c6f728228df6d0eb7f095e917cc16b6 2025-08-14T22:01:20.8224693Z deleted: sha256:2787e13cf97e870ca65312526c3000163ebf3da20fe59e5f5d53b1aeb4fb424b 2025-08-14T22:01:20.8225152Z deleted: sha256:d61197909174795bd69f8d5f534f1b086065d36b7aa6c5a50744eca6f8d6b12b 2025-08-14T22:01:20.8225557Z deleted: sha256:ecdfbb81e95b2ae2c8e9ab4ca72ba8564095caabb0512a47da8f866923f71bff 2025-08-14T22:01:20.8225938Z deleted: sha256:cd2d7c644df243742a0c0349af0d37570c06fdd1711ddc367e79514757a6d5cc 2025-08-14T22:01:20.8226347Z deleted: sha256:6703ab1ced70b30a87660c0dd778fe95fb90b04ed8461c2a331272aa54eb3499 2025-08-14T22:01:20.8226718Z deleted: sha256:b7088ce49d7df1d6fb18eee5fc5664e637c5649c89e581d972c76a83f60d0a62 2025-08-14T22:01:20.8227127Z deleted: sha256:d0d2786658af9907d8c4ecfa84fa9e2bd07131257264395b804deef744a5c39c 2025-08-14T22:01:20.8227581Z deleted: sha256:d46baf72d8e570e6004c6f95131cea6ede27eb01c213d8c1e8b263ab95fdfe95 2025-08-14T22:01:20.8227989Z deleted: sha256:0219ea0bd0e38d169ed596ed80807b0f70b609ec5f886d671c249d10575dff2c 2025-08-14T22:01:20.8228381Z deleted: sha256:77d1a1f15cf8ae85a4c5495d800378c307967004360814810fd13b07a74aee5e 2025-08-14T22:01:20.8228764Z deleted: sha256:47c77d89ce8782a94a6f5435b1611a76b47f830153ba4b462d3e08dcbdaa40f7 2025-08-14T22:01:20.8229169Z deleted: sha256:d5120b2e61fb0ccc32a2ad02fc0b2b908bc69f1f174268bde3d26d79ce46f046 2025-08-14T22:01:20.8229562Z deleted: sha256:65626052fd7e03a8e90c72072a54f0eaa43788cfcb0835ffb98b700be89b0567 2025-08-14T22:01:20.8229957Z deleted: sha256:05c09c0832c35f0128e0258b1d3069d7bb4b94ce58239faba5d585e49c34e904 2025-08-14T22:01:20.8230348Z deleted: sha256:2d6749fb2c30585eebb1d97e99318434ec34e0f7a4414e552fd4a44175f86839 2025-08-14T22:01:20.8230743Z deleted: sha256:2d65e2932810021e5b3cfedd89cfd851dd47fce63fbe5dc6959e59f3d8a98499 2025-08-14T22:01:20.8231143Z deleted: sha256:b2e71ddacad35b6caa3a77429bab51b654f6acaccc9e9263f1cb43edb8c53ac3 2025-08-14T22:01:20.8231557Z deleted: sha256:632a43100a629c40972b4da95fbbb581f29fe8b073a96386c72931d27ffbbefa 2025-08-14T22:01:20.8231964Z deleted: sha256:11964e5f5833fdf2bcc61c52f33d5aebf9b5504c6792baf58beb96b90398d10a 2025-08-14T22:01:20.8232361Z deleted: sha256:f0c1cb4c9e4655464b9b62b6589ac5005c2392213765ab4175bd61e3f6462643 2025-08-14T22:01:20.8232770Z deleted: sha256:5113aaee4b4d5ee45b58bcee467ac314112b02e4c4e5e9c3cc7a236dd308e9de 2025-08-14T22:01:20.8233175Z deleted: sha256:9cdc88c7b7fe728e15c72d0e8eef813ace31905b4b317a0a23f1334b6a22e604 2025-08-14T22:01:20.8233566Z deleted: sha256:8056a3da01752a91095e2d0afd80b625172f0915f22f7d998b9b926b9462dc5f 2025-08-14T22:01:20.8233946Z deleted: sha256:8a99968112e0edd39c242f3452b05d167911724468fdd9b18d11a8f5fa9c3ac8 2025-08-14T22:01:20.8234343Z deleted: sha256:6f70653bcfea9c1dd39aba76713adac0ac8f6f4c202387ff86a3ffe45d2079f2 2025-08-14T22:01:20.8234755Z deleted: sha256:9a0ed45f26188ecbfcf7658f46e29922b441969b2aded64d1d6b287b6de2e49c 2025-08-14T22:01:20.8235152Z deleted: sha256:f84c75780b110e68f7593fe9592456387118761b365a954a105aee72016adeac 2025-08-14T22:01:20.8235545Z deleted: sha256:1a5a81f8cbb945eee96e25ee8b4958d7140bb6751b86bc2e4a6aa9e18a16846c 2025-08-14T22:01:20.8235947Z deleted: sha256:7e072dc6aa8c1831ddc97ba8229235081976cb8036c06ee1320b33606e03f9a4 2025-08-14T22:01:20.8236379Z deleted: sha256:369af3627df8ecb48c51ea4fd3267e561b2f6821075ddce314e9485494447f16 2025-08-14T22:01:20.8236777Z deleted: sha256:4d49b99f2eee0f82788e33a9c771f75b1411b0b70ce47771fc1b3bc160f23961 2025-08-14T22:01:20.8237193Z deleted: sha256:fe04dcb9c711f36f9ed1df5b2d0854d30dc5abaa6e6cd493b85d4c2e2d2c3e1b 2025-08-14T22:01:20.8237589Z deleted: sha256:4800771a0435c52d6e480540ffa8a65ecc51fdc82a91302c1a373e6021bc37ca 2025-08-14T22:01:20.8237973Z deleted: sha256:90a2bf02e851326fc70d05470553ed33e578342d6e06bfa0cfaf331c4079b7e4 2025-08-14T22:01:20.8238206Z 2025-08-14T22:01:20.8238303Z Total reclaimed space: 51.8GB 2025-08-14T22:01:20.8324239Z Post job cleanup. 2025-08-14T22:01:20.8357593Z Post job cleanup. 2025-08-14T22:01:20.9192927Z [command]/usr/bin/git version 2025-08-14T22:01:20.9236193Z git version 2.47.1 2025-08-14T22:01:20.9268130Z Copying '/home/ec2-user/.gitconfig' to '/home/ec2-user/actions-runner/_work/_temp/49bf630b-f066-4780-ba1a-e55e511b94ef/.gitconfig' 2025-08-14T22:01:20.9277770Z Temporarily overriding HOME='/home/ec2-user/actions-runner/_work/_temp/49bf630b-f066-4780-ba1a-e55e511b94ef' before making global git config changes 2025-08-14T22:01:20.9278522Z Adding repository directory to the temporary git global config as a safe directory 2025-08-14T22:01:20.9282471Z [command]/usr/bin/git config --global --add safe.directory /home/ec2-user/actions-runner/_work/pytorch/pytorch 2025-08-14T22:01:20.9339399Z [command]/usr/bin/git config --local --name-only --get-regexp core\.sshCommand 2025-08-14T22:01:20.9378816Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'core\.sshCommand' && git config --local --unset-all 'core.sshCommand' || :" 2025-08-14T22:01:20.9709779Z Entering 'android/libs/fbjni' 2025-08-14T22:01:20.9766731Z Entering 'third_party/FP16' 2025-08-14T22:01:20.9823708Z Entering 'third_party/FXdiv' 2025-08-14T22:01:20.9882043Z Entering 'third_party/NNPACK' 2025-08-14T22:01:20.9935011Z Entering 'third_party/NVTX' 2025-08-14T22:01:20.9991484Z Entering 'third_party/VulkanMemoryAllocator' 2025-08-14T22:01:21.0050330Z Entering 'third_party/XNNPACK' 2025-08-14T22:01:21.0120725Z Entering 'third_party/aiter' 2025-08-14T22:01:21.0177154Z Entering 'third_party/aiter/3rdparty/composable_kernel' 2025-08-14T22:01:21.0245802Z Entering 'third_party/benchmark' 2025-08-14T22:01:21.0296340Z Entering 'third_party/composable_kernel' 2025-08-14T22:01:21.0360596Z Entering 'third_party/cpp-httplib' 2025-08-14T22:01:21.0422771Z Entering 'third_party/cpuinfo' 2025-08-14T22:01:21.0484417Z Entering 'third_party/cudnn_frontend' 2025-08-14T22:01:21.0547421Z Entering 'third_party/cutlass' 2025-08-14T22:01:21.0609141Z Entering 'third_party/fbgemm' 2025-08-14T22:01:21.0665038Z Entering 'third_party/fbgemm/external/asmjit' 2025-08-14T22:01:21.0723405Z Entering 'third_party/fbgemm/external/composable_kernel' 2025-08-14T22:01:21.0783212Z Entering 'third_party/fbgemm/external/cpuinfo' 2025-08-14T22:01:21.0843411Z Entering 'third_party/fbgemm/external/cutlass' 2025-08-14T22:01:21.0910630Z Entering 'third_party/fbgemm/external/googletest' 2025-08-14T22:01:21.0966151Z Entering 'third_party/fbgemm/external/hipify_torch' 2025-08-14T22:01:21.1023524Z Entering 'third_party/fbgemm/external/json' 2025-08-14T22:01:21.1079637Z Entering 'third_party/flash-attention' 2025-08-14T22:01:21.1149506Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-08-14T22:01:21.1207230Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-08-14T22:01:21.1294368Z Entering 'third_party/flatbuffers' 2025-08-14T22:01:21.1328000Z Entering 'third_party/fmt' 2025-08-14T22:01:21.1384558Z Entering 'third_party/gemmlowp/gemmlowp' 2025-08-14T22:01:21.1439979Z Entering 'third_party/gloo' 2025-08-14T22:01:21.1495181Z Entering 'third_party/googletest' 2025-08-14T22:01:21.1553315Z Entering 'third_party/ideep' 2025-08-14T22:01:21.1607704Z Entering 'third_party/ideep/mkl-dnn' 2025-08-14T22:01:21.1675414Z Entering 'third_party/ittapi' 2025-08-14T22:01:21.1733118Z Entering 'third_party/kineto' 2025-08-14T22:01:21.1785109Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-08-14T22:01:21.1842276Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-08-14T22:01:21.1897789Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-08-14T22:01:21.1957304Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-08-14T22:01:21.2012884Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-08-14T22:01:21.2066886Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-08-14T22:01:21.2123241Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-08-14T22:01:21.2181266Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-08-14T22:01:21.2239604Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-08-14T22:01:21.2296292Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-08-14T22:01:21.2352530Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-08-14T22:01:21.2406596Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-08-14T22:01:21.2465912Z Entering 'third_party/kleidiai' 2025-08-14T22:01:21.2521289Z Entering 'third_party/mimalloc' 2025-08-14T22:01:21.2581620Z Entering 'third_party/nlohmann' 2025-08-14T22:01:21.2644354Z Entering 'third_party/onnx' 2025-08-14T22:01:21.2713406Z Entering 'third_party/onnx/third_party/pybind11' 2025-08-14T22:01:21.2777275Z Entering 'third_party/opentelemetry-cpp' 2025-08-14T22:01:21.2836802Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-08-14T22:01:21.2892718Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-08-14T22:01:21.2952161Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-08-14T22:01:21.3008329Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-08-14T22:01:21.3066635Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-08-14T22:01:21.3121961Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-08-14T22:01:21.3183034Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-08-14T22:01:21.3240956Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-08-14T22:01:21.3298786Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-08-14T22:01:21.3355440Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-08-14T22:01:21.3431375Z Entering 'third_party/pocketfft' 2025-08-14T22:01:21.3486731Z Entering 'third_party/protobuf' 2025-08-14T22:01:21.3546691Z Entering 'third_party/protobuf/third_party/benchmark' 2025-08-14T22:01:21.3602497Z Entering 'third_party/protobuf/third_party/googletest' 2025-08-14T22:01:21.3657615Z Entering 'third_party/psimd' 2025-08-14T22:01:21.3715620Z Entering 'third_party/pthreadpool' 2025-08-14T22:01:21.3770848Z Entering 'third_party/pybind11' 2025-08-14T22:01:21.3826915Z Entering 'third_party/python-peachpy' 2025-08-14T22:01:21.3882052Z Entering 'third_party/sleef' 2025-08-14T22:01:21.3938452Z Entering 'third_party/tensorpipe' 2025-08-14T22:01:21.3996347Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-08-14T22:01:21.4050225Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-08-14T22:01:21.4112135Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-08-14T22:01:21.4169356Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-08-14T22:01:21.4228362Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-08-14T22:01:21.4305215Z [command]/usr/bin/git config --local --name-only --get-regexp http\.https\:\/\/github\.com\/\.extraheader 2025-08-14T22:01:21.4323323Z http.https://github.com/.extraheader 2025-08-14T22:01:21.4332900Z [command]/usr/bin/git config --local --unset-all http.https://github.com/.extraheader 2025-08-14T22:01:21.4363928Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'http\.https\:\/\/github\.com\/\.extraheader' && git config --local --unset-all 'http.https://github.com/.extraheader' || :" 2025-08-14T22:01:21.4681158Z Entering 'android/libs/fbjni' 2025-08-14T22:01:21.4723030Z http.https://github.com/.extraheader 2025-08-14T22:01:21.4755721Z Entering 'third_party/FP16' 2025-08-14T22:01:21.4792942Z http.https://github.com/.extraheader 2025-08-14T22:01:21.4831045Z Entering 'third_party/FXdiv' 2025-08-14T22:01:21.4869338Z http.https://github.com/.extraheader 2025-08-14T22:01:21.4904852Z Entering 'third_party/NNPACK' 2025-08-14T22:01:21.4943637Z http.https://github.com/.extraheader 2025-08-14T22:01:21.4977282Z Entering 'third_party/NVTX' 2025-08-14T22:01:21.5020131Z http.https://github.com/.extraheader 2025-08-14T22:01:21.5059127Z Entering 'third_party/VulkanMemoryAllocator' 2025-08-14T22:01:21.5097997Z http.https://github.com/.extraheader 2025-08-14T22:01:21.5140440Z Entering 'third_party/XNNPACK' 2025-08-14T22:01:21.5176908Z http.https://github.com/.extraheader 2025-08-14T22:01:21.5231139Z Entering 'third_party/aiter' 2025-08-14T22:01:21.5268308Z http.https://github.com/.extraheader 2025-08-14T22:01:21.5301979Z Entering 'third_party/aiter/3rdparty/composable_kernel' 2025-08-14T22:01:21.5339138Z http.https://github.com/.extraheader 2025-08-14T22:01:21.5384910Z Entering 'third_party/benchmark' 2025-08-14T22:01:21.5422943Z http.https://github.com/.extraheader 2025-08-14T22:01:21.5462396Z Entering 'third_party/composable_kernel' 2025-08-14T22:01:21.5499407Z http.https://github.com/.extraheader 2025-08-14T22:01:21.5542298Z Entering 'third_party/cpp-httplib' 2025-08-14T22:01:21.5581133Z http.https://github.com/.extraheader 2025-08-14T22:01:21.5618237Z Entering 'third_party/cpuinfo' 2025-08-14T22:01:21.5658179Z http.https://github.com/.extraheader 2025-08-14T22:01:21.5695182Z Entering 'third_party/cudnn_frontend' 2025-08-14T22:01:21.5732723Z http.https://github.com/.extraheader 2025-08-14T22:01:21.5768623Z Entering 'third_party/cutlass' 2025-08-14T22:01:21.5807015Z http.https://github.com/.extraheader 2025-08-14T22:01:21.5857160Z Entering 'third_party/fbgemm' 2025-08-14T22:01:21.5894228Z http.https://github.com/.extraheader 2025-08-14T22:01:21.5932450Z Entering 'third_party/fbgemm/external/asmjit' 2025-08-14T22:01:21.5971410Z http.https://github.com/.extraheader 2025-08-14T22:01:21.6004319Z Entering 'third_party/fbgemm/external/composable_kernel' 2025-08-14T22:01:21.6042161Z http.https://github.com/.extraheader 2025-08-14T22:01:21.6083772Z Entering 'third_party/fbgemm/external/cpuinfo' 2025-08-14T22:01:21.6120291Z http.https://github.com/.extraheader 2025-08-14T22:01:21.6159961Z Entering 'third_party/fbgemm/external/cutlass' 2025-08-14T22:01:21.6197117Z http.https://github.com/.extraheader 2025-08-14T22:01:21.6241378Z Entering 'third_party/fbgemm/external/googletest' 2025-08-14T22:01:21.6278644Z http.https://github.com/.extraheader 2025-08-14T22:01:21.6318214Z Entering 'third_party/fbgemm/external/hipify_torch' 2025-08-14T22:01:21.6355974Z http.https://github.com/.extraheader 2025-08-14T22:01:21.6388787Z Entering 'third_party/fbgemm/external/json' 2025-08-14T22:01:21.6427845Z http.https://github.com/.extraheader 2025-08-14T22:01:21.6463028Z Entering 'third_party/flash-attention' 2025-08-14T22:01:21.6500998Z http.https://github.com/.extraheader 2025-08-14T22:01:21.6543112Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-08-14T22:01:21.6579394Z http.https://github.com/.extraheader 2025-08-14T22:01:21.6624271Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-08-14T22:01:21.6660660Z http.https://github.com/.extraheader 2025-08-14T22:01:21.6705063Z Entering 'third_party/flatbuffers' 2025-08-14T22:01:21.6748959Z http.https://github.com/.extraheader 2025-08-14T22:01:21.6784845Z Entering 'third_party/fmt' 2025-08-14T22:01:21.6825095Z http.https://github.com/.extraheader 2025-08-14T22:01:21.6864129Z Entering 'third_party/gemmlowp/gemmlowp' 2025-08-14T22:01:21.6899183Z http.https://github.com/.extraheader 2025-08-14T22:01:21.6937019Z Entering 'third_party/gloo' 2025-08-14T22:01:21.6975203Z http.https://github.com/.extraheader 2025-08-14T22:01:21.7010169Z Entering 'third_party/googletest' 2025-08-14T22:01:21.7047626Z http.https://github.com/.extraheader 2025-08-14T22:01:21.7084308Z Entering 'third_party/ideep' 2025-08-14T22:01:21.7123193Z http.https://github.com/.extraheader 2025-08-14T22:01:21.7154887Z Entering 'third_party/ideep/mkl-dnn' 2025-08-14T22:01:21.7191759Z http.https://github.com/.extraheader 2025-08-14T22:01:21.7235799Z Entering 'third_party/ittapi' 2025-08-14T22:01:21.7273334Z http.https://github.com/.extraheader 2025-08-14T22:01:21.7311905Z Entering 'third_party/kineto' 2025-08-14T22:01:21.7347241Z http.https://github.com/.extraheader 2025-08-14T22:01:21.7384288Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-08-14T22:01:21.7422119Z http.https://github.com/.extraheader 2025-08-14T22:01:21.7458358Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-08-14T22:01:21.7494151Z http.https://github.com/.extraheader 2025-08-14T22:01:21.7539472Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-08-14T22:01:21.7577680Z http.https://github.com/.extraheader 2025-08-14T22:01:21.7610099Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-08-14T22:01:21.7647436Z http.https://github.com/.extraheader 2025-08-14T22:01:21.7682550Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-08-14T22:01:21.7718888Z http.https://github.com/.extraheader 2025-08-14T22:01:21.7754151Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-08-14T22:01:21.7790364Z http.https://github.com/.extraheader 2025-08-14T22:01:21.7829979Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-08-14T22:01:21.7866202Z http.https://github.com/.extraheader 2025-08-14T22:01:21.7899911Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-08-14T22:01:21.7940647Z http.https://github.com/.extraheader 2025-08-14T22:01:21.7979363Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-08-14T22:01:21.8017619Z http.https://github.com/.extraheader 2025-08-14T22:01:21.8054787Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-08-14T22:01:21.8091920Z http.https://github.com/.extraheader 2025-08-14T22:01:21.8132173Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-08-14T22:01:21.8168761Z http.https://github.com/.extraheader 2025-08-14T22:01:21.8206716Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-08-14T22:01:21.8240461Z http.https://github.com/.extraheader 2025-08-14T22:01:21.8280289Z Entering 'third_party/kleidiai' 2025-08-14T22:01:21.8318324Z http.https://github.com/.extraheader 2025-08-14T22:01:21.8355607Z Entering 'third_party/mimalloc' 2025-08-14T22:01:21.8392768Z http.https://github.com/.extraheader 2025-08-14T22:01:21.8430222Z Entering 'third_party/nlohmann' 2025-08-14T22:01:21.8466874Z http.https://github.com/.extraheader 2025-08-14T22:01:21.8511643Z Entering 'third_party/onnx' 2025-08-14T22:01:21.8551440Z http.https://github.com/.extraheader 2025-08-14T22:01:21.8598172Z Entering 'third_party/onnx/third_party/pybind11' 2025-08-14T22:01:21.8634090Z http.https://github.com/.extraheader 2025-08-14T22:01:21.8679395Z Entering 'third_party/opentelemetry-cpp' 2025-08-14T22:01:21.8720904Z http.https://github.com/.extraheader 2025-08-14T22:01:21.8756024Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-08-14T22:01:21.8792395Z http.https://github.com/.extraheader 2025-08-14T22:01:21.8825407Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-08-14T22:01:21.8864151Z http.https://github.com/.extraheader 2025-08-14T22:01:21.8896851Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-08-14T22:01:21.8934359Z http.https://github.com/.extraheader 2025-08-14T22:01:21.8969726Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-08-14T22:01:21.9006509Z http.https://github.com/.extraheader 2025-08-14T22:01:21.9043261Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-08-14T22:01:21.9080790Z http.https://github.com/.extraheader 2025-08-14T22:01:21.9113878Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-08-14T22:01:21.9151340Z http.https://github.com/.extraheader 2025-08-14T22:01:21.9183643Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-08-14T22:01:21.9227341Z http.https://github.com/.extraheader 2025-08-14T22:01:21.9257378Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-08-14T22:01:21.9293861Z http.https://github.com/.extraheader 2025-08-14T22:01:21.9330495Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-08-14T22:01:21.9366944Z http.https://github.com/.extraheader 2025-08-14T22:01:21.9405684Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-08-14T22:01:21.9444860Z http.https://github.com/.extraheader 2025-08-14T22:01:21.9495589Z Entering 'third_party/pocketfft' 2025-08-14T22:01:21.9532397Z http.https://github.com/.extraheader 2025-08-14T22:01:21.9573534Z Entering 'third_party/protobuf' 2025-08-14T22:01:21.9609222Z http.https://github.com/.extraheader 2025-08-14T22:01:21.9645162Z Entering 'third_party/protobuf/third_party/benchmark' 2025-08-14T22:01:21.9683514Z http.https://github.com/.extraheader 2025-08-14T22:01:21.9721789Z Entering 'third_party/protobuf/third_party/googletest' 2025-08-14T22:01:21.9760994Z http.https://github.com/.extraheader 2025-08-14T22:01:21.9790557Z Entering 'third_party/psimd' 2025-08-14T22:01:21.9832162Z http.https://github.com/.extraheader 2025-08-14T22:01:21.9868184Z Entering 'third_party/pthreadpool' 2025-08-14T22:01:21.9908547Z http.https://github.com/.extraheader 2025-08-14T22:01:21.9943987Z Entering 'third_party/pybind11' 2025-08-14T22:01:21.9978661Z http.https://github.com/.extraheader 2025-08-14T22:01:22.0042249Z Entering 'third_party/python-peachpy' 2025-08-14T22:01:22.0081815Z http.https://github.com/.extraheader 2025-08-14T22:01:22.0118072Z Entering 'third_party/sleef' 2025-08-14T22:01:22.0156490Z http.https://github.com/.extraheader 2025-08-14T22:01:22.0193861Z Entering 'third_party/tensorpipe' 2025-08-14T22:01:22.0231589Z http.https://github.com/.extraheader 2025-08-14T22:01:22.0269674Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-08-14T22:01:22.0309178Z http.https://github.com/.extraheader 2025-08-14T22:01:22.0341710Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-08-14T22:01:22.0376904Z http.https://github.com/.extraheader 2025-08-14T22:01:22.0415077Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-08-14T22:01:22.0449105Z http.https://github.com/.extraheader 2025-08-14T22:01:22.0484727Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-08-14T22:01:22.0522378Z http.https://github.com/.extraheader 2025-08-14T22:01:22.0554299Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-08-14T22:01:22.0592365Z http.https://github.com/.extraheader 2025-08-14T22:01:22.0730851Z A job completed hook has been configured by the self-hosted runner administrator 2025-08-14T22:01:22.0743129Z ##[group]Run '/home/ec2-user/runner-scripts/after_job.sh' 2025-08-14T22:01:22.0746648Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-08-14T22:01:22.0746909Z ##[endgroup] 2025-08-14T22:01:22.0830780Z [!ALERT!] Swap in detected! [!ALERT!] 2025-08-14T22:01:32.0894997Z [!ALERT!] Swap out detected [!ALERT!] 2025-08-14T22:01:48.8521238Z Cleaning up orphan processes