2025-03-14T05:30:22.3830211Z Current runner version: '2.322.0' 2025-03-14T05:30:22.3836817Z Runner name: 'i-0995e781c94ad14d3' 2025-03-14T05:30:22.3837722Z Runner group name: 'Default' 2025-03-14T05:30:22.3838742Z Machine name: 'ip-10-0-78-218' 2025-03-14T05:30:22.3843828Z ##[group]GITHUB_TOKEN Permissions 2025-03-14T05:30:22.3846402Z Actions: read 2025-03-14T05:30:22.3847074Z Attestations: read 2025-03-14T05:30:22.3847732Z Checks: read 2025-03-14T05:30:22.3848366Z Contents: read 2025-03-14T05:30:22.3849015Z Deployments: read 2025-03-14T05:30:22.3849675Z Discussions: read 2025-03-14T05:30:22.3850322Z Issues: read 2025-03-14T05:30:22.3850946Z Metadata: read 2025-03-14T05:30:22.3851635Z Packages: read 2025-03-14T05:30:22.3852274Z Pages: read 2025-03-14T05:30:22.3852907Z PullRequests: read 2025-03-14T05:30:22.3853594Z RepositoryProjects: read 2025-03-14T05:30:22.3854332Z SecurityEvents: read 2025-03-14T05:30:22.3855019Z Statuses: read 2025-03-14T05:30:22.3855653Z ##[endgroup] 2025-03-14T05:30:22.3858533Z Secret source: Actions 2025-03-14T05:30:22.3859460Z Prepare workflow directory 2025-03-14T05:30:22.4326773Z Prepare all required actions 2025-03-14T05:30:22.4363403Z Getting action download info 2025-03-14T05:30:22.6059760Z Download action repository 'pytorch/test-infra@main' (SHA:de00dac6adc071cb2f9861380a0ed3947b93e5cc) 2025-03-14T05:30:23.4994399Z Download action repository 'pytorch/pytorch@main' (SHA:d4496346b901e9a4c3993bf6b2054014c7c0b731) 2025-03-14T05:30:37.9641595Z Download action repository 'aws-actions/configure-aws-credentials@v3' (SHA:50ac8dd1e1b10d09dac7b8727528b91bed831ac0) 2025-03-14T05:30:38.1744977Z Download action repository 'seemethere/upload-artifact-s3@v5' (SHA:baba72d0712b404f646cebe0730933554ebce96a) 2025-03-14T05:30:38.4378757Z Getting action download info 2025-03-14T05:30:38.5494877Z Download action repository 'actions/checkout@v4' (SHA:11bd71901bbe5b1630ceea73d27597364c9af683) 2025-03-14T05:30:38.8017158Z Getting action download info 2025-03-14T05:30:38.9162190Z Download action repository 'nick-fields/retry@v3.0.0' (SHA:7152eba30c6575329ac0576536151aca5a72780e) 2025-03-14T05:30:39.1357471Z Getting action download info 2025-03-14T05:30:39.2401928Z Download action repository 'nick-fields/retry@3e91a01664abd3c5cd539100d10d33b9c5b68482' (SHA:3e91a01664abd3c5cd539100d10d33b9c5b68482) 2025-03-14T05:30:39.3915615Z Getting action download info 2025-03-14T05:30:39.5130122Z Uses: pytorch/pytorch/.github/workflows/_linux-test.yml@refs/heads/main (aed0b7a742a2d7b7901790622829cbd2135049a4) 2025-03-14T05:30:39.5131964Z ##[group] Inputs 2025-03-14T05:30:39.5132348Z build-environment: linux-focal-cuda12.6-py3.10-gcc9-sm86 2025-03-14T05:30:39.5134220Z test-matrix: {"include": [{"config": "inductor_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.g5.4xlarge.nvidia.gpu"}, {"config": "inductor_timm", "shard": 1, "num_shards": 2, "runner": "linux.g5.4xlarge.nvidia.gpu"}, {"config": "inductor_timm", "shard": 2, "num_shards": 2, "runner": "linux.g5.4xlarge.nvidia.gpu"}, {"config": "inductor_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.g5.4xlarge.nvidia.gpu"}, {"config": "inductor_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.g5.4xlarge.nvidia.gpu"}]} 2025-03-14T05:30:39.5136491Z docker-image: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:30:39.5137400Z sync-tag: 2025-03-14T05:30:39.5138118Z timeout-minutes: 240 2025-03-14T05:30:39.5138396Z use-gha: 2025-03-14T05:30:39.5138636Z dashboard-tag: 2025-03-14T05:30:39.5138901Z s3-bucket: gha-artifacts 2025-03-14T05:30:39.5139193Z aws-role-to-assume: 2025-03-14T05:30:39.5139722Z disable-monitor: false 2025-03-14T05:30:39.5140023Z ##[endgroup] 2025-03-14T05:30:39.5140587Z Complete job name: cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu) 2025-03-14T05:30:39.5731973Z A job started hook has been configured by the self-hosted runner administrator 2025-03-14T05:30:39.5853963Z ##[group]Run '/home/ec2-user/runner-scripts/before_job.sh' 2025-03-14T05:30:39.5865673Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:30:39.5866284Z ##[endgroup] 2025-03-14T05:30:40.8207056Z Runner Type: linux.g5.4xlarge.nvidia.gpu 2025-03-14T05:30:40.8207544Z Instance Type: g5.4xlarge 2025-03-14T05:30:40.8207854Z AMI Name: unknown 2025-03-14T05:30:40.8245615Z AMI ID: ami-08b5b3a93ed654d19 2025-03-14T05:30:46.1805004Z ##[group]Run pytorch/test-infra/.github/actions/setup-ssh@main 2025-03-14T05:30:46.1805445Z with: 2025-03-14T05:30:46.1806092Z github-secret: *** 2025-03-14T05:30:46.1806798Z instructions: All testing is done inside the container, to start an interactive session run: docker exec -it $(docker container ps --format '{{.ID}}') bash 2025-03-14T05:30:46.1807531Z activate-with-label: false 2025-03-14T05:30:46.1807832Z label: with-ssh 2025-03-14T05:30:46.1808105Z remove-existing-keys: true 2025-03-14T05:30:46.1808408Z fail-silently: true 2025-03-14T05:30:46.1808664Z env: 2025-03-14T05:30:46.1808918Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:30:46.1809201Z ##[endgroup] 2025-03-14T05:30:46.3003789Z Please see https://github.com/pytorch/pytorch/wiki/Debugging-using-with-ssh-for-Github-Actions for more info. 2025-03-14T05:30:46.3005286Z Not on pull request and ciflow reference could not be extracted, skipping adding ssh keys 2025-03-14T05:30:46.3171746Z ##[group]Run pytorch/pytorch/.github/actions/checkout-pytorch@main 2025-03-14T05:30:46.3172184Z with: 2025-03-14T05:30:46.3172426Z no-sudo: true 2025-03-14T05:30:46.3172706Z submodules: recursive 2025-03-14T05:30:46.3172987Z fetch-depth: 0 2025-03-14T05:30:46.3173228Z env: 2025-03-14T05:30:46.3173463Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:30:46.3173745Z ##[endgroup] 2025-03-14T05:30:46.3275153Z ##[group]Run echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-03-14T05:30:46.3276095Z echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-03-14T05:30:46.3290387Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:30:46.3290819Z env: 2025-03-14T05:30:46.3291060Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:30:46.3291343Z ##[endgroup] 2025-03-14T05:30:46.3390799Z ##[group]Run # Use all available CPUs for fetching 2025-03-14T05:30:46.3391268Z # Use all available CPUs for fetching 2025-03-14T05:30:46.3391669Z cd "${GITHUB_WORKSPACE}" 2025-03-14T05:30:46.3392076Z git config --global fetch.parallel 0 2025-03-14T05:30:46.3392498Z git config --global submodule.fetchJobs 0 2025-03-14T05:30:46.3392863Z  2025-03-14T05:30:46.3393306Z # Clean workspace. The default checkout action should also do this, but 2025-03-14T05:30:46.3393813Z # do it here as well just in case 2025-03-14T05:30:46.3394257Z if [[ -d .git ]]; then 2025-03-14T05:30:46.3394590Z  if [ -z "${NO_SUDO}" ]; then 2025-03-14T05:30:46.3394954Z  sudo git clean -ffdx 2025-03-14T05:30:46.3395258Z  else 2025-03-14T05:30:46.3395520Z  git clean -ffdx 2025-03-14T05:30:46.3395807Z  fi 2025-03-14T05:30:46.3396057Z fi 2025-03-14T05:30:46.3404812Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:30:46.3405201Z env: 2025-03-14T05:30:46.3405453Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:30:46.3405752Z NO_SUDO: true 2025-03-14T05:30:46.3406005Z ##[endgroup] 2025-03-14T05:30:46.3542107Z ##[group]Run actions/checkout@v4 2025-03-14T05:30:46.3542438Z with: 2025-03-14T05:30:46.3542721Z ref: aed0b7a742a2d7b7901790622829cbd2135049a4 2025-03-14T05:30:46.3543075Z fetch-depth: 0 2025-03-14T05:30:46.3543347Z submodules: recursive 2025-03-14T05:30:46.3543640Z show-progress: false 2025-03-14T05:30:46.3543944Z repository: pytorch/pytorch 2025-03-14T05:30:46.3544349Z token: *** 2025-03-14T05:30:46.3544597Z ssh-strict: true 2025-03-14T05:30:46.3545087Z ssh-user: git 2025-03-14T05:30:46.3545356Z persist-credentials: true 2025-03-14T05:30:46.3545652Z clean: true 2025-03-14T05:30:46.3545924Z sparse-checkout-cone-mode: true 2025-03-14T05:30:46.3546237Z fetch-tags: false 2025-03-14T05:30:46.3546491Z lfs: false 2025-03-14T05:30:46.3546747Z set-safe-directory: true 2025-03-14T05:30:46.3547045Z env: 2025-03-14T05:30:46.3547290Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:30:46.3547577Z ##[endgroup] 2025-03-14T05:30:46.4669474Z Syncing repository: pytorch/pytorch 2025-03-14T05:30:46.4670701Z ##[group]Getting Git version info 2025-03-14T05:30:46.4671179Z Working directory is '/home/ec2-user/actions-runner/_work/pytorch/pytorch' 2025-03-14T05:30:46.4671823Z [command]/usr/bin/git version 2025-03-14T05:30:46.4672841Z git version 2.47.1 2025-03-14T05:30:46.4696752Z ##[endgroup] 2025-03-14T05:30:46.4706808Z Copying '/home/ec2-user/.gitconfig' to '/home/ec2-user/actions-runner/_work/_temp/29dc027a-ba1a-49d2-91b3-d3d9ae7582bb/.gitconfig' 2025-03-14T05:30:46.4726328Z Temporarily overriding HOME='/home/ec2-user/actions-runner/_work/_temp/29dc027a-ba1a-49d2-91b3-d3d9ae7582bb' before making global git config changes 2025-03-14T05:30:46.4727282Z Adding repository directory to the temporary git global config as a safe directory 2025-03-14T05:30:46.4731653Z [command]/usr/bin/git config --global --add safe.directory /home/ec2-user/actions-runner/_work/pytorch/pytorch 2025-03-14T05:30:46.4771246Z Deleting the contents of '/home/ec2-user/actions-runner/_work/pytorch/pytorch' 2025-03-14T05:30:46.4774392Z ##[group]Initializing the repository 2025-03-14T05:30:46.4778423Z [command]/usr/bin/git init /home/ec2-user/actions-runner/_work/pytorch/pytorch 2025-03-14T05:30:46.4821436Z hint: Using 'master' as the name for the initial branch. This default branch name 2025-03-14T05:30:46.4822026Z hint: is subject to change. To configure the initial branch name to use in all 2025-03-14T05:30:46.4822590Z hint: of your new repositories, which will suppress this warning, call: 2025-03-14T05:30:46.4823019Z hint: 2025-03-14T05:30:46.4823362Z hint: git config --global init.defaultBranch 2025-03-14T05:30:46.4823730Z hint: 2025-03-14T05:30:46.4824085Z hint: Names commonly chosen instead of 'master' are 'main', 'trunk' and 2025-03-14T05:30:46.4824657Z hint: 'development'. The just-created branch can be renamed via this command: 2025-03-14T05:30:46.4825097Z hint: 2025-03-14T05:30:46.4825348Z hint: git branch -m 2025-03-14T05:30:46.4825872Z Initialized empty Git repository in /home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/ 2025-03-14T05:30:46.4834324Z [command]/usr/bin/git remote add origin https://github.com/pytorch/pytorch 2025-03-14T05:30:46.4867598Z ##[endgroup] 2025-03-14T05:30:46.4868314Z ##[group]Disabling automatic garbage collection 2025-03-14T05:30:46.4871968Z [command]/usr/bin/git config --local gc.auto 0 2025-03-14T05:30:46.4903462Z ##[endgroup] 2025-03-14T05:30:46.4903899Z ##[group]Setting up auth 2025-03-14T05:30:46.4910015Z [command]/usr/bin/git config --local --name-only --get-regexp core\.sshCommand 2025-03-14T05:30:46.4941976Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'core\.sshCommand' && git config --local --unset-all 'core.sshCommand' || :" 2025-03-14T05:30:46.5305185Z [command]/usr/bin/git config --local --name-only --get-regexp http\.https\:\/\/github\.com\/\.extraheader 2025-03-14T05:30:46.5336383Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'http\.https\:\/\/github\.com\/\.extraheader' && git config --local --unset-all 'http.https://github.com/.extraheader' || :" 2025-03-14T05:30:46.5679930Z [command]/usr/bin/git config --local http.https://github.com/.extraheader AUTHORIZATION: basic *** 2025-03-14T05:30:46.5730653Z ##[endgroup] 2025-03-14T05:30:46.5731123Z ##[group]Fetching the repository 2025-03-14T05:30:46.5738522Z [command]/usr/bin/git -c protocol.version=2 fetch --prune --no-recurse-submodules origin +refs/heads/*:refs/remotes/origin/* +refs/tags/*:refs/tags/* 2025-03-14T05:31:37.1032202Z From https://github.com/pytorch/pytorch 2025-03-14T05:31:37.1034464Z * [new branch] 2.1-dynamic-doc -> origin/2.1-dynamic-doc 2025-03-14T05:31:37.1035058Z * [new branch] 2.6.0.dev20241004+ -> origin/2.6.0.dev20241004+ 2025-03-14T05:31:37.1035640Z * [new branch] 20250219_e8m0_intermediate -> origin/20250219_e8m0_intermediate 2025-03-14T05:31:37.1038976Z * [new branch] 20250219_test -> origin/20250219_test 2025-03-14T05:31:37.1039781Z * [new branch] Adjust-Description-for-linux-binary-test-Workflow -> origin/Adjust-Description-for-linux-binary-test-Workflow 2025-03-14T05:31:37.1041753Z * [new branch] Chillee-patch-5 -> origin/Chillee-patch-5 2025-03-14T05:31:37.1044307Z * [new branch] Flamefire-patch-1 -> origin/Flamefire-patch-1 2025-03-14T05:31:37.1046659Z * [new branch] HDCharles-2.6.0-release-notes -> origin/HDCharles-2.6.0-release-notes 2025-03-14T05:31:37.1049663Z * [new branch] JackCaoG/add_new_lazy_counter_macro -> origin/JackCaoG/add_new_lazy_counter_macro 2025-03-14T05:31:37.1052429Z * [new branch] JackCaoG/dynamo_make_fx_non_core_aten_ops -> origin/JackCaoG/dynamo_make_fx_non_core_aten_ops 2025-03-14T05:31:37.1053809Z * [new branch] JackCaoG/update_dynamo_doc -> origin/JackCaoG/update_dynamo_doc 2025-03-14T05:31:37.1056268Z * [new branch] JackCaoG/update_xla_pin_to_skip_test -> origin/JackCaoG/update_xla_pin_to_skip_test 2025-03-14T05:31:37.1058815Z * [new branch] JackCaoG/update_xla_pin_to_skip_test2 -> origin/JackCaoG/update_xla_pin_to_skip_test2 2025-03-14T05:31:37.1061027Z * [new branch] NicolasHug-patch-2 -> origin/NicolasHug-patch-2 2025-03-14T05:31:37.1063422Z * [new branch] PR-AOTInductorNoneBug -> origin/PR-AOTInductorNoneBug 2025-03-14T05:31:37.1065832Z * [new branch] PR-AOTInductorNoneBugFix -> origin/PR-AOTInductorNoneBugFix 2025-03-14T05:31:37.1068189Z * [new branch] PR-FixConfigsIssue -> origin/PR-FixConfigsIssue 2025-03-14T05:31:37.1070921Z * [new branch] PR-NoneBugFix-viable -> origin/PR-NoneBugFix-viable 2025-03-14T05:31:37.1073091Z * [new branch] PR-ResetToZero -> origin/PR-ResetToZero 2025-03-14T05:31:37.1075500Z * [new branch] Remove-linux_t4g_2xlarge-Usage -> origin/Remove-linux_t4g_2xlarge-Usage 2025-03-14T05:31:37.1077570Z * [new branch] Revert-PR-110949 -> origin/Revert-PR-110949 2025-03-14T05:31:37.1079833Z * [new branch] Update-Flash-Packaging -> origin/Update-Flash-Packaging 2025-03-14T05:31:37.1082701Z * [new branch] Valentine/flash_attention_bf16 -> origin/Valentine/flash_attention_bf16 2025-03-14T05:31:37.1085464Z * [new branch] abock/onnx-1.15.0-validation -> origin/abock/onnx-1.15.0-validation 2025-03-14T05:31:37.1087719Z * [new branch] abock/ort-nightly==1.16.0.dev20230908001 -> origin/abock/ort-nightly==1.16.0.dev20230908001 2025-03-14T05:31:37.1089821Z * [new branch] add-android-build-workflow -> origin/add-android-build-workflow 2025-03-14T05:31:37.1091849Z * [new branch] add-assign -> origin/add-assign 2025-03-14T05:31:37.1094117Z * [new branch] add_broadcast_functional_collective -> origin/add_broadcast_functional_collective 2025-03-14T05:31:37.1096522Z * [new branch] add_from_group_doc_and_test -> origin/add_from_group_doc_and_test 2025-03-14T05:31:37.1098422Z * [new branch] add_mha_to_autocast_policy -> origin/add_mha_to_autocast_policy 2025-03-14T05:31:37.1100614Z * [new branch] add_non_parallel_model_comparison -> origin/add_non_parallel_model_comparison 2025-03-14T05:31:37.1102961Z * [new branch] add_test_to_show_view_gap -> origin/add_test_to_show_view_gap 2025-03-14T05:31:37.1104494Z * [new branch] add_windows_testing_back -> origin/add_windows_testing_back 2025-03-14T05:31:37.1106936Z * [new branch] addmm-heuristic -> origin/addmm-heuristic 2025-03-14T05:31:37.1109034Z * [new branch] addsimde -> origin/addsimde 2025-03-14T05:31:37.1112230Z * [new branch] adi/gemm_bf16f32 -> origin/adi/gemm_bf16f32 2025-03-14T05:31:37.1114547Z * [new branch] ah-globalfeedback-hook -> origin/ah-globalfeedback-hook 2025-03-14T05:31:37.1117394Z * [new branch] alanwaketan/pin2 -> origin/alanwaketan/pin2 2025-03-14T05:31:37.1119641Z * [new branch] albanD-patch-1 -> origin/albanD-patch-1 2025-03-14T05:31:37.1121821Z * [new branch] albanD-patch-2 -> origin/albanD-patch-2 2025-03-14T05:31:37.1123970Z * [new branch] alt-disable -> origin/alt-disable 2025-03-14T05:31:37.1126917Z * [new branch] angelayi/aot_inductor_bench_comp_time -> origin/angelayi/aot_inductor_bench_comp_time 2025-03-14T05:31:37.1129008Z * [new branch] angelayi/aot_inductor_benchmark -> origin/angelayi/aot_inductor_benchmark 2025-03-14T05:31:37.1131581Z * [new branch] angelayi/aot_inductor_torch -> origin/angelayi/aot_inductor_torch 2025-03-14T05:31:37.1133402Z * [new branch] angelayi/aoti_additional_files -> origin/angelayi/aoti_additional_files 2025-03-14T05:31:37.1135459Z * [new branch] angelayi/aotinductor_const -> origin/angelayi/aotinductor_const 2025-03-14T05:31:37.1137974Z * [new branch] angelayi/aotinductor_const_name -> origin/angelayi/aotinductor_const_name 2025-03-14T05:31:37.1140386Z * [new branch] angelayi/attr_proxy -> origin/angelayi/attr_proxy 2025-03-14T05:31:37.1142650Z * [new branch] angelayi/benchmark_skip -> origin/angelayi/benchmark_skip 2025-03-14T05:31:37.1144780Z * [new branch] angelayi/bincount -> origin/angelayi/bincount 2025-03-14T05:31:37.1146965Z * [new branch] angelayi/change_pytree_serialization -> origin/angelayi/change_pytree_serialization 2025-03-14T05:31:37.1149025Z * [new branch] angelayi/constraint -> origin/angelayi/constraint 2025-03-14T05:31:37.1151191Z * [new branch] angelayi/cp107981 -> origin/angelayi/cp107981 2025-03-14T05:31:37.1153379Z * [new branch] angelayi/cpp_loader -> origin/angelayi/cpp_loader 2025-03-14T05:31:37.1155869Z * [new branch] angelayi/customop -> origin/angelayi/customop 2025-03-14T05:31:37.1158045Z * [new branch] angelayi/default_serialized -> origin/angelayi/default_serialized 2025-03-14T05:31:37.1160090Z * [new branch] angelayi/distribby -> origin/angelayi/distribby 2025-03-14T05:31:37.1162355Z * [new branch] angelayi/distribution -> origin/angelayi/distribution 2025-03-14T05:31:37.1164451Z * [new branch] angelayi/docs -> origin/angelayi/docs 2025-03-14T05:31:37.1166743Z * [new branch] angelayi/draft_logger -> origin/angelayi/draft_logger 2025-03-14T05:31:37.1169181Z * [new branch] angelayi/embed_constants -> origin/angelayi/embed_constants 2025-03-14T05:31:37.1171206Z * [new branch] angelayi/export_custom_op_rst -> origin/angelayi/export_custom_op_rst 2025-03-14T05:31:37.1173411Z * [new branch] angelayi/fail_models_temp -> origin/angelayi/fail_models_temp 2025-03-14T05:31:37.1175478Z * [new branch] angelayi/fake -> origin/angelayi/fake 2025-03-14T05:31:37.1177649Z * [new branch] angelayi/fix3 -> origin/angelayi/fix3 2025-03-14T05:31:37.1179982Z * [new branch] angelayi/kwarg_input -> origin/angelayi/kwarg_input 2025-03-14T05:31:37.1182246Z * [new branch] angelayi/logging.bak -> origin/angelayi/logging.bak 2025-03-14T05:31:37.1184424Z * [new branch] angelayi/logging2 -> origin/angelayi/logging2 2025-03-14T05:31:37.1186601Z * [new branch] angelayi/no_so_weight -> origin/angelayi/no_so_weight 2025-03-14T05:31:37.1188855Z * [new branch] angelayi/provenance_id -> origin/angelayi/provenance_id 2025-03-14T05:31:37.1190973Z * [new branch] angelayi/pytree2 -> origin/angelayi/pytree2 2025-03-14T05:31:37.1193147Z * [new branch] angelayi/pytree_namedtuple -> origin/angelayi/pytree_namedtuple 2025-03-14T05:31:37.1195629Z * [new branch] angelayi/register_dataclass -> origin/angelayi/register_dataclass 2025-03-14T05:31:37.1197719Z * [new branch] angelayi/remove_aoti_unlift -> origin/angelayi/remove_aoti_unlift 2025-03-14T05:31:37.1199994Z * [new branch] angelayi/shape -> origin/angelayi/shape 2025-03-14T05:31:37.1202138Z * [new branch] angelayi/symint_input -> origin/angelayi/symint_input 2025-03-14T05:31:37.1204371Z * [new branch] angelayi/test113041 -> origin/angelayi/test113041 2025-03-14T05:31:37.1206957Z * [new branch] angelayi/torch_size -> origin/angelayi/torch_size 2025-03-14T05:31:37.1208785Z * [new branch] angelayi/transpose_ -> origin/angelayi/transpose_ 2025-03-14T05:31:37.1210970Z * [new branch] angelayi/update_schema_msg -> origin/angelayi/update_schema_msg 2025-03-14T05:31:37.1213213Z * [new branch] atalman-inductor-perf-cu124 -> origin/atalman-inductor-perf-cu124 2025-03-14T05:31:37.1215487Z * [new branch] atalman-inductor-perf-cu124.1 -> origin/atalman-inductor-perf-cu124.1 2025-03-14T05:31:37.1217635Z * [new branch] atalman-patch-1 -> origin/atalman-patch-1 2025-03-14T05:31:37.1219901Z * [new branch] atalman-patch-10 -> origin/atalman-patch-10 2025-03-14T05:31:37.1222100Z * [new branch] atalman-patch-2 -> origin/atalman-patch-2 2025-03-14T05:31:37.1224395Z * [new branch] atalman-patch-3 -> origin/atalman-patch-3 2025-03-14T05:31:37.1226500Z * [new branch] atalman-patch-5 -> origin/atalman-patch-5 2025-03-14T05:31:37.1228780Z * [new branch] atalman-patch-6 -> origin/atalman-patch-6 2025-03-14T05:31:37.1231047Z * [new branch] atalman-patch-7 -> origin/atalman-patch-7 2025-03-14T05:31:37.1233157Z * [new branch] atalman-patch-8 -> origin/atalman-patch-8 2025-03-14T05:31:37.1235500Z * [new branch] atalman-patch-9 -> origin/atalman-patch-9 2025-03-14T05:31:37.1237772Z * [new branch] atalman_inductor_2.3.0 -> origin/atalman_inductor_2.3.0 2025-03-14T05:31:37.1239877Z * [new branch] atalman_inductor_2.3.1 -> origin/atalman_inductor_2.3.1 2025-03-14T05:31:37.1241917Z * [new branch] atalman_inductor_2.4.0 -> origin/atalman_inductor_2.4.0 2025-03-14T05:31:37.1244149Z * [new branch] atalman_inductor_2.4.x -> origin/atalman_inductor_2.4.x 2025-03-14T05:31:37.1246255Z * [new branch] avoid_record_ag_rs -> origin/avoid_record_ag_rs 2025-03-14T05:31:37.1249157Z * [new branch] bahuang/make_fallback -> origin/bahuang/make_fallback 2025-03-14T05:31:37.1251992Z * [new branch] base/1.5 -> origin/base/1.5 2025-03-14T05:31:37.1254163Z * [new branch] base_inductor_opt_flag -> origin/base_inductor_opt_flag 2025-03-14T05:31:37.1256479Z * [new branch] batching_sdpa_efficient_attention -> origin/batching_sdpa_efficient_attention 2025-03-14T05:31:37.1258548Z * [new branch] benchmark-updates -> origin/benchmark-updates 2025-03-14T05:31:37.1261322Z * [new branch] bertmaher/pinbump26 -> origin/bertmaher/pinbump26 2025-03-14T05:31:37.1264057Z * [new branch] bertrand/cutlass -> origin/bertrand/cutlass 2025-03-14T05:31:37.1266911Z * [new branch] bf/cg-disable-tts-angular -> origin/bf/cg-disable-tts-angular 2025-03-14T05:31:37.1269236Z * [new branch] bf/cg-partition -> origin/bf/cg-partition 2025-03-14T05:31:37.1271290Z * [new branch] bf/cg-prototype -> origin/bf/cg-prototype 2025-03-14T05:31:37.1273393Z * [new branch] bf/cg-remove-check -> origin/bf/cg-remove-check 2025-03-14T05:31:37.1275625Z * [new branch] bf/cg-skip-unbacked-symint-msg -> origin/bf/cg-skip-unbacked-symint-msg 2025-03-14T05:31:37.1277515Z * [new branch] bf/cudagraph -> origin/bf/cudagraph 2025-03-14T05:31:37.1280113Z * [new branch] bf/cudagraph-disable-input-mutation -> origin/bf/cudagraph-disable-input-mutation 2025-03-14T05:31:37.1282809Z * [new branch] bf/cudagraph-enable-input-mutation-support-benchmark -> origin/bf/cudagraph-enable-input-mutation-support-benchmark 2025-03-14T05:31:37.1284426Z * [new branch] bf/cudagraph-partition -> origin/bf/cudagraph-partition 2025-03-14T05:31:37.1286990Z * [new branch] bf/donated-buffer-bench -> origin/bf/donated-buffer-bench 2025-03-14T05:31:37.1289022Z * [new branch] bf/fa-embedding-16 -> origin/bf/fa-embedding-16 2025-03-14T05:31:37.1291218Z * [new branch] bf/fd-non-one-num-head -> origin/bf/fd-non-one-num-head 2025-03-14T05:31:37.1293402Z * [new branch] bf/reduce-scatter-copy-in -> origin/bf/reduce-scatter-copy-in 2025-03-14T05:31:37.1295516Z * [new branch] bf/remove-check-55b0c39d -> origin/bf/remove-check-55b0c39d 2025-03-14T05:31:37.1297623Z * [new branch] bisect_perf_hf_T5_3acc6eac492 -> origin/bisect_perf_hf_T5_3acc6eac492 2025-03-14T05:31:37.1299796Z * [new branch] bisect_perf_hf_T5_3fcf66f61fb -> origin/bisect_perf_hf_T5_3fcf66f61fb 2025-03-14T05:31:37.1301852Z * [new branch] bisect_perf_hf_T5_4009d154129 -> origin/bisect_perf_hf_T5_4009d154129 2025-03-14T05:31:37.1303883Z * [new branch] bisect_perf_hf_T5_40d0740e73d -> origin/bisect_perf_hf_T5_40d0740e73d 2025-03-14T05:31:37.1306065Z * [new branch] bisect_perf_hf_T5_5268754e -> origin/bisect_perf_hf_T5_5268754e 2025-03-14T05:31:37.1308114Z * [new branch] bisect_perf_hf_T5_7d89a8d385c -> origin/bisect_perf_hf_T5_7d89a8d385c 2025-03-14T05:31:37.1310162Z * [new branch] bisect_perf_hf_T5_b7a25c1ee7c -> origin/bisect_perf_hf_T5_b7a25c1ee7c 2025-03-14T05:31:37.1312249Z * [new branch] bisect_perf_hf_T5_c25b201583f -> origin/bisect_perf_hf_T5_c25b201583f 2025-03-14T05:31:37.1314393Z * [new branch] bisect_perf_hf_T5_c93e57efac0 -> origin/bisect_perf_hf_T5_c93e57efac0 2025-03-14T05:31:37.1316527Z * [new branch] bisect_perf_hf_T5_ca9813ea149 -> origin/bisect_perf_hf_T5_ca9813ea149 2025-03-14T05:31:37.1318527Z * [new branch] bisect_perf_hf_T5_d65f194a -> origin/bisect_perf_hf_T5_d65f194a 2025-03-14T05:31:37.1320642Z * [new branch] bisect_perf_hf_T5_da94ab0b -> origin/bisect_perf_hf_T5_da94ab0b 2025-03-14T05:31:37.1322718Z * [new branch] bisect_perf_hf_T5_da94ab0b_new -> origin/bisect_perf_hf_T5_da94ab0b_new 2025-03-14T05:31:37.1324923Z * [new branch] bisect_perf_hf_T5_db4e8a1d8a8 -> origin/bisect_perf_hf_T5_db4e8a1d8a8 2025-03-14T05:31:37.1327012Z * [new branch] bisect_perf_hf_T5_e0d97e936a2 -> origin/bisect_perf_hf_T5_e0d97e936a2 2025-03-14T05:31:37.1329069Z * [new branch] bisect_perf_hf_T5_f23621ec563 -> origin/bisect_perf_hf_T5_f23621ec563 2025-03-14T05:31:37.1331862Z * [new branch] bowbao/beartype_fix_2.1.1 -> origin/bowbao/beartype_fix_2.1.1 2025-03-14T05:31:37.1333833Z * [new branch] bowbao/bench_updates -> origin/bowbao/bench_updates 2025-03-14T05:31:37.1336016Z * [new branch] bowbao/bench_updates_stage -> origin/bowbao/bench_updates_stage 2025-03-14T05:31:37.1338025Z * [new branch] bowbao/benchmark_test_data -> origin/bowbao/benchmark_test_data 2025-03-14T05:31:37.1340108Z * [new branch] bowbao/dort_rewriter -> origin/bowbao/dort_rewriter 2025-03-14T05:31:37.1342192Z * [new branch] bowbao/skip_decomp -> origin/bowbao/skip_decomp 2025-03-14T05:31:37.1344143Z * [new branch] bowbao/wip_prs -> origin/bowbao/wip_prs 2025-03-14T05:31:37.1346962Z * [new branch] brenocfg/fix-meta-opinfo -> origin/brenocfg/fix-meta-opinfo 2025-03-14T05:31:37.1349688Z * [new branch] brister/3d_permute_block_ptr -> origin/brister/3d_permute_block_ptr 2025-03-14T05:31:37.1351746Z * [new branch] brister/always_tiled_reduction -> origin/brister/always_tiled_reduction 2025-03-14T05:31:37.1353707Z * [new branch] brister/doc_bucketize -> origin/brister/doc_bucketize 2025-03-14T05:31:37.1355862Z * [new branch] brister/loop_order -> origin/brister/loop_order 2025-03-14T05:31:37.1358205Z * [new branch] brister/tiled_reduction_no_numel_check -> origin/brister/tiled_reduction_no_numel_check 2025-03-14T05:31:37.1359985Z * [new branch] brister/wrapper_ir -> origin/brister/wrapper_ir 2025-03-14T05:31:37.1362050Z * [new branch] ca_0431d47eaa -> origin/ca_0431d47eaa 2025-03-14T05:31:37.1364133Z * [new branch] ca_fix_0431d47eaa -> origin/ca_fix_0431d47eaa 2025-03-14T05:31:37.1366381Z * [new branch] cherry-pick-111576 -> origin/cherry-pick-111576 2025-03-14T05:31:37.1368881Z * [new branch] cherry-pick-148663-by-pytorch_bot_bot_ -> origin/cherry-pick-148663-by-pytorch_bot_bot_ 2025-03-14T05:31:37.1370896Z * [new branch] cherry-pick-149025-by-pytorch_bot_bot_ -> origin/cherry-pick-149025-by-pytorch_bot_bot_ 2025-03-14T05:31:37.1373239Z * [new branch] cherry-pick-149092-by-pytorch_bot_bot_ -> origin/cherry-pick-149092-by-pytorch_bot_bot_ 2025-03-14T05:31:37.1375245Z * [new branch] ci_pin -> origin/ci_pin 2025-03-14T05:31:37.1377453Z * [new branch] ckluk2-compileThread-1 -> origin/ckluk2-compileThread-1 2025-03-14T05:31:37.1379623Z * [new branch] ckluk2-compileThread-2 -> origin/ckluk2-compileThread-2 2025-03-14T05:31:37.1381702Z * [new branch] ckluk2-compileThread-64 -> origin/ckluk2-compileThread-64 2025-03-14T05:31:37.1384000Z * [new branch] ckluk2-test-1 -> origin/ckluk2-test-1 2025-03-14T05:31:37.1386288Z * [new branch] compile_fsdp2_disable_stream_and_event -> origin/compile_fsdp2_disable_stream_and_event 2025-03-14T05:31:37.1388517Z * [new branch] condition-branch-in-debug-handler -> origin/condition-branch-in-debug-handler 2025-03-14T05:31:37.1390581Z * [new branch] consolidate-is-qat -> origin/consolidate-is-qat 2025-03-14T05:31:37.1392654Z * [new branch] copy_graph -> origin/copy_graph 2025-03-14T05:31:37.1395518Z * [new branch] cpio/fix_new_ami_tests -> origin/cpio/fix_new_ami_tests 2025-03-14T05:31:37.1397513Z * [new branch] cpio/fix_unit_test -> origin/cpio/fix_unit_test 2025-03-14T05:31:37.1399562Z * [new branch] cse-source -> origin/cse-source 2025-03-14T05:31:37.1402278Z * [new branch] csl/3proc -> origin/csl/3proc 2025-03-14T05:31:37.1404432Z * [new branch] csl/always_produce_xml -> origin/csl/always_produce_xml 2025-03-14T05:31:37.1406589Z * [new branch] csl/build_experiment_max_jobs -> origin/csl/build_experiment_max_jobs 2025-03-14T05:31:37.1408637Z * [new branch] csl/build_test_more_procs -> origin/csl/build_test_more_procs 2025-03-14T05:31:37.1410642Z * [new branch] csl/build_test_more_procs2 -> origin/csl/build_test_more_procs2 2025-03-14T05:31:37.1413041Z * [new branch] csl/checkout_more_procs -> origin/csl/checkout_more_procs 2025-03-14T05:31:37.1415347Z * [new branch] csl/cutlass_bazel -> origin/csl/cutlass_bazel 2025-03-14T05:31:37.1417870Z * [new branch] csl/disableautotune -> origin/csl/disableautotune 2025-03-14T05:31:37.1420181Z * [new branch] csl/fix_rerun_disabled_tests_upload -> origin/csl/fix_rerun_disabled_tests_upload 2025-03-14T05:31:37.1422302Z * [new branch] csl/inductortest_max_autotune -> origin/csl/inductortest_max_autotune 2025-03-14T05:31:37.1424306Z * [new branch] csl/lint_dockerimg -> origin/csl/lint_dockerimg 2025-03-14T05:31:37.1426324Z * [new branch] csl/logchanges -> origin/csl/logchanges 2025-03-14T05:31:37.1428523Z * [new branch] csl/mps_sharding -> origin/csl/mps_sharding 2025-03-14T05:31:37.1430644Z * [new branch] csl/multigpufix -> origin/csl/multigpufix 2025-03-14T05:31:37.1433023Z * [new branch] csl/no_clean_workspace -> origin/csl/no_clean_workspace 2025-03-14T05:31:37.1434955Z * [new branch] csl/no_conda_cmake -> origin/csl/no_conda_cmake 2025-03-14T05:31:37.1437140Z * [new branch] csl/numpy222 -> origin/csl/numpy222 2025-03-14T05:31:37.1439256Z * [new branch] csl/pytest_timeout -> origin/csl/pytest_timeout 2025-03-14T05:31:37.1441440Z * [new branch] csl/rerun_disabled_tests_print_log -> origin/csl/rerun_disabled_tests_print_log 2025-03-14T05:31:37.1443444Z * [new branch] csl/revert -> origin/csl/revert 2025-03-14T05:31:37.1445694Z * [new branch] csl/sharding_build_env -> origin/csl/sharding_build_env 2025-03-14T05:31:37.1447790Z * [new branch] csl/slowtesttimeout -> origin/csl/slowtesttimeout 2025-03-14T05:31:37.1449956Z * [new branch] csl/some_super_setup -> origin/csl/some_super_setup 2025-03-14T05:31:37.1452000Z * [new branch] csl/stdmakeunique -> origin/csl/stdmakeunique 2025-03-14T05:31:37.1454147Z * [new branch] csl/sudo_clean_workspace -> origin/csl/sudo_clean_workspace 2025-03-14T05:31:37.1456270Z * [new branch] csl/td_test_cpp_extensions -> origin/csl/td_test_cpp_extensions 2025-03-14T05:31:37.1458322Z * [new branch] csl/tensoboardpip -> origin/csl/tensoboardpip 2025-03-14T05:31:37.1460505Z * [new branch] csl/test_checkout_git_changes -> origin/csl/test_checkout_git_changes 2025-03-14T05:31:37.1462551Z * [new branch] csl/trymerge_flush -> origin/csl/trymerge_flush 2025-03-14T05:31:37.1464857Z * [new branch] csl/trymerge_initial_comment_stack -> origin/csl/trymerge_initial_comment_stack 2025-03-14T05:31:37.1466955Z * [new branch] csl/update_gh_runners_ubuntu2004 -> origin/csl/update_gh_runners_ubuntu2004 2025-03-14T05:31:37.1469073Z * [new branch] csl/use_ninja -> origin/csl/use_ninja 2025-03-14T05:31:37.1471515Z * [new branch] csl/windowsbat -> origin/csl/windowsbat 2025-03-14T05:31:37.1473279Z * [new branch] cutlass-template-fix-rocm -> origin/cutlass-template-fix-rocm 2025-03-14T05:31:37.1475338Z * [new branch] d4l3k/fsdp_wait -> origin/d4l3k/fsdp_wait 2025-03-14T05:31:37.1477208Z * [new branch] danthe3rd-patch-1 -> origin/danthe3rd-patch-1 2025-03-14T05:31:37.1479998Z * [new branch] daxia6/fix/doc_string -> origin/daxia6/fix/doc_string 2025-03-14T05:31:37.1482374Z * [new branch] desertfire/test_cpp_wrapper -> origin/desertfire/test_cpp_wrapper 2025-03-14T05:31:37.1484133Z * [new branch] desertfire/torchgen_support_default_arg -> origin/desertfire/torchgen_support_default_arg 2025-03-14T05:31:37.1485579Z * [new branch] desertfire/triton-cpu-for-aarch64 -> origin/desertfire/triton-cpu-for-aarch64 2025-03-14T05:31:37.1487255Z * [new branch] desertfire/update_hf_pin -> origin/desertfire/update_hf_pin 2025-03-14T05:31:37.1490390Z * [new branch] dev/joona/MPSNDArrayAdd -> origin/dev/joona/MPSNDArrayAdd 2025-03-14T05:31:37.1492236Z * [new branch] dev/joona/Unranked -> origin/dev/joona/Unranked 2025-03-14T05:31:37.1493967Z * [new branch] dev/joona/embeddingbag -> origin/dev/joona/embeddingbag 2025-03-14T05:31:37.1495783Z * [new branch] dev/joona/sdpa -> origin/dev/joona/sdpa 2025-03-14T05:31:37.1497676Z * [new branch] dev/joona/unique_leak -> origin/dev/joona/unique_leak 2025-03-14T05:31:37.1499492Z * [new branch] dev/joona/upsize3d -> origin/dev/joona/upsize3d 2025-03-14T05:31:37.1501274Z * [new branch] disable -> origin/disable 2025-03-14T05:31:37.1503276Z * [new branch] disable_fp_contract_baseline -> origin/disable_fp_contract_baseline 2025-03-14T05:31:37.1504696Z * [new branch] distributed_checkpointing_e2e_tests -> origin/distributed_checkpointing_e2e_tests 2025-03-14T05:31:37.1506466Z * [new branch] doc_change -> origin/doc_change 2025-03-14T05:31:37.1508189Z * [new branch] docs_numpy -> origin/docs_numpy 2025-03-14T05:31:37.1509955Z * [new branch] dropout-eval -> origin/dropout-eval 2025-03-14T05:31:37.1512358Z * [new branch] dt_alltoall -> origin/dt_alltoall 2025-03-14T05:31:37.1514108Z * [new branch] dynamorunner_mp -> origin/dynamorunner_mp 2025-03-14T05:31:37.1516000Z * [new branch] e2e-baseline -> origin/e2e-baseline 2025-03-14T05:31:37.1518365Z * [new branch] eikanwang/eager_torch_compile -> origin/eikanwang/eager_torch_compile 2025-03-14T05:31:37.1520753Z * [new branch] embg/test_inductor_ci_128B -> origin/embg/test_inductor_ci_128B 2025-03-14T05:31:37.1522442Z * [new branch] embg/test_inductor_ci_base -> origin/embg/test_inductor_ci_base 2025-03-14T05:31:37.1524211Z * [new branch] embg/test_inductor_ci_control -> origin/embg/test_inductor_ci_control 2025-03-14T05:31:37.1525858Z * [new branch] embg/triton_l2_prefetch_128B -> origin/embg/triton_l2_prefetch_128B 2025-03-14T05:31:37.1527766Z * [new branch] embg/triton_l2_prefetch_256B -> origin/embg/triton_l2_prefetch_256B 2025-03-14T05:31:37.1529550Z * [new branch] eqy-patch-1 -> origin/eqy-patch-1 2025-03-14T05:31:37.1531348Z * [new branch] eqy-patch-2 -> origin/eqy-patch-2 2025-03-14T05:31:37.1533148Z * [new branch] eqy-patch-20 -> origin/eqy-patch-20 2025-03-14T05:31:37.1534819Z * [new branch] eqy-patch-21 -> origin/eqy-patch-21 2025-03-14T05:31:37.1536540Z * [new branch] eqy-patch-26 -> origin/eqy-patch-26 2025-03-14T05:31:37.1538300Z * [new branch] eqy-patch-4 -> origin/eqy-patch-4 2025-03-14T05:31:37.1539999Z * [new branch] eqy-patch-5 -> origin/eqy-patch-5 2025-03-14T05:31:37.1541695Z * [new branch] eqy-patch-6 -> origin/eqy-patch-6 2025-03-14T05:31:37.1543536Z * [new branch] error-when-setattr-over-cls-attr -> origin/error-when-setattr-over-cls-attr 2025-03-14T05:31:37.1545169Z * [new branch] et_pin_bump -> origin/et_pin_bump 2025-03-14T05:31:37.1547785Z * [new branch] exclamaforte/aot-inductor-debug -> origin/exclamaforte/aot-inductor-debug 2025-03-14T05:31:37.1549438Z * [new branch] exclamaforte/aten-convolution-out -> origin/exclamaforte/aten-convolution-out 2025-03-14T05:31:37.1551242Z * [new branch] exclamaforte/combo-kernels-perf-run -> origin/exclamaforte/combo-kernels-perf-run 2025-03-14T05:31:37.1552520Z * [new branch] exclamaforte/delta -> origin/exclamaforte/delta 2025-03-14T05:31:37.1554701Z * [new branch] exclamaforte/dont-remove-feedback-functions -> origin/exclamaforte/dont-remove-feedback-functions 2025-03-14T05:31:37.1556138Z * [new branch] exclamaforte/dynamo-types -> origin/exclamaforte/dynamo-types 2025-03-14T05:31:37.1558451Z * [new branch] exclamaforte/enable-mem-dep-fusion -> origin/exclamaforte/enable-mem-dep-fusion 2025-03-14T05:31:37.1560527Z * [new branch] exclamaforte/fix-orig-svg -> origin/exclamaforte/fix-orig-svg 2025-03-14T05:31:37.1562428Z * [new branch] exclamaforte/fix-trace-parsing-fx-svg -> origin/exclamaforte/fix-trace-parsing-fx-svg 2025-03-14T05:31:37.1563983Z * [new branch] exclamaforte/force-pointwise-cat-perf-run -> origin/exclamaforte/force-pointwise-cat-perf-run 2025-03-14T05:31:37.1565903Z * [new branch] exclamaforte/fusion-data -> origin/exclamaforte/fusion-data 2025-03-14T05:31:37.1567318Z * [new branch] exclamaforte/heuristic-choices -> origin/exclamaforte/heuristic-choices 2025-03-14T05:31:37.1570122Z * [new branch] exclamaforte/heuristic-choices-2 -> origin/exclamaforte/heuristic-choices-2 2025-03-14T05:31:37.1571891Z * [new branch] exclamaforte/max-autotune-dtype-test -> origin/exclamaforte/max-autotune-dtype-test 2025-03-14T05:31:37.1573610Z * [new branch] exclamaforte/remove-desc-names -> origin/exclamaforte/remove-desc-names 2025-03-14T05:31:37.1575342Z * [new branch] exclamaforte/scheduler-refactor -> origin/exclamaforte/scheduler-refactor 2025-03-14T05:31:37.1577212Z * [new branch] exclamaforte/test_cpp_wrapper_mode -> origin/exclamaforte/test_cpp_wrapper_mode 2025-03-14T05:31:37.1578697Z * [new branch] exclamaforte/testing_only -> origin/exclamaforte/testing_only 2025-03-14T05:31:37.1580506Z * [new branch] exec -> origin/exec 2025-03-14T05:31:37.1582385Z * [new branch] experimental-mosaic -> origin/experimental-mosaic 2025-03-14T05:31:37.1584201Z * [new branch] export-D50544876 -> origin/export-D50544876 2025-03-14T05:31:37.1585937Z * [new branch] export-D51032385 -> origin/export-D51032385 2025-03-14T05:31:37.1588233Z * [new branch] export-D52434604 -> origin/export-D52434604 2025-03-14T05:31:37.1590132Z * [new branch] export-D58091437 -> origin/export-D58091437 2025-03-14T05:31:37.1591922Z * [new branch] export-D61047529 -> origin/export-D61047529 2025-03-14T05:31:37.1593710Z * [new branch] export-D61557220 -> origin/export-D61557220 2025-03-14T05:31:37.1595704Z * [new branch] export-D65638757 -> origin/export-D65638757 2025-03-14T05:31:37.1597413Z * [new branch] export-D66529288 -> origin/export-D66529288 2025-03-14T05:31:37.1599107Z * [new branch] export-D66690419 -> origin/export-D66690419 2025-03-14T05:31:37.1600820Z * [new branch] export-D66717302 -> origin/export-D66717302 2025-03-14T05:31:37.1602642Z * [new branch] export-D66908884 -> origin/export-D66908884 2025-03-14T05:31:37.1604373Z * [new branch] export-D68245292 -> origin/export-D68245292 2025-03-14T05:31:37.1606079Z * [new branch] export-D68909278 -> origin/export-D68909278 2025-03-14T05:31:37.1607828Z * [new branch] export-D69034578 -> origin/export-D69034578 2025-03-14T05:31:37.1609522Z * [new branch] export-D69355332 -> origin/export-D69355332 2025-03-14T05:31:37.1611282Z * [new branch] export-D69361235 -> origin/export-D69361235 2025-03-14T05:31:37.1613104Z * [new branch] export-D69592025 -> origin/export-D69592025 2025-03-14T05:31:37.1614886Z * [new branch] export-D69595327 -> origin/export-D69595327 2025-03-14T05:31:37.1616596Z * [new branch] export-D69994481 -> origin/export-D69994481 2025-03-14T05:31:37.1618319Z * [new branch] export-D70132269 -> origin/export-D70132269 2025-03-14T05:31:37.1620025Z * [new branch] export-D70141808 -> origin/export-D70141808 2025-03-14T05:31:37.1621810Z * [new branch] export-D70193972 -> origin/export-D70193972 2025-03-14T05:31:37.1623667Z * [new branch] export-D70454149 -> origin/export-D70454149 2025-03-14T05:31:37.1625356Z * [new branch] export-D71081192 -> origin/export-D71081192 2025-03-14T05:31:37.1627272Z * [new branch] exported-model-train-idempotent -> origin/exported-model-train-idempotent 2025-03-14T05:31:37.1628999Z * [new branch] fa_u8_brgemm -> origin/fa_u8_brgemm 2025-03-14T05:31:37.1630600Z * [new branch] fastmath_baseline -> origin/fastmath_baseline 2025-03-14T05:31:37.1633001Z * [new branch] fbcode/warm -> origin/fbcode/warm 2025-03-14T05:31:37.1634938Z * [new branch] fca -> origin/fca 2025-03-14T05:31:37.1636708Z * [new branch] fca2_ca5984c -> origin/fca2_ca5984c 2025-03-14T05:31:37.1651422Z * [new branch] fca3 -> origin/fca3 2025-03-14T05:31:37.1652052Z * [new branch] fca5 -> origin/fca5 2025-03-14T05:31:37.1652603Z * [new branch] fengyuan/external-proj -> origin/fengyuan/external-proj 2025-03-14T05:31:37.1653332Z * [new branch] fengyuan/out-of-tree-xpu-ops-improve-test -> origin/fengyuan/out-of-tree-xpu-ops-improve-test 2025-03-14T05:31:37.1654215Z * [new branch] fengyuan/out-of-tree-xpu-ops-remove-dtype -> origin/fengyuan/out-of-tree-xpu-ops-remove-dtype 2025-03-14T05:31:37.1654895Z * [new branch] fengyuan/test-xpu -> origin/fengyuan/test-xpu 2025-03-14T05:31:37.1655421Z * [new branch] ffast_math_baseline -> origin/ffast_math_baseline 2025-03-14T05:31:37.1655938Z * [new branch] ffast_math_target -> origin/ffast_math_target 2025-03-14T05:31:37.1656464Z * [new branch] findhao/base_commit -> origin/findhao/base_commit 2025-03-14T05:31:37.1657010Z * [new branch] findhao/base_commit1 -> origin/findhao/base_commit1 2025-03-14T05:31:37.1657606Z * [new branch] findhao/fix-indirect-access -> origin/findhao/fix-indirect-access 2025-03-14T05:31:37.1658770Z * [new branch] findhao/fix-triton-constexpr -> origin/findhao/fix-triton-constexpr 2025-03-14T05:31:37.1660504Z * [new branch] findhao/multistream2 -> origin/findhao/multistream2 2025-03-14T05:31:37.1662063Z * [new branch] findhao/multistream5 -> origin/findhao/multistream5 2025-03-14T05:31:37.1663675Z * [new branch] findhao/operatorbench3 -> origin/findhao/operatorbench3 2025-03-14T05:31:37.1665720Z * [new branch] findhao/operatorbench5 -> origin/findhao/operatorbench5 2025-03-14T05:31:37.1667382Z * [new branch] fix -> origin/fix 2025-03-14T05:31:37.1669527Z * [new branch] fix-benchmark-config-h100 -> origin/fix-benchmark-config-h100 2025-03-14T05:31:37.1671314Z * [new branch] fix-cat-lowering-uint8-hack -> origin/fix-cat-lowering-uint8-hack 2025-03-14T05:31:37.1672801Z * [new branch] fix-config-ignore -> origin/fix-config-ignore 2025-03-14T05:31:37.1674795Z * [new branch] fix-dict-guard -> origin/fix-dict-guard 2025-03-14T05:31:37.1676581Z * [new branch] fix-ios-upload-credentials -> origin/fix-ios-upload-credentials 2025-03-14T05:31:37.1678205Z * [new branch] fix-mem-leak -> origin/fix-mem-leak 2025-03-14T05:31:37.1680323Z * [new branch] fix-qat-derived-qspec -> origin/fix-qat-derived-qspec 2025-03-14T05:31:37.1682148Z * [new branch] fix-test-stat-upload-failures -> origin/fix-test-stat-upload-failures 2025-03-14T05:31:37.1683879Z * [new branch] fix_allow_train_eval_msg -> origin/fix_allow_train_eval_msg 2025-03-14T05:31:37.1685608Z * [new branch] fix_autotune_inplace_test -> origin/fix_autotune_inplace_test 2025-03-14T05:31:37.1687302Z * [new branch] fix_avoid_record_stream -> origin/fix_avoid_record_stream 2025-03-14T05:31:37.1689015Z * [new branch] fix_e2e_fsdp_tp_pairwise -> origin/fix_e2e_fsdp_tp_pairwise 2025-03-14T05:31:37.1690913Z * [new branch] fix_partial -> origin/fix_partial 2025-03-14T05:31:37.1692537Z * [new branch] fix_xpu_content_store -> origin/fix_xpu_content_store 2025-03-14T05:31:37.1694393Z * [new branch] fixes-triage -> origin/fixes-triage 2025-03-14T05:31:37.1696533Z * [new branch] flat_apply -> origin/flat_apply 2025-03-14T05:31:37.1698287Z * [new branch] flex_attention_functorch_grad -> origin/flex_attention_functorch_grad 2025-03-14T05:31:37.1700612Z * [new branch] fmassa/fix_all_gather_cost -> origin/fmassa/fix_all_gather_cost 2025-03-14T05:31:37.1702252Z * [new branch] fmassa/fix_memeff_sharding_rule -> origin/fmassa/fix_memeff_sharding_rule 2025-03-14T05:31:37.1704012Z * [new branch] fmassa/partitioner_knapsack_checkpoint -> origin/fmassa/partitioner_knapsack_checkpoint 2025-03-14T05:31:37.1705317Z * [new branch] fp8_fix -> origin/fp8_fix 2025-03-14T05:31:37.1707356Z * [new branch] fsdp2_trace_rules -> origin/fsdp2_trace_rules 2025-03-14T05:31:37.1709125Z * [new branch] fsdpv2_3d -> origin/fsdpv2_3d 2025-03-14T05:31:37.1710939Z * [new branch] fsdpv2_3d_m1 -> origin/fsdpv2_3d_m1 2025-03-14T05:31:37.1712658Z * [new branch] func-attr -> origin/func-attr 2025-03-14T05:31:37.1714548Z * [new branch] functorch_scan -> origin/functorch_scan 2025-03-14T05:31:37.1716356Z * [new branch] fx_cpp -> origin/fx_cpp 2025-03-14T05:31:37.1718686Z * [new branch] fy/fix-win -> origin/fy/fix-win 2025-03-14T05:31:37.1720362Z * [new branch] gelu-3 -> origin/gelu-3 2025-03-14T05:31:37.1722291Z * [new branch] get_state_dict_forward_fix -> origin/get_state_dict_forward_fix 2025-03-14T05:31:37.1725855Z * [new branch] gh/AlnisM/1/base -> origin/gh/AlnisM/1/base 2025-03-14T05:31:37.1727561Z * [new branch] gh/AlnisM/1/head -> origin/gh/AlnisM/1/head 2025-03-14T05:31:37.1730374Z * [new branch] gh/BowenBao/296/base -> origin/gh/BowenBao/296/base 2025-03-14T05:31:37.1732104Z * [new branch] gh/BowenBao/296/head -> origin/gh/BowenBao/296/head 2025-03-14T05:31:37.1733834Z * [new branch] gh/BowenBao/296/orig -> origin/gh/BowenBao/296/orig 2025-03-14T05:31:37.1736522Z * [new branch] gh/CaoE/46/base -> origin/gh/CaoE/46/base 2025-03-14T05:31:37.1738212Z * [new branch] gh/CaoE/46/head -> origin/gh/CaoE/46/head 2025-03-14T05:31:37.1739964Z * [new branch] gh/CaoE/46/orig -> origin/gh/CaoE/46/orig 2025-03-14T05:31:37.1742351Z * [new branch] gh/CaoE/47/base -> origin/gh/CaoE/47/base 2025-03-14T05:31:37.1744065Z * [new branch] gh/CaoE/47/head -> origin/gh/CaoE/47/head 2025-03-14T05:31:37.1745768Z * [new branch] gh/CaoE/47/orig -> origin/gh/CaoE/47/orig 2025-03-14T05:31:37.1748090Z * [new branch] gh/CaoE/48/base -> origin/gh/CaoE/48/base 2025-03-14T05:31:37.1749739Z * [new branch] gh/CaoE/48/head -> origin/gh/CaoE/48/head 2025-03-14T05:31:37.1751416Z * [new branch] gh/CaoE/48/orig -> origin/gh/CaoE/48/orig 2025-03-14T05:31:37.1753816Z * [new branch] gh/CaoE/49/base -> origin/gh/CaoE/49/base 2025-03-14T05:31:37.1755812Z * [new branch] gh/CaoE/49/head -> origin/gh/CaoE/49/head 2025-03-14T05:31:37.1757531Z * [new branch] gh/CaoE/49/orig -> origin/gh/CaoE/49/orig 2025-03-14T05:31:37.1759798Z * [new branch] gh/CaoE/50/base -> origin/gh/CaoE/50/base 2025-03-14T05:31:37.1761461Z * [new branch] gh/CaoE/50/head -> origin/gh/CaoE/50/head 2025-03-14T05:31:37.1763341Z * [new branch] gh/CaoE/50/orig -> origin/gh/CaoE/50/orig 2025-03-14T05:31:37.1765654Z * [new branch] gh/CaoE/51/base -> origin/gh/CaoE/51/base 2025-03-14T05:31:37.1767453Z * [new branch] gh/CaoE/51/head -> origin/gh/CaoE/51/head 2025-03-14T05:31:37.1769405Z * [new branch] gh/CaoE/51/orig -> origin/gh/CaoE/51/orig 2025-03-14T05:31:37.1772285Z * [new branch] gh/ColinPeppler/62/base -> origin/gh/ColinPeppler/62/base 2025-03-14T05:31:37.1774097Z * [new branch] gh/ColinPeppler/62/head -> origin/gh/ColinPeppler/62/head 2025-03-14T05:31:37.1775752Z * [new branch] gh/ColinPeppler/62/orig -> origin/gh/ColinPeppler/62/orig 2025-03-14T05:31:37.1778567Z * [new branch] gh/EikanWang/67/base -> origin/gh/EikanWang/67/base 2025-03-14T05:31:37.1780253Z * [new branch] gh/EikanWang/67/head -> origin/gh/EikanWang/67/head 2025-03-14T05:31:37.1782604Z * [new branch] gh/EikanWang/75/base -> origin/gh/EikanWang/75/base 2025-03-14T05:31:37.1784303Z * [new branch] gh/EikanWang/75/head -> origin/gh/EikanWang/75/head 2025-03-14T05:31:37.1786011Z * [new branch] gh/EikanWang/75/orig -> origin/gh/EikanWang/75/orig 2025-03-14T05:31:37.1788391Z * [new branch] gh/EikanWang/76/base -> origin/gh/EikanWang/76/base 2025-03-14T05:31:37.1790062Z * [new branch] gh/EikanWang/76/head -> origin/gh/EikanWang/76/head 2025-03-14T05:31:37.1791764Z * [new branch] gh/EikanWang/76/orig -> origin/gh/EikanWang/76/orig 2025-03-14T05:31:37.1794115Z * [new branch] gh/EikanWang/77/base -> origin/gh/EikanWang/77/base 2025-03-14T05:31:37.1795971Z * [new branch] gh/EikanWang/77/head -> origin/gh/EikanWang/77/head 2025-03-14T05:31:37.1797697Z * [new branch] gh/EikanWang/77/orig -> origin/gh/EikanWang/77/orig 2025-03-14T05:31:37.1799949Z * [new branch] gh/EikanWang/78/base -> origin/gh/EikanWang/78/base 2025-03-14T05:31:37.1801592Z * [new branch] gh/EikanWang/78/head -> origin/gh/EikanWang/78/head 2025-03-14T05:31:37.1803265Z * [new branch] gh/EikanWang/78/orig -> origin/gh/EikanWang/78/orig 2025-03-14T05:31:37.1806416Z * [new branch] gh/Gasoonjia/1/base -> origin/gh/Gasoonjia/1/base 2025-03-14T05:31:37.1808095Z * [new branch] gh/Gasoonjia/1/head -> origin/gh/Gasoonjia/1/head 2025-03-14T05:31:37.1811006Z * [new branch] gh/H-Huang/131/base -> origin/gh/H-Huang/131/base 2025-03-14T05:31:37.1812748Z * [new branch] gh/H-Huang/131/head -> origin/gh/H-Huang/131/head 2025-03-14T05:31:37.1814570Z * [new branch] gh/H-Huang/131/orig -> origin/gh/H-Huang/131/orig 2025-03-14T05:31:37.1816836Z * [new branch] gh/H-Huang/132/base -> origin/gh/H-Huang/132/base 2025-03-14T05:31:37.1818979Z * [new branch] gh/H-Huang/132/head -> origin/gh/H-Huang/132/head 2025-03-14T05:31:37.1820668Z * [new branch] gh/H-Huang/132/orig -> origin/gh/H-Huang/132/orig 2025-03-14T05:31:37.1823087Z * [new branch] gh/H-Huang/160/base -> origin/gh/H-Huang/160/base 2025-03-14T05:31:37.1824735Z * [new branch] gh/H-Huang/160/head -> origin/gh/H-Huang/160/head 2025-03-14T05:31:37.1826376Z * [new branch] gh/H-Huang/160/orig -> origin/gh/H-Huang/160/orig 2025-03-14T05:31:37.1828803Z * [new branch] gh/H-Huang/167/base -> origin/gh/H-Huang/167/base 2025-03-14T05:31:37.1830552Z * [new branch] gh/H-Huang/167/head -> origin/gh/H-Huang/167/head 2025-03-14T05:31:37.1832300Z * [new branch] gh/H-Huang/167/orig -> origin/gh/H-Huang/167/orig 2025-03-14T05:31:37.1834974Z * [new branch] gh/H-Huang/168/base -> origin/gh/H-Huang/168/base 2025-03-14T05:31:37.1836536Z * [new branch] gh/H-Huang/168/head -> origin/gh/H-Huang/168/head 2025-03-14T05:31:37.1838268Z * [new branch] gh/H-Huang/168/orig -> origin/gh/H-Huang/168/orig 2025-03-14T05:31:37.1840407Z * [new branch] gh/H-Huang/169/base -> origin/gh/H-Huang/169/base 2025-03-14T05:31:37.1842119Z * [new branch] gh/H-Huang/169/head -> origin/gh/H-Huang/169/head 2025-03-14T05:31:37.1843789Z * [new branch] gh/H-Huang/169/orig -> origin/gh/H-Huang/169/orig 2025-03-14T05:31:37.1846164Z * [new branch] gh/H-Huang/170/base -> origin/gh/H-Huang/170/base 2025-03-14T05:31:37.1847875Z * [new branch] gh/H-Huang/170/head -> origin/gh/H-Huang/170/head 2025-03-14T05:31:37.1849528Z * [new branch] gh/H-Huang/170/orig -> origin/gh/H-Huang/170/orig 2025-03-14T05:31:37.1851830Z * [new branch] gh/H-Huang/171/base -> origin/gh/H-Huang/171/base 2025-03-14T05:31:37.1853511Z * [new branch] gh/H-Huang/171/head -> origin/gh/H-Huang/171/head 2025-03-14T05:31:37.1855193Z * [new branch] gh/H-Huang/171/orig -> origin/gh/H-Huang/171/orig 2025-03-14T05:31:37.1858317Z * [new branch] gh/HDCharles/168/base -> origin/gh/HDCharles/168/base 2025-03-14T05:31:37.1860072Z * [new branch] gh/HDCharles/168/head -> origin/gh/HDCharles/168/head 2025-03-14T05:31:37.1861910Z * [new branch] gh/HDCharles/168/orig -> origin/gh/HDCharles/168/orig 2025-03-14T05:31:37.1864627Z * [new branch] gh/IvanKobzarev/100/base -> origin/gh/IvanKobzarev/100/base 2025-03-14T05:31:37.1866304Z * [new branch] gh/IvanKobzarev/100/head -> origin/gh/IvanKobzarev/100/head 2025-03-14T05:31:37.1868252Z * [new branch] gh/IvanKobzarev/100/orig -> origin/gh/IvanKobzarev/100/orig 2025-03-14T05:31:37.1870796Z * [new branch] gh/IvanKobzarev/101/base -> origin/gh/IvanKobzarev/101/base 2025-03-14T05:31:37.1872501Z * [new branch] gh/IvanKobzarev/101/head -> origin/gh/IvanKobzarev/101/head 2025-03-14T05:31:37.1874268Z * [new branch] gh/IvanKobzarev/101/orig -> origin/gh/IvanKobzarev/101/orig 2025-03-14T05:31:37.1877046Z * [new branch] gh/IvanKobzarev/102/base -> origin/gh/IvanKobzarev/102/base 2025-03-14T05:31:37.1878746Z * [new branch] gh/IvanKobzarev/102/head -> origin/gh/IvanKobzarev/102/head 2025-03-14T05:31:37.1880501Z * [new branch] gh/IvanKobzarev/102/orig -> origin/gh/IvanKobzarev/102/orig 2025-03-14T05:31:37.1882780Z * [new branch] gh/IvanKobzarev/103/base -> origin/gh/IvanKobzarev/103/base 2025-03-14T05:31:37.1884620Z * [new branch] gh/IvanKobzarev/103/head -> origin/gh/IvanKobzarev/103/head 2025-03-14T05:31:37.1886361Z * [new branch] gh/IvanKobzarev/103/orig -> origin/gh/IvanKobzarev/103/orig 2025-03-14T05:31:37.1888656Z * [new branch] gh/IvanKobzarev/104/base -> origin/gh/IvanKobzarev/104/base 2025-03-14T05:31:37.1890447Z * [new branch] gh/IvanKobzarev/104/head -> origin/gh/IvanKobzarev/104/head 2025-03-14T05:31:37.1892114Z * [new branch] gh/IvanKobzarev/104/orig -> origin/gh/IvanKobzarev/104/orig 2025-03-14T05:31:37.1894664Z * [new branch] gh/IvanKobzarev/105/base -> origin/gh/IvanKobzarev/105/base 2025-03-14T05:31:37.1896357Z * [new branch] gh/IvanKobzarev/105/head -> origin/gh/IvanKobzarev/105/head 2025-03-14T05:31:37.1897962Z * [new branch] gh/IvanKobzarev/105/orig -> origin/gh/IvanKobzarev/105/orig 2025-03-14T05:31:37.1900458Z * [new branch] gh/IvanKobzarev/56/base -> origin/gh/IvanKobzarev/56/base 2025-03-14T05:31:37.1902533Z * [new branch] gh/IvanKobzarev/56/head -> origin/gh/IvanKobzarev/56/head 2025-03-14T05:31:37.1904109Z * [new branch] gh/IvanKobzarev/56/orig -> origin/gh/IvanKobzarev/56/orig 2025-03-14T05:31:37.1906459Z * [new branch] gh/IvanKobzarev/64/base -> origin/gh/IvanKobzarev/64/base 2025-03-14T05:31:37.1908163Z * [new branch] gh/IvanKobzarev/64/head -> origin/gh/IvanKobzarev/64/head 2025-03-14T05:31:37.1909877Z * [new branch] gh/IvanKobzarev/64/orig -> origin/gh/IvanKobzarev/64/orig 2025-03-14T05:31:37.1912211Z * [new branch] gh/IvanKobzarev/86/base -> origin/gh/IvanKobzarev/86/base 2025-03-14T05:31:37.1913930Z * [new branch] gh/IvanKobzarev/86/head -> origin/gh/IvanKobzarev/86/head 2025-03-14T05:31:37.1915810Z * [new branch] gh/IvanKobzarev/86/orig -> origin/gh/IvanKobzarev/86/orig 2025-03-14T05:31:37.1918157Z * [new branch] gh/IvanKobzarev/91/base -> origin/gh/IvanKobzarev/91/base 2025-03-14T05:31:37.1919842Z * [new branch] gh/IvanKobzarev/91/head -> origin/gh/IvanKobzarev/91/head 2025-03-14T05:31:37.1921502Z * [new branch] gh/IvanKobzarev/91/orig -> origin/gh/IvanKobzarev/91/orig 2025-03-14T05:31:37.1924021Z * [new branch] gh/IvanKobzarev/92/base -> origin/gh/IvanKobzarev/92/base 2025-03-14T05:31:37.1926063Z * [new branch] gh/IvanKobzarev/92/head -> origin/gh/IvanKobzarev/92/head 2025-03-14T05:31:37.1927747Z * [new branch] gh/IvanKobzarev/92/orig -> origin/gh/IvanKobzarev/92/orig 2025-03-14T05:31:37.1930282Z * [new branch] gh/IvanKobzarev/93/base -> origin/gh/IvanKobzarev/93/base 2025-03-14T05:31:37.1932028Z * [new branch] gh/IvanKobzarev/93/head -> origin/gh/IvanKobzarev/93/head 2025-03-14T05:31:37.1933754Z * [new branch] gh/IvanKobzarev/93/orig -> origin/gh/IvanKobzarev/93/orig 2025-03-14T05:31:37.1936286Z * [new branch] gh/IvanKobzarev/94/base -> origin/gh/IvanKobzarev/94/base 2025-03-14T05:31:37.1937928Z * [new branch] gh/IvanKobzarev/94/head -> origin/gh/IvanKobzarev/94/head 2025-03-14T05:31:37.1939650Z * [new branch] gh/IvanKobzarev/94/orig -> origin/gh/IvanKobzarev/94/orig 2025-03-14T05:31:37.1942131Z * [new branch] gh/IvanKobzarev/98/base -> origin/gh/IvanKobzarev/98/base 2025-03-14T05:31:37.1943782Z * [new branch] gh/IvanKobzarev/98/head -> origin/gh/IvanKobzarev/98/head 2025-03-14T05:31:37.1945516Z * [new branch] gh/IvanKobzarev/98/orig -> origin/gh/IvanKobzarev/98/orig 2025-03-14T05:31:37.1948374Z * [new branch] gh/Lezcano/243/base -> origin/gh/Lezcano/243/base 2025-03-14T05:31:37.1950161Z * [new branch] gh/Lezcano/243/head -> origin/gh/Lezcano/243/head 2025-03-14T05:31:37.1952028Z * [new branch] gh/Lezcano/243/orig -> origin/gh/Lezcano/243/orig 2025-03-14T05:31:37.1955042Z * [new branch] gh/SS-JIA/164/base -> origin/gh/SS-JIA/164/base 2025-03-14T05:31:37.1956703Z * [new branch] gh/SS-JIA/164/head -> origin/gh/SS-JIA/164/head 2025-03-14T05:31:37.1958973Z * [new branch] gh/SS-JIA/172/base -> origin/gh/SS-JIA/172/base 2025-03-14T05:31:37.1960775Z * [new branch] gh/SS-JIA/172/head -> origin/gh/SS-JIA/172/head 2025-03-14T05:31:37.1962482Z * [new branch] gh/SS-JIA/172/orig -> origin/gh/SS-JIA/172/orig 2025-03-14T05:31:37.1965886Z * [new branch] gh/SamGinzburg/11/base -> origin/gh/SamGinzburg/11/base 2025-03-14T05:31:37.1967590Z * [new branch] gh/SamGinzburg/11/head -> origin/gh/SamGinzburg/11/head 2025-03-14T05:31:37.1970929Z * [new branch] gh/StrongerXi/1/base -> origin/gh/StrongerXi/1/base 2025-03-14T05:31:37.1972610Z * [new branch] gh/StrongerXi/1/head -> origin/gh/StrongerXi/1/head 2025-03-14T05:31:37.1975154Z * [new branch] gh/StrongerXi/63/base -> origin/gh/StrongerXi/63/base 2025-03-14T05:31:37.1976555Z * [new branch] gh/StrongerXi/63/head -> origin/gh/StrongerXi/63/head 2025-03-14T05:31:37.1978385Z * [new branch] gh/StrongerXi/63/orig -> origin/gh/StrongerXi/63/orig 2025-03-14T05:31:37.1980664Z * [new branch] gh/StrongerXi/67/base -> origin/gh/StrongerXi/67/base 2025-03-14T05:31:37.1982367Z * [new branch] gh/StrongerXi/67/head -> origin/gh/StrongerXi/67/head 2025-03-14T05:31:37.1984057Z * [new branch] gh/StrongerXi/67/orig -> origin/gh/StrongerXi/67/orig 2025-03-14T05:31:37.1986693Z * [new branch] gh/StrongerXi/71/base -> origin/gh/StrongerXi/71/base 2025-03-14T05:31:37.1988339Z * [new branch] gh/StrongerXi/71/head -> origin/gh/StrongerXi/71/head 2025-03-14T05:31:37.1990561Z * [new branch] gh/StrongerXi/72/base -> origin/gh/StrongerXi/72/base 2025-03-14T05:31:37.1992291Z * [new branch] gh/StrongerXi/72/head -> origin/gh/StrongerXi/72/head 2025-03-14T05:31:37.1994725Z * [new branch] gh/StrongerXi/81/base -> origin/gh/StrongerXi/81/base 2025-03-14T05:31:37.1996420Z * [new branch] gh/StrongerXi/81/head -> origin/gh/StrongerXi/81/head 2025-03-14T05:31:37.1998293Z * [new branch] gh/StrongerXi/81/orig -> origin/gh/StrongerXi/81/orig 2025-03-14T05:31:37.2000547Z * [new branch] gh/StrongerXi/82/base -> origin/gh/StrongerXi/82/base 2025-03-14T05:31:37.2002228Z * [new branch] gh/StrongerXi/82/head -> origin/gh/StrongerXi/82/head 2025-03-14T05:31:37.2003962Z * [new branch] gh/StrongerXi/82/orig -> origin/gh/StrongerXi/82/orig 2025-03-14T05:31:37.2006264Z * [new branch] gh/StrongerXi/83/base -> origin/gh/StrongerXi/83/base 2025-03-14T05:31:37.2008008Z * [new branch] gh/StrongerXi/83/head -> origin/gh/StrongerXi/83/head 2025-03-14T05:31:37.2009744Z * [new branch] gh/StrongerXi/83/orig -> origin/gh/StrongerXi/83/orig 2025-03-14T05:31:37.2012034Z * [new branch] gh/StrongerXi/84/base -> origin/gh/StrongerXi/84/base 2025-03-14T05:31:37.2013734Z * [new branch] gh/StrongerXi/84/head -> origin/gh/StrongerXi/84/head 2025-03-14T05:31:37.2015595Z * [new branch] gh/StrongerXi/84/orig -> origin/gh/StrongerXi/84/orig 2025-03-14T05:31:37.2017955Z * [new branch] gh/StrongerXi/85/base -> origin/gh/StrongerXi/85/base 2025-03-14T05:31:37.2019443Z * [new branch] gh/StrongerXi/85/head -> origin/gh/StrongerXi/85/head 2025-03-14T05:31:37.2021376Z * [new branch] gh/StrongerXi/85/orig -> origin/gh/StrongerXi/85/orig 2025-03-14T05:31:37.2023657Z * [new branch] gh/StrongerXi/86/base -> origin/gh/StrongerXi/86/base 2025-03-14T05:31:37.2026420Z * [new branch] gh/StrongerXi/86/head -> origin/gh/StrongerXi/86/head 2025-03-14T05:31:37.2027049Z * [new branch] gh/StrongerXi/86/orig -> origin/gh/StrongerXi/86/orig 2025-03-14T05:31:37.2029508Z * [new branch] gh/StrongerXi/87/base -> origin/gh/StrongerXi/87/base 2025-03-14T05:31:37.2031179Z * [new branch] gh/StrongerXi/87/head -> origin/gh/StrongerXi/87/head 2025-03-14T05:31:37.2032834Z * [new branch] gh/StrongerXi/87/orig -> origin/gh/StrongerXi/87/orig 2025-03-14T05:31:37.2035223Z * [new branch] gh/StrongerXi/88/base -> origin/gh/StrongerXi/88/base 2025-03-14T05:31:37.2036965Z * [new branch] gh/StrongerXi/88/head -> origin/gh/StrongerXi/88/head 2025-03-14T05:31:37.2038769Z * [new branch] gh/StrongerXi/88/orig -> origin/gh/StrongerXi/88/orig 2025-03-14T05:31:37.2041215Z * [new branch] gh/StrongerXi/89/base -> origin/gh/StrongerXi/89/base 2025-03-14T05:31:37.2042791Z * [new branch] gh/StrongerXi/89/head -> origin/gh/StrongerXi/89/head 2025-03-14T05:31:37.2044548Z * [new branch] gh/StrongerXi/89/orig -> origin/gh/StrongerXi/89/orig 2025-03-14T05:31:37.2046849Z * [new branch] gh/StrongerXi/90/base -> origin/gh/StrongerXi/90/base 2025-03-14T05:31:37.2048559Z * [new branch] gh/StrongerXi/90/head -> origin/gh/StrongerXi/90/head 2025-03-14T05:31:37.2050351Z * [new branch] gh/StrongerXi/90/orig -> origin/gh/StrongerXi/90/orig 2025-03-14T05:31:37.2052612Z * [new branch] gh/StrongerXi/91/base -> origin/gh/StrongerXi/91/base 2025-03-14T05:31:37.2054322Z * [new branch] gh/StrongerXi/91/head -> origin/gh/StrongerXi/91/head 2025-03-14T05:31:37.2055991Z * [new branch] gh/StrongerXi/91/orig -> origin/gh/StrongerXi/91/orig 2025-03-14T05:31:37.2058318Z * [new branch] gh/StrongerXi/92/base -> origin/gh/StrongerXi/92/base 2025-03-14T05:31:37.2060030Z * [new branch] gh/StrongerXi/92/head -> origin/gh/StrongerXi/92/head 2025-03-14T05:31:37.2061769Z * [new branch] gh/StrongerXi/92/orig -> origin/gh/StrongerXi/92/orig 2025-03-14T05:31:37.2064139Z * [new branch] gh/StrongerXi/93/base -> origin/gh/StrongerXi/93/base 2025-03-14T05:31:37.2065788Z * [new branch] gh/StrongerXi/93/head -> origin/gh/StrongerXi/93/head 2025-03-14T05:31:37.2067501Z * [new branch] gh/StrongerXi/93/orig -> origin/gh/StrongerXi/93/orig 2025-03-14T05:31:37.2072284Z * [new branch] gh/Xia-Weiwen/28/base -> origin/gh/Xia-Weiwen/28/base 2025-03-14T05:31:37.2074029Z * [new branch] gh/Xia-Weiwen/28/head -> origin/gh/Xia-Weiwen/28/head 2025-03-14T05:31:37.2075776Z * [new branch] gh/Xia-Weiwen/28/orig -> origin/gh/Xia-Weiwen/28/orig 2025-03-14T05:31:37.2078079Z * [new branch] gh/Xia-Weiwen/29/base -> origin/gh/Xia-Weiwen/29/base 2025-03-14T05:31:37.2079788Z * [new branch] gh/Xia-Weiwen/29/head -> origin/gh/Xia-Weiwen/29/head 2025-03-14T05:31:37.2081418Z * [new branch] gh/Xia-Weiwen/29/orig -> origin/gh/Xia-Weiwen/29/orig 2025-03-14T05:31:37.2083748Z * [new branch] gh/Xia-Weiwen/30/base -> origin/gh/Xia-Weiwen/30/base 2025-03-14T05:31:37.2085509Z * [new branch] gh/Xia-Weiwen/30/head -> origin/gh/Xia-Weiwen/30/head 2025-03-14T05:31:37.2087179Z * [new branch] gh/Xia-Weiwen/30/orig -> origin/gh/Xia-Weiwen/30/orig 2025-03-14T05:31:37.2089581Z * [new branch] gh/Xia-Weiwen/31/base -> origin/gh/Xia-Weiwen/31/base 2025-03-14T05:31:37.2091285Z * [new branch] gh/Xia-Weiwen/31/head -> origin/gh/Xia-Weiwen/31/head 2025-03-14T05:31:37.2093036Z * [new branch] gh/Xia-Weiwen/31/orig -> origin/gh/Xia-Weiwen/31/orig 2025-03-14T05:31:37.2096171Z * [new branch] gh/XilunWu/110/base -> origin/gh/XilunWu/110/base 2025-03-14T05:31:37.2097836Z * [new branch] gh/XilunWu/110/head -> origin/gh/XilunWu/110/head 2025-03-14T05:31:37.2099551Z * [new branch] gh/XilunWu/110/orig -> origin/gh/XilunWu/110/orig 2025-03-14T05:31:37.2101986Z * [new branch] gh/XilunWu/114/base -> origin/gh/XilunWu/114/base 2025-03-14T05:31:37.2103740Z * [new branch] gh/XilunWu/114/head -> origin/gh/XilunWu/114/head 2025-03-14T05:31:37.2105359Z * [new branch] gh/XilunWu/114/orig -> origin/gh/XilunWu/114/orig 2025-03-14T05:31:37.2107706Z * [new branch] gh/XilunWu/115/base -> origin/gh/XilunWu/115/base 2025-03-14T05:31:37.2109449Z * [new branch] gh/XilunWu/115/head -> origin/gh/XilunWu/115/head 2025-03-14T05:31:37.2111326Z * [new branch] gh/XilunWu/115/orig -> origin/gh/XilunWu/115/orig 2025-03-14T05:31:37.2113302Z * [new branch] gh/XilunWu/116/base -> origin/gh/XilunWu/116/base 2025-03-14T05:31:37.2115095Z * [new branch] gh/XilunWu/116/head -> origin/gh/XilunWu/116/head 2025-03-14T05:31:37.2116887Z * [new branch] gh/XilunWu/116/orig -> origin/gh/XilunWu/116/orig 2025-03-14T05:31:37.2119016Z * [new branch] gh/XilunWu/117/base -> origin/gh/XilunWu/117/base 2025-03-14T05:31:37.2120672Z * [new branch] gh/XilunWu/117/head -> origin/gh/XilunWu/117/head 2025-03-14T05:31:37.2122357Z * [new branch] gh/XilunWu/117/orig -> origin/gh/XilunWu/117/orig 2025-03-14T05:31:37.2124602Z * [new branch] gh/XilunWu/118/base -> origin/gh/XilunWu/118/base 2025-03-14T05:31:37.2126203Z * [new branch] gh/XilunWu/118/head -> origin/gh/XilunWu/118/head 2025-03-14T05:31:37.2127834Z * [new branch] gh/XilunWu/118/orig -> origin/gh/XilunWu/118/orig 2025-03-14T05:31:37.2130114Z * [new branch] gh/XilunWu/119/base -> origin/gh/XilunWu/119/base 2025-03-14T05:31:37.2131779Z * [new branch] gh/XilunWu/119/head -> origin/gh/XilunWu/119/head 2025-03-14T05:31:37.2133444Z * [new branch] gh/XilunWu/119/orig -> origin/gh/XilunWu/119/orig 2025-03-14T05:31:37.2135820Z * [new branch] gh/XilunWu/120/base -> origin/gh/XilunWu/120/base 2025-03-14T05:31:37.2137631Z * [new branch] gh/XilunWu/120/head -> origin/gh/XilunWu/120/head 2025-03-14T05:31:37.2139310Z * [new branch] gh/XilunWu/120/orig -> origin/gh/XilunWu/120/orig 2025-03-14T05:31:37.2141505Z * [new branch] gh/XilunWu/121/base -> origin/gh/XilunWu/121/base 2025-03-14T05:31:37.2143249Z * [new branch] gh/XilunWu/121/head -> origin/gh/XilunWu/121/head 2025-03-14T05:31:37.2145013Z * [new branch] gh/XilunWu/121/orig -> origin/gh/XilunWu/121/orig 2025-03-14T05:31:37.2147182Z * [new branch] gh/XilunWu/122/base -> origin/gh/XilunWu/122/base 2025-03-14T05:31:37.2148920Z * [new branch] gh/XilunWu/122/head -> origin/gh/XilunWu/122/head 2025-03-14T05:31:37.2150614Z * [new branch] gh/XilunWu/122/orig -> origin/gh/XilunWu/122/orig 2025-03-14T05:31:37.2153182Z * [new branch] gh/XilunWu/123/base -> origin/gh/XilunWu/123/base 2025-03-14T05:31:37.2154887Z * [new branch] gh/XilunWu/123/head -> origin/gh/XilunWu/123/head 2025-03-14T05:31:37.2156579Z * [new branch] gh/XilunWu/123/orig -> origin/gh/XilunWu/123/orig 2025-03-14T05:31:37.2158927Z * [new branch] gh/XilunWu/124/base -> origin/gh/XilunWu/124/base 2025-03-14T05:31:37.2160553Z * [new branch] gh/XilunWu/124/head -> origin/gh/XilunWu/124/head 2025-03-14T05:31:37.2162336Z * [new branch] gh/XilunWu/124/orig -> origin/gh/XilunWu/124/orig 2025-03-14T05:31:37.2164681Z * [new branch] gh/XilunWu/125/base -> origin/gh/XilunWu/125/base 2025-03-14T05:31:37.2166427Z * [new branch] gh/XilunWu/125/head -> origin/gh/XilunWu/125/head 2025-03-14T05:31:37.2168175Z * [new branch] gh/XilunWu/125/orig -> origin/gh/XilunWu/125/orig 2025-03-14T05:31:37.2171281Z * [new branch] gh/XuehaiPan/1/base -> origin/gh/XuehaiPan/1/base 2025-03-14T05:31:37.2172990Z * [new branch] gh/XuehaiPan/1/head -> origin/gh/XuehaiPan/1/head 2025-03-14T05:31:37.2174776Z * [new branch] gh/XuehaiPan/1/orig -> origin/gh/XuehaiPan/1/orig 2025-03-14T05:31:37.2177201Z * [new branch] gh/XuehaiPan/105/base -> origin/gh/XuehaiPan/105/base 2025-03-14T05:31:37.2178868Z * [new branch] gh/XuehaiPan/105/head -> origin/gh/XuehaiPan/105/head 2025-03-14T05:31:37.2180826Z * [new branch] gh/XuehaiPan/105/orig -> origin/gh/XuehaiPan/105/orig 2025-03-14T05:31:37.2183032Z * [new branch] gh/XuehaiPan/108/base -> origin/gh/XuehaiPan/108/base 2025-03-14T05:31:37.2184744Z * [new branch] gh/XuehaiPan/108/head -> origin/gh/XuehaiPan/108/head 2025-03-14T05:31:37.2186474Z * [new branch] gh/XuehaiPan/108/orig -> origin/gh/XuehaiPan/108/orig 2025-03-14T05:31:37.2188788Z * [new branch] gh/XuehaiPan/109/base -> origin/gh/XuehaiPan/109/base 2025-03-14T05:31:37.2190524Z * [new branch] gh/XuehaiPan/109/head -> origin/gh/XuehaiPan/109/head 2025-03-14T05:31:37.2192219Z * [new branch] gh/XuehaiPan/109/orig -> origin/gh/XuehaiPan/109/orig 2025-03-14T05:31:37.2194762Z * [new branch] gh/XuehaiPan/13/base -> origin/gh/XuehaiPan/13/base 2025-03-14T05:31:37.2196570Z * [new branch] gh/XuehaiPan/13/head -> origin/gh/XuehaiPan/13/head 2025-03-14T05:31:37.2198331Z * [new branch] gh/XuehaiPan/13/orig -> origin/gh/XuehaiPan/13/orig 2025-03-14T05:31:37.2200682Z * [new branch] gh/XuehaiPan/14/base -> origin/gh/XuehaiPan/14/base 2025-03-14T05:31:37.2202305Z * [new branch] gh/XuehaiPan/14/head -> origin/gh/XuehaiPan/14/head 2025-03-14T05:31:37.2203979Z * [new branch] gh/XuehaiPan/14/orig -> origin/gh/XuehaiPan/14/orig 2025-03-14T05:31:37.2206223Z * [new branch] gh/XuehaiPan/179/base -> origin/gh/XuehaiPan/179/base 2025-03-14T05:31:37.2207847Z * [new branch] gh/XuehaiPan/179/head -> origin/gh/XuehaiPan/179/head 2025-03-14T05:31:37.2209583Z * [new branch] gh/XuehaiPan/179/orig -> origin/gh/XuehaiPan/179/orig 2025-03-14T05:31:37.2211846Z * [new branch] gh/XuehaiPan/180/base -> origin/gh/XuehaiPan/180/base 2025-03-14T05:31:37.2213496Z * [new branch] gh/XuehaiPan/180/head -> origin/gh/XuehaiPan/180/head 2025-03-14T05:31:37.2215130Z * [new branch] gh/XuehaiPan/180/orig -> origin/gh/XuehaiPan/180/orig 2025-03-14T05:31:37.2217404Z * [new branch] gh/XuehaiPan/182/base -> origin/gh/XuehaiPan/182/base 2025-03-14T05:31:37.2219006Z * [new branch] gh/XuehaiPan/182/head -> origin/gh/XuehaiPan/182/head 2025-03-14T05:31:37.2220655Z * [new branch] gh/XuehaiPan/182/orig -> origin/gh/XuehaiPan/182/orig 2025-03-14T05:31:37.2223020Z * [new branch] gh/XuehaiPan/183/base -> origin/gh/XuehaiPan/183/base 2025-03-14T05:31:37.2224705Z * [new branch] gh/XuehaiPan/183/head -> origin/gh/XuehaiPan/183/head 2025-03-14T05:31:37.2226360Z * [new branch] gh/XuehaiPan/183/orig -> origin/gh/XuehaiPan/183/orig 2025-03-14T05:31:37.2228672Z * [new branch] gh/XuehaiPan/185/base -> origin/gh/XuehaiPan/185/base 2025-03-14T05:31:37.2230300Z * [new branch] gh/XuehaiPan/185/head -> origin/gh/XuehaiPan/185/head 2025-03-14T05:31:37.2231936Z * [new branch] gh/XuehaiPan/185/orig -> origin/gh/XuehaiPan/185/orig 2025-03-14T05:31:37.2234395Z * [new branch] gh/XuehaiPan/188/base -> origin/gh/XuehaiPan/188/base 2025-03-14T05:31:37.2236068Z * [new branch] gh/XuehaiPan/188/head -> origin/gh/XuehaiPan/188/head 2025-03-14T05:31:37.2237797Z * [new branch] gh/XuehaiPan/188/orig -> origin/gh/XuehaiPan/188/orig 2025-03-14T05:31:37.2240017Z * [new branch] gh/XuehaiPan/189/base -> origin/gh/XuehaiPan/189/base 2025-03-14T05:31:37.2241623Z * [new branch] gh/XuehaiPan/189/head -> origin/gh/XuehaiPan/189/head 2025-03-14T05:31:37.2243299Z * [new branch] gh/XuehaiPan/189/orig -> origin/gh/XuehaiPan/189/orig 2025-03-14T05:31:37.2245845Z * [new branch] gh/XuehaiPan/210/base -> origin/gh/XuehaiPan/210/base 2025-03-14T05:31:37.2247391Z * [new branch] gh/XuehaiPan/210/head -> origin/gh/XuehaiPan/210/head 2025-03-14T05:31:37.2249047Z * [new branch] gh/XuehaiPan/210/orig -> origin/gh/XuehaiPan/210/orig 2025-03-14T05:31:37.2251351Z * [new branch] gh/XuehaiPan/211/base -> origin/gh/XuehaiPan/211/base 2025-03-14T05:31:37.2253059Z * [new branch] gh/XuehaiPan/211/head -> origin/gh/XuehaiPan/211/head 2025-03-14T05:31:37.2254803Z * [new branch] gh/XuehaiPan/211/orig -> origin/gh/XuehaiPan/211/orig 2025-03-14T05:31:37.2257051Z * [new branch] gh/XuehaiPan/217/base -> origin/gh/XuehaiPan/217/base 2025-03-14T05:31:37.2258707Z * [new branch] gh/XuehaiPan/217/head -> origin/gh/XuehaiPan/217/head 2025-03-14T05:31:37.2260340Z * [new branch] gh/XuehaiPan/217/orig -> origin/gh/XuehaiPan/217/orig 2025-03-14T05:31:37.2262568Z * [new branch] gh/XuehaiPan/218/base -> origin/gh/XuehaiPan/218/base 2025-03-14T05:31:37.2264211Z * [new branch] gh/XuehaiPan/218/head -> origin/gh/XuehaiPan/218/head 2025-03-14T05:31:37.2265950Z * [new branch] gh/XuehaiPan/218/orig -> origin/gh/XuehaiPan/218/orig 2025-03-14T05:31:37.2269100Z * [new branch] gh/XuehaiPan/219/base -> origin/gh/XuehaiPan/219/base 2025-03-14T05:31:37.2270861Z * [new branch] gh/XuehaiPan/219/head -> origin/gh/XuehaiPan/219/head 2025-03-14T05:31:37.2272523Z * [new branch] gh/XuehaiPan/219/orig -> origin/gh/XuehaiPan/219/orig 2025-03-14T05:31:37.2275050Z * [new branch] gh/XuehaiPan/221/base -> origin/gh/XuehaiPan/221/base 2025-03-14T05:31:37.2276674Z * [new branch] gh/XuehaiPan/221/head -> origin/gh/XuehaiPan/221/head 2025-03-14T05:31:37.2278390Z * [new branch] gh/XuehaiPan/221/orig -> origin/gh/XuehaiPan/221/orig 2025-03-14T05:31:37.2280746Z * [new branch] gh/XuehaiPan/222/base -> origin/gh/XuehaiPan/222/base 2025-03-14T05:31:37.2282355Z * [new branch] gh/XuehaiPan/222/head -> origin/gh/XuehaiPan/222/head 2025-03-14T05:31:37.2284269Z * [new branch] gh/XuehaiPan/222/orig -> origin/gh/XuehaiPan/222/orig 2025-03-14T05:31:37.2286643Z * [new branch] gh/XuehaiPan/223/base -> origin/gh/XuehaiPan/223/base 2025-03-14T05:31:37.2288296Z * [new branch] gh/XuehaiPan/223/head -> origin/gh/XuehaiPan/223/head 2025-03-14T05:31:37.2289945Z * [new branch] gh/XuehaiPan/223/orig -> origin/gh/XuehaiPan/223/orig 2025-03-14T05:31:37.2357461Z * [new branch] gh/XuehaiPan/224/base -> origin/gh/XuehaiPan/224/base 2025-03-14T05:31:37.2358096Z * [new branch] gh/XuehaiPan/224/head -> origin/gh/XuehaiPan/224/head 2025-03-14T05:31:37.2358670Z * [new branch] gh/XuehaiPan/224/orig -> origin/gh/XuehaiPan/224/orig 2025-03-14T05:31:37.2359217Z * [new branch] gh/XuehaiPan/225/base -> origin/gh/XuehaiPan/225/base 2025-03-14T05:31:37.2359762Z * [new branch] gh/XuehaiPan/225/head -> origin/gh/XuehaiPan/225/head 2025-03-14T05:31:37.2360309Z * [new branch] gh/XuehaiPan/225/orig -> origin/gh/XuehaiPan/225/orig 2025-03-14T05:31:37.2360847Z * [new branch] gh/XuehaiPan/226/base -> origin/gh/XuehaiPan/226/base 2025-03-14T05:31:37.2361388Z * [new branch] gh/XuehaiPan/226/head -> origin/gh/XuehaiPan/226/head 2025-03-14T05:31:37.2361926Z * [new branch] gh/XuehaiPan/226/orig -> origin/gh/XuehaiPan/226/orig 2025-03-14T05:31:37.2362469Z * [new branch] gh/XuehaiPan/227/base -> origin/gh/XuehaiPan/227/base 2025-03-14T05:31:37.2363004Z * [new branch] gh/XuehaiPan/227/head -> origin/gh/XuehaiPan/227/head 2025-03-14T05:31:37.2363833Z * [new branch] gh/XuehaiPan/227/orig -> origin/gh/XuehaiPan/227/orig 2025-03-14T05:31:37.2364428Z * [new branch] gh/XuehaiPan/228/base -> origin/gh/XuehaiPan/228/base 2025-03-14T05:31:37.2364965Z * [new branch] gh/XuehaiPan/228/head -> origin/gh/XuehaiPan/228/head 2025-03-14T05:31:37.2365498Z * [new branch] gh/XuehaiPan/228/orig -> origin/gh/XuehaiPan/228/orig 2025-03-14T05:31:37.2366038Z * [new branch] gh/XuehaiPan/229/base -> origin/gh/XuehaiPan/229/base 2025-03-14T05:31:37.2366576Z * [new branch] gh/XuehaiPan/229/head -> origin/gh/XuehaiPan/229/head 2025-03-14T05:31:37.2367119Z * [new branch] gh/XuehaiPan/229/orig -> origin/gh/XuehaiPan/229/orig 2025-03-14T05:31:37.2367670Z * [new branch] gh/XuehaiPan/230/base -> origin/gh/XuehaiPan/230/base 2025-03-14T05:31:37.2368416Z * [new branch] gh/XuehaiPan/230/head -> origin/gh/XuehaiPan/230/head 2025-03-14T05:31:37.2368964Z * [new branch] gh/XuehaiPan/230/orig -> origin/gh/XuehaiPan/230/orig 2025-03-14T05:31:37.2369498Z * [new branch] gh/XuehaiPan/231/base -> origin/gh/XuehaiPan/231/base 2025-03-14T05:31:37.2370035Z * [new branch] gh/XuehaiPan/231/head -> origin/gh/XuehaiPan/231/head 2025-03-14T05:31:37.2370584Z * [new branch] gh/XuehaiPan/231/orig -> origin/gh/XuehaiPan/231/orig 2025-03-14T05:31:37.2371121Z * [new branch] gh/XuehaiPan/232/base -> origin/gh/XuehaiPan/232/base 2025-03-14T05:31:37.2371657Z * [new branch] gh/XuehaiPan/232/head -> origin/gh/XuehaiPan/232/head 2025-03-14T05:31:37.2372200Z * [new branch] gh/XuehaiPan/232/orig -> origin/gh/XuehaiPan/232/orig 2025-03-14T05:31:37.2372734Z * [new branch] gh/XuehaiPan/233/base -> origin/gh/XuehaiPan/233/base 2025-03-14T05:31:37.2373277Z * [new branch] gh/XuehaiPan/233/head -> origin/gh/XuehaiPan/233/head 2025-03-14T05:31:37.2373817Z * [new branch] gh/XuehaiPan/233/orig -> origin/gh/XuehaiPan/233/orig 2025-03-14T05:31:37.2374356Z * [new branch] gh/XuehaiPan/234/base -> origin/gh/XuehaiPan/234/base 2025-03-14T05:31:37.2374893Z * [new branch] gh/XuehaiPan/234/head -> origin/gh/XuehaiPan/234/head 2025-03-14T05:31:37.2375426Z * [new branch] gh/XuehaiPan/234/orig -> origin/gh/XuehaiPan/234/orig 2025-03-14T05:31:37.2375962Z * [new branch] gh/XuehaiPan/236/base -> origin/gh/XuehaiPan/236/base 2025-03-14T05:31:37.2376496Z * [new branch] gh/XuehaiPan/236/head -> origin/gh/XuehaiPan/236/head 2025-03-14T05:31:37.2377219Z * [new branch] gh/XuehaiPan/236/orig -> origin/gh/XuehaiPan/236/orig 2025-03-14T05:31:37.2377755Z * [new branch] gh/XuehaiPan/237/base -> origin/gh/XuehaiPan/237/base 2025-03-14T05:31:37.2378295Z * [new branch] gh/XuehaiPan/237/head -> origin/gh/XuehaiPan/237/head 2025-03-14T05:31:37.2378838Z * [new branch] gh/XuehaiPan/237/orig -> origin/gh/XuehaiPan/237/orig 2025-03-14T05:31:37.2379381Z * [new branch] gh/XuehaiPan/238/base -> origin/gh/XuehaiPan/238/base 2025-03-14T05:31:37.2379921Z * [new branch] gh/XuehaiPan/238/head -> origin/gh/XuehaiPan/238/head 2025-03-14T05:31:37.2380460Z * [new branch] gh/XuehaiPan/238/orig -> origin/gh/XuehaiPan/238/orig 2025-03-14T05:31:37.2380998Z * [new branch] gh/XuehaiPan/239/base -> origin/gh/XuehaiPan/239/base 2025-03-14T05:31:37.2381542Z * [new branch] gh/XuehaiPan/239/head -> origin/gh/XuehaiPan/239/head 2025-03-14T05:31:37.2382086Z * [new branch] gh/XuehaiPan/239/orig -> origin/gh/XuehaiPan/239/orig 2025-03-14T05:31:37.2382640Z * [new branch] gh/XuehaiPan/240/base -> origin/gh/XuehaiPan/240/base 2025-03-14T05:31:37.2383302Z * [new branch] gh/XuehaiPan/240/head -> origin/gh/XuehaiPan/240/head 2025-03-14T05:31:37.2383840Z * [new branch] gh/XuehaiPan/240/orig -> origin/gh/XuehaiPan/240/orig 2025-03-14T05:31:37.2384636Z * [new branch] gh/XuehaiPan/241/base -> origin/gh/XuehaiPan/241/base 2025-03-14T05:31:37.2386507Z * [new branch] gh/XuehaiPan/241/head -> origin/gh/XuehaiPan/241/head 2025-03-14T05:31:37.2388189Z * [new branch] gh/XuehaiPan/241/orig -> origin/gh/XuehaiPan/241/orig 2025-03-14T05:31:37.2390569Z * [new branch] gh/XuehaiPan/242/base -> origin/gh/XuehaiPan/242/base 2025-03-14T05:31:37.2392191Z * [new branch] gh/XuehaiPan/242/head -> origin/gh/XuehaiPan/242/head 2025-03-14T05:31:37.2393965Z * [new branch] gh/XuehaiPan/242/orig -> origin/gh/XuehaiPan/242/orig 2025-03-14T05:31:37.2396395Z * [new branch] gh/XuehaiPan/243/base -> origin/gh/XuehaiPan/243/base 2025-03-14T05:31:37.2397976Z * [new branch] gh/XuehaiPan/243/head -> origin/gh/XuehaiPan/243/head 2025-03-14T05:31:37.2399571Z * [new branch] gh/XuehaiPan/243/orig -> origin/gh/XuehaiPan/243/orig 2025-03-14T05:31:37.2401882Z * [new branch] gh/XuehaiPan/244/base -> origin/gh/XuehaiPan/244/base 2025-03-14T05:31:37.2403509Z * [new branch] gh/XuehaiPan/244/head -> origin/gh/XuehaiPan/244/head 2025-03-14T05:31:37.2405157Z * [new branch] gh/XuehaiPan/244/orig -> origin/gh/XuehaiPan/244/orig 2025-03-14T05:31:37.2407612Z * [new branch] gh/XuehaiPan/245/base -> origin/gh/XuehaiPan/245/base 2025-03-14T05:31:37.2409254Z * [new branch] gh/XuehaiPan/245/head -> origin/gh/XuehaiPan/245/head 2025-03-14T05:31:37.2410933Z * [new branch] gh/XuehaiPan/245/orig -> origin/gh/XuehaiPan/245/orig 2025-03-14T05:31:37.2413255Z * [new branch] gh/XuehaiPan/246/base -> origin/gh/XuehaiPan/246/base 2025-03-14T05:31:37.2414893Z * [new branch] gh/XuehaiPan/246/head -> origin/gh/XuehaiPan/246/head 2025-03-14T05:31:37.2416557Z * [new branch] gh/XuehaiPan/246/orig -> origin/gh/XuehaiPan/246/orig 2025-03-14T05:31:37.2419011Z * [new branch] gh/XuehaiPan/247/base -> origin/gh/XuehaiPan/247/base 2025-03-14T05:31:37.2420893Z * [new branch] gh/XuehaiPan/247/head -> origin/gh/XuehaiPan/247/head 2025-03-14T05:31:37.2422422Z * [new branch] gh/XuehaiPan/247/orig -> origin/gh/XuehaiPan/247/orig 2025-03-14T05:31:37.2424711Z * [new branch] gh/XuehaiPan/248/base -> origin/gh/XuehaiPan/248/base 2025-03-14T05:31:37.2426339Z * [new branch] gh/XuehaiPan/248/head -> origin/gh/XuehaiPan/248/head 2025-03-14T05:31:37.2428053Z * [new branch] gh/XuehaiPan/248/orig -> origin/gh/XuehaiPan/248/orig 2025-03-14T05:31:37.2430418Z * [new branch] gh/XuehaiPan/249/base -> origin/gh/XuehaiPan/249/base 2025-03-14T05:31:37.2432064Z * [new branch] gh/XuehaiPan/249/head -> origin/gh/XuehaiPan/249/head 2025-03-14T05:31:37.2433769Z * [new branch] gh/XuehaiPan/249/orig -> origin/gh/XuehaiPan/249/orig 2025-03-14T05:31:37.2436424Z * [new branch] gh/XuehaiPan/250/base -> origin/gh/XuehaiPan/250/base 2025-03-14T05:31:37.2438162Z * [new branch] gh/XuehaiPan/250/head -> origin/gh/XuehaiPan/250/head 2025-03-14T05:31:37.2439712Z * [new branch] gh/XuehaiPan/250/orig -> origin/gh/XuehaiPan/250/orig 2025-03-14T05:31:37.2441985Z * [new branch] gh/XuehaiPan/251/base -> origin/gh/XuehaiPan/251/base 2025-03-14T05:31:37.2443569Z * [new branch] gh/XuehaiPan/251/head -> origin/gh/XuehaiPan/251/head 2025-03-14T05:31:37.2445371Z * [new branch] gh/XuehaiPan/251/orig -> origin/gh/XuehaiPan/251/orig 2025-03-14T05:31:37.2447644Z * [new branch] gh/XuehaiPan/252/base -> origin/gh/XuehaiPan/252/base 2025-03-14T05:31:37.2449255Z * [new branch] gh/XuehaiPan/252/head -> origin/gh/XuehaiPan/252/head 2025-03-14T05:31:37.2450951Z * [new branch] gh/XuehaiPan/252/orig -> origin/gh/XuehaiPan/252/orig 2025-03-14T05:31:37.2453330Z * [new branch] gh/XuehaiPan/253/base -> origin/gh/XuehaiPan/253/base 2025-03-14T05:31:37.2454981Z * [new branch] gh/XuehaiPan/253/head -> origin/gh/XuehaiPan/253/head 2025-03-14T05:31:37.2456651Z * [new branch] gh/XuehaiPan/253/orig -> origin/gh/XuehaiPan/253/orig 2025-03-14T05:31:37.2459094Z * [new branch] gh/XuehaiPan/254/base -> origin/gh/XuehaiPan/254/base 2025-03-14T05:31:37.2460756Z * [new branch] gh/XuehaiPan/254/head -> origin/gh/XuehaiPan/254/head 2025-03-14T05:31:37.2462346Z * [new branch] gh/XuehaiPan/254/orig -> origin/gh/XuehaiPan/254/orig 2025-03-14T05:31:37.2464721Z * [new branch] gh/XuehaiPan/255/base -> origin/gh/XuehaiPan/255/base 2025-03-14T05:31:37.2466336Z * [new branch] gh/XuehaiPan/255/head -> origin/gh/XuehaiPan/255/head 2025-03-14T05:31:37.2468315Z * [new branch] gh/XuehaiPan/255/orig -> origin/gh/XuehaiPan/255/orig 2025-03-14T05:31:37.2472101Z * [new branch] gh/XuehaiPan/256/base -> origin/gh/XuehaiPan/256/base 2025-03-14T05:31:37.2473699Z * [new branch] gh/XuehaiPan/256/head -> origin/gh/XuehaiPan/256/head 2025-03-14T05:31:37.2475525Z * [new branch] gh/XuehaiPan/256/orig -> origin/gh/XuehaiPan/256/orig 2025-03-14T05:31:37.2477953Z * [new branch] gh/XuehaiPan/257/base -> origin/gh/XuehaiPan/257/base 2025-03-14T05:31:37.2479556Z * [new branch] gh/XuehaiPan/257/head -> origin/gh/XuehaiPan/257/head 2025-03-14T05:31:37.2481224Z * [new branch] gh/XuehaiPan/257/orig -> origin/gh/XuehaiPan/257/orig 2025-03-14T05:31:37.2483580Z * [new branch] gh/XuehaiPan/258/base -> origin/gh/XuehaiPan/258/base 2025-03-14T05:31:37.2485179Z * [new branch] gh/XuehaiPan/258/head -> origin/gh/XuehaiPan/258/head 2025-03-14T05:31:37.2486785Z * [new branch] gh/XuehaiPan/258/orig -> origin/gh/XuehaiPan/258/orig 2025-03-14T05:31:37.2489144Z * [new branch] gh/XuehaiPan/259/base -> origin/gh/XuehaiPan/259/base 2025-03-14T05:31:37.2490729Z * [new branch] gh/XuehaiPan/259/head -> origin/gh/XuehaiPan/259/head 2025-03-14T05:31:37.2492459Z * [new branch] gh/XuehaiPan/259/orig -> origin/gh/XuehaiPan/259/orig 2025-03-14T05:31:37.2494842Z * [new branch] gh/XuehaiPan/260/base -> origin/gh/XuehaiPan/260/base 2025-03-14T05:31:37.2496486Z * [new branch] gh/XuehaiPan/260/head -> origin/gh/XuehaiPan/260/head 2025-03-14T05:31:37.2498264Z * [new branch] gh/XuehaiPan/260/orig -> origin/gh/XuehaiPan/260/orig 2025-03-14T05:31:37.2500623Z * [new branch] gh/XuehaiPan/261/base -> origin/gh/XuehaiPan/261/base 2025-03-14T05:31:37.2502276Z * [new branch] gh/XuehaiPan/261/head -> origin/gh/XuehaiPan/261/head 2025-03-14T05:31:37.2503931Z * [new branch] gh/XuehaiPan/261/orig -> origin/gh/XuehaiPan/261/orig 2025-03-14T05:31:37.2506394Z * [new branch] gh/XuehaiPan/30/base -> origin/gh/XuehaiPan/30/base 2025-03-14T05:31:37.2508051Z * [new branch] gh/XuehaiPan/30/head -> origin/gh/XuehaiPan/30/head 2025-03-14T05:31:37.2509713Z * [new branch] gh/XuehaiPan/30/orig -> origin/gh/XuehaiPan/30/orig 2025-03-14T05:31:37.2512100Z * [new branch] gh/XuehaiPan/72/base -> origin/gh/XuehaiPan/72/base 2025-03-14T05:31:37.2513957Z * [new branch] gh/XuehaiPan/72/head -> origin/gh/XuehaiPan/72/head 2025-03-14T05:31:37.2515344Z * [new branch] gh/XuehaiPan/72/orig -> origin/gh/XuehaiPan/72/orig 2025-03-14T05:31:37.2517870Z * [new branch] gh/XuehaiPan/9/base -> origin/gh/XuehaiPan/9/base 2025-03-14T05:31:37.2519553Z * [new branch] gh/XuehaiPan/9/orig -> origin/gh/XuehaiPan/9/orig 2025-03-14T05:31:37.2522009Z * [new branch] gh/XuehaiPan/97/base -> origin/gh/XuehaiPan/97/base 2025-03-14T05:31:37.2523666Z * [new branch] gh/XuehaiPan/97/head -> origin/gh/XuehaiPan/97/head 2025-03-14T05:31:37.2525300Z * [new branch] gh/XuehaiPan/97/orig -> origin/gh/XuehaiPan/97/orig 2025-03-14T05:31:37.2527668Z * [new branch] gh/XuehaiPan/98/base -> origin/gh/XuehaiPan/98/base 2025-03-14T05:31:37.2529347Z * [new branch] gh/XuehaiPan/98/head -> origin/gh/XuehaiPan/98/head 2025-03-14T05:31:37.2530977Z * [new branch] gh/XuehaiPan/98/orig -> origin/gh/XuehaiPan/98/orig 2025-03-14T05:31:37.2533422Z * [new branch] gh/XuehaiPan/99/base -> origin/gh/XuehaiPan/99/base 2025-03-14T05:31:37.2535110Z * [new branch] gh/XuehaiPan/99/head -> origin/gh/XuehaiPan/99/head 2025-03-14T05:31:37.2536771Z * [new branch] gh/XuehaiPan/99/orig -> origin/gh/XuehaiPan/99/orig 2025-03-14T05:31:37.2539465Z * [new branch] gh/ZhiweiYan-96/23/base -> origin/gh/ZhiweiYan-96/23/base 2025-03-14T05:31:37.2541197Z * [new branch] gh/ZhiweiYan-96/23/head -> origin/gh/ZhiweiYan-96/23/head 2025-03-14T05:31:37.2542838Z * [new branch] gh/ZhiweiYan-96/23/orig -> origin/gh/ZhiweiYan-96/23/orig 2025-03-14T05:31:37.2545236Z * [new branch] gh/ZhiweiYan-96/27/base -> origin/gh/ZhiweiYan-96/27/base 2025-03-14T05:31:37.2546868Z * [new branch] gh/ZhiweiYan-96/27/head -> origin/gh/ZhiweiYan-96/27/head 2025-03-14T05:31:37.2548548Z * [new branch] gh/ZhiweiYan-96/27/orig -> origin/gh/ZhiweiYan-96/27/orig 2025-03-14T05:31:37.2550744Z * [new branch] gh/ZhiweiYan-96/29/base -> origin/gh/ZhiweiYan-96/29/base 2025-03-14T05:31:37.2552451Z * [new branch] gh/ZhiweiYan-96/29/head -> origin/gh/ZhiweiYan-96/29/head 2025-03-14T05:31:37.2554124Z * [new branch] gh/ZhiweiYan-96/29/orig -> origin/gh/ZhiweiYan-96/29/orig 2025-03-14T05:31:37.2556562Z * [new branch] gh/ZhiweiYan-96/30/base -> origin/gh/ZhiweiYan-96/30/base 2025-03-14T05:31:37.2558269Z * [new branch] gh/ZhiweiYan-96/30/head -> origin/gh/ZhiweiYan-96/30/head 2025-03-14T05:31:37.2559695Z * [new branch] gh/ZhiweiYan-96/30/orig -> origin/gh/ZhiweiYan-96/30/orig 2025-03-14T05:31:37.2562196Z * [new branch] gh/ZhiweiYan-96/31/base -> origin/gh/ZhiweiYan-96/31/base 2025-03-14T05:31:37.2563834Z * [new branch] gh/ZhiweiYan-96/31/head -> origin/gh/ZhiweiYan-96/31/head 2025-03-14T05:31:37.2565507Z * [new branch] gh/ZhiweiYan-96/31/orig -> origin/gh/ZhiweiYan-96/31/orig 2025-03-14T05:31:37.2568123Z * [new branch] gh/ZhiweiYan-96/32/base -> origin/gh/ZhiweiYan-96/32/base 2025-03-14T05:31:37.2569996Z * [new branch] gh/ZhiweiYan-96/32/head -> origin/gh/ZhiweiYan-96/32/head 2025-03-14T05:31:37.2571666Z * [new branch] gh/ZhiweiYan-96/32/orig -> origin/gh/ZhiweiYan-96/32/orig 2025-03-14T05:31:37.2573928Z * [new branch] gh/ZhiweiYan-96/33/base -> origin/gh/ZhiweiYan-96/33/base 2025-03-14T05:31:37.2575601Z * [new branch] gh/ZhiweiYan-96/33/head -> origin/gh/ZhiweiYan-96/33/head 2025-03-14T05:31:37.2577271Z * [new branch] gh/ZhiweiYan-96/33/orig -> origin/gh/ZhiweiYan-96/33/orig 2025-03-14T05:31:37.2579917Z * [new branch] gh/ZhiweiYan-96/38/base -> origin/gh/ZhiweiYan-96/38/base 2025-03-14T05:31:37.2581756Z * [new branch] gh/ZhiweiYan-96/38/head -> origin/gh/ZhiweiYan-96/38/head 2025-03-14T05:31:37.2583068Z * [new branch] gh/ZhiweiYan-96/38/orig -> origin/gh/ZhiweiYan-96/38/orig 2025-03-14T05:31:37.2585565Z * [new branch] gh/ZhiweiYan-96/39/base -> origin/gh/ZhiweiYan-96/39/base 2025-03-14T05:31:37.2587374Z * [new branch] gh/ZhiweiYan-96/39/head -> origin/gh/ZhiweiYan-96/39/head 2025-03-14T05:31:37.2588667Z * [new branch] gh/ZhiweiYan-96/39/orig -> origin/gh/ZhiweiYan-96/39/orig 2025-03-14T05:31:37.2591744Z * [new branch] gh/ZhiweiYan-96/40/base -> origin/gh/ZhiweiYan-96/40/base 2025-03-14T05:31:37.2593392Z * [new branch] gh/ZhiweiYan-96/40/head -> origin/gh/ZhiweiYan-96/40/head 2025-03-14T05:31:37.2595191Z * [new branch] gh/ZhiweiYan-96/40/orig -> origin/gh/ZhiweiYan-96/40/orig 2025-03-14T05:31:37.2597617Z * [new branch] gh/ZhiweiYan-96/41/base -> origin/gh/ZhiweiYan-96/41/base 2025-03-14T05:31:37.2599286Z * [new branch] gh/ZhiweiYan-96/41/head -> origin/gh/ZhiweiYan-96/41/head 2025-03-14T05:31:37.2600908Z * [new branch] gh/ZhiweiYan-96/41/orig -> origin/gh/ZhiweiYan-96/41/orig 2025-03-14T05:31:37.2603212Z * [new branch] gh/ZhiweiYan-96/42/base -> origin/gh/ZhiweiYan-96/42/base 2025-03-14T05:31:37.2604866Z * [new branch] gh/ZhiweiYan-96/42/head -> origin/gh/ZhiweiYan-96/42/head 2025-03-14T05:31:37.2606499Z * [new branch] gh/ZhiweiYan-96/42/orig -> origin/gh/ZhiweiYan-96/42/orig 2025-03-14T05:31:37.2608690Z * [new branch] gh/ZhiweiYan-96/43/base -> origin/gh/ZhiweiYan-96/43/base 2025-03-14T05:31:37.2610366Z * [new branch] gh/ZhiweiYan-96/43/head -> origin/gh/ZhiweiYan-96/43/head 2025-03-14T05:31:37.2612077Z * [new branch] gh/ZhiweiYan-96/43/orig -> origin/gh/ZhiweiYan-96/43/orig 2025-03-14T05:31:37.2614422Z * [new branch] gh/ZhiweiYan-96/44/base -> origin/gh/ZhiweiYan-96/44/base 2025-03-14T05:31:37.2616075Z * [new branch] gh/ZhiweiYan-96/44/head -> origin/gh/ZhiweiYan-96/44/head 2025-03-14T05:31:37.2618380Z * [new branch] gh/ZhiweiYan-96/45/base -> origin/gh/ZhiweiYan-96/45/base 2025-03-14T05:31:37.2619990Z * [new branch] gh/ZhiweiYan-96/45/head -> origin/gh/ZhiweiYan-96/45/head 2025-03-14T05:31:37.2622262Z * [new branch] gh/ZhiweiYan-96/46/base -> origin/gh/ZhiweiYan-96/46/base 2025-03-14T05:31:37.2624011Z * [new branch] gh/ZhiweiYan-96/46/head -> origin/gh/ZhiweiYan-96/46/head 2025-03-14T05:31:37.2625458Z * [new branch] gh/ZhiweiYan-96/46/orig -> origin/gh/ZhiweiYan-96/46/orig 2025-03-14T05:31:37.2627919Z * [new branch] gh/ZhiweiYan-96/47/base -> origin/gh/ZhiweiYan-96/47/base 2025-03-14T05:31:37.2629541Z * [new branch] gh/ZhiweiYan-96/47/head -> origin/gh/ZhiweiYan-96/47/head 2025-03-14T05:31:37.2631216Z * [new branch] gh/ZhiweiYan-96/47/orig -> origin/gh/ZhiweiYan-96/47/orig 2025-03-14T05:31:37.2633438Z * [new branch] gh/ZhiweiYan-96/48/base -> origin/gh/ZhiweiYan-96/48/base 2025-03-14T05:31:37.2635334Z * [new branch] gh/ZhiweiYan-96/48/head -> origin/gh/ZhiweiYan-96/48/head 2025-03-14T05:31:37.2637047Z * [new branch] gh/ZhiweiYan-96/48/orig -> origin/gh/ZhiweiYan-96/48/orig 2025-03-14T05:31:37.2639861Z * [new branch] gh/ZhiweiYan-96/49/base -> origin/gh/ZhiweiYan-96/49/base 2025-03-14T05:31:37.2641697Z * [new branch] gh/ZhiweiYan-96/49/head -> origin/gh/ZhiweiYan-96/49/head 2025-03-14T05:31:37.2643889Z * [new branch] gh/ZhiweiYan-96/50/base -> origin/gh/ZhiweiYan-96/50/base 2025-03-14T05:31:37.2645540Z * [new branch] gh/ZhiweiYan-96/50/head -> origin/gh/ZhiweiYan-96/50/head 2025-03-14T05:31:37.2647366Z * [new branch] gh/ZhiweiYan-96/50/orig -> origin/gh/ZhiweiYan-96/50/orig 2025-03-14T05:31:37.2649531Z * [new branch] gh/ZhiweiYan-96/51/base -> origin/gh/ZhiweiYan-96/51/base 2025-03-14T05:31:37.2651190Z * [new branch] gh/ZhiweiYan-96/51/head -> origin/gh/ZhiweiYan-96/51/head 2025-03-14T05:31:37.2652641Z * [new branch] gh/ZhiweiYan-96/51/orig -> origin/gh/ZhiweiYan-96/51/orig 2025-03-14T05:31:37.2655117Z * [new branch] gh/ZhiweiYan-96/52/base -> origin/gh/ZhiweiYan-96/52/base 2025-03-14T05:31:37.2656819Z * [new branch] gh/ZhiweiYan-96/52/head -> origin/gh/ZhiweiYan-96/52/head 2025-03-14T05:31:37.2658563Z * [new branch] gh/ZhiweiYan-96/52/orig -> origin/gh/ZhiweiYan-96/52/orig 2025-03-14T05:31:37.2660891Z * [new branch] gh/ZhiweiYan-96/53/base -> origin/gh/ZhiweiYan-96/53/base 2025-03-14T05:31:37.2662631Z * [new branch] gh/ZhiweiYan-96/53/head -> origin/gh/ZhiweiYan-96/53/head 2025-03-14T05:31:37.2664032Z * [new branch] gh/ZhiweiYan-96/53/orig -> origin/gh/ZhiweiYan-96/53/orig 2025-03-14T05:31:37.2666569Z * [new branch] gh/ZhiweiYan-96/54/base -> origin/gh/ZhiweiYan-96/54/base 2025-03-14T05:31:37.2668266Z * [new branch] gh/ZhiweiYan-96/54/head -> origin/gh/ZhiweiYan-96/54/head 2025-03-14T05:31:37.2670261Z * [new branch] gh/ZhiweiYan-96/54/orig -> origin/gh/ZhiweiYan-96/54/orig 2025-03-14T05:31:37.2673448Z * [new branch] gh/aakhundov/1/base -> origin/gh/aakhundov/1/base 2025-03-14T05:31:37.2674881Z * [new branch] gh/aakhundov/1/head -> origin/gh/aakhundov/1/head 2025-03-14T05:31:37.2677197Z * [new branch] gh/aakhundov/2/base -> origin/gh/aakhundov/2/base 2025-03-14T05:31:37.2678618Z * [new branch] gh/aakhundov/2/head -> origin/gh/aakhundov/2/head 2025-03-14T05:31:37.2681211Z * [new branch] gh/aditew01/openblas -> origin/gh/aditew01/openblas 2025-03-14T05:31:37.2682979Z * [new branch] gh/aditew01/sbgemm -> origin/gh/aditew01/sbgemm 2025-03-14T05:31:37.2684988Z * [new branch] gh/aditew01/vecbf16 -> origin/gh/aditew01/vecbf16 2025-03-14T05:31:37.2687652Z * [new branch] gh/albanD/3/base -> origin/gh/albanD/3/base 2025-03-14T05:31:37.2689372Z * [new branch] gh/albanD/3/head -> origin/gh/albanD/3/head 2025-03-14T05:31:37.2691015Z * [new branch] gh/albanD/3/orig -> origin/gh/albanD/3/orig 2025-03-14T05:31:37.2693425Z * [new branch] gh/alexbrauckmann/paddedtensor_init -> origin/gh/alexbrauckmann/paddedtensor_init 2025-03-14T05:31:37.2696166Z * [new branch] gh/alexsamardzic/25/base -> origin/gh/alexsamardzic/25/base 2025-03-14T05:31:37.2697575Z * [new branch] gh/alexsamardzic/25/head -> origin/gh/alexsamardzic/25/head 2025-03-14T05:31:37.2699518Z * [new branch] gh/alexsamardzic/25/orig -> origin/gh/alexsamardzic/25/orig 2025-03-14T05:31:37.2701701Z * [new branch] gh/alexsamardzic/26/base -> origin/gh/alexsamardzic/26/base 2025-03-14T05:31:37.2703350Z * [new branch] gh/alexsamardzic/26/head -> origin/gh/alexsamardzic/26/head 2025-03-14T05:31:37.2704883Z * [new branch] gh/alexsamardzic/26/orig -> origin/gh/alexsamardzic/26/orig 2025-03-14T05:31:37.2707803Z * [new branch] gh/amjames/18/base -> origin/gh/amjames/18/base 2025-03-14T05:31:37.2709490Z * [new branch] gh/amjames/18/head -> origin/gh/amjames/18/head 2025-03-14T05:31:37.2711114Z * [new branch] gh/amjames/18/orig -> origin/gh/amjames/18/orig 2025-03-14T05:31:37.2713359Z * [new branch] gh/amjames/19/base -> origin/gh/amjames/19/base 2025-03-14T05:31:37.2715261Z * [new branch] gh/amjames/19/head -> origin/gh/amjames/19/head 2025-03-14T05:31:37.2716568Z * [new branch] gh/amjames/19/orig -> origin/gh/amjames/19/orig 2025-03-14T05:31:37.2719044Z * [new branch] gh/amjames/20/base -> origin/gh/amjames/20/base 2025-03-14T05:31:37.2720642Z * [new branch] gh/amjames/20/head -> origin/gh/amjames/20/head 2025-03-14T05:31:37.2722391Z * [new branch] gh/amjames/20/orig -> origin/gh/amjames/20/orig 2025-03-14T05:31:37.2725206Z * [new branch] gh/andrewlee302/1/base -> origin/gh/andrewlee302/1/base 2025-03-14T05:31:37.2726898Z * [new branch] gh/andrewlee302/1/head -> origin/gh/andrewlee302/1/head 2025-03-14T05:31:37.2729213Z * [new branch] gh/andrewlee302/3/base -> origin/gh/andrewlee302/3/base 2025-03-14T05:31:37.2730921Z * [new branch] gh/andrewlee302/3/head -> origin/gh/andrewlee302/3/head 2025-03-14T05:31:37.2732707Z * [new branch] gh/andrewlee302/3/orig -> origin/gh/andrewlee302/3/orig 2025-03-14T05:31:37.2735628Z * [new branch] gh/andrewor14/35/base -> origin/gh/andrewor14/35/base 2025-03-14T05:31:37.2737109Z * [new branch] gh/andrewor14/35/head -> origin/gh/andrewor14/35/head 2025-03-14T05:31:37.2738971Z * [new branch] gh/andrewor14/35/orig -> origin/gh/andrewor14/35/orig 2025-03-14T05:31:37.2741464Z * [new branch] gh/andrewor14/36/base -> origin/gh/andrewor14/36/base 2025-03-14T05:31:37.2743382Z * [new branch] gh/andrewor14/36/head -> origin/gh/andrewor14/36/head 2025-03-14T05:31:37.2745035Z * [new branch] gh/andrewor14/36/orig -> origin/gh/andrewor14/36/orig 2025-03-14T05:31:37.2747480Z * [new branch] gh/andrewor14/37/base -> origin/gh/andrewor14/37/base 2025-03-14T05:31:37.2749145Z * [new branch] gh/andrewor14/37/head -> origin/gh/andrewor14/37/head 2025-03-14T05:31:37.2751143Z * [new branch] gh/andrewor14/37/orig -> origin/gh/andrewor14/37/orig 2025-03-14T05:31:37.2753283Z * [new branch] gh/andrewor14/50/base -> origin/gh/andrewor14/50/base 2025-03-14T05:31:37.2755146Z * [new branch] gh/andrewor14/50/head -> origin/gh/andrewor14/50/head 2025-03-14T05:31:37.2756996Z * [new branch] gh/andrewor14/50/orig -> origin/gh/andrewor14/50/orig 2025-03-14T05:31:37.2759649Z * [new branch] gh/angelayi/64/base -> origin/gh/angelayi/64/base 2025-03-14T05:31:37.2761301Z * [new branch] gh/angelayi/64/head -> origin/gh/angelayi/64/head 2025-03-14T05:31:37.2762670Z * [new branch] gh/angelayi/64/orig -> origin/gh/angelayi/64/orig 2025-03-14T05:31:37.2765634Z * [new branch] gh/angelayi/65/base -> origin/gh/angelayi/65/base 2025-03-14T05:31:37.2767601Z * [new branch] gh/angelayi/65/head -> origin/gh/angelayi/65/head 2025-03-14T05:31:37.2769564Z * [new branch] gh/angelayi/65/orig -> origin/gh/angelayi/65/orig 2025-03-14T05:31:37.2772180Z * [new branch] gh/angelayi/66/base -> origin/gh/angelayi/66/base 2025-03-14T05:31:37.2773424Z * [new branch] gh/angelayi/66/head -> origin/gh/angelayi/66/head 2025-03-14T05:31:37.2775295Z * [new branch] gh/angelayi/66/orig -> origin/gh/angelayi/66/orig 2025-03-14T05:31:37.2778165Z * [new branch] gh/angelayi/67/base -> origin/gh/angelayi/67/base 2025-03-14T05:31:37.2779249Z * [new branch] gh/angelayi/67/head -> origin/gh/angelayi/67/head 2025-03-14T05:31:37.2781102Z * [new branch] gh/angelayi/67/orig -> origin/gh/angelayi/67/orig 2025-03-14T05:31:37.2783426Z * [new branch] gh/angelayi/68/base -> origin/gh/angelayi/68/base 2025-03-14T05:31:37.2785270Z * [new branch] gh/angelayi/68/head -> origin/gh/angelayi/68/head 2025-03-14T05:31:37.2786593Z * [new branch] gh/angelayi/68/orig -> origin/gh/angelayi/68/orig 2025-03-14T05:31:37.2789238Z * [new branch] gh/angelayi/69/base -> origin/gh/angelayi/69/base 2025-03-14T05:31:37.2790617Z * [new branch] gh/angelayi/69/head -> origin/gh/angelayi/69/head 2025-03-14T05:31:37.2792478Z * [new branch] gh/angelayi/69/orig -> origin/gh/angelayi/69/orig 2025-03-14T05:31:37.2794895Z * [new branch] gh/angelayi/70/base -> origin/gh/angelayi/70/base 2025-03-14T05:31:37.2796540Z * [new branch] gh/angelayi/70/head -> origin/gh/angelayi/70/head 2025-03-14T05:31:37.2798008Z * [new branch] gh/angelayi/70/orig -> origin/gh/angelayi/70/orig 2025-03-14T05:31:37.2800707Z * [new branch] gh/angelayi/71/base -> origin/gh/angelayi/71/base 2025-03-14T05:31:37.2802743Z * [new branch] gh/angelayi/71/head -> origin/gh/angelayi/71/head 2025-03-14T05:31:37.2804079Z * [new branch] gh/angelayi/71/orig -> origin/gh/angelayi/71/orig 2025-03-14T05:31:37.2806461Z * [new branch] gh/angelayi/72/base -> origin/gh/angelayi/72/base 2025-03-14T05:31:37.2808302Z * [new branch] gh/angelayi/72/head -> origin/gh/angelayi/72/head 2025-03-14T05:31:37.2809906Z * [new branch] gh/angelayi/72/orig -> origin/gh/angelayi/72/orig 2025-03-14T05:31:37.2812235Z * [new branch] gh/angelayi/73/base -> origin/gh/angelayi/73/base 2025-03-14T05:31:37.2813876Z * [new branch] gh/angelayi/73/head -> origin/gh/angelayi/73/head 2025-03-14T05:31:37.2815364Z * [new branch] gh/angelayi/73/orig -> origin/gh/angelayi/73/orig 2025-03-14T05:31:37.2817916Z * [new branch] gh/angelayi/74/base -> origin/gh/angelayi/74/base 2025-03-14T05:31:37.2819256Z * [new branch] gh/angelayi/74/head -> origin/gh/angelayi/74/head 2025-03-14T05:31:37.2821198Z * [new branch] gh/angelayi/74/orig -> origin/gh/angelayi/74/orig 2025-03-14T05:31:37.2823427Z * [new branch] gh/angelayi/75/base -> origin/gh/angelayi/75/base 2025-03-14T05:31:37.2825031Z * [new branch] gh/angelayi/75/head -> origin/gh/angelayi/75/head 2025-03-14T05:31:37.2826778Z * [new branch] gh/angelayi/75/orig -> origin/gh/angelayi/75/orig 2025-03-14T05:31:37.2828929Z * [new branch] gh/angelayi/76/base -> origin/gh/angelayi/76/base 2025-03-14T05:31:37.2830401Z * [new branch] gh/angelayi/76/head -> origin/gh/angelayi/76/head 2025-03-14T05:31:37.2832306Z * [new branch] gh/angelayi/76/orig -> origin/gh/angelayi/76/orig 2025-03-14T05:31:37.2835431Z * [new branch] gh/anijain2305/162/base -> origin/gh/anijain2305/162/base 2025-03-14T05:31:37.2837242Z * [new branch] gh/anijain2305/162/head -> origin/gh/anijain2305/162/head 2025-03-14T05:31:37.2839718Z * [new branch] gh/anijain2305/541/head -> origin/gh/anijain2305/541/head 2025-03-14T05:31:37.2841736Z * [new branch] gh/anijain2305/566/base -> origin/gh/anijain2305/566/base 2025-03-14T05:31:37.2843521Z * [new branch] gh/anijain2305/566/head -> origin/gh/anijain2305/566/head 2025-03-14T05:31:37.2844944Z * [new branch] gh/anijain2305/566/orig -> origin/gh/anijain2305/566/orig 2025-03-14T05:31:37.2847658Z * [new branch] gh/anijain2305/571/base -> origin/gh/anijain2305/571/base 2025-03-14T05:31:37.2848970Z * [new branch] gh/anijain2305/571/head -> origin/gh/anijain2305/571/head 2025-03-14T05:31:37.2850770Z * [new branch] gh/anijain2305/571/orig -> origin/gh/anijain2305/571/orig 2025-03-14T05:31:37.2853237Z * [new branch] gh/anijain2305/580/base -> origin/gh/anijain2305/580/base 2025-03-14T05:31:37.2854675Z * [new branch] gh/anijain2305/580/head -> origin/gh/anijain2305/580/head 2025-03-14T05:31:37.2856635Z * [new branch] gh/anijain2305/580/orig -> origin/gh/anijain2305/580/orig 2025-03-14T05:31:37.2858997Z * [new branch] gh/anijain2305/620/base -> origin/gh/anijain2305/620/base 2025-03-14T05:31:37.2860691Z * [new branch] gh/anijain2305/620/head -> origin/gh/anijain2305/620/head 2025-03-14T05:31:37.2862440Z * [new branch] gh/anijain2305/620/orig -> origin/gh/anijain2305/620/orig 2025-03-14T05:31:37.2864767Z * [new branch] gh/anijain2305/634/base -> origin/gh/anijain2305/634/base 2025-03-14T05:31:37.2866433Z * [new branch] gh/anijain2305/634/head -> origin/gh/anijain2305/634/head 2025-03-14T05:31:37.2868153Z * [new branch] gh/anijain2305/634/orig -> origin/gh/anijain2305/634/orig 2025-03-14T05:31:37.2873786Z * [new branch] gh/anijain2305/668/base -> origin/gh/anijain2305/668/base 2025-03-14T05:31:37.2875566Z * [new branch] gh/anijain2305/668/head -> origin/gh/anijain2305/668/head 2025-03-14T05:31:37.2877055Z * [new branch] gh/anijain2305/668/orig -> origin/gh/anijain2305/668/orig 2025-03-14T05:31:37.2879650Z * [new branch] gh/anijain2305/669/base -> origin/gh/anijain2305/669/base 2025-03-14T05:31:37.2881217Z * [new branch] gh/anijain2305/669/head -> origin/gh/anijain2305/669/head 2025-03-14T05:31:37.2883487Z * [new branch] gh/anijain2305/669/orig -> origin/gh/anijain2305/669/orig 2025-03-14T05:31:37.2885864Z * [new branch] gh/anijain2305/675/base -> origin/gh/anijain2305/675/base 2025-03-14T05:31:37.2887329Z * [new branch] gh/anijain2305/675/head -> origin/gh/anijain2305/675/head 2025-03-14T05:31:37.2889236Z * [new branch] gh/anijain2305/675/orig -> origin/gh/anijain2305/675/orig 2025-03-14T05:31:37.2891764Z * [new branch] gh/anijain2305/677/base -> origin/gh/anijain2305/677/base 2025-03-14T05:31:37.2893186Z * [new branch] gh/anijain2305/677/head -> origin/gh/anijain2305/677/head 2025-03-14T05:31:37.2895029Z * [new branch] gh/anijain2305/677/orig -> origin/gh/anijain2305/677/orig 2025-03-14T05:31:37.2897348Z * [new branch] gh/anijain2305/679/base -> origin/gh/anijain2305/679/base 2025-03-14T05:31:37.2898980Z * [new branch] gh/anijain2305/679/head -> origin/gh/anijain2305/679/head 2025-03-14T05:31:37.2900429Z * [new branch] gh/anijain2305/679/orig -> origin/gh/anijain2305/679/orig 2025-03-14T05:31:37.2902777Z * [new branch] gh/anijain2305/680/base -> origin/gh/anijain2305/680/base 2025-03-14T05:31:37.2904425Z * [new branch] gh/anijain2305/680/head -> origin/gh/anijain2305/680/head 2025-03-14T05:31:37.2906288Z * [new branch] gh/anijain2305/680/orig -> origin/gh/anijain2305/680/orig 2025-03-14T05:31:37.2908278Z * [new branch] gh/anijain2305/681/base -> origin/gh/anijain2305/681/base 2025-03-14T05:31:37.2910012Z * [new branch] gh/anijain2305/681/head -> origin/gh/anijain2305/681/head 2025-03-14T05:31:37.2911465Z * [new branch] gh/anijain2305/681/orig -> origin/gh/anijain2305/681/orig 2025-03-14T05:31:37.2914186Z * [new branch] gh/anijain2305/682/base -> origin/gh/anijain2305/682/base 2025-03-14T05:31:37.2915967Z * [new branch] gh/anijain2305/682/head -> origin/gh/anijain2305/682/head 2025-03-14T05:31:37.2917302Z * [new branch] gh/anijain2305/682/orig -> origin/gh/anijain2305/682/orig 2025-03-14T05:31:37.2919646Z * [new branch] gh/anijain2305/683/base -> origin/gh/anijain2305/683/base 2025-03-14T05:31:37.2921567Z * [new branch] gh/anijain2305/683/head -> origin/gh/anijain2305/683/head 2025-03-14T05:31:37.2922803Z * [new branch] gh/anijain2305/683/orig -> origin/gh/anijain2305/683/orig 2025-03-14T05:31:37.2925980Z * [new branch] gh/anijain2305/684/base -> origin/gh/anijain2305/684/base 2025-03-14T05:31:37.2927416Z * [new branch] gh/anijain2305/684/head -> origin/gh/anijain2305/684/head 2025-03-14T05:31:37.2929383Z * [new branch] gh/anijain2305/684/orig -> origin/gh/anijain2305/684/orig 2025-03-14T05:31:37.2932134Z * [new branch] gh/anijain2305/685/base -> origin/gh/anijain2305/685/base 2025-03-14T05:31:37.2933370Z * [new branch] gh/anijain2305/685/head -> origin/gh/anijain2305/685/head 2025-03-14T05:31:37.2935168Z * [new branch] gh/anijain2305/685/orig -> origin/gh/anijain2305/685/orig 2025-03-14T05:31:37.2937545Z * [new branch] gh/anijain2305/686/base -> origin/gh/anijain2305/686/base 2025-03-14T05:31:37.2939023Z * [new branch] gh/anijain2305/686/head -> origin/gh/anijain2305/686/head 2025-03-14T05:31:37.2950982Z * [new branch] gh/anijain2305/686/orig -> origin/gh/anijain2305/686/orig 2025-03-14T05:31:37.2951760Z * [new branch] gh/anijain2305/687/base -> origin/gh/anijain2305/687/base 2025-03-14T05:31:37.2952407Z * [new branch] gh/anijain2305/687/head -> origin/gh/anijain2305/687/head 2025-03-14T05:31:37.2952975Z * [new branch] gh/anijain2305/687/orig -> origin/gh/anijain2305/687/orig 2025-03-14T05:31:37.2953539Z * [new branch] gh/anijain2305/688/base -> origin/gh/anijain2305/688/base 2025-03-14T05:31:37.2954110Z * [new branch] gh/anijain2305/688/head -> origin/gh/anijain2305/688/head 2025-03-14T05:31:37.2954910Z * [new branch] gh/anijain2305/688/orig -> origin/gh/anijain2305/688/orig 2025-03-14T05:31:37.2955471Z * [new branch] gh/anijain2305/689/base -> origin/gh/anijain2305/689/base 2025-03-14T05:31:37.2956238Z * [new branch] gh/anijain2305/689/head -> origin/gh/anijain2305/689/head 2025-03-14T05:31:37.2957820Z * [new branch] gh/anijain2305/689/orig -> origin/gh/anijain2305/689/orig 2025-03-14T05:31:37.2960011Z * [new branch] gh/anijain2305/690/base -> origin/gh/anijain2305/690/base 2025-03-14T05:31:37.2961722Z * [new branch] gh/anijain2305/690/head -> origin/gh/anijain2305/690/head 2025-03-14T05:31:37.2963337Z * [new branch] gh/anijain2305/690/orig -> origin/gh/anijain2305/690/orig 2025-03-14T05:31:37.2965810Z * [new branch] gh/anijain2305/691/base -> origin/gh/anijain2305/691/base 2025-03-14T05:31:37.2967607Z * [new branch] gh/anijain2305/691/head -> origin/gh/anijain2305/691/head 2025-03-14T05:31:37.2969648Z * [new branch] gh/anijain2305/691/orig -> origin/gh/anijain2305/691/orig 2025-03-14T05:31:37.2972002Z * [new branch] gh/anijain2305/692/base -> origin/gh/anijain2305/692/base 2025-03-14T05:31:37.2973676Z * [new branch] gh/anijain2305/692/head -> origin/gh/anijain2305/692/head 2025-03-14T05:31:37.2975407Z * [new branch] gh/anijain2305/692/orig -> origin/gh/anijain2305/692/orig 2025-03-14T05:31:37.2977811Z * [new branch] gh/anijain2305/693/base -> origin/gh/anijain2305/693/base 2025-03-14T05:31:37.2979488Z * [new branch] gh/anijain2305/693/head -> origin/gh/anijain2305/693/head 2025-03-14T05:31:37.2981135Z * [new branch] gh/anijain2305/693/orig -> origin/gh/anijain2305/693/orig 2025-03-14T05:31:37.2983612Z * [new branch] gh/anijain2305/694/base -> origin/gh/anijain2305/694/base 2025-03-14T05:31:37.2985527Z * [new branch] gh/anijain2305/694/head -> origin/gh/anijain2305/694/head 2025-03-14T05:31:37.2987140Z * [new branch] gh/anijain2305/694/orig -> origin/gh/anijain2305/694/orig 2025-03-14T05:31:37.2989724Z * [new branch] gh/anijain2305/695/base -> origin/gh/anijain2305/695/base 2025-03-14T05:31:37.2990948Z * [new branch] gh/anijain2305/695/head -> origin/gh/anijain2305/695/head 2025-03-14T05:31:37.2992813Z * [new branch] gh/anijain2305/695/orig -> origin/gh/anijain2305/695/orig 2025-03-14T05:31:37.2995174Z * [new branch] gh/anijain2305/696/base -> origin/gh/anijain2305/696/base 2025-03-14T05:31:37.2996873Z * [new branch] gh/anijain2305/696/head -> origin/gh/anijain2305/696/head 2025-03-14T05:31:37.2998658Z * [new branch] gh/anijain2305/696/orig -> origin/gh/anijain2305/696/orig 2025-03-14T05:31:37.3001127Z * [new branch] gh/anijain2305/697/base -> origin/gh/anijain2305/697/base 2025-03-14T05:31:37.3002746Z * [new branch] gh/anijain2305/697/head -> origin/gh/anijain2305/697/head 2025-03-14T05:31:37.3004772Z * [new branch] gh/anijain2305/697/orig -> origin/gh/anijain2305/697/orig 2025-03-14T05:31:37.3006858Z * [new branch] gh/anijain2305/698/base -> origin/gh/anijain2305/698/base 2025-03-14T05:31:37.3008528Z * [new branch] gh/anijain2305/698/head -> origin/gh/anijain2305/698/head 2025-03-14T05:31:37.3010354Z * [new branch] gh/anijain2305/698/orig -> origin/gh/anijain2305/698/orig 2025-03-14T05:31:37.3012532Z * [new branch] gh/anijain2305/699/base -> origin/gh/anijain2305/699/base 2025-03-14T05:31:37.3014333Z * [new branch] gh/anijain2305/699/head -> origin/gh/anijain2305/699/head 2025-03-14T05:31:37.3015905Z * [new branch] gh/anijain2305/699/orig -> origin/gh/anijain2305/699/orig 2025-03-14T05:31:37.3018151Z * [new branch] gh/anijain2305/700/base -> origin/gh/anijain2305/700/base 2025-03-14T05:31:37.3019777Z * [new branch] gh/anijain2305/700/head -> origin/gh/anijain2305/700/head 2025-03-14T05:31:37.3021574Z * [new branch] gh/anijain2305/700/orig -> origin/gh/anijain2305/700/orig 2025-03-14T05:31:37.3024500Z * [new branch] gh/anjali411/216/base -> origin/gh/anjali411/216/base 2025-03-14T05:31:37.3026618Z * [new branch] gh/anjali411/216/head -> origin/gh/anjali411/216/head 2025-03-14T05:31:37.3028265Z * [new branch] gh/anjali411/216/orig -> origin/gh/anjali411/216/orig 2025-03-14T05:31:37.3031187Z * [new branch] gh/aorenste/132/base -> origin/gh/aorenste/132/base 2025-03-14T05:31:37.3032809Z * [new branch] gh/aorenste/132/head -> origin/gh/aorenste/132/head 2025-03-14T05:31:37.3035372Z * [new branch] gh/aorenste/141/base -> origin/gh/aorenste/141/base 2025-03-14T05:31:37.3037019Z * [new branch] gh/aorenste/141/head -> origin/gh/aorenste/141/head 2025-03-14T05:31:37.3038680Z * [new branch] gh/aorenste/141/orig -> origin/gh/aorenste/141/orig 2025-03-14T05:31:37.3041046Z * [new branch] gh/aorenste/213/base -> origin/gh/aorenste/213/base 2025-03-14T05:31:37.3042823Z * [new branch] gh/aorenste/213/head -> origin/gh/aorenste/213/head 2025-03-14T05:31:37.3044558Z * [new branch] gh/aorenste/213/orig -> origin/gh/aorenste/213/orig 2025-03-14T05:31:37.3047098Z * [new branch] gh/aorenste/214/base -> origin/gh/aorenste/214/base 2025-03-14T05:31:37.3048718Z * [new branch] gh/aorenste/214/head -> origin/gh/aorenste/214/head 2025-03-14T05:31:37.3050383Z * [new branch] gh/aorenste/214/orig -> origin/gh/aorenste/214/orig 2025-03-14T05:31:37.3052657Z * [new branch] gh/aorenste/215/base -> origin/gh/aorenste/215/base 2025-03-14T05:31:37.3054417Z * [new branch] gh/aorenste/215/head -> origin/gh/aorenste/215/head 2025-03-14T05:31:37.3056126Z * [new branch] gh/aorenste/215/orig -> origin/gh/aorenste/215/orig 2025-03-14T05:31:37.3058472Z * [new branch] gh/aorenste/216/base -> origin/gh/aorenste/216/base 2025-03-14T05:31:37.3060148Z * [new branch] gh/aorenste/216/head -> origin/gh/aorenste/216/head 2025-03-14T05:31:37.3061829Z * [new branch] gh/aorenste/216/orig -> origin/gh/aorenste/216/orig 2025-03-14T05:31:37.3064145Z * [new branch] gh/aorenste/217/base -> origin/gh/aorenste/217/base 2025-03-14T05:31:37.3065853Z * [new branch] gh/aorenste/217/head -> origin/gh/aorenste/217/head 2025-03-14T05:31:37.3067491Z * [new branch] gh/aorenste/217/orig -> origin/gh/aorenste/217/orig 2025-03-14T05:31:37.3070202Z * [new branch] gh/aorenste/218/base -> origin/gh/aorenste/218/base 2025-03-14T05:31:37.3071849Z * [new branch] gh/aorenste/218/head -> origin/gh/aorenste/218/head 2025-03-14T05:31:37.3073520Z * [new branch] gh/aorenste/218/orig -> origin/gh/aorenste/218/orig 2025-03-14T05:31:37.3075985Z * [new branch] gh/aorenste/219/base -> origin/gh/aorenste/219/base 2025-03-14T05:31:37.3077707Z * [new branch] gh/aorenste/219/head -> origin/gh/aorenste/219/head 2025-03-14T05:31:37.3079400Z * [new branch] gh/aorenste/219/orig -> origin/gh/aorenste/219/orig 2025-03-14T05:31:37.3081840Z * [new branch] gh/aorenste/220/base -> origin/gh/aorenste/220/base 2025-03-14T05:31:37.3083541Z * [new branch] gh/aorenste/220/head -> origin/gh/aorenste/220/head 2025-03-14T05:31:37.3085271Z * [new branch] gh/aorenste/220/orig -> origin/gh/aorenste/220/orig 2025-03-14T05:31:37.3087566Z * [new branch] gh/aorenste/221/base -> origin/gh/aorenste/221/base 2025-03-14T05:31:37.3089302Z * [new branch] gh/aorenste/221/head -> origin/gh/aorenste/221/head 2025-03-14T05:31:37.3090983Z * [new branch] gh/aorenste/221/orig -> origin/gh/aorenste/221/orig 2025-03-14T05:31:37.3093372Z * [new branch] gh/aorenste/222/base -> origin/gh/aorenste/222/base 2025-03-14T05:31:37.3095110Z * [new branch] gh/aorenste/222/head -> origin/gh/aorenste/222/head 2025-03-14T05:31:37.3096713Z * [new branch] gh/aorenste/222/orig -> origin/gh/aorenste/222/orig 2025-03-14T05:31:37.3099570Z * [new branch] gh/avikchaudhuri/39/base -> origin/gh/avikchaudhuri/39/base 2025-03-14T05:31:37.3101252Z * [new branch] gh/avikchaudhuri/39/head -> origin/gh/avikchaudhuri/39/head 2025-03-14T05:31:37.3102979Z * [new branch] gh/avikchaudhuri/39/orig -> origin/gh/avikchaudhuri/39/orig 2025-03-14T05:31:37.3105318Z * [new branch] gh/avikchaudhuri/54/base -> origin/gh/avikchaudhuri/54/base 2025-03-14T05:31:37.3107079Z * [new branch] gh/avikchaudhuri/54/head -> origin/gh/avikchaudhuri/54/head 2025-03-14T05:31:37.3108714Z * [new branch] gh/avikchaudhuri/54/orig -> origin/gh/avikchaudhuri/54/orig 2025-03-14T05:31:37.3111058Z * [new branch] gh/avikchaudhuri/55/base -> origin/gh/avikchaudhuri/55/base 2025-03-14T05:31:37.3112722Z * [new branch] gh/avikchaudhuri/55/head -> origin/gh/avikchaudhuri/55/head 2025-03-14T05:31:37.3114441Z * [new branch] gh/avikchaudhuri/55/orig -> origin/gh/avikchaudhuri/55/orig 2025-03-14T05:31:37.3116809Z * [new branch] gh/avikchaudhuri/56/base -> origin/gh/avikchaudhuri/56/base 2025-03-14T05:31:37.3118485Z * [new branch] gh/avikchaudhuri/56/head -> origin/gh/avikchaudhuri/56/head 2025-03-14T05:31:37.3120141Z * [new branch] gh/avikchaudhuri/56/orig -> origin/gh/avikchaudhuri/56/orig 2025-03-14T05:31:37.3123060Z * [new branch] gh/bdhirsh/604/base -> origin/gh/bdhirsh/604/base 2025-03-14T05:31:37.3124942Z * [new branch] gh/bdhirsh/604/head -> origin/gh/bdhirsh/604/head 2025-03-14T05:31:37.3126456Z * [new branch] gh/bdhirsh/604/orig -> origin/gh/bdhirsh/604/orig 2025-03-14T05:31:37.3128760Z * [new branch] gh/bdhirsh/626/base -> origin/gh/bdhirsh/626/base 2025-03-14T05:31:37.3130444Z * [new branch] gh/bdhirsh/626/head -> origin/gh/bdhirsh/626/head 2025-03-14T05:31:37.3132104Z * [new branch] gh/bdhirsh/626/orig -> origin/gh/bdhirsh/626/orig 2025-03-14T05:31:37.3134770Z * [new branch] gh/bdhirsh/627/base -> origin/gh/bdhirsh/627/base 2025-03-14T05:31:37.3136552Z * [new branch] gh/bdhirsh/627/head -> origin/gh/bdhirsh/627/head 2025-03-14T05:31:37.3138220Z * [new branch] gh/bdhirsh/627/orig -> origin/gh/bdhirsh/627/orig 2025-03-14T05:31:37.3140517Z * [new branch] gh/bdhirsh/630/base -> origin/gh/bdhirsh/630/base 2025-03-14T05:31:37.3142200Z * [new branch] gh/bdhirsh/630/head -> origin/gh/bdhirsh/630/head 2025-03-14T05:31:37.3143890Z * [new branch] gh/bdhirsh/630/orig -> origin/gh/bdhirsh/630/orig 2025-03-14T05:31:37.3146325Z * [new branch] gh/bdhirsh/635/base -> origin/gh/bdhirsh/635/base 2025-03-14T05:31:37.3147985Z * [new branch] gh/bdhirsh/635/head -> origin/gh/bdhirsh/635/head 2025-03-14T05:31:37.3149696Z * [new branch] gh/bdhirsh/635/orig -> origin/gh/bdhirsh/635/orig 2025-03-14T05:31:37.3152237Z * [new branch] gh/bdhirsh/636/base -> origin/gh/bdhirsh/636/base 2025-03-14T05:31:37.3153877Z * [new branch] gh/bdhirsh/636/head -> origin/gh/bdhirsh/636/head 2025-03-14T05:31:37.3155685Z * [new branch] gh/bdhirsh/636/orig -> origin/gh/bdhirsh/636/orig 2025-03-14T05:31:37.3158170Z * [new branch] gh/bdhirsh/639/base -> origin/gh/bdhirsh/639/base 2025-03-14T05:31:37.3160021Z * [new branch] gh/bdhirsh/639/head -> origin/gh/bdhirsh/639/head 2025-03-14T05:31:37.3161617Z * [new branch] gh/bdhirsh/639/orig -> origin/gh/bdhirsh/639/orig 2025-03-14T05:31:37.3163965Z * [new branch] gh/bdhirsh/640/base -> origin/gh/bdhirsh/640/base 2025-03-14T05:31:37.3165727Z * [new branch] gh/bdhirsh/640/head -> origin/gh/bdhirsh/640/head 2025-03-14T05:31:37.3167484Z * [new branch] gh/bdhirsh/640/orig -> origin/gh/bdhirsh/640/orig 2025-03-14T05:31:37.3170229Z * [new branch] gh/bdhirsh/641/base -> origin/gh/bdhirsh/641/base 2025-03-14T05:31:37.3171977Z * [new branch] gh/bdhirsh/641/head -> origin/gh/bdhirsh/641/head 2025-03-14T05:31:37.3173598Z * [new branch] gh/bdhirsh/641/orig -> origin/gh/bdhirsh/641/orig 2025-03-14T05:31:37.3176217Z * [new branch] gh/bdhirsh/642/base -> origin/gh/bdhirsh/642/base 2025-03-14T05:31:37.3177989Z * [new branch] gh/bdhirsh/642/head -> origin/gh/bdhirsh/642/head 2025-03-14T05:31:37.3180004Z * [new branch] gh/bdhirsh/642/orig -> origin/gh/bdhirsh/642/orig 2025-03-14T05:31:37.3182413Z * [new branch] gh/bdhirsh/643/base -> origin/gh/bdhirsh/643/base 2025-03-14T05:31:37.3184274Z * [new branch] gh/bdhirsh/643/head -> origin/gh/bdhirsh/643/head 2025-03-14T05:31:37.3185854Z * [new branch] gh/bdhirsh/643/orig -> origin/gh/bdhirsh/643/orig 2025-03-14T05:31:37.3188162Z * [new branch] gh/bdhirsh/644/base -> origin/gh/bdhirsh/644/base 2025-03-14T05:31:37.3189873Z * [new branch] gh/bdhirsh/644/head -> origin/gh/bdhirsh/644/head 2025-03-14T05:31:37.3191557Z * [new branch] gh/bdhirsh/644/orig -> origin/gh/bdhirsh/644/orig 2025-03-14T05:31:37.3194095Z * [new branch] gh/bdhirsh/645/base -> origin/gh/bdhirsh/645/base 2025-03-14T05:31:37.3195874Z * [new branch] gh/bdhirsh/645/head -> origin/gh/bdhirsh/645/head 2025-03-14T05:31:37.3197505Z * [new branch] gh/bdhirsh/645/orig -> origin/gh/bdhirsh/645/orig 2025-03-14T05:31:37.3201445Z * [new branch] gh/bdhirsh/646/base -> origin/gh/bdhirsh/646/base 2025-03-14T05:31:37.3204952Z * [new branch] gh/bdhirsh/646/head -> origin/gh/bdhirsh/646/head 2025-03-14T05:31:37.3206633Z * [new branch] gh/bdhirsh/646/orig -> origin/gh/bdhirsh/646/orig 2025-03-14T05:31:37.3209588Z * [new branch] gh/benjaminglass1/51/base -> origin/gh/benjaminglass1/51/base 2025-03-14T05:31:37.3211277Z * [new branch] gh/benjaminglass1/51/head -> origin/gh/benjaminglass1/51/head 2025-03-14T05:31:37.3213033Z * [new branch] gh/benjaminglass1/51/orig -> origin/gh/benjaminglass1/51/orig 2025-03-14T05:31:37.3215291Z * [new branch] gh/benjaminglass1/52/base -> origin/gh/benjaminglass1/52/base 2025-03-14T05:31:37.3216953Z * [new branch] gh/benjaminglass1/52/head -> origin/gh/benjaminglass1/52/head 2025-03-14T05:31:37.3218609Z * [new branch] gh/benjaminglass1/52/orig -> origin/gh/benjaminglass1/52/orig 2025-03-14T05:31:37.3220850Z * [new branch] gh/benjaminglass1/63/base -> origin/gh/benjaminglass1/63/base 2025-03-14T05:31:37.3222512Z * [new branch] gh/benjaminglass1/63/head -> origin/gh/benjaminglass1/63/head 2025-03-14T05:31:37.3224185Z * [new branch] gh/benjaminglass1/63/orig -> origin/gh/benjaminglass1/63/orig 2025-03-14T05:31:37.3226495Z * [new branch] gh/benjaminglass1/64/base -> origin/gh/benjaminglass1/64/base 2025-03-14T05:31:37.3228160Z * [new branch] gh/benjaminglass1/64/head -> origin/gh/benjaminglass1/64/head 2025-03-14T05:31:37.3229903Z * [new branch] gh/benjaminglass1/64/orig -> origin/gh/benjaminglass1/64/orig 2025-03-14T05:31:37.3232210Z * [new branch] gh/benjaminglass1/65/base -> origin/gh/benjaminglass1/65/base 2025-03-14T05:31:37.3233940Z * [new branch] gh/benjaminglass1/65/head -> origin/gh/benjaminglass1/65/head 2025-03-14T05:31:37.3235779Z * [new branch] gh/benjaminglass1/65/orig -> origin/gh/benjaminglass1/65/orig 2025-03-14T05:31:37.3240511Z * [new branch] gh/benjaminglass1/66/base -> origin/gh/benjaminglass1/66/base 2025-03-14T05:31:37.3241312Z * [new branch] gh/benjaminglass1/66/head -> origin/gh/benjaminglass1/66/head 2025-03-14T05:31:37.3241682Z * [new branch] gh/benjaminglass1/66/orig -> origin/gh/benjaminglass1/66/orig 2025-03-14T05:31:37.3243831Z * [new branch] gh/benjaminglass1/67/base -> origin/gh/benjaminglass1/67/base 2025-03-14T05:31:37.3245419Z * [new branch] gh/benjaminglass1/67/head -> origin/gh/benjaminglass1/67/head 2025-03-14T05:31:37.3247105Z * [new branch] gh/benjaminglass1/67/orig -> origin/gh/benjaminglass1/67/orig 2025-03-14T05:31:37.3249386Z * [new branch] gh/benjaminglass1/68/base -> origin/gh/benjaminglass1/68/base 2025-03-14T05:31:37.3251015Z * [new branch] gh/benjaminglass1/68/head -> origin/gh/benjaminglass1/68/head 2025-03-14T05:31:37.3253048Z * [new branch] gh/benjaminglass1/68/orig -> origin/gh/benjaminglass1/68/orig 2025-03-14T05:31:37.3255088Z * [new branch] gh/benjaminglass1/69/base -> origin/gh/benjaminglass1/69/base 2025-03-14T05:31:37.3256747Z * [new branch] gh/benjaminglass1/69/head -> origin/gh/benjaminglass1/69/head 2025-03-14T05:31:37.3258368Z * [new branch] gh/benjaminglass1/69/orig -> origin/gh/benjaminglass1/69/orig 2025-03-14T05:31:37.3260723Z * [new branch] gh/benjaminglass1/70/base -> origin/gh/benjaminglass1/70/base 2025-03-14T05:31:37.3262457Z * [new branch] gh/benjaminglass1/70/head -> origin/gh/benjaminglass1/70/head 2025-03-14T05:31:37.3264036Z * [new branch] gh/benjaminglass1/70/orig -> origin/gh/benjaminglass1/70/orig 2025-03-14T05:31:37.3266388Z * [new branch] gh/benjaminglass1/71/base -> origin/gh/benjaminglass1/71/base 2025-03-14T05:31:37.3268398Z * [new branch] gh/benjaminglass1/71/head -> origin/gh/benjaminglass1/71/head 2025-03-14T05:31:37.3272623Z * [new branch] gh/benjaminglass1/71/orig -> origin/gh/benjaminglass1/71/orig 2025-03-14T05:31:37.3275049Z * [new branch] gh/benjaminglass1/72/base -> origin/gh/benjaminglass1/72/base 2025-03-14T05:31:37.3276707Z * [new branch] gh/benjaminglass1/72/head -> origin/gh/benjaminglass1/72/head 2025-03-14T05:31:37.3278396Z * [new branch] gh/benjaminglass1/72/orig -> origin/gh/benjaminglass1/72/orig 2025-03-14T05:31:37.3280706Z * [new branch] gh/benjaminglass1/73/base -> origin/gh/benjaminglass1/73/base 2025-03-14T05:31:37.3282299Z * [new branch] gh/benjaminglass1/73/head -> origin/gh/benjaminglass1/73/head 2025-03-14T05:31:37.3284021Z * [new branch] gh/benjaminglass1/73/orig -> origin/gh/benjaminglass1/73/orig 2025-03-14T05:31:37.3286292Z * [new branch] gh/benjaminglass1/74/base -> origin/gh/benjaminglass1/74/base 2025-03-14T05:31:37.3287922Z * [new branch] gh/benjaminglass1/74/head -> origin/gh/benjaminglass1/74/head 2025-03-14T05:31:37.3289580Z * [new branch] gh/benjaminglass1/74/orig -> origin/gh/benjaminglass1/74/orig 2025-03-14T05:31:37.3291863Z * [new branch] gh/benjaminglass1/75/base -> origin/gh/benjaminglass1/75/base 2025-03-14T05:31:37.3293503Z * [new branch] gh/benjaminglass1/75/head -> origin/gh/benjaminglass1/75/head 2025-03-14T05:31:37.3295213Z * [new branch] gh/benjaminglass1/75/orig -> origin/gh/benjaminglass1/75/orig 2025-03-14T05:31:37.3297513Z * [new branch] gh/benjaminglass1/76/base -> origin/gh/benjaminglass1/76/base 2025-03-14T05:31:37.3299151Z * [new branch] gh/benjaminglass1/76/head -> origin/gh/benjaminglass1/76/head 2025-03-14T05:31:37.3300907Z * [new branch] gh/benjaminglass1/76/orig -> origin/gh/benjaminglass1/76/orig 2025-03-14T05:31:37.3303115Z * [new branch] gh/benjaminglass1/77/base -> origin/gh/benjaminglass1/77/base 2025-03-14T05:31:37.3304818Z * [new branch] gh/benjaminglass1/77/head -> origin/gh/benjaminglass1/77/head 2025-03-14T05:31:37.3306509Z * [new branch] gh/benjaminglass1/77/orig -> origin/gh/benjaminglass1/77/orig 2025-03-14T05:31:37.3309322Z * [new branch] gh/bertmaher/6/base -> origin/gh/bertmaher/6/base 2025-03-14T05:31:37.3310939Z * [new branch] gh/bertmaher/6/head -> origin/gh/bertmaher/6/head 2025-03-14T05:31:37.3312601Z * [new branch] gh/bertmaher/6/orig -> origin/gh/bertmaher/6/orig 2025-03-14T05:31:37.3315632Z * [new branch] gh/bobrenjc93/207/base -> origin/gh/bobrenjc93/207/base 2025-03-14T05:31:37.3317294Z * [new branch] gh/bobrenjc93/207/head -> origin/gh/bobrenjc93/207/head 2025-03-14T05:31:37.3318945Z * [new branch] gh/bobrenjc93/207/orig -> origin/gh/bobrenjc93/207/orig 2025-03-14T05:31:37.3321311Z * [new branch] gh/bobrenjc93/270/base -> origin/gh/bobrenjc93/270/base 2025-03-14T05:31:37.3323076Z * [new branch] gh/bobrenjc93/270/head -> origin/gh/bobrenjc93/270/head 2025-03-14T05:31:37.3324714Z * [new branch] gh/bobrenjc93/270/orig -> origin/gh/bobrenjc93/270/orig 2025-03-14T05:31:37.3327066Z * [new branch] gh/bobrenjc93/272/base -> origin/gh/bobrenjc93/272/base 2025-03-14T05:31:37.3328704Z * [new branch] gh/bobrenjc93/272/head -> origin/gh/bobrenjc93/272/head 2025-03-14T05:31:37.3330546Z * [new branch] gh/bobrenjc93/272/orig -> origin/gh/bobrenjc93/272/orig 2025-03-14T05:31:37.3332520Z * [new branch] gh/bobrenjc93/273/base -> origin/gh/bobrenjc93/273/base 2025-03-14T05:31:37.3334225Z * [new branch] gh/bobrenjc93/273/head -> origin/gh/bobrenjc93/273/head 2025-03-14T05:31:37.3335836Z * [new branch] gh/bobrenjc93/273/orig -> origin/gh/bobrenjc93/273/orig 2025-03-14T05:31:37.3337939Z * [new branch] gh/bobrenjc93/274/base -> origin/gh/bobrenjc93/274/base 2025-03-14T05:31:37.3339662Z * [new branch] gh/bobrenjc93/274/head -> origin/gh/bobrenjc93/274/head 2025-03-14T05:31:37.3341210Z * [new branch] gh/bobrenjc93/274/orig -> origin/gh/bobrenjc93/274/orig 2025-03-14T05:31:37.3343447Z * [new branch] gh/bobrenjc93/275/base -> origin/gh/bobrenjc93/275/base 2025-03-14T05:31:37.3345121Z * [new branch] gh/bobrenjc93/275/head -> origin/gh/bobrenjc93/275/head 2025-03-14T05:31:37.3346753Z * [new branch] gh/bobrenjc93/275/orig -> origin/gh/bobrenjc93/275/orig 2025-03-14T05:31:37.3348835Z * [new branch] gh/bobrenjc93/276/base -> origin/gh/bobrenjc93/276/base 2025-03-14T05:31:37.3350507Z * [new branch] gh/bobrenjc93/276/head -> origin/gh/bobrenjc93/276/head 2025-03-14T05:31:37.3352152Z * [new branch] gh/bobrenjc93/276/orig -> origin/gh/bobrenjc93/276/orig 2025-03-14T05:31:37.3354568Z * [new branch] gh/bobrenjc93/277/base -> origin/gh/bobrenjc93/277/base 2025-03-14T05:31:37.3356327Z * [new branch] gh/bobrenjc93/277/head -> origin/gh/bobrenjc93/277/head 2025-03-14T05:31:37.3358000Z * [new branch] gh/bobrenjc93/277/orig -> origin/gh/bobrenjc93/277/orig 2025-03-14T05:31:37.3360155Z * [new branch] gh/bobrenjc93/278/base -> origin/gh/bobrenjc93/278/base 2025-03-14T05:31:37.3361947Z * [new branch] gh/bobrenjc93/278/head -> origin/gh/bobrenjc93/278/head 2025-03-14T05:31:37.3363626Z * [new branch] gh/bobrenjc93/278/orig -> origin/gh/bobrenjc93/278/orig 2025-03-14T05:31:37.3366039Z * [new branch] gh/bobrenjc93/279/base -> origin/gh/bobrenjc93/279/base 2025-03-14T05:31:37.3367699Z * [new branch] gh/bobrenjc93/279/head -> origin/gh/bobrenjc93/279/head 2025-03-14T05:31:37.3369678Z * [new branch] gh/bobrenjc93/279/orig -> origin/gh/bobrenjc93/279/orig 2025-03-14T05:31:37.3372019Z * [new branch] gh/bobrenjc93/280/base -> origin/gh/bobrenjc93/280/base 2025-03-14T05:31:37.3373778Z * [new branch] gh/bobrenjc93/280/head -> origin/gh/bobrenjc93/280/head 2025-03-14T05:31:37.3375572Z * [new branch] gh/bobrenjc93/280/orig -> origin/gh/bobrenjc93/280/orig 2025-03-14T05:31:37.3377996Z * [new branch] gh/bobrenjc93/281/base -> origin/gh/bobrenjc93/281/base 2025-03-14T05:31:37.3379389Z * [new branch] gh/bobrenjc93/281/head -> origin/gh/bobrenjc93/281/head 2025-03-14T05:31:37.3381021Z * [new branch] gh/bobrenjc93/281/orig -> origin/gh/bobrenjc93/281/orig 2025-03-14T05:31:37.3383418Z * [new branch] gh/bobrenjc93/282/base -> origin/gh/bobrenjc93/282/base 2025-03-14T05:31:37.3385188Z * [new branch] gh/bobrenjc93/282/head -> origin/gh/bobrenjc93/282/head 2025-03-14T05:31:37.3386872Z * [new branch] gh/bobrenjc93/282/orig -> origin/gh/bobrenjc93/282/orig 2025-03-14T05:31:37.3389331Z * [new branch] gh/bobrenjc93/283/base -> origin/gh/bobrenjc93/283/base 2025-03-14T05:31:37.3391284Z * [new branch] gh/bobrenjc93/283/head -> origin/gh/bobrenjc93/283/head 2025-03-14T05:31:37.3392522Z * [new branch] gh/bobrenjc93/283/orig -> origin/gh/bobrenjc93/283/orig 2025-03-14T05:31:37.3395560Z * [new branch] gh/bobrenjc93/284/base -> origin/gh/bobrenjc93/284/base 2025-03-14T05:31:37.3397527Z * [new branch] gh/bobrenjc93/284/head -> origin/gh/bobrenjc93/284/head 2025-03-14T05:31:37.3398390Z * [new branch] gh/bobrenjc93/284/orig -> origin/gh/bobrenjc93/284/orig 2025-03-14T05:31:37.3401013Z * [new branch] gh/bobrenjc93/285/base -> origin/gh/bobrenjc93/285/base 2025-03-14T05:31:37.3402126Z * [new branch] gh/bobrenjc93/285/head -> origin/gh/bobrenjc93/285/head 2025-03-14T05:31:37.3404279Z * [new branch] gh/bobrenjc93/285/orig -> origin/gh/bobrenjc93/285/orig 2025-03-14T05:31:37.3407197Z * [new branch] gh/bobrenjc93/286/base -> origin/gh/bobrenjc93/286/base 2025-03-14T05:31:37.3408264Z * [new branch] gh/bobrenjc93/286/head -> origin/gh/bobrenjc93/286/head 2025-03-14T05:31:37.3410186Z * [new branch] gh/bobrenjc93/286/orig -> origin/gh/bobrenjc93/286/orig 2025-03-14T05:31:37.3412765Z * [new branch] gh/bobrenjc93/287/base -> origin/gh/bobrenjc93/287/base 2025-03-14T05:31:37.3414054Z * [new branch] gh/bobrenjc93/287/head -> origin/gh/bobrenjc93/287/head 2025-03-14T05:31:37.3416165Z * [new branch] gh/bobrenjc93/287/orig -> origin/gh/bobrenjc93/287/orig 2025-03-14T05:31:37.3418750Z * [new branch] gh/bobrenjc93/288/base -> origin/gh/bobrenjc93/288/base 2025-03-14T05:31:37.3419889Z * [new branch] gh/bobrenjc93/288/head -> origin/gh/bobrenjc93/288/head 2025-03-14T05:31:37.3422007Z * [new branch] gh/bobrenjc93/288/orig -> origin/gh/bobrenjc93/288/orig 2025-03-14T05:31:37.3424188Z * [new branch] gh/bobrenjc93/289/base -> origin/gh/bobrenjc93/289/base 2025-03-14T05:31:37.3426431Z * [new branch] gh/bobrenjc93/289/head -> origin/gh/bobrenjc93/289/head 2025-03-14T05:31:37.3427584Z * [new branch] gh/bobrenjc93/289/orig -> origin/gh/bobrenjc93/289/orig 2025-03-14T05:31:37.3430560Z * [new branch] gh/bobrenjc93/290/base -> origin/gh/bobrenjc93/290/base 2025-03-14T05:31:37.3431903Z * [new branch] gh/bobrenjc93/290/head -> origin/gh/bobrenjc93/290/head 2025-03-14T05:31:37.3434523Z * [new branch] gh/bobrenjc93/290/orig -> origin/gh/bobrenjc93/290/orig 2025-03-14T05:31:37.3436896Z * [new branch] gh/bobrenjc93/291/base -> origin/gh/bobrenjc93/291/base 2025-03-14T05:31:37.3438078Z * [new branch] gh/bobrenjc93/291/head -> origin/gh/bobrenjc93/291/head 2025-03-14T05:31:37.3440242Z * [new branch] gh/bobrenjc93/291/orig -> origin/gh/bobrenjc93/291/orig 2025-03-14T05:31:37.3442753Z * [new branch] gh/bobrenjc93/292/base -> origin/gh/bobrenjc93/292/base 2025-03-14T05:31:37.3443775Z * [new branch] gh/bobrenjc93/292/head -> origin/gh/bobrenjc93/292/head 2025-03-14T05:31:37.3445891Z * [new branch] gh/bobrenjc93/292/orig -> origin/gh/bobrenjc93/292/orig 2025-03-14T05:31:37.3448339Z * [new branch] gh/bobrenjc93/293/base -> origin/gh/bobrenjc93/293/base 2025-03-14T05:31:37.3449455Z * [new branch] gh/bobrenjc93/293/head -> origin/gh/bobrenjc93/293/head 2025-03-14T05:31:37.3451598Z * [new branch] gh/bobrenjc93/293/orig -> origin/gh/bobrenjc93/293/orig 2025-03-14T05:31:37.3454030Z * [new branch] gh/bobrenjc93/294/base -> origin/gh/bobrenjc93/294/base 2025-03-14T05:31:37.3455833Z * [new branch] gh/bobrenjc93/294/head -> origin/gh/bobrenjc93/294/head 2025-03-14T05:31:37.3457096Z * [new branch] gh/bobrenjc93/294/orig -> origin/gh/bobrenjc93/294/orig 2025-03-14T05:31:37.3459755Z * [new branch] gh/bobrenjc93/295/base -> origin/gh/bobrenjc93/295/base 2025-03-14T05:31:37.3461156Z * [new branch] gh/bobrenjc93/295/head -> origin/gh/bobrenjc93/295/head 2025-03-14T05:31:37.3462921Z * [new branch] gh/bobrenjc93/295/orig -> origin/gh/bobrenjc93/295/orig 2025-03-14T05:31:37.3465256Z * [new branch] gh/bobrenjc93/296/base -> origin/gh/bobrenjc93/296/base 2025-03-14T05:31:37.3466865Z * [new branch] gh/bobrenjc93/296/head -> origin/gh/bobrenjc93/296/head 2025-03-14T05:31:37.3468781Z * [new branch] gh/bobrenjc93/296/orig -> origin/gh/bobrenjc93/296/orig 2025-03-14T05:31:37.3471511Z * [new branch] gh/bobrenjc93/297/base -> origin/gh/bobrenjc93/297/base 2025-03-14T05:31:37.3473571Z * [new branch] gh/bobrenjc93/297/head -> origin/gh/bobrenjc93/297/head 2025-03-14T05:31:37.3475511Z * [new branch] gh/bobrenjc93/297/orig -> origin/gh/bobrenjc93/297/orig 2025-03-14T05:31:37.3477874Z * [new branch] gh/bobrenjc93/298/base -> origin/gh/bobrenjc93/298/base 2025-03-14T05:31:37.3479623Z * [new branch] gh/bobrenjc93/298/head -> origin/gh/bobrenjc93/298/head 2025-03-14T05:31:37.3481717Z * [new branch] gh/bobrenjc93/298/orig -> origin/gh/bobrenjc93/298/orig 2025-03-14T05:31:37.3484010Z * [new branch] gh/bobrenjc93/299/base -> origin/gh/bobrenjc93/299/base 2025-03-14T05:31:37.3485701Z * [new branch] gh/bobrenjc93/299/head -> origin/gh/bobrenjc93/299/head 2025-03-14T05:31:37.3487421Z * [new branch] gh/bobrenjc93/299/orig -> origin/gh/bobrenjc93/299/orig 2025-03-14T05:31:37.3490174Z * [new branch] gh/briancoutinho/2/base -> origin/gh/briancoutinho/2/base 2025-03-14T05:31:37.3491882Z * [new branch] gh/briancoutinho/2/head -> origin/gh/briancoutinho/2/head 2025-03-14T05:31:37.3494667Z * [new branch] gh/c00w/23/base -> origin/gh/c00w/23/base 2025-03-14T05:31:37.3496447Z * [new branch] gh/c00w/23/head -> origin/gh/c00w/23/head 2025-03-14T05:31:37.3498823Z * [new branch] gh/c00w/37/base -> origin/gh/c00w/37/base 2025-03-14T05:31:37.3500482Z * [new branch] gh/c00w/37/head -> origin/gh/c00w/37/head 2025-03-14T05:31:37.3502133Z * [new branch] gh/c00w/37/orig -> origin/gh/c00w/37/orig 2025-03-14T05:31:37.3504804Z * [new branch] gh/c00w/38/base -> origin/gh/c00w/38/base 2025-03-14T05:31:37.3506352Z * [new branch] gh/c00w/38/head -> origin/gh/c00w/38/head 2025-03-14T05:31:37.3508045Z * [new branch] gh/c00w/38/orig -> origin/gh/c00w/38/orig 2025-03-14T05:31:37.3510505Z * [new branch] gh/c00w/39/base -> origin/gh/c00w/39/base 2025-03-14T05:31:37.3512148Z * [new branch] gh/c00w/39/head -> origin/gh/c00w/39/head 2025-03-14T05:31:37.3513853Z * [new branch] gh/c00w/39/orig -> origin/gh/c00w/39/orig 2025-03-14T05:31:37.3516394Z * [new branch] gh/c00w/40/base -> origin/gh/c00w/40/base 2025-03-14T05:31:37.3518062Z * [new branch] gh/c00w/40/head -> origin/gh/c00w/40/head 2025-03-14T05:31:37.3519924Z * [new branch] gh/c00w/40/orig -> origin/gh/c00w/40/orig 2025-03-14T05:31:37.3521980Z * [new branch] gh/c00w/41/base -> origin/gh/c00w/41/base 2025-03-14T05:31:37.3523646Z * [new branch] gh/c00w/41/head -> origin/gh/c00w/41/head 2025-03-14T05:31:37.3525361Z * [new branch] gh/c00w/41/orig -> origin/gh/c00w/41/orig 2025-03-14T05:31:37.3527722Z * [new branch] gh/c00w/42/base -> origin/gh/c00w/42/base 2025-03-14T05:31:37.3529557Z * [new branch] gh/c00w/42/head -> origin/gh/c00w/42/head 2025-03-14T05:31:37.3531233Z * [new branch] gh/c00w/42/orig -> origin/gh/c00w/42/orig 2025-03-14T05:31:37.3533756Z * [new branch] gh/c00w/43/base -> origin/gh/c00w/43/base 2025-03-14T05:31:37.3535359Z * [new branch] gh/c00w/43/head -> origin/gh/c00w/43/head 2025-03-14T05:31:37.3537031Z * [new branch] gh/c00w/43/orig -> origin/gh/c00w/43/orig 2025-03-14T05:31:37.3539156Z * [new branch] gh/c00w/44/base -> origin/gh/c00w/44/base 2025-03-14T05:31:37.3540833Z * [new branch] gh/c00w/44/head -> origin/gh/c00w/44/head 2025-03-14T05:31:37.3542534Z * [new branch] gh/c00w/44/orig -> origin/gh/c00w/44/orig 2025-03-14T05:31:37.3544899Z * [new branch] gh/c00w/45/base -> origin/gh/c00w/45/base 2025-03-14T05:31:37.3546508Z * [new branch] gh/c00w/45/head -> origin/gh/c00w/45/head 2025-03-14T05:31:37.3548154Z * [new branch] gh/c00w/45/orig -> origin/gh/c00w/45/orig 2025-03-14T05:31:37.3551073Z * [new branch] gh/chenyang78/1/base -> origin/gh/chenyang78/1/base 2025-03-14T05:31:37.3552753Z * [new branch] gh/chenyang78/1/head -> origin/gh/chenyang78/1/head 2025-03-14T05:31:37.3554612Z * [new branch] gh/chenyang78/1/orig -> origin/gh/chenyang78/1/orig 2025-03-14T05:31:37.3556891Z * [new branch] gh/chenyang78/2/base -> origin/gh/chenyang78/2/base 2025-03-14T05:31:37.3558513Z * [new branch] gh/chenyang78/2/head -> origin/gh/chenyang78/2/head 2025-03-14T05:31:37.3560189Z * [new branch] gh/chenyang78/2/orig -> origin/gh/chenyang78/2/orig 2025-03-14T05:31:37.3563059Z * [new branch] gh/chillee/220/base -> origin/gh/chillee/220/base 2025-03-14T05:31:37.3564775Z * [new branch] gh/chillee/220/head -> origin/gh/chillee/220/head 2025-03-14T05:31:37.3566417Z * [new branch] gh/chillee/220/orig -> origin/gh/chillee/220/orig 2025-03-14T05:31:37.3569081Z * [new branch] gh/chillee/376/base -> origin/gh/chillee/376/base 2025-03-14T05:31:37.3570736Z * [new branch] gh/chillee/376/head -> origin/gh/chillee/376/head 2025-03-14T05:31:37.3572484Z * [new branch] gh/chillee/376/orig -> origin/gh/chillee/376/orig 2025-03-14T05:31:37.3574761Z * [new branch] gh/chillee/377/base -> origin/gh/chillee/377/base 2025-03-14T05:31:37.3576414Z * [new branch] gh/chillee/377/head -> origin/gh/chillee/377/head 2025-03-14T05:31:37.3578100Z * [new branch] gh/chillee/377/orig -> origin/gh/chillee/377/orig 2025-03-14T05:31:37.3580865Z * [new branch] gh/chunyuan-w/1/base -> origin/gh/chunyuan-w/1/base 2025-03-14T05:31:37.3582524Z * [new branch] gh/chunyuan-w/1/head -> origin/gh/chunyuan-w/1/head 2025-03-14T05:31:37.3584166Z * [new branch] gh/chunyuan-w/1/orig -> origin/gh/chunyuan-w/1/orig 2025-03-14T05:31:37.3586443Z * [new branch] gh/chunyuan-w/3/base -> origin/gh/chunyuan-w/3/base 2025-03-14T05:31:37.3588144Z * [new branch] gh/chunyuan-w/3/head -> origin/gh/chunyuan-w/3/head 2025-03-14T05:31:37.3589836Z * [new branch] gh/chunyuan-w/3/orig -> origin/gh/chunyuan-w/3/orig 2025-03-14T05:31:37.3592644Z * [new branch] gh/clee2000/1/base -> origin/gh/clee2000/1/base 2025-03-14T05:31:37.3594451Z * [new branch] gh/clee2000/1/head -> origin/gh/clee2000/1/head 2025-03-14T05:31:37.3596130Z * [new branch] gh/clee2000/1/orig -> origin/gh/clee2000/1/orig 2025-03-14T05:31:37.3598428Z * [new branch] gh/clee2000/2/base -> origin/gh/clee2000/2/base 2025-03-14T05:31:37.3600022Z * [new branch] gh/clee2000/2/head -> origin/gh/clee2000/2/head 2025-03-14T05:31:37.3601611Z * [new branch] gh/clee2000/2/orig -> origin/gh/clee2000/2/orig 2025-03-14T05:31:37.3604307Z * [new branch] gh/clee2000/3/base -> origin/gh/clee2000/3/base 2025-03-14T05:31:37.3605534Z * [new branch] gh/clee2000/3/head -> origin/gh/clee2000/3/head 2025-03-14T05:31:37.3607159Z * [new branch] gh/clee2000/3/orig -> origin/gh/clee2000/3/orig 2025-03-14T05:31:37.3610173Z * [new branch] gh/davidberard98/230/base -> origin/gh/davidberard98/230/base 2025-03-14T05:31:37.3611867Z * [new branch] gh/davidberard98/230/head -> origin/gh/davidberard98/230/head 2025-03-14T05:31:37.3613561Z * [new branch] gh/davidberard98/230/orig -> origin/gh/davidberard98/230/orig 2025-03-14T05:31:37.3616002Z * [new branch] gh/davidberard98/335/base -> origin/gh/davidberard98/335/base 2025-03-14T05:31:37.3617691Z * [new branch] gh/davidberard98/335/head -> origin/gh/davidberard98/335/head 2025-03-14T05:31:37.3619369Z * [new branch] gh/davidberard98/335/orig -> origin/gh/davidberard98/335/orig 2025-03-14T05:31:37.3621592Z * [new branch] gh/davidberard98/338/base -> origin/gh/davidberard98/338/base 2025-03-14T05:31:37.3623283Z * [new branch] gh/davidberard98/338/head -> origin/gh/davidberard98/338/head 2025-03-14T05:31:37.3624997Z * [new branch] gh/davidberard98/338/orig -> origin/gh/davidberard98/338/orig 2025-03-14T05:31:37.3627289Z * [new branch] gh/davidberard98/339/base -> origin/gh/davidberard98/339/base 2025-03-14T05:31:37.3628981Z * [new branch] gh/davidberard98/339/head -> origin/gh/davidberard98/339/head 2025-03-14T05:31:37.3630687Z * [new branch] gh/davidberard98/339/orig -> origin/gh/davidberard98/339/orig 2025-03-14T05:31:37.3633036Z * [new branch] gh/davidberard98/340/base -> origin/gh/davidberard98/340/base 2025-03-14T05:31:37.3634860Z * [new branch] gh/davidberard98/340/head -> origin/gh/davidberard98/340/head 2025-03-14T05:31:37.3636618Z * [new branch] gh/davidberard98/340/orig -> origin/gh/davidberard98/340/orig 2025-03-14T05:31:37.3638957Z * [new branch] gh/davidberard98/341/base -> origin/gh/davidberard98/341/base 2025-03-14T05:31:37.3641215Z * [new branch] gh/davidberard98/341/head -> origin/gh/davidberard98/341/head 2025-03-14T05:31:37.3642927Z * [new branch] gh/davidberard98/341/orig -> origin/gh/davidberard98/341/orig 2025-03-14T05:31:37.3645239Z * [new branch] gh/davidberard98/342/base -> origin/gh/davidberard98/342/base 2025-03-14T05:31:37.3646927Z * [new branch] gh/davidberard98/342/head -> origin/gh/davidberard98/342/head 2025-03-14T05:31:37.3649084Z * [new branch] gh/davidberard98/342/orig -> origin/gh/davidberard98/342/orig 2025-03-14T05:31:37.3651298Z * [new branch] gh/davidberard98/343/base -> origin/gh/davidberard98/343/base 2025-03-14T05:31:37.3652537Z * [new branch] gh/davidberard98/343/head -> origin/gh/davidberard98/343/head 2025-03-14T05:31:37.3654596Z * [new branch] gh/davidberard98/343/orig -> origin/gh/davidberard98/343/orig 2025-03-14T05:31:37.3656839Z * [new branch] gh/davidberard98/344/base -> origin/gh/davidberard98/344/base 2025-03-14T05:31:37.3658320Z * [new branch] gh/davidberard98/344/head -> origin/gh/davidberard98/344/head 2025-03-14T05:31:37.3659949Z * [new branch] gh/davidberard98/344/orig -> origin/gh/davidberard98/344/orig 2025-03-14T05:31:37.3662381Z * [new branch] gh/davidberard98/345/base -> origin/gh/davidberard98/345/base 2025-03-14T05:31:37.3664118Z * [new branch] gh/davidberard98/345/head -> origin/gh/davidberard98/345/head 2025-03-14T05:31:37.3665899Z * [new branch] gh/davidberard98/345/orig -> origin/gh/davidberard98/345/orig 2025-03-14T05:31:37.3668528Z * [new branch] gh/davidberard98/346/base -> origin/gh/davidberard98/346/base 2025-03-14T05:31:37.3672707Z * [new branch] gh/davidberard98/346/head -> origin/gh/davidberard98/346/head 2025-03-14T05:31:37.3674547Z * [new branch] gh/davidberard98/346/orig -> origin/gh/davidberard98/346/orig 2025-03-14T05:31:37.3677381Z * [new branch] gh/desertfire/531/base -> origin/gh/desertfire/531/base 2025-03-14T05:31:37.3678995Z * [new branch] gh/desertfire/531/head -> origin/gh/desertfire/531/head 2025-03-14T05:31:37.3680794Z * [new branch] gh/desertfire/531/orig -> origin/gh/desertfire/531/orig 2025-03-14T05:31:37.3683013Z * [new branch] gh/desertfire/535/base -> origin/gh/desertfire/535/base 2025-03-14T05:31:37.3684792Z * [new branch] gh/desertfire/535/head -> origin/gh/desertfire/535/head 2025-03-14T05:31:37.3686546Z * [new branch] gh/desertfire/535/orig -> origin/gh/desertfire/535/orig 2025-03-14T05:31:37.3688897Z * [new branch] gh/desertfire/539/base -> origin/gh/desertfire/539/base 2025-03-14T05:31:37.3690583Z * [new branch] gh/desertfire/539/head -> origin/gh/desertfire/539/head 2025-03-14T05:31:37.3692211Z * [new branch] gh/desertfire/539/orig -> origin/gh/desertfire/539/orig 2025-03-14T05:31:37.3695033Z * [new branch] gh/desertfire/540/base -> origin/gh/desertfire/540/base 2025-03-14T05:31:37.3696726Z * [new branch] gh/desertfire/540/head -> origin/gh/desertfire/540/head 2025-03-14T05:31:37.3698382Z * [new branch] gh/desertfire/540/orig -> origin/gh/desertfire/540/orig 2025-03-14T05:31:37.3700527Z * [new branch] gh/desertfire/541/base -> origin/gh/desertfire/541/base 2025-03-14T05:31:37.3702199Z * [new branch] gh/desertfire/541/head -> origin/gh/desertfire/541/head 2025-03-14T05:31:37.3703874Z * [new branch] gh/desertfire/541/orig -> origin/gh/desertfire/541/orig 2025-03-14T05:31:37.3706146Z * [new branch] gh/desertfire/542/base -> origin/gh/desertfire/542/base 2025-03-14T05:31:37.3707778Z * [new branch] gh/desertfire/542/head -> origin/gh/desertfire/542/head 2025-03-14T05:31:37.3709545Z * [new branch] gh/desertfire/542/orig -> origin/gh/desertfire/542/orig 2025-03-14T05:31:37.3711809Z * [new branch] gh/desertfire/543/base -> origin/gh/desertfire/543/base 2025-03-14T05:31:37.3713436Z * [new branch] gh/desertfire/543/head -> origin/gh/desertfire/543/head 2025-03-14T05:31:37.3715245Z * [new branch] gh/desertfire/543/orig -> origin/gh/desertfire/543/orig 2025-03-14T05:31:37.3717480Z * [new branch] gh/desertfire/544/base -> origin/gh/desertfire/544/base 2025-03-14T05:31:37.3719606Z * [new branch] gh/desertfire/544/head -> origin/gh/desertfire/544/head 2025-03-14T05:31:37.3721337Z * [new branch] gh/desertfire/544/orig -> origin/gh/desertfire/544/orig 2025-03-14T05:31:37.3723538Z * [new branch] gh/desertfire/545/base -> origin/gh/desertfire/545/base 2025-03-14T05:31:37.3725192Z * [new branch] gh/desertfire/545/head -> origin/gh/desertfire/545/head 2025-03-14T05:31:37.3726839Z * [new branch] gh/desertfire/545/orig -> origin/gh/desertfire/545/orig 2025-03-14T05:31:37.3729032Z * [new branch] gh/desertfire/546/base -> origin/gh/desertfire/546/base 2025-03-14T05:31:37.3730649Z * [new branch] gh/desertfire/546/head -> origin/gh/desertfire/546/head 2025-03-14T05:31:37.3732420Z * [new branch] gh/desertfire/546/orig -> origin/gh/desertfire/546/orig 2025-03-14T05:31:37.3735170Z * [new branch] gh/desertfire/547/base -> origin/gh/desertfire/547/base 2025-03-14T05:31:37.3736818Z * [new branch] gh/desertfire/547/head -> origin/gh/desertfire/547/head 2025-03-14T05:31:37.3738575Z * [new branch] gh/desertfire/547/orig -> origin/gh/desertfire/547/orig 2025-03-14T05:31:37.3740869Z * [new branch] gh/desertfire/548/base -> origin/gh/desertfire/548/base 2025-03-14T05:31:37.3742560Z * [new branch] gh/desertfire/548/head -> origin/gh/desertfire/548/head 2025-03-14T05:31:37.3744239Z * [new branch] gh/desertfire/548/orig -> origin/gh/desertfire/548/orig 2025-03-14T05:31:37.3746498Z * [new branch] gh/desertfire/549/base -> origin/gh/desertfire/549/base 2025-03-14T05:31:37.3748214Z * [new branch] gh/desertfire/549/head -> origin/gh/desertfire/549/head 2025-03-14T05:31:37.3749931Z * [new branch] gh/desertfire/549/orig -> origin/gh/desertfire/549/orig 2025-03-14T05:31:37.3752550Z * [new branch] gh/desertfire/550/base -> origin/gh/desertfire/550/base 2025-03-14T05:31:37.3754051Z * [new branch] gh/desertfire/550/head -> origin/gh/desertfire/550/head 2025-03-14T05:31:37.3755899Z * [new branch] gh/desertfire/550/orig -> origin/gh/desertfire/550/orig 2025-03-14T05:31:37.3758311Z * [new branch] gh/desertfire/551/base -> origin/gh/desertfire/551/base 2025-03-14T05:31:37.3759763Z * [new branch] gh/desertfire/551/head -> origin/gh/desertfire/551/head 2025-03-14T05:31:37.3761460Z * [new branch] gh/desertfire/551/orig -> origin/gh/desertfire/551/orig 2025-03-14T05:31:37.3763807Z * [new branch] gh/desertfire/552/base -> origin/gh/desertfire/552/base 2025-03-14T05:31:37.3765451Z * [new branch] gh/desertfire/552/head -> origin/gh/desertfire/552/head 2025-03-14T05:31:37.3767092Z * [new branch] gh/desertfire/552/orig -> origin/gh/desertfire/552/orig 2025-03-14T05:31:37.3770260Z * [new branch] gh/desertfire/553/base -> origin/gh/desertfire/553/base 2025-03-14T05:31:37.3771911Z * [new branch] gh/desertfire/553/head -> origin/gh/desertfire/553/head 2025-03-14T05:31:37.3773548Z * [new branch] gh/desertfire/553/orig -> origin/gh/desertfire/553/orig 2025-03-14T05:31:37.3775966Z * [new branch] gh/desertfire/554/base -> origin/gh/desertfire/554/base 2025-03-14T05:31:37.3777590Z * [new branch] gh/desertfire/554/head -> origin/gh/desertfire/554/head 2025-03-14T05:31:37.3779340Z * [new branch] gh/desertfire/554/orig -> origin/gh/desertfire/554/orig 2025-03-14T05:31:37.3782142Z * [new branch] gh/drisspg/100/base -> origin/gh/drisspg/100/base 2025-03-14T05:31:37.3784173Z * [new branch] gh/drisspg/100/head -> origin/gh/drisspg/100/head 2025-03-14T05:31:37.3785834Z * [new branch] gh/drisspg/100/orig -> origin/gh/drisspg/100/orig 2025-03-14T05:31:37.3788065Z * [new branch] gh/drisspg/103/base -> origin/gh/drisspg/103/base 2025-03-14T05:31:37.3789696Z * [new branch] gh/drisspg/103/head -> origin/gh/drisspg/103/head 2025-03-14T05:31:37.3791374Z * [new branch] gh/drisspg/103/orig -> origin/gh/drisspg/103/orig 2025-03-14T05:31:37.3793669Z * [new branch] gh/drisspg/104/base -> origin/gh/drisspg/104/base 2025-03-14T05:31:37.3795454Z * [new branch] gh/drisspg/104/head -> origin/gh/drisspg/104/head 2025-03-14T05:31:37.3797104Z * [new branch] gh/drisspg/104/orig -> origin/gh/drisspg/104/orig 2025-03-14T05:31:37.3799283Z * [new branch] gh/drisspg/111/base -> origin/gh/drisspg/111/base 2025-03-14T05:31:37.3801026Z * [new branch] gh/drisspg/111/head -> origin/gh/drisspg/111/head 2025-03-14T05:31:37.3802979Z * [new branch] gh/drisspg/111/orig -> origin/gh/drisspg/111/orig 2025-03-14T05:31:37.3805469Z * [new branch] gh/drisspg/115/base -> origin/gh/drisspg/115/base 2025-03-14T05:31:37.3806873Z * [new branch] gh/drisspg/115/head -> origin/gh/drisspg/115/head 2025-03-14T05:31:37.3808766Z * [new branch] gh/drisspg/115/orig -> origin/gh/drisspg/115/orig 2025-03-14T05:31:37.3810698Z * [new branch] gh/drisspg/119/base -> origin/gh/drisspg/119/base 2025-03-14T05:31:37.3812239Z * [new branch] gh/drisspg/119/head -> origin/gh/drisspg/119/head 2025-03-14T05:31:37.3813879Z * [new branch] gh/drisspg/119/orig -> origin/gh/drisspg/119/orig 2025-03-14T05:31:37.3816115Z * [new branch] gh/drisspg/123/base -> origin/gh/drisspg/123/base 2025-03-14T05:31:37.3817729Z * [new branch] gh/drisspg/123/head -> origin/gh/drisspg/123/head 2025-03-14T05:31:37.3819441Z * [new branch] gh/drisspg/123/orig -> origin/gh/drisspg/123/orig 2025-03-14T05:31:37.3821661Z * [new branch] gh/drisspg/125/base -> origin/gh/drisspg/125/base 2025-03-14T05:31:37.3823419Z * [new branch] gh/drisspg/125/head -> origin/gh/drisspg/125/head 2025-03-14T05:31:37.3825061Z * [new branch] gh/drisspg/125/orig -> origin/gh/drisspg/125/orig 2025-03-14T05:31:37.3827307Z * [new branch] gh/drisspg/126/base -> origin/gh/drisspg/126/base 2025-03-14T05:31:37.3828916Z * [new branch] gh/drisspg/126/head -> origin/gh/drisspg/126/head 2025-03-14T05:31:37.3830603Z * [new branch] gh/drisspg/126/orig -> origin/gh/drisspg/126/orig 2025-03-14T05:31:37.3832847Z * [new branch] gh/drisspg/127/base -> origin/gh/drisspg/127/base 2025-03-14T05:31:37.3834577Z * [new branch] gh/drisspg/127/head -> origin/gh/drisspg/127/head 2025-03-14T05:31:37.3836230Z * [new branch] gh/drisspg/127/orig -> origin/gh/drisspg/127/orig 2025-03-14T05:31:37.3838453Z * [new branch] gh/drisspg/128/base -> origin/gh/drisspg/128/base 2025-03-14T05:31:37.3840497Z * [new branch] gh/drisspg/128/head -> origin/gh/drisspg/128/head 2025-03-14T05:31:37.3841819Z * [new branch] gh/drisspg/128/orig -> origin/gh/drisspg/128/orig 2025-03-14T05:31:37.3844170Z * [new branch] gh/drisspg/129/base -> origin/gh/drisspg/129/base 2025-03-14T05:31:37.3845918Z * [new branch] gh/drisspg/129/head -> origin/gh/drisspg/129/head 2025-03-14T05:31:37.3847564Z * [new branch] gh/drisspg/129/orig -> origin/gh/drisspg/129/orig 2025-03-14T05:31:37.3849831Z * [new branch] gh/drisspg/130/base -> origin/gh/drisspg/130/base 2025-03-14T05:31:37.3851470Z * [new branch] gh/drisspg/130/head -> origin/gh/drisspg/130/head 2025-03-14T05:31:37.3853060Z * [new branch] gh/drisspg/130/orig -> origin/gh/drisspg/130/orig 2025-03-14T05:31:37.3855373Z * [new branch] gh/drisspg/131/base -> origin/gh/drisspg/131/base 2025-03-14T05:31:37.3857354Z * [new branch] gh/drisspg/131/head -> origin/gh/drisspg/131/head 2025-03-14T05:31:37.3858705Z * [new branch] gh/drisspg/131/orig -> origin/gh/drisspg/131/orig 2025-03-14T05:31:37.3861531Z * [new branch] gh/drisspg/132/base -> origin/gh/drisspg/132/base 2025-03-14T05:31:37.3863167Z * [new branch] gh/drisspg/132/head -> origin/gh/drisspg/132/head 2025-03-14T05:31:37.3864935Z * [new branch] gh/drisspg/132/orig -> origin/gh/drisspg/132/orig 2025-03-14T05:31:37.3867113Z * [new branch] gh/drisspg/133/base -> origin/gh/drisspg/133/base 2025-03-14T05:31:37.3869105Z * [new branch] gh/drisspg/133/head -> origin/gh/drisspg/133/head 2025-03-14T05:31:37.3873397Z * [new branch] gh/drisspg/133/orig -> origin/gh/drisspg/133/orig 2025-03-14T05:31:37.3876524Z * [new branch] gh/drisspg/134/base -> origin/gh/drisspg/134/base 2025-03-14T05:31:37.3878052Z * [new branch] gh/drisspg/134/head -> origin/gh/drisspg/134/head 2025-03-14T05:31:37.3879674Z * [new branch] gh/drisspg/134/orig -> origin/gh/drisspg/134/orig 2025-03-14T05:31:37.3882027Z * [new branch] gh/drisspg/135/base -> origin/gh/drisspg/135/base 2025-03-14T05:31:37.3883593Z * [new branch] gh/drisspg/135/head -> origin/gh/drisspg/135/head 2025-03-14T05:31:37.3885288Z * [new branch] gh/drisspg/135/orig -> origin/gh/drisspg/135/orig 2025-03-14T05:31:37.3887594Z * [new branch] gh/drisspg/136/base -> origin/gh/drisspg/136/base 2025-03-14T05:31:37.3889224Z * [new branch] gh/drisspg/136/head -> origin/gh/drisspg/136/head 2025-03-14T05:31:37.3890874Z * [new branch] gh/drisspg/136/orig -> origin/gh/drisspg/136/orig 2025-03-14T05:31:37.3893302Z * [new branch] gh/drisspg/66/base -> origin/gh/drisspg/66/base 2025-03-14T05:31:37.3895101Z * [new branch] gh/drisspg/66/head -> origin/gh/drisspg/66/head 2025-03-14T05:31:37.3896722Z * [new branch] gh/drisspg/66/orig -> origin/gh/drisspg/66/orig 2025-03-14T05:31:37.3899065Z * [new branch] gh/drisspg/98/base -> origin/gh/drisspg/98/base 2025-03-14T05:31:37.3901001Z * [new branch] gh/drisspg/98/head -> origin/gh/drisspg/98/head 2025-03-14T05:31:37.3902439Z * [new branch] gh/drisspg/98/orig -> origin/gh/drisspg/98/orig 2025-03-14T05:31:37.3905211Z * [new branch] gh/eellison/554/base -> origin/gh/eellison/554/base 2025-03-14T05:31:37.3907006Z * [new branch] gh/eellison/554/head -> origin/gh/eellison/554/head 2025-03-14T05:31:37.3908625Z * [new branch] gh/eellison/554/orig -> origin/gh/eellison/554/orig 2025-03-14T05:31:37.3910945Z * [new branch] gh/eellison/555/base -> origin/gh/eellison/555/base 2025-03-14T05:31:37.3912607Z * [new branch] gh/eellison/555/head -> origin/gh/eellison/555/head 2025-03-14T05:31:37.3914261Z * [new branch] gh/eellison/555/orig -> origin/gh/eellison/555/orig 2025-03-14T05:31:37.3916646Z * [new branch] gh/eellison/691/base -> origin/gh/eellison/691/base 2025-03-14T05:31:37.3918285Z * [new branch] gh/eellison/691/head -> origin/gh/eellison/691/head 2025-03-14T05:31:37.3920033Z * [new branch] gh/eellison/691/orig -> origin/gh/eellison/691/orig 2025-03-14T05:31:37.3922276Z * [new branch] gh/eellison/709/base -> origin/gh/eellison/709/base 2025-03-14T05:31:37.3923902Z * [new branch] gh/eellison/709/head -> origin/gh/eellison/709/head 2025-03-14T05:31:37.3925567Z * [new branch] gh/eellison/709/orig -> origin/gh/eellison/709/orig 2025-03-14T05:31:37.3927940Z * [new branch] gh/eellison/735/base -> origin/gh/eellison/735/base 2025-03-14T05:31:37.3929556Z * [new branch] gh/eellison/735/head -> origin/gh/eellison/735/head 2025-03-14T05:31:37.3931174Z * [new branch] gh/eellison/735/orig -> origin/gh/eellison/735/orig 2025-03-14T05:31:37.3933388Z * [new branch] gh/eellison/747/base -> origin/gh/eellison/747/base 2025-03-14T05:31:37.3935140Z * [new branch] gh/eellison/747/head -> origin/gh/eellison/747/head 2025-03-14T05:31:37.3936765Z * [new branch] gh/eellison/747/orig -> origin/gh/eellison/747/orig 2025-03-14T05:31:37.3939545Z * [new branch] gh/eellison/759/base -> origin/gh/eellison/759/base 2025-03-14T05:31:37.3942015Z * [new branch] gh/eellison/759/head -> origin/gh/eellison/759/head 2025-03-14T05:31:37.3943871Z * [new branch] gh/eellison/759/orig -> origin/gh/eellison/759/orig 2025-03-14T05:31:37.3946044Z * [new branch] gh/eellison/760/base -> origin/gh/eellison/760/base 2025-03-14T05:31:37.3947767Z * [new branch] gh/eellison/760/head -> origin/gh/eellison/760/head 2025-03-14T05:31:37.3949428Z * [new branch] gh/eellison/760/orig -> origin/gh/eellison/760/orig 2025-03-14T05:31:37.3951707Z * [new branch] gh/eellison/761/base -> origin/gh/eellison/761/base 2025-03-14T05:31:37.3953364Z * [new branch] gh/eellison/761/head -> origin/gh/eellison/761/head 2025-03-14T05:31:37.3955162Z * [new branch] gh/eellison/761/orig -> origin/gh/eellison/761/orig 2025-03-14T05:31:37.3957372Z * [new branch] gh/eellison/762/base -> origin/gh/eellison/762/base 2025-03-14T05:31:37.3959035Z * [new branch] gh/eellison/762/head -> origin/gh/eellison/762/head 2025-03-14T05:31:37.3960666Z * [new branch] gh/eellison/762/orig -> origin/gh/eellison/762/orig 2025-03-14T05:31:37.3963070Z * [new branch] gh/eellison/763/base -> origin/gh/eellison/763/base 2025-03-14T05:31:37.3964900Z * [new branch] gh/eellison/763/head -> origin/gh/eellison/763/head 2025-03-14T05:31:37.3967203Z * [new branch] gh/eellison/763/orig -> origin/gh/eellison/763/orig 2025-03-14T05:31:37.3969921Z * [new branch] gh/eellison/764/base -> origin/gh/eellison/764/base 2025-03-14T05:31:37.3971566Z * [new branch] gh/eellison/764/head -> origin/gh/eellison/764/head 2025-03-14T05:31:37.3973212Z * [new branch] gh/eellison/764/orig -> origin/gh/eellison/764/orig 2025-03-14T05:31:37.3975540Z * [new branch] gh/eellison/765/base -> origin/gh/eellison/765/base 2025-03-14T05:31:37.3977228Z * [new branch] gh/eellison/765/head -> origin/gh/eellison/765/head 2025-03-14T05:31:37.3978923Z * [new branch] gh/eellison/765/orig -> origin/gh/eellison/765/orig 2025-03-14T05:31:37.3981211Z * [new branch] gh/eellison/766/base -> origin/gh/eellison/766/base 2025-03-14T05:31:37.3982856Z * [new branch] gh/eellison/766/head -> origin/gh/eellison/766/head 2025-03-14T05:31:37.3984606Z * [new branch] gh/eellison/766/orig -> origin/gh/eellison/766/orig 2025-03-14T05:31:37.3986904Z * [new branch] gh/eellison/767/base -> origin/gh/eellison/767/base 2025-03-14T05:31:37.3988533Z * [new branch] gh/eellison/767/head -> origin/gh/eellison/767/head 2025-03-14T05:31:37.3990198Z * [new branch] gh/eellison/767/orig -> origin/gh/eellison/767/orig 2025-03-14T05:31:37.3992474Z * [new branch] gh/eellison/768/base -> origin/gh/eellison/768/base 2025-03-14T05:31:37.3994103Z * [new branch] gh/eellison/768/head -> origin/gh/eellison/768/head 2025-03-14T05:31:37.3995958Z * [new branch] gh/eellison/768/orig -> origin/gh/eellison/768/orig 2025-03-14T05:31:37.3999304Z * [new branch] gh/eellison/769/base -> origin/gh/eellison/769/base 2025-03-14T05:31:37.4000907Z * [new branch] gh/eellison/769/head -> origin/gh/eellison/769/head 2025-03-14T05:31:37.4002599Z * [new branch] gh/eellison/769/orig -> origin/gh/eellison/769/orig 2025-03-14T05:31:37.4004821Z * [new branch] gh/eellison/770/base -> origin/gh/eellison/770/base 2025-03-14T05:31:37.4006483Z * [new branch] gh/eellison/770/head -> origin/gh/eellison/770/head 2025-03-14T05:31:37.4008141Z * [new branch] gh/eellison/770/orig -> origin/gh/eellison/770/orig 2025-03-14T05:31:37.4010695Z * [new branch] gh/eellison/771/base -> origin/gh/eellison/771/base 2025-03-14T05:31:37.4012361Z * [new branch] gh/eellison/771/head -> origin/gh/eellison/771/head 2025-03-14T05:31:37.4013840Z * [new branch] gh/eellison/771/orig -> origin/gh/eellison/771/orig 2025-03-14T05:31:37.4016702Z * [new branch] gh/etaf/100/base -> origin/gh/etaf/100/base 2025-03-14T05:31:37.4018365Z * [new branch] gh/etaf/100/head -> origin/gh/etaf/100/head 2025-03-14T05:31:37.4020017Z * [new branch] gh/etaf/100/orig -> origin/gh/etaf/100/orig 2025-03-14T05:31:37.4022509Z * [new branch] gh/etaf/101/base -> origin/gh/etaf/101/base 2025-03-14T05:31:37.4024206Z * [new branch] gh/etaf/101/head -> origin/gh/etaf/101/head 2025-03-14T05:31:37.4025959Z * [new branch] gh/etaf/101/orig -> origin/gh/etaf/101/orig 2025-03-14T05:31:37.4028455Z * [new branch] gh/etaf/102/base -> origin/gh/etaf/102/base 2025-03-14T05:31:37.4030097Z * [new branch] gh/etaf/102/head -> origin/gh/etaf/102/head 2025-03-14T05:31:37.4031759Z * [new branch] gh/etaf/102/orig -> origin/gh/etaf/102/orig 2025-03-14T05:31:37.4034030Z * [new branch] gh/etaf/103/base -> origin/gh/etaf/103/base 2025-03-14T05:31:37.4035837Z * [new branch] gh/etaf/103/head -> origin/gh/etaf/103/head 2025-03-14T05:31:37.4037483Z * [new branch] gh/etaf/103/orig -> origin/gh/etaf/103/orig 2025-03-14T05:31:37.4039722Z * [new branch] gh/etaf/104/base -> origin/gh/etaf/104/base 2025-03-14T05:31:37.4041307Z * [new branch] gh/etaf/104/head -> origin/gh/etaf/104/head 2025-03-14T05:31:37.4042988Z * [new branch] gh/etaf/104/orig -> origin/gh/etaf/104/orig 2025-03-14T05:31:37.4045381Z * [new branch] gh/etaf/105/base -> origin/gh/etaf/105/base 2025-03-14T05:31:37.4047576Z * [new branch] gh/etaf/105/head -> origin/gh/etaf/105/head 2025-03-14T05:31:37.4049232Z * [new branch] gh/etaf/105/orig -> origin/gh/etaf/105/orig 2025-03-14T05:31:37.4052039Z * [new branch] gh/etaf/106/base -> origin/gh/etaf/106/base 2025-03-14T05:31:37.4053703Z * [new branch] gh/etaf/106/head -> origin/gh/etaf/106/head 2025-03-14T05:31:37.4055441Z * [new branch] gh/etaf/106/orig -> origin/gh/etaf/106/orig 2025-03-14T05:31:37.4057812Z * [new branch] gh/etaf/107/base -> origin/gh/etaf/107/base 2025-03-14T05:31:37.4059460Z * [new branch] gh/etaf/107/head -> origin/gh/etaf/107/head 2025-03-14T05:31:37.4061116Z * [new branch] gh/etaf/107/orig -> origin/gh/etaf/107/orig 2025-03-14T05:31:37.4069674Z * [new branch] gh/etaf/108/base -> origin/gh/etaf/108/base 2025-03-14T05:31:37.4070254Z * [new branch] gh/etaf/108/head -> origin/gh/etaf/108/head 2025-03-14T05:31:37.4070488Z * [new branch] gh/etaf/108/orig -> origin/gh/etaf/108/orig 2025-03-14T05:31:37.4070689Z * [new branch] gh/etaf/109/base -> origin/gh/etaf/109/base 2025-03-14T05:31:37.4070919Z * [new branch] gh/etaf/109/head -> origin/gh/etaf/109/head 2025-03-14T05:31:37.4072897Z * [new branch] gh/etaf/109/orig -> origin/gh/etaf/109/orig 2025-03-14T05:31:37.4075376Z * [new branch] gh/etaf/110/base -> origin/gh/etaf/110/base 2025-03-14T05:31:37.4077079Z * [new branch] gh/etaf/110/head -> origin/gh/etaf/110/head 2025-03-14T05:31:37.4078709Z * [new branch] gh/etaf/110/orig -> origin/gh/etaf/110/orig 2025-03-14T05:31:37.4081080Z * [new branch] gh/etaf/64/base -> origin/gh/etaf/64/base 2025-03-14T05:31:37.4083246Z * [new branch] gh/etaf/64/head -> origin/gh/etaf/64/head 2025-03-14T05:31:37.4084773Z * [new branch] gh/etaf/64/orig -> origin/gh/etaf/64/orig 2025-03-14T05:31:37.4087067Z * [new branch] gh/etaf/68/base -> origin/gh/etaf/68/base 2025-03-14T05:31:37.4088671Z * [new branch] gh/etaf/68/head -> origin/gh/etaf/68/head 2025-03-14T05:31:37.4090793Z * [new branch] gh/etaf/68/orig -> origin/gh/etaf/68/orig 2025-03-14T05:31:37.4092494Z * [new branch] gh/etaf/69/base -> origin/gh/etaf/69/base 2025-03-14T05:31:37.4094140Z * [new branch] gh/etaf/69/head -> origin/gh/etaf/69/head 2025-03-14T05:31:37.4095830Z * [new branch] gh/etaf/69/orig -> origin/gh/etaf/69/orig 2025-03-14T05:31:37.4098229Z * [new branch] gh/etaf/84/base -> origin/gh/etaf/84/base 2025-03-14T05:31:37.4099891Z * [new branch] gh/etaf/84/head -> origin/gh/etaf/84/head 2025-03-14T05:31:37.4101579Z * [new branch] gh/etaf/84/orig -> origin/gh/etaf/84/orig 2025-03-14T05:31:37.4104005Z * [new branch] gh/etaf/95/base -> origin/gh/etaf/95/base 2025-03-14T05:31:37.4105719Z * [new branch] gh/etaf/95/head -> origin/gh/etaf/95/head 2025-03-14T05:31:37.4107410Z * [new branch] gh/etaf/95/orig -> origin/gh/etaf/95/orig 2025-03-14T05:31:37.4109770Z * [new branch] gh/etaf/96/base -> origin/gh/etaf/96/base 2025-03-14T05:31:37.4111391Z * [new branch] gh/etaf/96/head -> origin/gh/etaf/96/head 2025-03-14T05:31:37.4113076Z * [new branch] gh/etaf/96/orig -> origin/gh/etaf/96/orig 2025-03-14T05:31:37.4116543Z * [new branch] gh/etaf/97/base -> origin/gh/etaf/97/base 2025-03-14T05:31:37.4117821Z * [new branch] gh/etaf/97/head -> origin/gh/etaf/97/head 2025-03-14T05:31:37.4119461Z * [new branch] gh/etaf/97/orig -> origin/gh/etaf/97/orig 2025-03-14T05:31:37.4121790Z * [new branch] gh/etaf/98/base -> origin/gh/etaf/98/base 2025-03-14T05:31:37.4123500Z * [new branch] gh/etaf/98/head -> origin/gh/etaf/98/head 2025-03-14T05:31:37.4125184Z * [new branch] gh/etaf/98/orig -> origin/gh/etaf/98/orig 2025-03-14T05:31:37.4127453Z * [new branch] gh/etaf/99/base -> origin/gh/etaf/99/base 2025-03-14T05:31:37.4129119Z * [new branch] gh/etaf/99/head -> origin/gh/etaf/99/head 2025-03-14T05:31:37.4130789Z * [new branch] gh/etaf/99/orig -> origin/gh/etaf/99/orig 2025-03-14T05:31:37.4133665Z * [new branch] gh/ezyang/2374/base -> origin/gh/ezyang/2374/base 2025-03-14T05:31:37.4135408Z * [new branch] gh/ezyang/2374/head -> origin/gh/ezyang/2374/head 2025-03-14T05:31:37.4137087Z * [new branch] gh/ezyang/2374/orig -> origin/gh/ezyang/2374/orig 2025-03-14T05:31:37.4139314Z * [new branch] gh/ezyang/2449/orig -> origin/gh/ezyang/2449/orig 2025-03-14T05:31:37.4141596Z * [new branch] gh/ezyang/2479/next -> origin/gh/ezyang/2479/next 2025-03-14T05:31:37.4143806Z * [new branch] gh/ezyang/2480/next -> origin/gh/ezyang/2480/next 2025-03-14T05:31:37.4146191Z * [new branch] gh/ezyang/2973/base -> origin/gh/ezyang/2973/base 2025-03-14T05:31:37.4147880Z * [new branch] gh/ezyang/2973/head -> origin/gh/ezyang/2973/head 2025-03-14T05:31:37.4149565Z * [new branch] gh/ezyang/2973/orig -> origin/gh/ezyang/2973/orig 2025-03-14T05:31:37.4151758Z * [new branch] gh/ezyang/2974/base -> origin/gh/ezyang/2974/base 2025-03-14T05:31:37.4153585Z * [new branch] gh/ezyang/2974/head -> origin/gh/ezyang/2974/head 2025-03-14T05:31:37.4155309Z * [new branch] gh/ezyang/2974/orig -> origin/gh/ezyang/2974/orig 2025-03-14T05:31:37.4157542Z * [new branch] gh/ezyang/2997/base -> origin/gh/ezyang/2997/base 2025-03-14T05:31:37.4159131Z * [new branch] gh/ezyang/2997/head -> origin/gh/ezyang/2997/head 2025-03-14T05:31:37.4160950Z * [new branch] gh/ezyang/2997/orig -> origin/gh/ezyang/2997/orig 2025-03-14T05:31:37.4163062Z * [new branch] gh/ezyang/3031/base -> origin/gh/ezyang/3031/base 2025-03-14T05:31:37.4164753Z * [new branch] gh/ezyang/3031/head -> origin/gh/ezyang/3031/head 2025-03-14T05:31:37.4166418Z * [new branch] gh/ezyang/3031/orig -> origin/gh/ezyang/3031/orig 2025-03-14T05:31:37.4168994Z * [new branch] gh/ezyang/3068/base -> origin/gh/ezyang/3068/base 2025-03-14T05:31:37.4170893Z * [new branch] gh/ezyang/3068/head -> origin/gh/ezyang/3068/head 2025-03-14T05:31:37.4172521Z * [new branch] gh/ezyang/3068/orig -> origin/gh/ezyang/3068/orig 2025-03-14T05:31:37.4175651Z * [new branch] gh/fadara01/1/base -> origin/gh/fadara01/1/base 2025-03-14T05:31:37.4177071Z * [new branch] gh/fadara01/1/head -> origin/gh/fadara01/1/head 2025-03-14T05:31:37.4178791Z * [new branch] gh/fadara01/1/orig -> origin/gh/fadara01/1/orig 2025-03-14T05:31:37.4180971Z * [new branch] gh/fadara01/2/base -> origin/gh/fadara01/2/base 2025-03-14T05:31:37.4183140Z * [new branch] gh/fadara01/2/head -> origin/gh/fadara01/2/head 2025-03-14T05:31:37.4184861Z * [new branch] gh/fadara01/2/orig -> origin/gh/fadara01/2/orig 2025-03-14T05:31:37.4187110Z * [new branch] gh/fadara01/3/base -> origin/gh/fadara01/3/base 2025-03-14T05:31:37.4188818Z * [new branch] gh/fadara01/3/head -> origin/gh/fadara01/3/head 2025-03-14T05:31:37.4190624Z * [new branch] gh/fadara01/3/orig -> origin/gh/fadara01/3/orig 2025-03-14T05:31:37.4192902Z * [new branch] gh/fadara01/4/base -> origin/gh/fadara01/4/base 2025-03-14T05:31:37.4194625Z * [new branch] gh/fadara01/4/head -> origin/gh/fadara01/4/head 2025-03-14T05:31:37.4196430Z * [new branch] gh/fadara01/4/orig -> origin/gh/fadara01/4/orig 2025-03-14T05:31:37.4198704Z * [new branch] gh/fadara01/5/base -> origin/gh/fadara01/5/base 2025-03-14T05:31:37.4200331Z * [new branch] gh/fadara01/5/head -> origin/gh/fadara01/5/head 2025-03-14T05:31:37.4202117Z * [new branch] gh/fadara01/5/orig -> origin/gh/fadara01/5/orig 2025-03-14T05:31:37.4204414Z * [new branch] gh/fadara01/6/base -> origin/gh/fadara01/6/base 2025-03-14T05:31:37.4206015Z * [new branch] gh/fadara01/6/head -> origin/gh/fadara01/6/head 2025-03-14T05:31:37.4207767Z * [new branch] gh/fadara01/6/orig -> origin/gh/fadara01/6/orig 2025-03-14T05:31:37.4210098Z * [new branch] gh/fadara01/7/base -> origin/gh/fadara01/7/base 2025-03-14T05:31:37.4211729Z * [new branch] gh/fadara01/7/head -> origin/gh/fadara01/7/head 2025-03-14T05:31:37.4213442Z * [new branch] gh/fadara01/7/orig -> origin/gh/fadara01/7/orig 2025-03-14T05:31:37.4217130Z * [new branch] gh/fduwjj/111/base -> origin/gh/fduwjj/111/base 2025-03-14T05:31:37.4218495Z * [new branch] gh/fduwjj/111/head -> origin/gh/fduwjj/111/head 2025-03-14T05:31:37.4220121Z * [new branch] gh/fduwjj/111/orig -> origin/gh/fduwjj/111/orig 2025-03-14T05:31:37.4222427Z * [new branch] gh/fduwjj/112/base -> origin/gh/fduwjj/112/base 2025-03-14T05:31:37.4224212Z * [new branch] gh/fduwjj/112/head -> origin/gh/fduwjj/112/head 2025-03-14T05:31:37.4225742Z * [new branch] gh/fduwjj/112/orig -> origin/gh/fduwjj/112/orig 2025-03-14T05:31:37.4227967Z * [new branch] gh/fduwjj/113/base -> origin/gh/fduwjj/113/base 2025-03-14T05:31:37.4229669Z * [new branch] gh/fduwjj/113/head -> origin/gh/fduwjj/113/head 2025-03-14T05:31:37.4231385Z * [new branch] gh/fduwjj/113/orig -> origin/gh/fduwjj/113/orig 2025-03-14T05:31:37.4234292Z * [new branch] gh/fegin/148/base -> origin/gh/fegin/148/base 2025-03-14T05:31:37.4236106Z * [new branch] gh/fegin/148/head -> origin/gh/fegin/148/head 2025-03-14T05:31:37.4237828Z * [new branch] gh/fegin/148/orig -> origin/gh/fegin/148/orig 2025-03-14T05:31:37.4240140Z * [new branch] gh/fegin/159/base -> origin/gh/fegin/159/base 2025-03-14T05:31:37.4242366Z * [new branch] gh/fegin/159/head -> origin/gh/fegin/159/head 2025-03-14T05:31:37.4243707Z * [new branch] gh/fegin/159/orig -> origin/gh/fegin/159/orig 2025-03-14T05:31:37.4246117Z * [new branch] gh/fegin/160/base -> origin/gh/fegin/160/base 2025-03-14T05:31:37.4247712Z * [new branch] gh/fegin/160/head -> origin/gh/fegin/160/head 2025-03-14T05:31:37.4249304Z * [new branch] gh/fegin/160/orig -> origin/gh/fegin/160/orig 2025-03-14T05:31:37.4251594Z * [new branch] gh/fegin/169/base -> origin/gh/fegin/169/base 2025-03-14T05:31:37.4253339Z * [new branch] gh/fegin/169/head -> origin/gh/fegin/169/head 2025-03-14T05:31:37.4255045Z * [new branch] gh/fegin/169/orig -> origin/gh/fegin/169/orig 2025-03-14T05:31:37.4257402Z * [new branch] gh/fegin/171/base -> origin/gh/fegin/171/base 2025-03-14T05:31:37.4259218Z * [new branch] gh/fegin/171/head -> origin/gh/fegin/171/head 2025-03-14T05:31:37.4260939Z * [new branch] gh/fegin/171/orig -> origin/gh/fegin/171/orig 2025-03-14T05:31:37.4263704Z * [new branch] gh/fegin/172/base -> origin/gh/fegin/172/base 2025-03-14T05:31:37.4265113Z * [new branch] gh/fegin/172/head -> origin/gh/fegin/172/head 2025-03-14T05:31:37.4266795Z * [new branch] gh/fegin/172/orig -> origin/gh/fegin/172/orig 2025-03-14T05:31:37.4272074Z * [new branch] gh/fegin/294/base -> origin/gh/fegin/294/base 2025-03-14T05:31:37.4273777Z * [new branch] gh/fegin/294/head -> origin/gh/fegin/294/head 2025-03-14T05:31:37.4275579Z * [new branch] gh/fegin/294/orig -> origin/gh/fegin/294/orig 2025-03-14T05:31:37.4277815Z * [new branch] gh/fegin/295/base -> origin/gh/fegin/295/base 2025-03-14T05:31:37.4279472Z * [new branch] gh/fegin/295/head -> origin/gh/fegin/295/head 2025-03-14T05:31:37.4281126Z * [new branch] gh/fegin/295/orig -> origin/gh/fegin/295/orig 2025-03-14T05:31:37.4283427Z * [new branch] gh/fegin/296/base -> origin/gh/fegin/296/base 2025-03-14T05:31:37.4285134Z * [new branch] gh/fegin/296/head -> origin/gh/fegin/296/head 2025-03-14T05:31:37.4286792Z * [new branch] gh/fegin/296/orig -> origin/gh/fegin/296/orig 2025-03-14T05:31:37.4288956Z * [new branch] gh/fegin/297/base -> origin/gh/fegin/297/base 2025-03-14T05:31:37.4290731Z * [new branch] gh/fegin/297/head -> origin/gh/fegin/297/head 2025-03-14T05:31:37.4292814Z * [new branch] gh/fegin/297/orig -> origin/gh/fegin/297/orig 2025-03-14T05:31:37.4294620Z * [new branch] gh/fegin/298/base -> origin/gh/fegin/298/base 2025-03-14T05:31:37.4296459Z * [new branch] gh/fegin/298/head -> origin/gh/fegin/298/head 2025-03-14T05:31:37.4297985Z * [new branch] gh/fegin/298/orig -> origin/gh/fegin/298/orig 2025-03-14T05:31:37.4300063Z * [new branch] gh/fegin/299/base -> origin/gh/fegin/299/base 2025-03-14T05:31:37.4302319Z * [new branch] gh/fegin/299/head -> origin/gh/fegin/299/head 2025-03-14T05:31:37.4303395Z * [new branch] gh/fegin/299/orig -> origin/gh/fegin/299/orig 2025-03-14T05:31:37.4306481Z * [new branch] gh/fffrog/28/base -> origin/gh/fffrog/28/base 2025-03-14T05:31:37.4308149Z * [new branch] gh/fffrog/28/head -> origin/gh/fffrog/28/head 2025-03-14T05:31:37.4309793Z * [new branch] gh/fffrog/28/orig -> origin/gh/fffrog/28/orig 2025-03-14T05:31:37.4312067Z * [new branch] gh/fffrog/37/base -> origin/gh/fffrog/37/base 2025-03-14T05:31:37.4313568Z * [new branch] gh/fffrog/37/head -> origin/gh/fffrog/37/head 2025-03-14T05:31:37.4315530Z * [new branch] gh/fffrog/37/orig -> origin/gh/fffrog/37/orig 2025-03-14T05:31:37.4317776Z * [new branch] gh/fffrog/38/base -> origin/gh/fffrog/38/base 2025-03-14T05:31:37.4319376Z * [new branch] gh/fffrog/38/head -> origin/gh/fffrog/38/head 2025-03-14T05:31:37.4321083Z * [new branch] gh/fffrog/38/orig -> origin/gh/fffrog/38/orig 2025-03-14T05:31:37.4323314Z * [new branch] gh/fffrog/39/base -> origin/gh/fffrog/39/base 2025-03-14T05:31:37.4324931Z * [new branch] gh/fffrog/39/head -> origin/gh/fffrog/39/head 2025-03-14T05:31:37.4326589Z * [new branch] gh/fffrog/39/orig -> origin/gh/fffrog/39/orig 2025-03-14T05:31:37.4328942Z * [new branch] gh/fffrog/40/base -> origin/gh/fffrog/40/base 2025-03-14T05:31:37.4330636Z * [new branch] gh/fffrog/40/head -> origin/gh/fffrog/40/head 2025-03-14T05:31:37.4332364Z * [new branch] gh/fffrog/40/orig -> origin/gh/fffrog/40/orig 2025-03-14T05:31:37.4334686Z * [new branch] gh/fffrog/41/base -> origin/gh/fffrog/41/base 2025-03-14T05:31:37.4336300Z * [new branch] gh/fffrog/41/head -> origin/gh/fffrog/41/head 2025-03-14T05:31:37.4337954Z * [new branch] gh/fffrog/41/orig -> origin/gh/fffrog/41/orig 2025-03-14T05:31:37.4340154Z * [new branch] gh/fffrog/42/base -> origin/gh/fffrog/42/base 2025-03-14T05:31:37.4341859Z * [new branch] gh/fffrog/42/head -> origin/gh/fffrog/42/head 2025-03-14T05:31:37.4343481Z * [new branch] gh/fffrog/42/orig -> origin/gh/fffrog/42/orig 2025-03-14T05:31:37.4345750Z * [new branch] gh/fffrog/43/base -> origin/gh/fffrog/43/base 2025-03-14T05:31:37.4347449Z * [new branch] gh/fffrog/43/head -> origin/gh/fffrog/43/head 2025-03-14T05:31:37.4349035Z * [new branch] gh/fffrog/43/orig -> origin/gh/fffrog/43/orig 2025-03-14T05:31:37.4351410Z * [new branch] gh/fffrog/44/base -> origin/gh/fffrog/44/base 2025-03-14T05:31:37.4353019Z * [new branch] gh/fffrog/44/head -> origin/gh/fffrog/44/head 2025-03-14T05:31:37.4354788Z * [new branch] gh/fffrog/44/orig -> origin/gh/fffrog/44/orig 2025-03-14T05:31:37.4357049Z * [new branch] gh/fffrog/45/base -> origin/gh/fffrog/45/base 2025-03-14T05:31:37.4358624Z * [new branch] gh/fffrog/45/head -> origin/gh/fffrog/45/head 2025-03-14T05:31:37.4360316Z * [new branch] gh/fffrog/45/orig -> origin/gh/fffrog/45/orig 2025-03-14T05:31:37.4362569Z * [new branch] gh/fffrog/46/base -> origin/gh/fffrog/46/base 2025-03-14T05:31:37.4364368Z * [new branch] gh/fffrog/46/head -> origin/gh/fffrog/46/head 2025-03-14T05:31:37.4365751Z * [new branch] gh/fffrog/46/orig -> origin/gh/fffrog/46/orig 2025-03-14T05:31:37.4368291Z * [new branch] gh/fffrog/47/base -> origin/gh/fffrog/47/base 2025-03-14T05:31:37.4370481Z * [new branch] gh/fffrog/47/head -> origin/gh/fffrog/47/head 2025-03-14T05:31:37.4371705Z * [new branch] gh/fffrog/47/orig -> origin/gh/fffrog/47/orig 2025-03-14T05:31:37.4374303Z * [new branch] gh/fffrog/48/base -> origin/gh/fffrog/48/base 2025-03-14T05:31:37.4375964Z * [new branch] gh/fffrog/48/head -> origin/gh/fffrog/48/head 2025-03-14T05:31:37.4377855Z * [new branch] gh/fffrog/48/orig -> origin/gh/fffrog/48/orig 2025-03-14T05:31:37.4379951Z * [new branch] gh/fffrog/49/base -> origin/gh/fffrog/49/base 2025-03-14T05:31:37.4381809Z * [new branch] gh/fffrog/49/head -> origin/gh/fffrog/49/head 2025-03-14T05:31:37.4383505Z * [new branch] gh/fffrog/49/orig -> origin/gh/fffrog/49/orig 2025-03-14T05:31:37.4385793Z * [new branch] gh/fffrog/50/base -> origin/gh/fffrog/50/base 2025-03-14T05:31:37.4387456Z * [new branch] gh/fffrog/50/head -> origin/gh/fffrog/50/head 2025-03-14T05:31:37.4389167Z * [new branch] gh/fffrog/50/orig -> origin/gh/fffrog/50/orig 2025-03-14T05:31:37.4391970Z * [new branch] gh/guangyey/118/base -> origin/gh/guangyey/118/base 2025-03-14T05:31:37.4393615Z * [new branch] gh/guangyey/118/head -> origin/gh/guangyey/118/head 2025-03-14T05:31:37.4395390Z * [new branch] gh/guangyey/118/orig -> origin/gh/guangyey/118/orig 2025-03-14T05:31:37.4397746Z * [new branch] gh/guangyey/123/base -> origin/gh/guangyey/123/base 2025-03-14T05:31:37.4399417Z * [new branch] gh/guangyey/123/head -> origin/gh/guangyey/123/head 2025-03-14T05:31:37.4401035Z * [new branch] gh/guangyey/123/orig -> origin/gh/guangyey/123/orig 2025-03-14T05:31:37.4403299Z * [new branch] gh/guangyey/124/base -> origin/gh/guangyey/124/base 2025-03-14T05:31:37.4404953Z * [new branch] gh/guangyey/124/head -> origin/gh/guangyey/124/head 2025-03-14T05:31:37.4406570Z * [new branch] gh/guangyey/124/orig -> origin/gh/guangyey/124/orig 2025-03-14T05:31:37.4408827Z * [new branch] gh/guangyey/125/base -> origin/gh/guangyey/125/base 2025-03-14T05:31:37.4410386Z * [new branch] gh/guangyey/125/head -> origin/gh/guangyey/125/head 2025-03-14T05:31:37.4412737Z * [new branch] gh/guangyey/125/orig -> origin/gh/guangyey/125/orig 2025-03-14T05:31:37.4415539Z * [new branch] gh/guangyey/126/base -> origin/gh/guangyey/126/base 2025-03-14T05:31:37.4416637Z * [new branch] gh/guangyey/126/head -> origin/gh/guangyey/126/head 2025-03-14T05:31:37.4418291Z * [new branch] gh/guangyey/126/orig -> origin/gh/guangyey/126/orig 2025-03-14T05:31:37.4420898Z * [new branch] gh/guangyey/127/base -> origin/gh/guangyey/127/base 2025-03-14T05:31:37.4422255Z * [new branch] gh/guangyey/127/head -> origin/gh/guangyey/127/head 2025-03-14T05:31:37.4424163Z * [new branch] gh/guangyey/127/orig -> origin/gh/guangyey/127/orig 2025-03-14T05:31:37.4426535Z * [new branch] gh/guangyey/71/base -> origin/gh/guangyey/71/base 2025-03-14T05:31:37.4427911Z * [new branch] gh/guangyey/71/head -> origin/gh/guangyey/71/head 2025-03-14T05:31:37.4429840Z * [new branch] gh/guangyey/71/orig -> origin/gh/guangyey/71/orig 2025-03-14T05:31:37.4432274Z * [new branch] gh/guangyey/79/base -> origin/gh/guangyey/79/base 2025-03-14T05:31:37.4433545Z * [new branch] gh/guangyey/79/head -> origin/gh/guangyey/79/head 2025-03-14T05:31:37.4435156Z * [new branch] gh/guangyey/79/orig -> origin/gh/guangyey/79/orig 2025-03-14T05:31:37.4437683Z * [new branch] gh/guangyey/87/base -> origin/gh/guangyey/87/base 2025-03-14T05:31:37.4439045Z * [new branch] gh/guangyey/87/head -> origin/gh/guangyey/87/head 2025-03-14T05:31:37.4440998Z * [new branch] gh/guangyey/87/orig -> origin/gh/guangyey/87/orig 2025-03-14T05:31:37.4443357Z * [new branch] gh/guangyey/89/base -> origin/gh/guangyey/89/base 2025-03-14T05:31:37.4444700Z * [new branch] gh/guangyey/89/head -> origin/gh/guangyey/89/head 2025-03-14T05:31:37.4446634Z * [new branch] gh/guangyey/89/orig -> origin/gh/guangyey/89/orig 2025-03-14T05:31:37.4449532Z * [new branch] gh/guilhermeleobas/100/base -> origin/gh/guilhermeleobas/100/base 2025-03-14T05:31:37.4450933Z * [new branch] gh/guilhermeleobas/100/head -> origin/gh/guilhermeleobas/100/head 2025-03-14T05:31:37.4452711Z * [new branch] gh/guilhermeleobas/100/orig -> origin/gh/guilhermeleobas/100/orig 2025-03-14T05:31:37.4455174Z * [new branch] gh/guilhermeleobas/101/base -> origin/gh/guilhermeleobas/101/base 2025-03-14T05:31:37.4456574Z * [new branch] gh/guilhermeleobas/101/head -> origin/gh/guilhermeleobas/101/head 2025-03-14T05:31:37.4458198Z * [new branch] gh/guilhermeleobas/101/orig -> origin/gh/guilhermeleobas/101/orig 2025-03-14T05:31:37.4460862Z * [new branch] gh/guilhermeleobas/102/base -> origin/gh/guilhermeleobas/102/base 2025-03-14T05:31:37.4462234Z * [new branch] gh/guilhermeleobas/102/head -> origin/gh/guilhermeleobas/102/head 2025-03-14T05:31:37.4464282Z * [new branch] gh/guilhermeleobas/102/orig -> origin/gh/guilhermeleobas/102/orig 2025-03-14T05:31:37.4466486Z * [new branch] gh/guilhermeleobas/103/base -> origin/gh/guilhermeleobas/103/base 2025-03-14T05:31:37.4468081Z * [new branch] gh/guilhermeleobas/103/head -> origin/gh/guilhermeleobas/103/head 2025-03-14T05:31:37.4471615Z * [new branch] gh/guilhermeleobas/103/orig -> origin/gh/guilhermeleobas/103/orig 2025-03-14T05:31:37.4473980Z * [new branch] gh/guilhermeleobas/104/base -> origin/gh/guilhermeleobas/104/base 2025-03-14T05:31:37.4475463Z * [new branch] gh/guilhermeleobas/104/head -> origin/gh/guilhermeleobas/104/head 2025-03-14T05:31:37.4477068Z * [new branch] gh/guilhermeleobas/104/orig -> origin/gh/guilhermeleobas/104/orig 2025-03-14T05:31:37.4479645Z * [new branch] gh/guilhermeleobas/105/base -> origin/gh/guilhermeleobas/105/base 2025-03-14T05:31:37.4481076Z * [new branch] gh/guilhermeleobas/105/head -> origin/gh/guilhermeleobas/105/head 2025-03-14T05:31:37.4482638Z * [new branch] gh/guilhermeleobas/105/orig -> origin/gh/guilhermeleobas/105/orig 2025-03-14T05:31:37.4485240Z * [new branch] gh/guilhermeleobas/106/base -> origin/gh/guilhermeleobas/106/base 2025-03-14T05:31:37.4486674Z * [new branch] gh/guilhermeleobas/106/head -> origin/gh/guilhermeleobas/106/head 2025-03-14T05:31:37.4488700Z * [new branch] gh/guilhermeleobas/106/orig -> origin/gh/guilhermeleobas/106/orig 2025-03-14T05:31:37.4490930Z * [new branch] gh/guilhermeleobas/107/base -> origin/gh/guilhermeleobas/107/base 2025-03-14T05:31:37.4492305Z * [new branch] gh/guilhermeleobas/107/head -> origin/gh/guilhermeleobas/107/head 2025-03-14T05:31:37.4494078Z * [new branch] gh/guilhermeleobas/107/orig -> origin/gh/guilhermeleobas/107/orig 2025-03-14T05:31:37.4496528Z * [new branch] gh/guilhermeleobas/108/base -> origin/gh/guilhermeleobas/108/base 2025-03-14T05:31:37.4498102Z * [new branch] gh/guilhermeleobas/108/head -> origin/gh/guilhermeleobas/108/head 2025-03-14T05:31:37.4499559Z * [new branch] gh/guilhermeleobas/108/orig -> origin/gh/guilhermeleobas/108/orig 2025-03-14T05:31:37.4502111Z * [new branch] gh/guilhermeleobas/109/base -> origin/gh/guilhermeleobas/109/base 2025-03-14T05:31:37.4503486Z * [new branch] gh/guilhermeleobas/109/head -> origin/gh/guilhermeleobas/109/head 2025-03-14T05:31:37.4505164Z * [new branch] gh/guilhermeleobas/109/orig -> origin/gh/guilhermeleobas/109/orig 2025-03-14T05:31:37.4508283Z * [new branch] gh/guilhermeleobas/11/base -> origin/gh/guilhermeleobas/11/base 2025-03-14T05:31:37.4509739Z * [new branch] gh/guilhermeleobas/11/head -> origin/gh/guilhermeleobas/11/head 2025-03-14T05:31:37.4511743Z * [new branch] gh/guilhermeleobas/11/orig -> origin/gh/guilhermeleobas/11/orig 2025-03-14T05:31:37.4513970Z * [new branch] gh/guilhermeleobas/110/base -> origin/gh/guilhermeleobas/110/base 2025-03-14T05:31:37.4515505Z * [new branch] gh/guilhermeleobas/110/head -> origin/gh/guilhermeleobas/110/head 2025-03-14T05:31:37.4517090Z * [new branch] gh/guilhermeleobas/110/orig -> origin/gh/guilhermeleobas/110/orig 2025-03-14T05:31:37.4520118Z * [new branch] gh/guilhermeleobas/111/base -> origin/gh/guilhermeleobas/111/base 2025-03-14T05:31:37.4521440Z * [new branch] gh/guilhermeleobas/111/head -> origin/gh/guilhermeleobas/111/head 2025-03-14T05:31:37.4523045Z * [new branch] gh/guilhermeleobas/111/orig -> origin/gh/guilhermeleobas/111/orig 2025-03-14T05:31:37.4525603Z * [new branch] gh/guilhermeleobas/73/base -> origin/gh/guilhermeleobas/73/base 2025-03-14T05:31:37.4527040Z * [new branch] gh/guilhermeleobas/73/head -> origin/gh/guilhermeleobas/73/head 2025-03-14T05:31:37.4528677Z * [new branch] gh/guilhermeleobas/73/orig -> origin/gh/guilhermeleobas/73/orig 2025-03-14T05:31:37.4531254Z * [new branch] gh/guilhermeleobas/92/base -> origin/gh/guilhermeleobas/92/base 2025-03-14T05:31:37.4532616Z * [new branch] gh/guilhermeleobas/92/head -> origin/gh/guilhermeleobas/92/head 2025-03-14T05:31:37.4534730Z * [new branch] gh/guilhermeleobas/92/orig -> origin/gh/guilhermeleobas/92/orig 2025-03-14T05:31:37.4536973Z * [new branch] gh/guilhermeleobas/93/base -> origin/gh/guilhermeleobas/93/base 2025-03-14T05:31:37.4538369Z * [new branch] gh/guilhermeleobas/93/head -> origin/gh/guilhermeleobas/93/head 2025-03-14T05:31:37.4540318Z * [new branch] gh/guilhermeleobas/93/orig -> origin/gh/guilhermeleobas/93/orig 2025-03-14T05:31:37.4542559Z * [new branch] gh/guilhermeleobas/94/base -> origin/gh/guilhermeleobas/94/base 2025-03-14T05:31:37.4543935Z * [new branch] gh/guilhermeleobas/94/head -> origin/gh/guilhermeleobas/94/head 2025-03-14T05:31:37.4545618Z * [new branch] gh/guilhermeleobas/94/orig -> origin/gh/guilhermeleobas/94/orig 2025-03-14T05:31:37.4548707Z * [new branch] gh/guilhermeleobas/95/base -> origin/gh/guilhermeleobas/95/base 2025-03-14T05:31:37.4550077Z * [new branch] gh/guilhermeleobas/95/head -> origin/gh/guilhermeleobas/95/head 2025-03-14T05:31:37.4552035Z * [new branch] gh/guilhermeleobas/95/orig -> origin/gh/guilhermeleobas/95/orig 2025-03-14T05:31:37.4554532Z * [new branch] gh/guilhermeleobas/97/base -> origin/gh/guilhermeleobas/97/base 2025-03-14T05:31:37.4556032Z * [new branch] gh/guilhermeleobas/97/head -> origin/gh/guilhermeleobas/97/head 2025-03-14T05:31:37.4557999Z * [new branch] gh/guilhermeleobas/97/orig -> origin/gh/guilhermeleobas/97/orig 2025-03-14T05:31:37.4560277Z * [new branch] gh/guilhermeleobas/98/base -> origin/gh/guilhermeleobas/98/base 2025-03-14T05:31:37.4561795Z * [new branch] gh/guilhermeleobas/98/head -> origin/gh/guilhermeleobas/98/head 2025-03-14T05:31:37.4563266Z * [new branch] gh/guilhermeleobas/98/orig -> origin/gh/guilhermeleobas/98/orig 2025-03-14T05:31:37.4565850Z * [new branch] gh/guilhermeleobas/99/base -> origin/gh/guilhermeleobas/99/base 2025-03-14T05:31:37.4567249Z * [new branch] gh/guilhermeleobas/99/head -> origin/gh/guilhermeleobas/99/head 2025-03-14T05:31:37.4569231Z * [new branch] gh/guilhermeleobas/99/orig -> origin/gh/guilhermeleobas/99/orig 2025-03-14T05:31:37.4572633Z * [new branch] gh/henrylhtsang/10/base -> origin/gh/henrylhtsang/10/base 2025-03-14T05:31:37.4574128Z * [new branch] gh/henrylhtsang/10/head -> origin/gh/henrylhtsang/10/head 2025-03-14T05:31:37.4576088Z * [new branch] gh/henrylhtsang/10/orig -> origin/gh/henrylhtsang/10/orig 2025-03-14T05:31:37.4578589Z * [new branch] gh/henrylhtsang/11/base -> origin/gh/henrylhtsang/11/base 2025-03-14T05:31:37.4580147Z * [new branch] gh/henrylhtsang/11/head -> origin/gh/henrylhtsang/11/head 2025-03-14T05:31:37.4582124Z * [new branch] gh/henrylhtsang/11/orig -> origin/gh/henrylhtsang/11/orig 2025-03-14T05:31:37.4584474Z * [new branch] gh/henrylhtsang/12/base -> origin/gh/henrylhtsang/12/base 2025-03-14T05:31:37.4586043Z * [new branch] gh/henrylhtsang/12/head -> origin/gh/henrylhtsang/12/head 2025-03-14T05:31:37.4588006Z * [new branch] gh/henrylhtsang/12/orig -> origin/gh/henrylhtsang/12/orig 2025-03-14T05:31:37.4590464Z * [new branch] gh/henrylhtsang/13/base -> origin/gh/henrylhtsang/13/base 2025-03-14T05:31:37.4591932Z * [new branch] gh/henrylhtsang/13/head -> origin/gh/henrylhtsang/13/head 2025-03-14T05:31:37.4593854Z * [new branch] gh/henrylhtsang/13/orig -> origin/gh/henrylhtsang/13/orig 2025-03-14T05:31:37.4596477Z * [new branch] gh/henrylhtsang/14/base -> origin/gh/henrylhtsang/14/base 2025-03-14T05:31:37.4597875Z * [new branch] gh/henrylhtsang/14/head -> origin/gh/henrylhtsang/14/head 2025-03-14T05:31:37.4599782Z * [new branch] gh/henrylhtsang/14/orig -> origin/gh/henrylhtsang/14/orig 2025-03-14T05:31:37.4602271Z * [new branch] gh/henrylhtsang/15/base -> origin/gh/henrylhtsang/15/base 2025-03-14T05:31:37.4603748Z * [new branch] gh/henrylhtsang/15/head -> origin/gh/henrylhtsang/15/head 2025-03-14T05:31:37.4605721Z * [new branch] gh/henrylhtsang/15/orig -> origin/gh/henrylhtsang/15/orig 2025-03-14T05:31:37.4608045Z * [new branch] gh/henrylhtsang/16/base -> origin/gh/henrylhtsang/16/base 2025-03-14T05:31:37.4609421Z * [new branch] gh/henrylhtsang/16/head -> origin/gh/henrylhtsang/16/head 2025-03-14T05:31:37.4611372Z * [new branch] gh/henrylhtsang/16/orig -> origin/gh/henrylhtsang/16/orig 2025-03-14T05:31:37.4614635Z * [new branch] gh/henrylhtsang/17/base -> origin/gh/henrylhtsang/17/base 2025-03-14T05:31:37.4616124Z * [new branch] gh/henrylhtsang/17/head -> origin/gh/henrylhtsang/17/head 2025-03-14T05:31:37.4618001Z * [new branch] gh/henrylhtsang/17/orig -> origin/gh/henrylhtsang/17/orig 2025-03-14T05:31:37.4620380Z * [new branch] gh/henrylhtsang/18/base -> origin/gh/henrylhtsang/18/base 2025-03-14T05:31:37.4621782Z * [new branch] gh/henrylhtsang/18/head -> origin/gh/henrylhtsang/18/head 2025-03-14T05:31:37.4623817Z * [new branch] gh/henrylhtsang/18/orig -> origin/gh/henrylhtsang/18/orig 2025-03-14T05:31:37.4626220Z * [new branch] gh/henrylhtsang/19/base -> origin/gh/henrylhtsang/19/base 2025-03-14T05:31:37.4627773Z * [new branch] gh/henrylhtsang/19/head -> origin/gh/henrylhtsang/19/head 2025-03-14T05:31:37.4629925Z * [new branch] gh/henrylhtsang/19/orig -> origin/gh/henrylhtsang/19/orig 2025-03-14T05:31:37.4632147Z * [new branch] gh/henrylhtsang/20/base -> origin/gh/henrylhtsang/20/base 2025-03-14T05:31:37.4633557Z * [new branch] gh/henrylhtsang/20/head -> origin/gh/henrylhtsang/20/head 2025-03-14T05:31:37.4635677Z * [new branch] gh/henrylhtsang/20/orig -> origin/gh/henrylhtsang/20/orig 2025-03-14T05:31:37.4637958Z * [new branch] gh/henrylhtsang/21/base -> origin/gh/henrylhtsang/21/base 2025-03-14T05:31:37.4639417Z * [new branch] gh/henrylhtsang/21/head -> origin/gh/henrylhtsang/21/head 2025-03-14T05:31:37.4641335Z * [new branch] gh/henrylhtsang/21/orig -> origin/gh/henrylhtsang/21/orig 2025-03-14T05:31:37.4643607Z * [new branch] gh/henrylhtsang/22/base -> origin/gh/henrylhtsang/22/base 2025-03-14T05:31:37.4645030Z * [new branch] gh/henrylhtsang/22/head -> origin/gh/henrylhtsang/22/head 2025-03-14T05:31:37.4647047Z * [new branch] gh/henrylhtsang/22/orig -> origin/gh/henrylhtsang/22/orig 2025-03-14T05:31:37.4649580Z * [new branch] gh/henrylhtsang/23/base -> origin/gh/henrylhtsang/23/base 2025-03-14T05:31:37.4651031Z * [new branch] gh/henrylhtsang/23/head -> origin/gh/henrylhtsang/23/head 2025-03-14T05:31:37.4652972Z * [new branch] gh/henrylhtsang/23/orig -> origin/gh/henrylhtsang/23/orig 2025-03-14T05:31:37.4655249Z * [new branch] gh/henrylhtsang/24/base -> origin/gh/henrylhtsang/24/base 2025-03-14T05:31:37.4656685Z * [new branch] gh/henrylhtsang/24/head -> origin/gh/henrylhtsang/24/head 2025-03-14T05:31:37.4658612Z * [new branch] gh/henrylhtsang/24/orig -> origin/gh/henrylhtsang/24/orig 2025-03-14T05:31:37.4660940Z * [new branch] gh/henrylhtsang/25/base -> origin/gh/henrylhtsang/25/base 2025-03-14T05:31:37.4662497Z * [new branch] gh/henrylhtsang/25/head -> origin/gh/henrylhtsang/25/head 2025-03-14T05:31:37.4664220Z * [new branch] gh/henrylhtsang/25/orig -> origin/gh/henrylhtsang/25/orig 2025-03-14T05:31:37.4666868Z * [new branch] gh/henrylhtsang/26/base -> origin/gh/henrylhtsang/26/base 2025-03-14T05:31:37.4668552Z * [new branch] gh/henrylhtsang/26/head -> origin/gh/henrylhtsang/26/head 2025-03-14T05:31:37.4676874Z * [new branch] gh/henrylhtsang/26/orig -> origin/gh/henrylhtsang/26/orig 2025-03-14T05:31:37.4679303Z * [new branch] gh/henrylhtsang/27/base -> origin/gh/henrylhtsang/27/base 2025-03-14T05:31:37.4680796Z * [new branch] gh/henrylhtsang/27/head -> origin/gh/henrylhtsang/27/head 2025-03-14T05:31:37.4683308Z * [new branch] gh/henrylhtsang/27/orig -> origin/gh/henrylhtsang/27/orig 2025-03-14T05:31:37.4686394Z * [new branch] gh/henrylhtsang/28/base -> origin/gh/henrylhtsang/28/base 2025-03-14T05:31:37.4687985Z * [new branch] gh/henrylhtsang/28/head -> origin/gh/henrylhtsang/28/head 2025-03-14T05:31:37.4689929Z * [new branch] gh/henrylhtsang/28/orig -> origin/gh/henrylhtsang/28/orig 2025-03-14T05:31:37.4692625Z * [new branch] gh/henrylhtsang/29/base -> origin/gh/henrylhtsang/29/base 2025-03-14T05:31:37.4693892Z * [new branch] gh/henrylhtsang/29/head -> origin/gh/henrylhtsang/29/head 2025-03-14T05:31:37.4695862Z * [new branch] gh/henrylhtsang/29/orig -> origin/gh/henrylhtsang/29/orig 2025-03-14T05:31:37.4698280Z * [new branch] gh/henrylhtsang/3/base -> origin/gh/henrylhtsang/3/base 2025-03-14T05:31:37.4699708Z * [new branch] gh/henrylhtsang/3/head -> origin/gh/henrylhtsang/3/head 2025-03-14T05:31:37.4701798Z * [new branch] gh/henrylhtsang/3/orig -> origin/gh/henrylhtsang/3/orig 2025-03-14T05:31:37.4704506Z * [new branch] gh/henrylhtsang/30/base -> origin/gh/henrylhtsang/30/base 2025-03-14T05:31:37.4705853Z * [new branch] gh/henrylhtsang/30/head -> origin/gh/henrylhtsang/30/head 2025-03-14T05:31:37.4707759Z * [new branch] gh/henrylhtsang/30/orig -> origin/gh/henrylhtsang/30/orig 2025-03-14T05:31:37.4711312Z * [new branch] gh/henrylhtsang/31/base -> origin/gh/henrylhtsang/31/base 2025-03-14T05:31:37.4712141Z * [new branch] gh/henrylhtsang/31/head -> origin/gh/henrylhtsang/31/head 2025-03-14T05:31:37.4713212Z * [new branch] gh/henrylhtsang/31/orig -> origin/gh/henrylhtsang/31/orig 2025-03-14T05:31:37.4716360Z * [new branch] gh/henrylhtsang/32/base -> origin/gh/henrylhtsang/32/base 2025-03-14T05:31:37.4717516Z * [new branch] gh/henrylhtsang/32/head -> origin/gh/henrylhtsang/32/head 2025-03-14T05:31:37.4719426Z * [new branch] gh/henrylhtsang/32/orig -> origin/gh/henrylhtsang/32/orig 2025-03-14T05:31:37.4721920Z * [new branch] gh/henrylhtsang/33/base -> origin/gh/henrylhtsang/33/base 2025-03-14T05:31:37.4723401Z * [new branch] gh/henrylhtsang/33/head -> origin/gh/henrylhtsang/33/head 2025-03-14T05:31:37.4725293Z * [new branch] gh/henrylhtsang/33/orig -> origin/gh/henrylhtsang/33/orig 2025-03-14T05:31:37.4727835Z * [new branch] gh/henrylhtsang/34/base -> origin/gh/henrylhtsang/34/base 2025-03-14T05:31:37.4729392Z * [new branch] gh/henrylhtsang/34/head -> origin/gh/henrylhtsang/34/head 2025-03-14T05:31:37.4731314Z * [new branch] gh/henrylhtsang/34/orig -> origin/gh/henrylhtsang/34/orig 2025-03-14T05:31:37.4733768Z * [new branch] gh/henrylhtsang/35/base -> origin/gh/henrylhtsang/35/base 2025-03-14T05:31:37.4735150Z * [new branch] gh/henrylhtsang/35/head -> origin/gh/henrylhtsang/35/head 2025-03-14T05:31:37.4737088Z * [new branch] gh/henrylhtsang/35/orig -> origin/gh/henrylhtsang/35/orig 2025-03-14T05:31:37.4739573Z * [new branch] gh/henrylhtsang/36/base -> origin/gh/henrylhtsang/36/base 2025-03-14T05:31:37.4740953Z * [new branch] gh/henrylhtsang/36/head -> origin/gh/henrylhtsang/36/head 2025-03-14T05:31:37.4742840Z * [new branch] gh/henrylhtsang/36/orig -> origin/gh/henrylhtsang/36/orig 2025-03-14T05:31:37.4745312Z * [new branch] gh/henrylhtsang/37/base -> origin/gh/henrylhtsang/37/base 2025-03-14T05:31:37.4746711Z * [new branch] gh/henrylhtsang/37/head -> origin/gh/henrylhtsang/37/head 2025-03-14T05:31:37.4748603Z * [new branch] gh/henrylhtsang/37/orig -> origin/gh/henrylhtsang/37/orig 2025-03-14T05:31:37.4751834Z * [new branch] gh/henrylhtsang/38/base -> origin/gh/henrylhtsang/38/base 2025-03-14T05:31:37.4752912Z * [new branch] gh/henrylhtsang/38/head -> origin/gh/henrylhtsang/38/head 2025-03-14T05:31:37.4754131Z * [new branch] gh/henrylhtsang/38/orig -> origin/gh/henrylhtsang/38/orig 2025-03-14T05:31:37.4756979Z * [new branch] gh/henrylhtsang/39/base -> origin/gh/henrylhtsang/39/base 2025-03-14T05:31:37.4758316Z * [new branch] gh/henrylhtsang/39/head -> origin/gh/henrylhtsang/39/head 2025-03-14T05:31:37.4760028Z * [new branch] gh/henrylhtsang/39/orig -> origin/gh/henrylhtsang/39/orig 2025-03-14T05:31:37.4762629Z * [new branch] gh/henrylhtsang/4/base -> origin/gh/henrylhtsang/4/base 2025-03-14T05:31:37.4764022Z * [new branch] gh/henrylhtsang/4/head -> origin/gh/henrylhtsang/4/head 2025-03-14T05:31:37.4765999Z * [new branch] gh/henrylhtsang/4/orig -> origin/gh/henrylhtsang/4/orig 2025-03-14T05:31:37.4769313Z * [new branch] gh/henrylhtsang/40/base -> origin/gh/henrylhtsang/40/base 2025-03-14T05:31:37.4770953Z * [new branch] gh/henrylhtsang/40/head -> origin/gh/henrylhtsang/40/head 2025-03-14T05:31:37.4772435Z * [new branch] gh/henrylhtsang/40/orig -> origin/gh/henrylhtsang/40/orig 2025-03-14T05:31:37.4774934Z * [new branch] gh/henrylhtsang/41/base -> origin/gh/henrylhtsang/41/base 2025-03-14T05:31:37.4776430Z * [new branch] gh/henrylhtsang/41/head -> origin/gh/henrylhtsang/41/head 2025-03-14T05:31:37.4778157Z * [new branch] gh/henrylhtsang/41/orig -> origin/gh/henrylhtsang/41/orig 2025-03-14T05:31:37.4780559Z * [new branch] gh/henrylhtsang/42/base -> origin/gh/henrylhtsang/42/base 2025-03-14T05:31:37.4782082Z * [new branch] gh/henrylhtsang/42/head -> origin/gh/henrylhtsang/42/head 2025-03-14T05:31:37.4784009Z * [new branch] gh/henrylhtsang/42/orig -> origin/gh/henrylhtsang/42/orig 2025-03-14T05:31:37.4786307Z * [new branch] gh/henrylhtsang/5/base -> origin/gh/henrylhtsang/5/base 2025-03-14T05:31:37.4788260Z * [new branch] gh/henrylhtsang/5/head -> origin/gh/henrylhtsang/5/head 2025-03-14T05:31:37.4789442Z * [new branch] gh/henrylhtsang/5/orig -> origin/gh/henrylhtsang/5/orig 2025-03-14T05:31:37.4792094Z * [new branch] gh/henrylhtsang/6/base -> origin/gh/henrylhtsang/6/base 2025-03-14T05:31:37.4802857Z * [new branch] gh/henrylhtsang/6/head -> origin/gh/henrylhtsang/6/head 2025-03-14T05:31:37.4803764Z * [new branch] gh/henrylhtsang/6/orig -> origin/gh/henrylhtsang/6/orig 2025-03-14T05:31:37.4804476Z * [new branch] gh/henrylhtsang/7/base -> origin/gh/henrylhtsang/7/base 2025-03-14T05:31:37.4805191Z * [new branch] gh/henrylhtsang/7/head -> origin/gh/henrylhtsang/7/head 2025-03-14T05:31:37.4805973Z * [new branch] gh/henrylhtsang/7/orig -> origin/gh/henrylhtsang/7/orig 2025-03-14T05:31:37.4806597Z * [new branch] gh/henrylhtsang/8/base -> origin/gh/henrylhtsang/8/base 2025-03-14T05:31:37.4807401Z * [new branch] gh/henrylhtsang/8/head -> origin/gh/henrylhtsang/8/head 2025-03-14T05:31:37.4808029Z * [new branch] gh/henrylhtsang/8/orig -> origin/gh/henrylhtsang/8/orig 2025-03-14T05:31:37.4809289Z * [new branch] gh/henrylhtsang/9/base -> origin/gh/henrylhtsang/9/base 2025-03-14T05:31:37.4811262Z * [new branch] gh/henrylhtsang/9/head -> origin/gh/henrylhtsang/9/head 2025-03-14T05:31:37.4812673Z * [new branch] gh/henrylhtsang/9/orig -> origin/gh/henrylhtsang/9/orig 2025-03-14T05:31:37.4815748Z * [new branch] gh/int3/21/base -> origin/gh/int3/21/base 2025-03-14T05:31:37.4817270Z * [new branch] gh/int3/21/head -> origin/gh/int3/21/head 2025-03-14T05:31:37.4819184Z * [new branch] gh/int3/21/orig -> origin/gh/int3/21/orig 2025-03-14T05:31:37.4821707Z * [new branch] gh/int3/34/base -> origin/gh/int3/34/base 2025-03-14T05:31:37.4823140Z * [new branch] gh/int3/34/head -> origin/gh/int3/34/head 2025-03-14T05:31:37.4825080Z * [new branch] gh/int3/34/orig -> origin/gh/int3/34/orig 2025-03-14T05:31:37.4827384Z * [new branch] gh/int3/36/base -> origin/gh/int3/36/base 2025-03-14T05:31:37.4828781Z * [new branch] gh/int3/36/head -> origin/gh/int3/36/head 2025-03-14T05:31:37.4830664Z * [new branch] gh/int3/36/orig -> origin/gh/int3/36/orig 2025-03-14T05:31:37.4833122Z * [new branch] gh/int3/41/base -> origin/gh/int3/41/base 2025-03-14T05:31:37.4835334Z * [new branch] gh/int3/41/head -> origin/gh/int3/41/head 2025-03-14T05:31:37.4837073Z * [new branch] gh/int3/41/orig -> origin/gh/int3/41/orig 2025-03-14T05:31:37.4839659Z * [new branch] gh/int3/45/base -> origin/gh/int3/45/base 2025-03-14T05:31:37.4840975Z * [new branch] gh/int3/45/head -> origin/gh/int3/45/head 2025-03-14T05:31:37.4842941Z * [new branch] gh/int3/45/orig -> origin/gh/int3/45/orig 2025-03-14T05:31:37.4845489Z * [new branch] gh/int3/46/base -> origin/gh/int3/46/base 2025-03-14T05:31:37.4846913Z * [new branch] gh/int3/46/head -> origin/gh/int3/46/head 2025-03-14T05:31:37.4848807Z * [new branch] gh/int3/46/orig -> origin/gh/int3/46/orig 2025-03-14T05:31:37.4851261Z * [new branch] gh/int3/47/base -> origin/gh/int3/47/base 2025-03-14T05:31:37.4852647Z * [new branch] gh/int3/47/head -> origin/gh/int3/47/head 2025-03-14T05:31:37.4854579Z * [new branch] gh/int3/47/orig -> origin/gh/int3/47/orig 2025-03-14T05:31:37.4857036Z * [new branch] gh/int3/55/base -> origin/gh/int3/55/base 2025-03-14T05:31:37.4858501Z * [new branch] gh/int3/55/head -> origin/gh/int3/55/head 2025-03-14T05:31:37.4860457Z * [new branch] gh/int3/55/orig -> origin/gh/int3/55/orig 2025-03-14T05:31:37.4862820Z * [new branch] gh/int3/79/base -> origin/gh/int3/79/base 2025-03-14T05:31:37.4864298Z * [new branch] gh/int3/79/head -> origin/gh/int3/79/head 2025-03-14T05:31:37.4866213Z * [new branch] gh/int3/79/orig -> origin/gh/int3/79/orig 2025-03-14T05:31:37.4868993Z * [new branch] gh/int3/94/base -> origin/gh/int3/94/base 2025-03-14T05:31:37.4870834Z * [new branch] gh/int3/94/head -> origin/gh/int3/94/head 2025-03-14T05:31:37.4872209Z * [new branch] gh/int3/94/orig -> origin/gh/int3/94/orig 2025-03-14T05:31:37.4874897Z * [new branch] gh/int3/95/base -> origin/gh/int3/95/base 2025-03-14T05:31:37.4876319Z * [new branch] gh/int3/95/head -> origin/gh/int3/95/head 2025-03-14T05:31:37.4878205Z * [new branch] gh/int3/95/orig -> origin/gh/int3/95/orig 2025-03-14T05:31:37.4880479Z * [new branch] gh/int3/97/base -> origin/gh/int3/97/base 2025-03-14T05:31:37.4881933Z * [new branch] gh/int3/97/head -> origin/gh/int3/97/head 2025-03-14T05:31:37.4885073Z * [new branch] gh/isuruf/101/base -> origin/gh/isuruf/101/base 2025-03-14T05:31:37.4886466Z * [new branch] gh/isuruf/101/head -> origin/gh/isuruf/101/head 2025-03-14T05:31:37.4888923Z * [new branch] gh/isuruf/105/base -> origin/gh/isuruf/105/base 2025-03-14T05:31:37.4890756Z * [new branch] gh/isuruf/105/head -> origin/gh/isuruf/105/head 2025-03-14T05:31:37.4892059Z * [new branch] gh/isuruf/105/orig -> origin/gh/isuruf/105/orig 2025-03-14T05:31:37.4894637Z * [new branch] gh/isuruf/110/base -> origin/gh/isuruf/110/base 2025-03-14T05:31:37.4896038Z * [new branch] gh/isuruf/110/head -> origin/gh/isuruf/110/head 2025-03-14T05:31:37.4897999Z * [new branch] gh/isuruf/110/orig -> origin/gh/isuruf/110/orig 2025-03-14T05:31:37.4900239Z * [new branch] gh/isuruf/112/base -> origin/gh/isuruf/112/base 2025-03-14T05:31:37.4901636Z * [new branch] gh/isuruf/112/head -> origin/gh/isuruf/112/head 2025-03-14T05:31:37.4903545Z * [new branch] gh/isuruf/112/orig -> origin/gh/isuruf/112/orig 2025-03-14T05:31:37.4905850Z * [new branch] gh/isuruf/115/base -> origin/gh/isuruf/115/base 2025-03-14T05:31:37.4907223Z * [new branch] gh/isuruf/115/head -> origin/gh/isuruf/115/head 2025-03-14T05:31:37.4909120Z * [new branch] gh/isuruf/115/orig -> origin/gh/isuruf/115/orig 2025-03-14T05:31:37.4911592Z * [new branch] gh/isuruf/116/base -> origin/gh/isuruf/116/base 2025-03-14T05:31:37.4912965Z * [new branch] gh/isuruf/116/head -> origin/gh/isuruf/116/head 2025-03-14T05:31:37.4914973Z * [new branch] gh/isuruf/116/orig -> origin/gh/isuruf/116/orig 2025-03-14T05:31:37.4917317Z * [new branch] gh/isuruf/117/base -> origin/gh/isuruf/117/base 2025-03-14T05:31:37.4918726Z * [new branch] gh/isuruf/117/head -> origin/gh/isuruf/117/head 2025-03-14T05:31:37.4920630Z * [new branch] gh/isuruf/117/orig -> origin/gh/isuruf/117/orig 2025-03-14T05:31:37.4922914Z * [new branch] gh/isuruf/119/base -> origin/gh/isuruf/119/base 2025-03-14T05:31:37.4924298Z * [new branch] gh/isuruf/119/head -> origin/gh/isuruf/119/head 2025-03-14T05:31:37.4926239Z * [new branch] gh/isuruf/119/orig -> origin/gh/isuruf/119/orig 2025-03-14T05:31:37.4928555Z * [new branch] gh/isuruf/120/base -> origin/gh/isuruf/120/base 2025-03-14T05:31:37.4929974Z * [new branch] gh/isuruf/120/head -> origin/gh/isuruf/120/head 2025-03-14T05:31:37.4931891Z * [new branch] gh/isuruf/120/orig -> origin/gh/isuruf/120/orig 2025-03-14T05:31:37.4934163Z * [new branch] gh/isuruf/121/base -> origin/gh/isuruf/121/base 2025-03-14T05:31:37.4935620Z * [new branch] gh/isuruf/121/head -> origin/gh/isuruf/121/head 2025-03-14T05:31:37.4937502Z * [new branch] gh/isuruf/121/orig -> origin/gh/isuruf/121/orig 2025-03-14T05:31:37.4939827Z * [new branch] gh/isuruf/122/base -> origin/gh/isuruf/122/base 2025-03-14T05:31:37.4941182Z * [new branch] gh/isuruf/122/head -> origin/gh/isuruf/122/head 2025-03-14T05:31:37.4943131Z * [new branch] gh/isuruf/122/orig -> origin/gh/isuruf/122/orig 2025-03-14T05:31:37.4945362Z * [new branch] gh/isuruf/123/base -> origin/gh/isuruf/123/base 2025-03-14T05:31:37.4946767Z * [new branch] gh/isuruf/123/head -> origin/gh/isuruf/123/head 2025-03-14T05:31:37.4948617Z * [new branch] gh/isuruf/123/orig -> origin/gh/isuruf/123/orig 2025-03-14T05:31:37.4950872Z * [new branch] gh/isuruf/124/base -> origin/gh/isuruf/124/base 2025-03-14T05:31:37.4952245Z * [new branch] gh/isuruf/124/head -> origin/gh/isuruf/124/head 2025-03-14T05:31:37.4954188Z * [new branch] gh/isuruf/124/orig -> origin/gh/isuruf/124/orig 2025-03-14T05:31:37.4956523Z * [new branch] gh/isuruf/125/base -> origin/gh/isuruf/125/base 2025-03-14T05:31:37.4958016Z * [new branch] gh/isuruf/125/head -> origin/gh/isuruf/125/head 2025-03-14T05:31:37.4959900Z * [new branch] gh/isuruf/125/orig -> origin/gh/isuruf/125/orig 2025-03-14T05:31:37.4962205Z * [new branch] gh/isuruf/126/base -> origin/gh/isuruf/126/base 2025-03-14T05:31:37.4963577Z * [new branch] gh/isuruf/126/head -> origin/gh/isuruf/126/head 2025-03-14T05:31:37.4965518Z * [new branch] gh/isuruf/126/orig -> origin/gh/isuruf/126/orig 2025-03-14T05:31:37.4967730Z * [new branch] gh/isuruf/127/base -> origin/gh/isuruf/127/base 2025-03-14T05:31:37.4969812Z * [new branch] gh/isuruf/127/head -> origin/gh/isuruf/127/head 2025-03-14T05:31:37.4971215Z * [new branch] gh/isuruf/127/orig -> origin/gh/isuruf/127/orig 2025-03-14T05:31:37.4974210Z * [new branch] gh/isuruf/128/base -> origin/gh/isuruf/128/base 2025-03-14T05:31:37.4975695Z * [new branch] gh/isuruf/128/head -> origin/gh/isuruf/128/head 2025-03-14T05:31:37.4977684Z * [new branch] gh/isuruf/128/orig -> origin/gh/isuruf/128/orig 2025-03-14T05:31:37.4979893Z * [new branch] gh/isuruf/129/base -> origin/gh/isuruf/129/base 2025-03-14T05:31:37.4981343Z * [new branch] gh/isuruf/129/head -> origin/gh/isuruf/129/head 2025-03-14T05:31:37.4983265Z * [new branch] gh/isuruf/129/orig -> origin/gh/isuruf/129/orig 2025-03-14T05:31:37.4986012Z * [new branch] gh/isuruf/130/base -> origin/gh/isuruf/130/base 2025-03-14T05:31:37.4987396Z * [new branch] gh/isuruf/130/head -> origin/gh/isuruf/130/head 2025-03-14T05:31:37.4989257Z * [new branch] gh/isuruf/130/orig -> origin/gh/isuruf/130/orig 2025-03-14T05:31:37.4991576Z * [new branch] gh/isuruf/131/base -> origin/gh/isuruf/131/base 2025-03-14T05:31:37.4993060Z * [new branch] gh/isuruf/131/head -> origin/gh/isuruf/131/head 2025-03-14T05:31:37.4995257Z * [new branch] gh/isuruf/131/orig -> origin/gh/isuruf/131/orig 2025-03-14T05:31:37.4997537Z * [new branch] gh/isuruf/39/base -> origin/gh/isuruf/39/base 2025-03-14T05:31:37.4998862Z * [new branch] gh/isuruf/39/head -> origin/gh/isuruf/39/head 2025-03-14T05:31:37.5000775Z * [new branch] gh/isuruf/39/orig -> origin/gh/isuruf/39/orig 2025-03-14T05:31:37.5003227Z * [new branch] gh/isuruf/81/base -> origin/gh/isuruf/81/base 2025-03-14T05:31:37.5004738Z * [new branch] gh/isuruf/81/head -> origin/gh/isuruf/81/head 2025-03-14T05:31:37.5006578Z * [new branch] gh/isuruf/81/orig -> origin/gh/isuruf/81/orig 2025-03-14T05:31:37.5009338Z * [new branch] gh/jamesjwu/100/base -> origin/gh/jamesjwu/100/base 2025-03-14T05:31:37.5010739Z * [new branch] gh/jamesjwu/100/head -> origin/gh/jamesjwu/100/head 2025-03-14T05:31:37.5012671Z * [new branch] gh/jamesjwu/100/orig -> origin/gh/jamesjwu/100/orig 2025-03-14T05:31:37.5015090Z * [new branch] gh/jamesjwu/102/base -> origin/gh/jamesjwu/102/base 2025-03-14T05:31:37.5016914Z * [new branch] gh/jamesjwu/102/head -> origin/gh/jamesjwu/102/head 2025-03-14T05:31:37.5019173Z * [new branch] gh/jamesjwu/105/base -> origin/gh/jamesjwu/105/base 2025-03-14T05:31:37.5021039Z * [new branch] gh/jamesjwu/105/head -> origin/gh/jamesjwu/105/head 2025-03-14T05:31:37.5022915Z * [new branch] gh/jamesjwu/105/orig -> origin/gh/jamesjwu/105/orig 2025-03-14T05:31:37.5025236Z * [new branch] gh/jamesjwu/108/base -> origin/gh/jamesjwu/108/base 2025-03-14T05:31:37.5026767Z * [new branch] gh/jamesjwu/108/head -> origin/gh/jamesjwu/108/head 2025-03-14T05:31:37.5028635Z * [new branch] gh/jamesjwu/108/orig -> origin/gh/jamesjwu/108/orig 2025-03-14T05:31:37.5031075Z * [new branch] gh/jamesjwu/109/base -> origin/gh/jamesjwu/109/base 2025-03-14T05:31:37.5032421Z * [new branch] gh/jamesjwu/109/head -> origin/gh/jamesjwu/109/head 2025-03-14T05:31:37.5034511Z * [new branch] gh/jamesjwu/109/orig -> origin/gh/jamesjwu/109/orig 2025-03-14T05:31:37.5036874Z * [new branch] gh/jamesjwu/110/base -> origin/gh/jamesjwu/110/base 2025-03-14T05:31:37.5038367Z * [new branch] gh/jamesjwu/110/head -> origin/gh/jamesjwu/110/head 2025-03-14T05:31:37.5040290Z * [new branch] gh/jamesjwu/110/orig -> origin/gh/jamesjwu/110/orig 2025-03-14T05:31:37.5042628Z * [new branch] gh/jamesjwu/111/base -> origin/gh/jamesjwu/111/base 2025-03-14T05:31:37.5044093Z * [new branch] gh/jamesjwu/111/head -> origin/gh/jamesjwu/111/head 2025-03-14T05:31:37.5045982Z * [new branch] gh/jamesjwu/111/orig -> origin/gh/jamesjwu/111/orig 2025-03-14T05:31:37.5048310Z * [new branch] gh/jamesjwu/112/base -> origin/gh/jamesjwu/112/base 2025-03-14T05:31:37.5049565Z * [new branch] gh/jamesjwu/112/head -> origin/gh/jamesjwu/112/head 2025-03-14T05:31:37.5051083Z * [new branch] gh/jamesjwu/112/orig -> origin/gh/jamesjwu/112/orig 2025-03-14T05:31:37.5053528Z * [new branch] gh/jamesjwu/113/base -> origin/gh/jamesjwu/113/base 2025-03-14T05:31:37.5054970Z * [new branch] gh/jamesjwu/113/head -> origin/gh/jamesjwu/113/head 2025-03-14T05:31:37.5056480Z * [new branch] gh/jamesjwu/113/orig -> origin/gh/jamesjwu/113/orig 2025-03-14T05:31:37.5059030Z * [new branch] gh/jamesjwu/114/base -> origin/gh/jamesjwu/114/base 2025-03-14T05:31:37.5060415Z * [new branch] gh/jamesjwu/114/head -> origin/gh/jamesjwu/114/head 2025-03-14T05:31:37.5062516Z * [new branch] gh/jamesjwu/114/orig -> origin/gh/jamesjwu/114/orig 2025-03-14T05:31:37.5064912Z * [new branch] gh/jamesjwu/115/base -> origin/gh/jamesjwu/115/base 2025-03-14T05:31:37.5067151Z * [new branch] gh/jamesjwu/115/head -> origin/gh/jamesjwu/115/head 2025-03-14T05:31:37.5070686Z * [new branch] gh/jamesjwu/115/orig -> origin/gh/jamesjwu/115/orig 2025-03-14T05:31:37.5073015Z * [new branch] gh/jamesjwu/116/base -> origin/gh/jamesjwu/116/base 2025-03-14T05:31:37.5074543Z * [new branch] gh/jamesjwu/116/head -> origin/gh/jamesjwu/116/head 2025-03-14T05:31:37.5076664Z * [new branch] gh/jamesjwu/116/orig -> origin/gh/jamesjwu/116/orig 2025-03-14T05:31:37.5078905Z * [new branch] gh/jamesjwu/117/base -> origin/gh/jamesjwu/117/base 2025-03-14T05:31:37.5080328Z * [new branch] gh/jamesjwu/117/head -> origin/gh/jamesjwu/117/head 2025-03-14T05:31:37.5082304Z * [new branch] gh/jamesjwu/117/orig -> origin/gh/jamesjwu/117/orig 2025-03-14T05:31:37.5084521Z * [new branch] gh/jamesjwu/118/base -> origin/gh/jamesjwu/118/base 2025-03-14T05:31:37.5086022Z * [new branch] gh/jamesjwu/118/head -> origin/gh/jamesjwu/118/head 2025-03-14T05:31:37.5087921Z * [new branch] gh/jamesjwu/118/orig -> origin/gh/jamesjwu/118/orig 2025-03-14T05:31:37.5090147Z * [new branch] gh/jamesjwu/119/base -> origin/gh/jamesjwu/119/base 2025-03-14T05:31:37.5091520Z * [new branch] gh/jamesjwu/119/head -> origin/gh/jamesjwu/119/head 2025-03-14T05:31:37.5093405Z * [new branch] gh/jamesjwu/119/orig -> origin/gh/jamesjwu/119/orig 2025-03-14T05:31:37.5096229Z * [new branch] gh/jamesjwu/120/base -> origin/gh/jamesjwu/120/base 2025-03-14T05:31:37.5097696Z * [new branch] gh/jamesjwu/120/head -> origin/gh/jamesjwu/120/head 2025-03-14T05:31:37.5099625Z * [new branch] gh/jamesjwu/120/orig -> origin/gh/jamesjwu/120/orig 2025-03-14T05:31:37.5101954Z * [new branch] gh/jamesjwu/52/base -> origin/gh/jamesjwu/52/base 2025-03-14T05:31:37.5103884Z * [new branch] gh/jamesjwu/52/head -> origin/gh/jamesjwu/52/head 2025-03-14T05:31:37.5106343Z * [new branch] gh/jamesjwu/53/base -> origin/gh/jamesjwu/53/base 2025-03-14T05:31:37.5107712Z * [new branch] gh/jamesjwu/53/head -> origin/gh/jamesjwu/53/head 2025-03-14T05:31:37.5110154Z * [new branch] gh/jamesjwu/54/base -> origin/gh/jamesjwu/54/base 2025-03-14T05:31:37.5112253Z * [new branch] gh/jamesjwu/54/head -> origin/gh/jamesjwu/54/head 2025-03-14T05:31:37.5115158Z * [new branch] gh/jamesjwu/55/base -> origin/gh/jamesjwu/55/base 2025-03-14T05:31:37.5116504Z * [new branch] gh/jamesjwu/55/head -> origin/gh/jamesjwu/55/head 2025-03-14T05:31:37.5119093Z * [new branch] gh/jamesjwu/56/base -> origin/gh/jamesjwu/56/base 2025-03-14T05:31:37.5120590Z * [new branch] gh/jamesjwu/56/head -> origin/gh/jamesjwu/56/head 2025-03-14T05:31:37.5122859Z * [new branch] gh/jamesjwu/57/base -> origin/gh/jamesjwu/57/base 2025-03-14T05:31:37.5124182Z * [new branch] gh/jamesjwu/57/head -> origin/gh/jamesjwu/57/head 2025-03-14T05:31:37.5126732Z * [new branch] gh/jamesjwu/58/base -> origin/gh/jamesjwu/58/base 2025-03-14T05:31:37.5128102Z * [new branch] gh/jamesjwu/58/head -> origin/gh/jamesjwu/58/head 2025-03-14T05:31:37.5131097Z * [new branch] gh/jamesjwu/59/base -> origin/gh/jamesjwu/59/base 2025-03-14T05:31:37.5132391Z * [new branch] gh/jamesjwu/59/head -> origin/gh/jamesjwu/59/head 2025-03-14T05:31:37.5134832Z * [new branch] gh/jamesjwu/60/base -> origin/gh/jamesjwu/60/base 2025-03-14T05:31:37.5136208Z * [new branch] gh/jamesjwu/60/head -> origin/gh/jamesjwu/60/head 2025-03-14T05:31:37.5138691Z * [new branch] gh/jamesjwu/61/base -> origin/gh/jamesjwu/61/base 2025-03-14T05:31:37.5140027Z * [new branch] gh/jamesjwu/61/head -> origin/gh/jamesjwu/61/head 2025-03-14T05:31:37.5142604Z * [new branch] gh/jamesjwu/62/base -> origin/gh/jamesjwu/62/base 2025-03-14T05:31:37.5143953Z * [new branch] gh/jamesjwu/62/head -> origin/gh/jamesjwu/62/head 2025-03-14T05:31:37.5146407Z * [new branch] gh/jamesjwu/63/base -> origin/gh/jamesjwu/63/base 2025-03-14T05:31:37.5148580Z * [new branch] gh/jamesjwu/63/head -> origin/gh/jamesjwu/63/head 2025-03-14T05:31:37.5150963Z * [new branch] gh/jamesjwu/64/base -> origin/gh/jamesjwu/64/base 2025-03-14T05:31:37.5152354Z * [new branch] gh/jamesjwu/64/head -> origin/gh/jamesjwu/64/head 2025-03-14T05:31:37.5154835Z * [new branch] gh/jamesjwu/65/base -> origin/gh/jamesjwu/65/base 2025-03-14T05:31:37.5156185Z * [new branch] gh/jamesjwu/65/head -> origin/gh/jamesjwu/65/head 2025-03-14T05:31:37.5159194Z * [new branch] gh/jamesjwu/97/base -> origin/gh/jamesjwu/97/base 2025-03-14T05:31:37.5160571Z * [new branch] gh/jamesjwu/97/head -> origin/gh/jamesjwu/97/head 2025-03-14T05:31:37.5162486Z * [new branch] gh/jamesjwu/97/orig -> origin/gh/jamesjwu/97/orig 2025-03-14T05:31:37.5165427Z * [new branch] gh/janeyx99/165/base -> origin/gh/janeyx99/165/base 2025-03-14T05:31:37.5166911Z * [new branch] gh/janeyx99/165/head -> origin/gh/janeyx99/165/head 2025-03-14T05:31:37.5169097Z * [new branch] gh/janeyx99/165/orig -> origin/gh/janeyx99/165/orig 2025-03-14T05:31:37.5171266Z * [new branch] gh/janeyx99/201/base -> origin/gh/janeyx99/201/base 2025-03-14T05:31:37.5172645Z * [new branch] gh/janeyx99/201/head -> origin/gh/janeyx99/201/head 2025-03-14T05:31:37.5174547Z * [new branch] gh/janeyx99/201/orig -> origin/gh/janeyx99/201/orig 2025-03-14T05:31:37.5176720Z * [new branch] gh/janeyx99/221/base -> origin/gh/janeyx99/221/base 2025-03-14T05:31:37.5178145Z * [new branch] gh/janeyx99/221/head -> origin/gh/janeyx99/221/head 2025-03-14T05:31:37.5180279Z * [new branch] gh/janeyx99/221/orig -> origin/gh/janeyx99/221/orig 2025-03-14T05:31:37.5182494Z * [new branch] gh/janeyx99/222/base -> origin/gh/janeyx99/222/base 2025-03-14T05:31:37.5183994Z * [new branch] gh/janeyx99/222/head -> origin/gh/janeyx99/222/head 2025-03-14T05:31:37.5185993Z * [new branch] gh/janeyx99/222/orig -> origin/gh/janeyx99/222/orig 2025-03-14T05:31:37.5188622Z * [new branch] gh/janeyx99/223/base -> origin/gh/janeyx99/223/base 2025-03-14T05:31:37.5189862Z * [new branch] gh/janeyx99/223/head -> origin/gh/janeyx99/223/head 2025-03-14T05:31:37.5191766Z * [new branch] gh/janeyx99/223/orig -> origin/gh/janeyx99/223/orig 2025-03-14T05:31:37.5194038Z * [new branch] gh/janeyx99/224/base -> origin/gh/janeyx99/224/base 2025-03-14T05:31:37.5195653Z * [new branch] gh/janeyx99/224/head -> origin/gh/janeyx99/224/head 2025-03-14T05:31:37.5197532Z * [new branch] gh/janeyx99/224/orig -> origin/gh/janeyx99/224/orig 2025-03-14T05:31:37.5199847Z * [new branch] gh/janeyx99/225/base -> origin/gh/janeyx99/225/base 2025-03-14T05:31:37.5201000Z * [new branch] gh/janeyx99/225/head -> origin/gh/janeyx99/225/head 2025-03-14T05:31:37.5202915Z * [new branch] gh/janeyx99/225/orig -> origin/gh/janeyx99/225/orig 2025-03-14T05:31:37.5205727Z * [new branch] gh/janeyx99/226/base -> origin/gh/janeyx99/226/base 2025-03-14T05:31:37.5206954Z * [new branch] gh/janeyx99/226/head -> origin/gh/janeyx99/226/head 2025-03-14T05:31:37.5208975Z * [new branch] gh/janeyx99/226/orig -> origin/gh/janeyx99/226/orig 2025-03-14T05:31:37.5211280Z * [new branch] gh/janeyx99/227/base -> origin/gh/janeyx99/227/base 2025-03-14T05:31:37.5212710Z * [new branch] gh/janeyx99/227/head -> origin/gh/janeyx99/227/head 2025-03-14T05:31:37.5214759Z * [new branch] gh/janeyx99/227/orig -> origin/gh/janeyx99/227/orig 2025-03-14T05:31:37.5217526Z * [new branch] gh/janeyx99/228/base -> origin/gh/janeyx99/228/base 2025-03-14T05:31:37.5219044Z * [new branch] gh/janeyx99/228/head -> origin/gh/janeyx99/228/head 2025-03-14T05:31:37.5221004Z * [new branch] gh/janeyx99/228/orig -> origin/gh/janeyx99/228/orig 2025-03-14T05:31:37.5223082Z * [new branch] gh/janeyx99/229/base -> origin/gh/janeyx99/229/base 2025-03-14T05:31:37.5224457Z * [new branch] gh/janeyx99/229/head -> origin/gh/janeyx99/229/head 2025-03-14T05:31:37.5226339Z * [new branch] gh/janeyx99/229/orig -> origin/gh/janeyx99/229/orig 2025-03-14T05:31:37.5228679Z * [new branch] gh/janeyx99/230/base -> origin/gh/janeyx99/230/base 2025-03-14T05:31:37.5230049Z * [new branch] gh/janeyx99/230/head -> origin/gh/janeyx99/230/head 2025-03-14T05:31:37.5232046Z * [new branch] gh/janeyx99/230/orig -> origin/gh/janeyx99/230/orig 2025-03-14T05:31:37.5234542Z * [new branch] gh/janeyx99/231/base -> origin/gh/janeyx99/231/base 2025-03-14T05:31:37.5235953Z * [new branch] gh/janeyx99/231/head -> origin/gh/janeyx99/231/head 2025-03-14T05:31:37.5237870Z * [new branch] gh/janeyx99/231/orig -> origin/gh/janeyx99/231/orig 2025-03-14T05:31:37.5240191Z * [new branch] gh/janeyx99/88/base -> origin/gh/janeyx99/88/base 2025-03-14T05:31:37.5241573Z * [new branch] gh/janeyx99/88/head -> origin/gh/janeyx99/88/head 2025-03-14T05:31:37.5243482Z * [new branch] gh/janeyx99/88/orig -> origin/gh/janeyx99/88/orig 2025-03-14T05:31:37.5246404Z * [new branch] gh/jansel/227/base -> origin/gh/jansel/227/base 2025-03-14T05:31:37.5247867Z * [new branch] gh/jansel/227/head -> origin/gh/jansel/227/head 2025-03-14T05:31:37.5249794Z * [new branch] gh/jansel/227/orig -> origin/gh/jansel/227/orig 2025-03-14T05:31:37.5252128Z * [new branch] gh/jansel/360/base -> origin/gh/jansel/360/base 2025-03-14T05:31:37.5253596Z * [new branch] gh/jansel/360/head -> origin/gh/jansel/360/head 2025-03-14T05:31:37.5256222Z * [new branch] gh/jansel/451/base -> origin/gh/jansel/451/base 2025-03-14T05:31:37.5257478Z * [new branch] gh/jansel/451/head -> origin/gh/jansel/451/head 2025-03-14T05:31:37.5259384Z * [new branch] gh/jansel/451/orig -> origin/gh/jansel/451/orig 2025-03-14T05:31:37.5261616Z * [new branch] gh/jansel/462/base -> origin/gh/jansel/462/base 2025-03-14T05:31:37.5262974Z * [new branch] gh/jansel/462/head -> origin/gh/jansel/462/head 2025-03-14T05:31:37.5264922Z * [new branch] gh/jansel/462/orig -> origin/gh/jansel/462/orig 2025-03-14T05:31:37.5267142Z * [new branch] gh/jansel/473/base -> origin/gh/jansel/473/base 2025-03-14T05:31:37.5268818Z * [new branch] gh/jansel/473/head -> origin/gh/jansel/473/head 2025-03-14T05:31:37.5270842Z * [new branch] gh/jansel/473/orig -> origin/gh/jansel/473/orig 2025-03-14T05:31:37.5273115Z * [new branch] gh/jansel/486/base -> origin/gh/jansel/486/base 2025-03-14T05:31:37.5274880Z * [new branch] gh/jansel/486/head -> origin/gh/jansel/486/head 2025-03-14T05:31:37.5276656Z * [new branch] gh/jansel/486/orig -> origin/gh/jansel/486/orig 2025-03-14T05:31:37.5278912Z * [new branch] gh/jansel/505/base -> origin/gh/jansel/505/base 2025-03-14T05:31:37.5280566Z * [new branch] gh/jansel/505/head -> origin/gh/jansel/505/head 2025-03-14T05:31:37.5282291Z * [new branch] gh/jansel/505/orig -> origin/gh/jansel/505/orig 2025-03-14T05:31:37.5284613Z * [new branch] gh/jansel/506/base -> origin/gh/jansel/506/base 2025-03-14T05:31:37.5286483Z * [new branch] gh/jansel/506/head -> origin/gh/jansel/506/head 2025-03-14T05:31:37.5288754Z * [new branch] gh/jansel/506/orig -> origin/gh/jansel/506/orig 2025-03-14T05:31:37.5291076Z * [new branch] gh/jansel/507/base -> origin/gh/jansel/507/base 2025-03-14T05:31:37.5292711Z * [new branch] gh/jansel/507/head -> origin/gh/jansel/507/head 2025-03-14T05:31:37.5294477Z * [new branch] gh/jansel/507/orig -> origin/gh/jansel/507/orig 2025-03-14T05:31:37.5296656Z * [new branch] gh/jansel/508/base -> origin/gh/jansel/508/base 2025-03-14T05:31:37.5298338Z * [new branch] gh/jansel/508/head -> origin/gh/jansel/508/head 2025-03-14T05:31:37.5300061Z * [new branch] gh/jansel/508/orig -> origin/gh/jansel/508/orig 2025-03-14T05:31:37.5302446Z * [new branch] gh/jansel/509/base -> origin/gh/jansel/509/base 2025-03-14T05:31:37.5304064Z * [new branch] gh/jansel/509/head -> origin/gh/jansel/509/head 2025-03-14T05:31:37.5305804Z * [new branch] gh/jansel/509/orig -> origin/gh/jansel/509/orig 2025-03-14T05:31:37.5308087Z * [new branch] gh/jansel/510/base -> origin/gh/jansel/510/base 2025-03-14T05:31:37.5309751Z * [new branch] gh/jansel/510/head -> origin/gh/jansel/510/head 2025-03-14T05:31:37.5311392Z * [new branch] gh/jansel/510/orig -> origin/gh/jansel/510/orig 2025-03-14T05:31:37.5313546Z * [new branch] gh/jansel/511/base -> origin/gh/jansel/511/base 2025-03-14T05:31:37.5315334Z * [new branch] gh/jansel/511/head -> origin/gh/jansel/511/head 2025-03-14T05:31:37.5316969Z * [new branch] gh/jansel/511/orig -> origin/gh/jansel/511/orig 2025-03-14T05:31:37.5319277Z * [new branch] gh/jansel/512/base -> origin/gh/jansel/512/base 2025-03-14T05:31:37.5320946Z * [new branch] gh/jansel/512/head -> origin/gh/jansel/512/head 2025-03-14T05:31:37.5322681Z * [new branch] gh/jansel/512/orig -> origin/gh/jansel/512/orig 2025-03-14T05:31:37.5325169Z * [new branch] gh/jansel/513/base -> origin/gh/jansel/513/base 2025-03-14T05:31:37.5326653Z * [new branch] gh/jansel/513/head -> origin/gh/jansel/513/head 2025-03-14T05:31:37.5328269Z * [new branch] gh/jansel/513/orig -> origin/gh/jansel/513/orig 2025-03-14T05:31:37.5330501Z * [new branch] gh/jansel/514/base -> origin/gh/jansel/514/base 2025-03-14T05:31:37.5332123Z * [new branch] gh/jansel/514/head -> origin/gh/jansel/514/head 2025-03-14T05:31:37.5333773Z * [new branch] gh/jansel/514/orig -> origin/gh/jansel/514/orig 2025-03-14T05:31:37.5336533Z * [new branch] gh/jansel/515/base -> origin/gh/jansel/515/base 2025-03-14T05:31:37.5338185Z * [new branch] gh/jansel/515/head -> origin/gh/jansel/515/head 2025-03-14T05:31:37.5340319Z * [new branch] gh/jansel/515/orig -> origin/gh/jansel/515/orig 2025-03-14T05:31:37.5342772Z * [new branch] gh/jansel/516/base -> origin/gh/jansel/516/base 2025-03-14T05:31:37.5344364Z * [new branch] gh/jansel/516/head -> origin/gh/jansel/516/head 2025-03-14T05:31:37.5346105Z * [new branch] gh/jansel/516/orig -> origin/gh/jansel/516/orig 2025-03-14T05:31:37.5348390Z * [new branch] gh/jansel/517/base -> origin/gh/jansel/517/base 2025-03-14T05:31:37.5350077Z * [new branch] gh/jansel/517/head -> origin/gh/jansel/517/head 2025-03-14T05:31:37.5351664Z * [new branch] gh/jansel/517/orig -> origin/gh/jansel/517/orig 2025-03-14T05:31:37.5353977Z * [new branch] gh/jansel/518/base -> origin/gh/jansel/518/base 2025-03-14T05:31:37.5355797Z * [new branch] gh/jansel/518/head -> origin/gh/jansel/518/head 2025-03-14T05:31:37.5357466Z * [new branch] gh/jansel/518/orig -> origin/gh/jansel/518/orig 2025-03-14T05:31:37.5359682Z * [new branch] gh/jansel/519/base -> origin/gh/jansel/519/base 2025-03-14T05:31:37.5361379Z * [new branch] gh/jansel/519/head -> origin/gh/jansel/519/head 2025-03-14T05:31:37.5363145Z * [new branch] gh/jansel/519/orig -> origin/gh/jansel/519/orig 2025-03-14T05:31:37.5365547Z * [new branch] gh/jansel/520/base -> origin/gh/jansel/520/base 2025-03-14T05:31:37.5367160Z * [new branch] gh/jansel/520/head -> origin/gh/jansel/520/head 2025-03-14T05:31:37.5372810Z * [new branch] gh/jansel/520/orig -> origin/gh/jansel/520/orig 2025-03-14T05:31:37.5375263Z * [new branch] gh/jansel/521/base -> origin/gh/jansel/521/base 2025-03-14T05:31:37.5376875Z * [new branch] gh/jansel/521/head -> origin/gh/jansel/521/head 2025-03-14T05:31:37.5378530Z * [new branch] gh/jansel/521/orig -> origin/gh/jansel/521/orig 2025-03-14T05:31:37.5381497Z * [new branch] gh/jbschlosser/195/base -> origin/gh/jbschlosser/195/base 2025-03-14T05:31:37.5383624Z * [new branch] gh/jbschlosser/195/head -> origin/gh/jbschlosser/195/head 2025-03-14T05:31:37.5385312Z * [new branch] gh/jbschlosser/195/orig -> origin/gh/jbschlosser/195/orig 2025-03-14T05:31:37.5387637Z * [new branch] gh/jbschlosser/208/base -> origin/gh/jbschlosser/208/base 2025-03-14T05:31:37.5389243Z * [new branch] gh/jbschlosser/208/head -> origin/gh/jbschlosser/208/head 2025-03-14T05:31:37.5390887Z * [new branch] gh/jbschlosser/208/orig -> origin/gh/jbschlosser/208/orig 2025-03-14T05:31:37.5393210Z * [new branch] gh/jbschlosser/214/base -> origin/gh/jbschlosser/214/base 2025-03-14T05:31:37.5395148Z * [new branch] gh/jbschlosser/214/head -> origin/gh/jbschlosser/214/head 2025-03-14T05:31:37.5397031Z * [new branch] gh/jbschlosser/214/orig -> origin/gh/jbschlosser/214/orig 2025-03-14T05:31:37.5399142Z * [new branch] gh/jbschlosser/226/base -> origin/gh/jbschlosser/226/base 2025-03-14T05:31:37.5400750Z * [new branch] gh/jbschlosser/226/head -> origin/gh/jbschlosser/226/head 2025-03-14T05:31:37.5402413Z * [new branch] gh/jbschlosser/226/orig -> origin/gh/jbschlosser/226/orig 2025-03-14T05:31:37.5404684Z * [new branch] gh/jbschlosser/227/base -> origin/gh/jbschlosser/227/base 2025-03-14T05:31:37.5406338Z * [new branch] gh/jbschlosser/227/head -> origin/gh/jbschlosser/227/head 2025-03-14T05:31:37.5407973Z * [new branch] gh/jbschlosser/227/orig -> origin/gh/jbschlosser/227/orig 2025-03-14T05:31:37.5410272Z * [new branch] gh/jbschlosser/228/base -> origin/gh/jbschlosser/228/base 2025-03-14T05:31:37.5412060Z * [new branch] gh/jbschlosser/228/head -> origin/gh/jbschlosser/228/head 2025-03-14T05:31:37.5413715Z * [new branch] gh/jbschlosser/228/orig -> origin/gh/jbschlosser/228/orig 2025-03-14T05:31:37.5416113Z * [new branch] gh/jbschlosser/229/base -> origin/gh/jbschlosser/229/base 2025-03-14T05:31:37.5417794Z * [new branch] gh/jbschlosser/229/head -> origin/gh/jbschlosser/229/head 2025-03-14T05:31:37.5419425Z * [new branch] gh/jbschlosser/229/orig -> origin/gh/jbschlosser/229/orig 2025-03-14T05:31:37.5421775Z * [new branch] gh/jbschlosser/230/base -> origin/gh/jbschlosser/230/base 2025-03-14T05:31:37.5423848Z * [new branch] gh/jbschlosser/230/head -> origin/gh/jbschlosser/230/head 2025-03-14T05:31:37.5425577Z * [new branch] gh/jbschlosser/230/orig -> origin/gh/jbschlosser/230/orig 2025-03-14T05:31:37.5427927Z * [new branch] gh/jbschlosser/231/base -> origin/gh/jbschlosser/231/base 2025-03-14T05:31:37.5429693Z * [new branch] gh/jbschlosser/231/head -> origin/gh/jbschlosser/231/head 2025-03-14T05:31:37.5431318Z * [new branch] gh/jbschlosser/231/orig -> origin/gh/jbschlosser/231/orig 2025-03-14T05:31:37.5433537Z * [new branch] gh/jbschlosser/89/base -> origin/gh/jbschlosser/89/base 2025-03-14T05:31:37.5435384Z * [new branch] gh/jbschlosser/89/head -> origin/gh/jbschlosser/89/head 2025-03-14T05:31:37.5437042Z * [new branch] gh/jbschlosser/89/orig -> origin/gh/jbschlosser/89/orig 2025-03-14T05:31:37.5439827Z * [new branch] gh/jcaip/70/base -> origin/gh/jcaip/70/base 2025-03-14T05:31:37.5441473Z * [new branch] gh/jcaip/70/head -> origin/gh/jcaip/70/head 2025-03-14T05:31:37.5443100Z * [new branch] gh/jcaip/70/orig -> origin/gh/jcaip/70/orig 2025-03-14T05:31:37.5445970Z * [new branch] gh/jerryzh168/855/base -> origin/gh/jerryzh168/855/base 2025-03-14T05:31:37.5447731Z * [new branch] gh/jerryzh168/855/head -> origin/gh/jerryzh168/855/head 2025-03-14T05:31:37.5449376Z * [new branch] gh/jerryzh168/855/orig -> origin/gh/jerryzh168/855/orig 2025-03-14T05:31:37.5451755Z * [new branch] gh/jerryzh168/859/base -> origin/gh/jerryzh168/859/base 2025-03-14T05:31:37.5453451Z * [new branch] gh/jerryzh168/859/head -> origin/gh/jerryzh168/859/head 2025-03-14T05:31:37.5455152Z * [new branch] gh/jerryzh168/859/orig -> origin/gh/jerryzh168/859/orig 2025-03-14T05:31:37.5457296Z * [new branch] gh/jerryzh168/860/base -> origin/gh/jerryzh168/860/base 2025-03-14T05:31:37.5459504Z * [new branch] gh/jerryzh168/860/head -> origin/gh/jerryzh168/860/head 2025-03-14T05:31:37.5461179Z * [new branch] gh/jerryzh168/860/orig -> origin/gh/jerryzh168/860/orig 2025-03-14T05:31:37.5464685Z * [new branch] gh/jgong5/23/base -> origin/gh/jgong5/23/base 2025-03-14T05:31:37.5466176Z * [new branch] gh/jgong5/23/head -> origin/gh/jgong5/23/head 2025-03-14T05:31:37.5469191Z * [new branch] gh/jiayisunx/34/base -> origin/gh/jiayisunx/34/base 2025-03-14T05:31:37.5471653Z * [new branch] gh/jiayisunx/34/head -> origin/gh/jiayisunx/34/head 2025-03-14T05:31:37.5473323Z * [new branch] gh/jiayisunx/34/orig -> origin/gh/jiayisunx/34/orig 2025-03-14T05:31:37.5475785Z * [new branch] gh/jiayisunx/37/base -> origin/gh/jiayisunx/37/base 2025-03-14T05:31:37.5477453Z * [new branch] gh/jiayisunx/37/head -> origin/gh/jiayisunx/37/head 2025-03-14T05:31:37.5479125Z * [new branch] gh/jiayisunx/37/orig -> origin/gh/jiayisunx/37/orig 2025-03-14T05:31:37.5481480Z * [new branch] gh/jiayisunx/50/base -> origin/gh/jiayisunx/50/base 2025-03-14T05:31:37.5483113Z * [new branch] gh/jiayisunx/50/head -> origin/gh/jiayisunx/50/head 2025-03-14T05:31:37.5484818Z * [new branch] gh/jiayisunx/50/orig -> origin/gh/jiayisunx/50/orig 2025-03-14T05:31:37.5487126Z * [new branch] gh/jiayisunx/51/base -> origin/gh/jiayisunx/51/base 2025-03-14T05:31:37.5488839Z * [new branch] gh/jiayisunx/51/head -> origin/gh/jiayisunx/51/head 2025-03-14T05:31:37.5490442Z * [new branch] gh/jiayisunx/51/orig -> origin/gh/jiayisunx/51/orig 2025-03-14T05:31:37.5492742Z * [new branch] gh/jiayisunx/53/base -> origin/gh/jiayisunx/53/base 2025-03-14T05:31:37.5494518Z * [new branch] gh/jiayisunx/53/head -> origin/gh/jiayisunx/53/head 2025-03-14T05:31:37.5496127Z * [new branch] gh/jiayisunx/53/orig -> origin/gh/jiayisunx/53/orig 2025-03-14T05:31:37.5498586Z * [new branch] gh/jiayisunx/54/base -> origin/gh/jiayisunx/54/base 2025-03-14T05:31:37.5500070Z * [new branch] gh/jiayisunx/54/head -> origin/gh/jiayisunx/54/head 2025-03-14T05:31:37.5501679Z * [new branch] gh/jiayisunx/54/orig -> origin/gh/jiayisunx/54/orig 2025-03-14T05:31:37.5503967Z * [new branch] gh/jiayisunx/55/base -> origin/gh/jiayisunx/55/base 2025-03-14T05:31:37.5505662Z * [new branch] gh/jiayisunx/55/head -> origin/gh/jiayisunx/55/head 2025-03-14T05:31:37.5507966Z * [new branch] gh/jiayisunx/55/orig -> origin/gh/jiayisunx/55/orig 2025-03-14T05:31:37.5510290Z * [new branch] gh/jiayisunx/56/base -> origin/gh/jiayisunx/56/base 2025-03-14T05:31:37.5512051Z * [new branch] gh/jiayisunx/56/head -> origin/gh/jiayisunx/56/head 2025-03-14T05:31:37.5513718Z * [new branch] gh/jiayisunx/56/orig -> origin/gh/jiayisunx/56/orig 2025-03-14T05:31:37.5516084Z * [new branch] gh/jiayisunx/57/base -> origin/gh/jiayisunx/57/base 2025-03-14T05:31:37.5517714Z * [new branch] gh/jiayisunx/57/head -> origin/gh/jiayisunx/57/head 2025-03-14T05:31:37.5519321Z * [new branch] gh/jiayisunx/57/orig -> origin/gh/jiayisunx/57/orig 2025-03-14T05:31:37.5521608Z * [new branch] gh/jiayisunx/58/base -> origin/gh/jiayisunx/58/base 2025-03-14T05:31:37.5523239Z * [new branch] gh/jiayisunx/58/head -> origin/gh/jiayisunx/58/head 2025-03-14T05:31:37.5524948Z * [new branch] gh/jiayisunx/58/orig -> origin/gh/jiayisunx/58/orig 2025-03-14T05:31:37.5527108Z * [new branch] gh/jiayisunx/59/base -> origin/gh/jiayisunx/59/base 2025-03-14T05:31:37.5528748Z * [new branch] gh/jiayisunx/59/head -> origin/gh/jiayisunx/59/head 2025-03-14T05:31:37.5530471Z * [new branch] gh/jiayisunx/59/orig -> origin/gh/jiayisunx/59/orig 2025-03-14T05:31:37.5532915Z * [new branch] gh/jiayisunx/60/base -> origin/gh/jiayisunx/60/base 2025-03-14T05:31:37.5534439Z * [new branch] gh/jiayisunx/60/head -> origin/gh/jiayisunx/60/head 2025-03-14T05:31:37.5536165Z * [new branch] gh/jiayisunx/60/orig -> origin/gh/jiayisunx/60/orig 2025-03-14T05:31:37.5538471Z * [new branch] gh/jiayisunx/61/base -> origin/gh/jiayisunx/61/base 2025-03-14T05:31:37.5540130Z * [new branch] gh/jiayisunx/61/head -> origin/gh/jiayisunx/61/head 2025-03-14T05:31:37.5541773Z * [new branch] gh/jiayisunx/61/orig -> origin/gh/jiayisunx/61/orig 2025-03-14T05:31:37.5544389Z * [new branch] gh/jjwu@meta.com/1/base -> origin/gh/jjwu@meta.com/1/base 2025-03-14T05:31:37.5546060Z * [new branch] gh/jjwu@meta.com/1/head -> origin/gh/jjwu@meta.com/1/head 2025-03-14T05:31:37.5548837Z * [new branch] gh/jon-chuang/1/base -> origin/gh/jon-chuang/1/base 2025-03-14T05:31:37.5550598Z * [new branch] gh/jon-chuang/1/head -> origin/gh/jon-chuang/1/head 2025-03-14T05:31:37.5552862Z * [new branch] gh/jon-chuang/12/base -> origin/gh/jon-chuang/12/base 2025-03-14T05:31:37.5555325Z * [new branch] gh/jon-chuang/13/base -> origin/gh/jon-chuang/13/base 2025-03-14T05:31:37.5557494Z * [new branch] gh/jon-chuang/14/base -> origin/gh/jon-chuang/14/base 2025-03-14T05:31:37.5559755Z * [new branch] gh/jon-chuang/16/base -> origin/gh/jon-chuang/16/base 2025-03-14T05:31:37.5561494Z * [new branch] gh/jon-chuang/16/head -> origin/gh/jon-chuang/16/head 2025-03-14T05:31:37.5563167Z * [new branch] gh/jon-chuang/16/orig -> origin/gh/jon-chuang/16/orig 2025-03-14T05:31:37.5565433Z * [new branch] gh/jon-chuang/19/base -> origin/gh/jon-chuang/19/base 2025-03-14T05:31:37.5567088Z * [new branch] gh/jon-chuang/19/head -> origin/gh/jon-chuang/19/head 2025-03-14T05:31:37.5569084Z * [new branch] gh/jon-chuang/19/orig -> origin/gh/jon-chuang/19/orig 2025-03-14T05:31:37.5571394Z * [new branch] gh/jon-chuang/2/base -> origin/gh/jon-chuang/2/base 2025-03-14T05:31:37.5572972Z * [new branch] gh/jon-chuang/2/head -> origin/gh/jon-chuang/2/head 2025-03-14T05:31:37.5575108Z * [new branch] gh/jon-chuang/3/base -> origin/gh/jon-chuang/3/base 2025-03-14T05:31:37.5576725Z * [new branch] gh/jon-chuang/3/head -> origin/gh/jon-chuang/3/head 2025-03-14T05:31:37.5578917Z * [new branch] gh/jon-chuang/4/base -> origin/gh/jon-chuang/4/base 2025-03-14T05:31:37.5580219Z * [new branch] gh/jon-chuang/4/head -> origin/gh/jon-chuang/4/head 2025-03-14T05:31:37.5583134Z * [new branch] gh/jon-chuang/5/base -> origin/gh/jon-chuang/5/base 2025-03-14T05:31:37.5584798Z * [new branch] gh/jon-chuang/5/head -> origin/gh/jon-chuang/5/head 2025-03-14T05:31:37.5586951Z * [new branch] gh/jon-chuang/6/base -> origin/gh/jon-chuang/6/base 2025-03-14T05:31:37.5588544Z * [new branch] gh/jon-chuang/6/head -> origin/gh/jon-chuang/6/head 2025-03-14T05:31:37.5591013Z * [new branch] gh/jon-chuang/7/base -> origin/gh/jon-chuang/7/base 2025-03-14T05:31:37.5592685Z * [new branch] gh/jon-chuang/7/head -> origin/gh/jon-chuang/7/head 2025-03-14T05:31:37.5595489Z * [new branch] gh/jon-chuang/8/base -> origin/gh/jon-chuang/8/base 2025-03-14T05:31:37.5597348Z * [new branch] gh/jon-chuang/8/head -> origin/gh/jon-chuang/8/head 2025-03-14T05:31:37.5599961Z * [new branch] gh/justinchuby/102/base -> origin/gh/justinchuby/102/base 2025-03-14T05:31:37.5601511Z * [new branch] gh/justinchuby/102/head -> origin/gh/justinchuby/102/head 2025-03-14T05:31:37.5603535Z * [new branch] gh/justinchuby/102/orig -> origin/gh/justinchuby/102/orig 2025-03-14T05:31:37.5606026Z * [new branch] gh/justinchuby/103/base -> origin/gh/justinchuby/103/base 2025-03-14T05:31:37.5607758Z * [new branch] gh/justinchuby/103/head -> origin/gh/justinchuby/103/head 2025-03-14T05:31:37.5615378Z * [new branch] gh/justinchuby/103/orig -> origin/gh/justinchuby/103/orig 2025-03-14T05:31:37.5615928Z * [new branch] gh/justinchuby/104/base -> origin/gh/justinchuby/104/base 2025-03-14T05:31:37.5616274Z * [new branch] gh/justinchuby/104/head -> origin/gh/justinchuby/104/head 2025-03-14T05:31:37.5616610Z * [new branch] gh/justinchuby/104/orig -> origin/gh/justinchuby/104/orig 2025-03-14T05:31:37.5616944Z * [new branch] gh/justinchuby/105/base -> origin/gh/justinchuby/105/base 2025-03-14T05:31:37.5618594Z * [new branch] gh/justinchuby/105/head -> origin/gh/justinchuby/105/head 2025-03-14T05:31:37.5620806Z * [new branch] gh/justinchuby/105/orig -> origin/gh/justinchuby/105/orig 2025-03-14T05:31:37.5622959Z * [new branch] gh/justinchuby/106/base -> origin/gh/justinchuby/106/base 2025-03-14T05:31:37.5624678Z * [new branch] gh/justinchuby/106/head -> origin/gh/justinchuby/106/head 2025-03-14T05:31:37.5626251Z * [new branch] gh/justinchuby/106/orig -> origin/gh/justinchuby/106/orig 2025-03-14T05:31:37.5628512Z * [new branch] gh/justinchuby/107/base -> origin/gh/justinchuby/107/base 2025-03-14T05:31:37.5630141Z * [new branch] gh/justinchuby/107/head -> origin/gh/justinchuby/107/head 2025-03-14T05:31:37.5631893Z * [new branch] gh/justinchuby/107/orig -> origin/gh/justinchuby/107/orig 2025-03-14T05:31:37.5634184Z * [new branch] gh/justinchuby/108/base -> origin/gh/justinchuby/108/base 2025-03-14T05:31:37.5635961Z * [new branch] gh/justinchuby/108/head -> origin/gh/justinchuby/108/head 2025-03-14T05:31:37.5637758Z * [new branch] gh/justinchuby/108/orig -> origin/gh/justinchuby/108/orig 2025-03-14T05:31:37.5640018Z * [new branch] gh/justinchuby/109/base -> origin/gh/justinchuby/109/base 2025-03-14T05:31:37.5641555Z * [new branch] gh/justinchuby/109/head -> origin/gh/justinchuby/109/head 2025-03-14T05:31:37.5643743Z * [new branch] gh/justinchuby/109/orig -> origin/gh/justinchuby/109/orig 2025-03-14T05:31:37.5645954Z * [new branch] gh/justinchuby/110/base -> origin/gh/justinchuby/110/base 2025-03-14T05:31:37.5647594Z * [new branch] gh/justinchuby/110/head -> origin/gh/justinchuby/110/head 2025-03-14T05:31:37.5649261Z * [new branch] gh/justinchuby/110/orig -> origin/gh/justinchuby/110/orig 2025-03-14T05:31:37.5651561Z * [new branch] gh/justinchuby/111/base -> origin/gh/justinchuby/111/base 2025-03-14T05:31:37.5653228Z * [new branch] gh/justinchuby/111/head -> origin/gh/justinchuby/111/head 2025-03-14T05:31:37.5654998Z * [new branch] gh/justinchuby/111/orig -> origin/gh/justinchuby/111/orig 2025-03-14T05:31:37.5657100Z * [new branch] gh/justinchuby/112/base -> origin/gh/justinchuby/112/base 2025-03-14T05:31:37.5658823Z * [new branch] gh/justinchuby/112/head -> origin/gh/justinchuby/112/head 2025-03-14T05:31:37.5660551Z * [new branch] gh/justinchuby/112/orig -> origin/gh/justinchuby/112/orig 2025-03-14T05:31:37.5663830Z * [new branch] gh/justinchuby/113/base -> origin/gh/justinchuby/113/base 2025-03-14T05:31:37.5665529Z * [new branch] gh/justinchuby/113/head -> origin/gh/justinchuby/113/head 2025-03-14T05:31:37.5667226Z * [new branch] gh/justinchuby/113/orig -> origin/gh/justinchuby/113/orig 2025-03-14T05:31:37.5672465Z * [new branch] gh/justinchuby/114/base -> origin/gh/justinchuby/114/base 2025-03-14T05:31:37.5673981Z * [new branch] gh/justinchuby/114/head -> origin/gh/justinchuby/114/head 2025-03-14T05:31:37.5675771Z * [new branch] gh/justinchuby/114/orig -> origin/gh/justinchuby/114/orig 2025-03-14T05:31:37.5678582Z * [new branch] gh/justinchuby/115/base -> origin/gh/justinchuby/115/base 2025-03-14T05:31:37.5680243Z * [new branch] gh/justinchuby/115/head -> origin/gh/justinchuby/115/head 2025-03-14T05:31:37.5681886Z * [new branch] gh/justinchuby/115/orig -> origin/gh/justinchuby/115/orig 2025-03-14T05:31:37.5685179Z * [new branch] gh/kadeng/1/base -> origin/gh/kadeng/1/base 2025-03-14T05:31:37.5686626Z * [new branch] gh/kadeng/1/head -> origin/gh/kadeng/1/head 2025-03-14T05:31:37.5688779Z * [new branch] gh/kadeng/1/orig -> origin/gh/kadeng/1/orig 2025-03-14T05:31:37.5691021Z * [new branch] gh/kadeng/12/base -> origin/gh/kadeng/12/base 2025-03-14T05:31:37.5692740Z * [new branch] gh/kadeng/12/head -> origin/gh/kadeng/12/head 2025-03-14T05:31:37.5695157Z * [new branch] gh/kadeng/13/base -> origin/gh/kadeng/13/base 2025-03-14T05:31:37.5696817Z * [new branch] gh/kadeng/13/head -> origin/gh/kadeng/13/head 2025-03-14T05:31:37.5698897Z * [new branch] gh/kadeng/14/base -> origin/gh/kadeng/14/base 2025-03-14T05:31:37.5700506Z * [new branch] gh/kadeng/14/head -> origin/gh/kadeng/14/head 2025-03-14T05:31:37.5702791Z * [new branch] gh/kadeng/16/base -> origin/gh/kadeng/16/base 2025-03-14T05:31:37.5704517Z * [new branch] gh/kadeng/16/head -> origin/gh/kadeng/16/head 2025-03-14T05:31:37.5707256Z * [new branch] gh/kadeng/6/base -> origin/gh/kadeng/6/base 2025-03-14T05:31:37.5709085Z * [new branch] gh/kadeng/6/head -> origin/gh/kadeng/6/head 2025-03-14T05:31:37.5711232Z * [new branch] gh/kadeng/7/base -> origin/gh/kadeng/7/base 2025-03-14T05:31:37.5713352Z * [new branch] gh/kadeng/9/base -> origin/gh/kadeng/9/base 2025-03-14T05:31:37.5715004Z * [new branch] gh/kadeng/9/head -> origin/gh/kadeng/9/head 2025-03-14T05:31:37.5717783Z * [new branch] gh/kimishpatel/186/base -> origin/gh/kimishpatel/186/base 2025-03-14T05:31:37.5719458Z * [new branch] gh/kimishpatel/186/head -> origin/gh/kimishpatel/186/head 2025-03-14T05:31:37.5721124Z * [new branch] gh/kimishpatel/186/orig -> origin/gh/kimishpatel/186/orig 2025-03-14T05:31:37.5723914Z * [new branch] gh/kurtamohler/31/base -> origin/gh/kurtamohler/31/base 2025-03-14T05:31:37.5725554Z * [new branch] gh/kurtamohler/31/head -> origin/gh/kurtamohler/31/head 2025-03-14T05:31:37.5727196Z * [new branch] gh/kurtamohler/31/orig -> origin/gh/kurtamohler/31/orig 2025-03-14T05:31:37.5729509Z * [new branch] gh/kurtamohler/32/base -> origin/gh/kurtamohler/32/base 2025-03-14T05:31:37.5731132Z * [new branch] gh/kurtamohler/32/head -> origin/gh/kurtamohler/32/head 2025-03-14T05:31:37.5732775Z * [new branch] gh/kurtamohler/32/orig -> origin/gh/kurtamohler/32/orig 2025-03-14T05:31:37.5735746Z * [new branch] gh/kwen2501/1/base -> origin/gh/kwen2501/1/base 2025-03-14T05:31:37.5737380Z * [new branch] gh/kwen2501/1/head -> origin/gh/kwen2501/1/head 2025-03-14T05:31:37.5739953Z * [new branch] gh/kwen2501/108/base -> origin/gh/kwen2501/108/base 2025-03-14T05:31:37.5741598Z * [new branch] gh/kwen2501/108/head -> origin/gh/kwen2501/108/head 2025-03-14T05:31:37.5743294Z * [new branch] gh/kwen2501/108/orig -> origin/gh/kwen2501/108/orig 2025-03-14T05:31:37.5745707Z * [new branch] gh/kwen2501/109/base -> origin/gh/kwen2501/109/base 2025-03-14T05:31:37.5747323Z * [new branch] gh/kwen2501/109/head -> origin/gh/kwen2501/109/head 2025-03-14T05:31:37.5749057Z * [new branch] gh/kwen2501/109/orig -> origin/gh/kwen2501/109/orig 2025-03-14T05:31:37.5751438Z * [new branch] gh/kwen2501/118/base -> origin/gh/kwen2501/118/base 2025-03-14T05:31:37.5753180Z * [new branch] gh/kwen2501/118/head -> origin/gh/kwen2501/118/head 2025-03-14T05:31:37.5754912Z * [new branch] gh/kwen2501/118/orig -> origin/gh/kwen2501/118/orig 2025-03-14T05:31:37.5757198Z * [new branch] gh/kwen2501/123/base -> origin/gh/kwen2501/123/base 2025-03-14T05:31:37.5758886Z * [new branch] gh/kwen2501/123/head -> origin/gh/kwen2501/123/head 2025-03-14T05:31:37.5760558Z * [new branch] gh/kwen2501/123/orig -> origin/gh/kwen2501/123/orig 2025-03-14T05:31:37.5762822Z * [new branch] gh/kwen2501/125/base -> origin/gh/kwen2501/125/base 2025-03-14T05:31:37.5764579Z * [new branch] gh/kwen2501/125/head -> origin/gh/kwen2501/125/head 2025-03-14T05:31:37.5766245Z * [new branch] gh/kwen2501/125/orig -> origin/gh/kwen2501/125/orig 2025-03-14T05:31:37.5768716Z * [new branch] gh/kwen2501/126/base -> origin/gh/kwen2501/126/base 2025-03-14T05:31:37.5770520Z * [new branch] gh/kwen2501/126/head -> origin/gh/kwen2501/126/head 2025-03-14T05:31:37.5772173Z * [new branch] gh/kwen2501/126/orig -> origin/gh/kwen2501/126/orig 2025-03-14T05:31:37.5774627Z * [new branch] gh/kwen2501/127/base -> origin/gh/kwen2501/127/base 2025-03-14T05:31:37.5776273Z * [new branch] gh/kwen2501/127/head -> origin/gh/kwen2501/127/head 2025-03-14T05:31:37.5777988Z * [new branch] gh/kwen2501/127/orig -> origin/gh/kwen2501/127/orig 2025-03-14T05:31:37.5780519Z * [new branch] gh/kwen2501/128/base -> origin/gh/kwen2501/128/base 2025-03-14T05:31:37.5782217Z * [new branch] gh/kwen2501/128/head -> origin/gh/kwen2501/128/head 2025-03-14T05:31:37.5783980Z * [new branch] gh/kwen2501/128/orig -> origin/gh/kwen2501/128/orig 2025-03-14T05:31:37.5786278Z * [new branch] gh/kwen2501/129/base -> origin/gh/kwen2501/129/base 2025-03-14T05:31:37.5787884Z * [new branch] gh/kwen2501/129/head -> origin/gh/kwen2501/129/head 2025-03-14T05:31:37.5789594Z * [new branch] gh/kwen2501/129/orig -> origin/gh/kwen2501/129/orig 2025-03-14T05:31:37.5791930Z * [new branch] gh/kwen2501/130/base -> origin/gh/kwen2501/130/base 2025-03-14T05:31:37.5793672Z * [new branch] gh/kwen2501/130/head -> origin/gh/kwen2501/130/head 2025-03-14T05:31:37.5795545Z * [new branch] gh/kwen2501/130/orig -> origin/gh/kwen2501/130/orig 2025-03-14T05:31:37.5797768Z * [new branch] gh/kwen2501/131/base -> origin/gh/kwen2501/131/base 2025-03-14T05:31:37.5799368Z * [new branch] gh/kwen2501/131/head -> origin/gh/kwen2501/131/head 2025-03-14T05:31:37.5800972Z * [new branch] gh/kwen2501/131/orig -> origin/gh/kwen2501/131/orig 2025-03-14T05:31:37.5803337Z * [new branch] gh/kwen2501/132/base -> origin/gh/kwen2501/132/base 2025-03-14T05:31:37.5805114Z * [new branch] gh/kwen2501/132/head -> origin/gh/kwen2501/132/head 2025-03-14T05:31:37.5807225Z * [new branch] gh/kwen2501/132/orig -> origin/gh/kwen2501/132/orig 2025-03-14T05:31:37.5809337Z * [new branch] gh/kwen2501/133/base -> origin/gh/kwen2501/133/base 2025-03-14T05:31:37.5810985Z * [new branch] gh/kwen2501/133/head -> origin/gh/kwen2501/133/head 2025-03-14T05:31:37.5812854Z * [new branch] gh/kwen2501/133/orig -> origin/gh/kwen2501/133/orig 2025-03-14T05:31:37.5814851Z * [new branch] gh/kwen2501/134/base -> origin/gh/kwen2501/134/base 2025-03-14T05:31:37.5816710Z * [new branch] gh/kwen2501/134/head -> origin/gh/kwen2501/134/head 2025-03-14T05:31:37.5818356Z * [new branch] gh/kwen2501/134/orig -> origin/gh/kwen2501/134/orig 2025-03-14T05:31:37.5820787Z * [new branch] gh/kwen2501/15/base -> origin/gh/kwen2501/15/base 2025-03-14T05:31:37.5822411Z * [new branch] gh/kwen2501/15/head -> origin/gh/kwen2501/15/head 2025-03-14T05:31:37.5824908Z * [new branch] gh/kwen2501/87/base -> origin/gh/kwen2501/87/base 2025-03-14T05:31:37.5826538Z * [new branch] gh/kwen2501/87/head -> origin/gh/kwen2501/87/head 2025-03-14T05:31:37.5828205Z * [new branch] gh/kwen2501/87/orig -> origin/gh/kwen2501/87/orig 2025-03-14T05:31:37.5830977Z * [new branch] gh/kwen2501/97/base -> origin/gh/kwen2501/97/base 2025-03-14T05:31:37.5832705Z * [new branch] gh/kwen2501/97/head -> origin/gh/kwen2501/97/head 2025-03-14T05:31:37.5834403Z * [new branch] gh/kwen2501/97/orig -> origin/gh/kwen2501/97/orig 2025-03-14T05:31:37.5837349Z * [new branch] gh/laithsakka/107/base -> origin/gh/laithsakka/107/base 2025-03-14T05:31:37.5838962Z * [new branch] gh/laithsakka/107/head -> origin/gh/laithsakka/107/head 2025-03-14T05:31:37.5840646Z * [new branch] gh/laithsakka/107/orig -> origin/gh/laithsakka/107/orig 2025-03-14T05:31:37.5843176Z * [new branch] gh/laithsakka/108/base -> origin/gh/laithsakka/108/base 2025-03-14T05:31:37.5844873Z * [new branch] gh/laithsakka/108/head -> origin/gh/laithsakka/108/head 2025-03-14T05:31:37.5846581Z * [new branch] gh/laithsakka/108/orig -> origin/gh/laithsakka/108/orig 2025-03-14T05:31:37.5848796Z * [new branch] gh/laithsakka/109/base -> origin/gh/laithsakka/109/base 2025-03-14T05:31:37.5850435Z * [new branch] gh/laithsakka/109/head -> origin/gh/laithsakka/109/head 2025-03-14T05:31:37.5852099Z * [new branch] gh/laithsakka/109/orig -> origin/gh/laithsakka/109/orig 2025-03-14T05:31:37.5854187Z * [new branch] gh/laithsakka/110/base -> origin/gh/laithsakka/110/base 2025-03-14T05:31:37.5856099Z * [new branch] gh/laithsakka/110/head -> origin/gh/laithsakka/110/head 2025-03-14T05:31:37.5857453Z * [new branch] gh/laithsakka/110/orig -> origin/gh/laithsakka/110/orig 2025-03-14T05:31:37.5860051Z * [new branch] gh/laithsakka/111/base -> origin/gh/laithsakka/111/base 2025-03-14T05:31:37.5861780Z * [new branch] gh/laithsakka/111/head -> origin/gh/laithsakka/111/head 2025-03-14T05:31:37.5863428Z * [new branch] gh/laithsakka/111/orig -> origin/gh/laithsakka/111/orig 2025-03-14T05:31:37.5866694Z * [new branch] gh/laithsakka/112/base -> origin/gh/laithsakka/112/base 2025-03-14T05:31:37.5868830Z * [new branch] gh/laithsakka/112/head -> origin/gh/laithsakka/112/head 2025-03-14T05:31:37.5870773Z * [new branch] gh/laithsakka/112/orig -> origin/gh/laithsakka/112/orig 2025-03-14T05:31:37.5872948Z * [new branch] gh/laithsakka/113/base -> origin/gh/laithsakka/113/base 2025-03-14T05:31:37.5874687Z * [new branch] gh/laithsakka/113/head -> origin/gh/laithsakka/113/head 2025-03-14T05:31:37.5876391Z * [new branch] gh/laithsakka/113/orig -> origin/gh/laithsakka/113/orig 2025-03-14T05:31:37.5878781Z * [new branch] gh/laithsakka/114/base -> origin/gh/laithsakka/114/base 2025-03-14T05:31:37.5880638Z * [new branch] gh/laithsakka/114/head -> origin/gh/laithsakka/114/head 2025-03-14T05:31:37.5882211Z * [new branch] gh/laithsakka/114/orig -> origin/gh/laithsakka/114/orig 2025-03-14T05:31:37.5884261Z * [new branch] gh/laithsakka/115/base -> origin/gh/laithsakka/115/base 2025-03-14T05:31:37.5885892Z * [new branch] gh/laithsakka/115/head -> origin/gh/laithsakka/115/head 2025-03-14T05:31:37.5887558Z * [new branch] gh/laithsakka/115/orig -> origin/gh/laithsakka/115/orig 2025-03-14T05:31:37.5889898Z * [new branch] gh/laithsakka/116/base -> origin/gh/laithsakka/116/base 2025-03-14T05:31:37.5891481Z * [new branch] gh/laithsakka/116/head -> origin/gh/laithsakka/116/head 2025-03-14T05:31:37.5893135Z * [new branch] gh/laithsakka/116/orig -> origin/gh/laithsakka/116/orig 2025-03-14T05:31:37.5895598Z * [new branch] gh/laithsakka/117/base -> origin/gh/laithsakka/117/base 2025-03-14T05:31:37.5897305Z * [new branch] gh/laithsakka/117/head -> origin/gh/laithsakka/117/head 2025-03-14T05:31:37.5898854Z * [new branch] gh/laithsakka/117/orig -> origin/gh/laithsakka/117/orig 2025-03-14T05:31:37.5900955Z * [new branch] gh/laithsakka/118/base -> origin/gh/laithsakka/118/base 2025-03-14T05:31:37.5902640Z * [new branch] gh/laithsakka/118/head -> origin/gh/laithsakka/118/head 2025-03-14T05:31:37.5904331Z * [new branch] gh/laithsakka/118/orig -> origin/gh/laithsakka/118/orig 2025-03-14T05:31:37.5906499Z * [new branch] gh/laithsakka/119/base -> origin/gh/laithsakka/119/base 2025-03-14T05:31:37.5908123Z * [new branch] gh/laithsakka/119/head -> origin/gh/laithsakka/119/head 2025-03-14T05:31:37.5909767Z * [new branch] gh/laithsakka/119/orig -> origin/gh/laithsakka/119/orig 2025-03-14T05:31:37.5911999Z * [new branch] gh/laithsakka/120/base -> origin/gh/laithsakka/120/base 2025-03-14T05:31:37.5913606Z * [new branch] gh/laithsakka/120/head -> origin/gh/laithsakka/120/head 2025-03-14T05:31:37.5915304Z * [new branch] gh/laithsakka/120/orig -> origin/gh/laithsakka/120/orig 2025-03-14T05:31:37.5917701Z * [new branch] gh/laithsakka/28/base -> origin/gh/laithsakka/28/base 2025-03-14T05:31:37.5919788Z * [new branch] gh/laithsakka/29/base -> origin/gh/laithsakka/29/base 2025-03-14T05:31:37.5921890Z * [new branch] gh/laithsakka/30/base -> origin/gh/laithsakka/30/base 2025-03-14T05:31:37.5923540Z * [new branch] gh/laithsakka/30/head -> origin/gh/laithsakka/30/head 2025-03-14T05:31:37.5926311Z * [new branch] gh/laithsakka/31/base -> origin/gh/laithsakka/31/base 2025-03-14T05:31:37.5927868Z * [new branch] gh/laithsakka/31/head -> origin/gh/laithsakka/31/head 2025-03-14T05:31:37.5930170Z * [new branch] gh/laithsakka/32/base -> origin/gh/laithsakka/32/base 2025-03-14T05:31:37.5931818Z * [new branch] gh/laithsakka/32/head -> origin/gh/laithsakka/32/head 2025-03-14T05:31:37.5934751Z * [new branch] gh/larryliu0820/46/base -> origin/gh/larryliu0820/46/base 2025-03-14T05:31:37.5936534Z * [new branch] gh/larryliu0820/46/head -> origin/gh/larryliu0820/46/head 2025-03-14T05:31:37.5938245Z * [new branch] gh/larryliu0820/46/orig -> origin/gh/larryliu0820/46/orig 2025-03-14T05:31:37.5941023Z * [new branch] gh/leslie-fang-intel/180/base -> origin/gh/leslie-fang-intel/180/base 2025-03-14T05:31:37.5942683Z * [new branch] gh/leslie-fang-intel/180/head -> origin/gh/leslie-fang-intel/180/head 2025-03-14T05:31:37.5944338Z * [new branch] gh/leslie-fang-intel/180/orig -> origin/gh/leslie-fang-intel/180/orig 2025-03-14T05:31:37.5946732Z * [new branch] gh/leslie-fang-intel/181/base -> origin/gh/leslie-fang-intel/181/base 2025-03-14T05:31:37.5948300Z * [new branch] gh/leslie-fang-intel/181/head -> origin/gh/leslie-fang-intel/181/head 2025-03-14T05:31:37.5949976Z * [new branch] gh/leslie-fang-intel/181/orig -> origin/gh/leslie-fang-intel/181/orig 2025-03-14T05:31:37.5952872Z * [new branch] gh/leslie-fang-intel/182/base -> origin/gh/leslie-fang-intel/182/base 2025-03-14T05:31:37.5954587Z * [new branch] gh/leslie-fang-intel/182/head -> origin/gh/leslie-fang-intel/182/head 2025-03-14T05:31:37.5956246Z * [new branch] gh/leslie-fang-intel/182/orig -> origin/gh/leslie-fang-intel/182/orig 2025-03-14T05:31:37.5958630Z * [new branch] gh/leslie-fang-intel/183/base -> origin/gh/leslie-fang-intel/183/base 2025-03-14T05:31:37.5960241Z * [new branch] gh/leslie-fang-intel/183/head -> origin/gh/leslie-fang-intel/183/head 2025-03-14T05:31:37.5961923Z * [new branch] gh/leslie-fang-intel/183/orig -> origin/gh/leslie-fang-intel/183/orig 2025-03-14T05:31:37.5964131Z * [new branch] gh/leslie-fang-intel/184/base -> origin/gh/leslie-fang-intel/184/base 2025-03-14T05:31:37.5965838Z * [new branch] gh/leslie-fang-intel/184/head -> origin/gh/leslie-fang-intel/184/head 2025-03-14T05:31:37.5967443Z * [new branch] gh/leslie-fang-intel/184/orig -> origin/gh/leslie-fang-intel/184/orig 2025-03-14T05:31:37.5971893Z * [new branch] gh/leslie-fang-intel/185/base -> origin/gh/leslie-fang-intel/185/base 2025-03-14T05:31:37.5973548Z * [new branch] gh/leslie-fang-intel/185/head -> origin/gh/leslie-fang-intel/185/head 2025-03-14T05:31:37.5975433Z * [new branch] gh/leslie-fang-intel/185/orig -> origin/gh/leslie-fang-intel/185/orig 2025-03-14T05:31:37.5977614Z * [new branch] gh/leslie-fang-intel/186/base -> origin/gh/leslie-fang-intel/186/base 2025-03-14T05:31:37.5979240Z * [new branch] gh/leslie-fang-intel/186/head -> origin/gh/leslie-fang-intel/186/head 2025-03-14T05:31:37.5980912Z * [new branch] gh/leslie-fang-intel/186/orig -> origin/gh/leslie-fang-intel/186/orig 2025-03-14T05:31:37.5983153Z * [new branch] gh/leslie-fang-intel/187/base -> origin/gh/leslie-fang-intel/187/base 2025-03-14T05:31:37.5984766Z * [new branch] gh/leslie-fang-intel/187/head -> origin/gh/leslie-fang-intel/187/head 2025-03-14T05:31:37.5986389Z * [new branch] gh/leslie-fang-intel/187/orig -> origin/gh/leslie-fang-intel/187/orig 2025-03-14T05:31:37.5988715Z * [new branch] gh/leslie-fang-intel/188/base -> origin/gh/leslie-fang-intel/188/base 2025-03-14T05:31:37.5990141Z * [new branch] gh/leslie-fang-intel/188/head -> origin/gh/leslie-fang-intel/188/head 2025-03-14T05:31:37.5991994Z * [new branch] gh/leslie-fang-intel/188/orig -> origin/gh/leslie-fang-intel/188/orig 2025-03-14T05:31:37.5995158Z * [new branch] gh/lw/5/head -> origin/gh/lw/5/head 2025-03-14T05:31:37.5997525Z * [new branch] gh/lw/6/base -> origin/gh/lw/6/base 2025-03-14T05:31:37.5999248Z * [new branch] gh/lw/6/head -> origin/gh/lw/6/head 2025-03-14T05:31:37.6001029Z * [new branch] gh/lw/6/orig -> origin/gh/lw/6/orig 2025-03-14T05:31:37.6003882Z * [new branch] gh/lw/7/base -> origin/gh/lw/7/base 2025-03-14T05:31:37.6005631Z * [new branch] gh/lw/7/head -> origin/gh/lw/7/head 2025-03-14T05:31:37.6007315Z * [new branch] gh/lw/7/orig -> origin/gh/lw/7/orig 2025-03-14T05:31:37.6009549Z * [new branch] gh/lw/8/base -> origin/gh/lw/8/base 2025-03-14T05:31:37.6011150Z * [new branch] gh/lw/8/head -> origin/gh/lw/8/head 2025-03-14T05:31:37.6012820Z * [new branch] gh/lw/8/orig -> origin/gh/lw/8/orig 2025-03-14T05:31:37.6016310Z * [new branch] gh/malfet/14/base -> origin/gh/malfet/14/base 2025-03-14T05:31:37.6018343Z * [new branch] gh/malfet/155/base -> origin/gh/malfet/155/base 2025-03-14T05:31:37.6019970Z * [new branch] gh/malfet/155/head -> origin/gh/malfet/155/head 2025-03-14T05:31:37.6021714Z * [new branch] gh/malfet/155/orig -> origin/gh/malfet/155/orig 2025-03-14T05:31:37.6024138Z * [new branch] gh/malfet/159/base -> origin/gh/malfet/159/base 2025-03-14T05:31:37.6026116Z * [new branch] gh/malfet/159/head -> origin/gh/malfet/159/head 2025-03-14T05:31:37.6027767Z * [new branch] gh/malfet/159/orig -> origin/gh/malfet/159/orig 2025-03-14T05:31:37.6030090Z * [new branch] gh/malfet/169/base -> origin/gh/malfet/169/base 2025-03-14T05:31:37.6031705Z * [new branch] gh/malfet/169/head -> origin/gh/malfet/169/head 2025-03-14T05:31:37.6033548Z * [new branch] gh/malfet/169/orig -> origin/gh/malfet/169/orig 2025-03-14T05:31:37.6036216Z * [new branch] gh/malfet/178/base -> origin/gh/malfet/178/base 2025-03-14T05:31:37.6037824Z * [new branch] gh/malfet/178/head -> origin/gh/malfet/178/head 2025-03-14T05:31:37.6039539Z * [new branch] gh/malfet/178/orig -> origin/gh/malfet/178/orig 2025-03-14T05:31:37.6041748Z * [new branch] gh/malfet/179/base -> origin/gh/malfet/179/base 2025-03-14T05:31:37.6043393Z * [new branch] gh/malfet/179/head -> origin/gh/malfet/179/head 2025-03-14T05:31:37.6045135Z * [new branch] gh/malfet/179/orig -> origin/gh/malfet/179/orig 2025-03-14T05:31:37.6047419Z * [new branch] gh/malfet/180/base -> origin/gh/malfet/180/base 2025-03-14T05:31:37.6049052Z * [new branch] gh/malfet/180/head -> origin/gh/malfet/180/head 2025-03-14T05:31:37.6050729Z * [new branch] gh/malfet/180/orig -> origin/gh/malfet/180/orig 2025-03-14T05:31:37.6053011Z * [new branch] gh/malfet/181/base -> origin/gh/malfet/181/base 2025-03-14T05:31:37.6055085Z * [new branch] gh/malfet/181/head -> origin/gh/malfet/181/head 2025-03-14T05:31:37.6056841Z * [new branch] gh/malfet/181/orig -> origin/gh/malfet/181/orig 2025-03-14T05:31:37.6059092Z * [new branch] gh/malfet/182/base -> origin/gh/malfet/182/base 2025-03-14T05:31:37.6060784Z * [new branch] gh/malfet/182/head -> origin/gh/malfet/182/head 2025-03-14T05:31:37.6062498Z * [new branch] gh/malfet/182/orig -> origin/gh/malfet/182/orig 2025-03-14T05:31:37.6064819Z * [new branch] gh/malfet/183/base -> origin/gh/malfet/183/base 2025-03-14T05:31:37.6066451Z * [new branch] gh/malfet/183/head -> origin/gh/malfet/183/head 2025-03-14T05:31:37.6068533Z * [new branch] gh/malfet/183/orig -> origin/gh/malfet/183/orig 2025-03-14T05:31:37.6070851Z * [new branch] gh/malfet/184/base -> origin/gh/malfet/184/base 2025-03-14T05:31:37.6072891Z * [new branch] gh/malfet/184/head -> origin/gh/malfet/184/head 2025-03-14T05:31:37.6074721Z * [new branch] gh/malfet/184/orig -> origin/gh/malfet/184/orig 2025-03-14T05:31:37.6077420Z * [new branch] gh/malfet/185/base -> origin/gh/malfet/185/base 2025-03-14T05:31:37.6078984Z * [new branch] gh/malfet/185/head -> origin/gh/malfet/185/head 2025-03-14T05:31:37.6080645Z * [new branch] gh/malfet/185/orig -> origin/gh/malfet/185/orig 2025-03-14T05:31:37.6082901Z * [new branch] gh/malfet/186/base -> origin/gh/malfet/186/base 2025-03-14T05:31:37.6084571Z * [new branch] gh/malfet/186/head -> origin/gh/malfet/186/head 2025-03-14T05:31:37.6086409Z * [new branch] gh/malfet/186/orig -> origin/gh/malfet/186/orig 2025-03-14T05:31:37.6088515Z * [new branch] gh/malfet/187/base -> origin/gh/malfet/187/base 2025-03-14T05:31:37.6090105Z * [new branch] gh/malfet/187/head -> origin/gh/malfet/187/head 2025-03-14T05:31:37.6091922Z * [new branch] gh/malfet/187/orig -> origin/gh/malfet/187/orig 2025-03-14T05:31:37.6094152Z * [new branch] gh/malfet/188/base -> origin/gh/malfet/188/base 2025-03-14T05:31:37.6095843Z * [new branch] gh/malfet/188/head -> origin/gh/malfet/188/head 2025-03-14T05:31:37.6097579Z * [new branch] gh/malfet/188/orig -> origin/gh/malfet/188/orig 2025-03-14T05:31:37.6099875Z * [new branch] gh/malfet/189/base -> origin/gh/malfet/189/base 2025-03-14T05:31:37.6101494Z * [new branch] gh/malfet/189/head -> origin/gh/malfet/189/head 2025-03-14T05:31:37.6103818Z * [new branch] gh/malfet/190/base -> origin/gh/malfet/190/base 2025-03-14T05:31:37.6105453Z * [new branch] gh/malfet/190/head -> origin/gh/malfet/190/head 2025-03-14T05:31:37.6107138Z * [new branch] gh/malfet/190/orig -> origin/gh/malfet/190/orig 2025-03-14T05:31:37.6109892Z * [new branch] gh/malfet/191/base -> origin/gh/malfet/191/base 2025-03-14T05:31:37.6111555Z * [new branch] gh/malfet/191/head -> origin/gh/malfet/191/head 2025-03-14T05:31:37.6113700Z * [new branch] gh/malfet/191/orig -> origin/gh/malfet/191/orig 2025-03-14T05:31:37.6116130Z * [new branch] gh/malfet/192/base -> origin/gh/malfet/192/base 2025-03-14T05:31:37.6117732Z * [new branch] gh/malfet/192/head -> origin/gh/malfet/192/head 2025-03-14T05:31:37.6119374Z * [new branch] gh/malfet/192/orig -> origin/gh/malfet/192/orig 2025-03-14T05:31:37.6121586Z * [new branch] gh/malfet/193/base -> origin/gh/malfet/193/base 2025-03-14T05:31:37.6123232Z * [new branch] gh/malfet/193/head -> origin/gh/malfet/193/head 2025-03-14T05:31:37.6124986Z * [new branch] gh/malfet/193/orig -> origin/gh/malfet/193/orig 2025-03-14T05:31:37.6127476Z * [new branch] gh/malfet/194/base -> origin/gh/malfet/194/base 2025-03-14T05:31:37.6129129Z * [new branch] gh/malfet/194/head -> origin/gh/malfet/194/head 2025-03-14T05:31:37.6130834Z * [new branch] gh/malfet/194/orig -> origin/gh/malfet/194/orig 2025-03-14T05:31:37.6133231Z * [new branch] gh/malfet/195/base -> origin/gh/malfet/195/base 2025-03-14T05:31:37.6134889Z * [new branch] gh/malfet/195/head -> origin/gh/malfet/195/head 2025-03-14T05:31:37.6136674Z * [new branch] gh/malfet/195/orig -> origin/gh/malfet/195/orig 2025-03-14T05:31:37.6139055Z * [new branch] gh/malfet/196/base -> origin/gh/malfet/196/base 2025-03-14T05:31:37.6140715Z * [new branch] gh/malfet/196/head -> origin/gh/malfet/196/head 2025-03-14T05:31:37.6142380Z * [new branch] gh/malfet/196/orig -> origin/gh/malfet/196/orig 2025-03-14T05:31:37.6144707Z * [new branch] gh/malfet/197/base -> origin/gh/malfet/197/base 2025-03-14T05:31:37.6146292Z * [new branch] gh/malfet/197/head -> origin/gh/malfet/197/head 2025-03-14T05:31:37.6147939Z * [new branch] gh/malfet/197/orig -> origin/gh/malfet/197/orig 2025-03-14T05:31:37.6150254Z * [new branch] gh/malfet/198/base -> origin/gh/malfet/198/base 2025-03-14T05:31:37.6151973Z * [new branch] gh/malfet/198/head -> origin/gh/malfet/198/head 2025-03-14T05:31:37.6153766Z * [new branch] gh/malfet/198/orig -> origin/gh/malfet/198/orig 2025-03-14T05:31:37.6156045Z * [new branch] gh/malfet/199/base -> origin/gh/malfet/199/base 2025-03-14T05:31:37.6157725Z * [new branch] gh/malfet/199/head -> origin/gh/malfet/199/head 2025-03-14T05:31:37.6159489Z * [new branch] gh/malfet/199/orig -> origin/gh/malfet/199/orig 2025-03-14T05:31:37.6161941Z * [new branch] gh/malfet/200/base -> origin/gh/malfet/200/base 2025-03-14T05:31:37.6163933Z * [new branch] gh/malfet/200/head -> origin/gh/malfet/200/head 2025-03-14T05:31:37.6165818Z * [new branch] gh/malfet/200/orig -> origin/gh/malfet/200/orig 2025-03-14T05:31:37.6168457Z * [new branch] gh/malfet/201/base -> origin/gh/malfet/201/base 2025-03-14T05:31:37.6170176Z * [new branch] gh/malfet/201/head -> origin/gh/malfet/201/head 2025-03-14T05:31:37.6172002Z * [new branch] gh/malfet/201/orig -> origin/gh/malfet/201/orig 2025-03-14T05:31:37.6174846Z * [new branch] gh/malfet/202/base -> origin/gh/malfet/202/base 2025-03-14T05:31:37.6176602Z * [new branch] gh/malfet/202/head -> origin/gh/malfet/202/head 2025-03-14T05:31:37.6178188Z * [new branch] gh/malfet/202/orig -> origin/gh/malfet/202/orig 2025-03-14T05:31:37.6180356Z * [new branch] gh/malfet/203/base -> origin/gh/malfet/203/base 2025-03-14T05:31:37.6182042Z * [new branch] gh/malfet/203/head -> origin/gh/malfet/203/head 2025-03-14T05:31:37.6183686Z * [new branch] gh/malfet/203/orig -> origin/gh/malfet/203/orig 2025-03-14T05:31:37.6186128Z * [new branch] gh/malfet/204/base -> origin/gh/malfet/204/base 2025-03-14T05:31:37.6187756Z * [new branch] gh/malfet/204/head -> origin/gh/malfet/204/head 2025-03-14T05:31:37.6189502Z * [new branch] gh/malfet/204/orig -> origin/gh/malfet/204/orig 2025-03-14T05:31:37.6191805Z * [new branch] gh/malfet/205/base -> origin/gh/malfet/205/base 2025-03-14T05:31:37.6193437Z * [new branch] gh/malfet/205/head -> origin/gh/malfet/205/head 2025-03-14T05:31:37.6195704Z * [new branch] gh/malfet/205/orig -> origin/gh/malfet/205/orig 2025-03-14T05:31:37.6198286Z * [new branch] gh/malfet/206/base -> origin/gh/malfet/206/base 2025-03-14T05:31:37.6199892Z * [new branch] gh/malfet/206/head -> origin/gh/malfet/206/head 2025-03-14T05:31:37.6201636Z * [new branch] gh/malfet/206/orig -> origin/gh/malfet/206/orig 2025-03-14T05:31:37.6203849Z * [new branch] gh/malfet/207/base -> origin/gh/malfet/207/base 2025-03-14T05:31:37.6205485Z * [new branch] gh/malfet/207/head -> origin/gh/malfet/207/head 2025-03-14T05:31:37.6207151Z * [new branch] gh/malfet/207/orig -> origin/gh/malfet/207/orig 2025-03-14T05:31:37.6209561Z * [new branch] gh/malfet/208/base -> origin/gh/malfet/208/base 2025-03-14T05:31:37.6211266Z * [new branch] gh/malfet/208/head -> origin/gh/malfet/208/head 2025-03-14T05:31:37.6212986Z * [new branch] gh/malfet/208/orig -> origin/gh/malfet/208/orig 2025-03-14T05:31:37.6215299Z * [new branch] gh/malfet/209/base -> origin/gh/malfet/209/base 2025-03-14T05:31:37.6216906Z * [new branch] gh/malfet/209/head -> origin/gh/malfet/209/head 2025-03-14T05:31:37.6218628Z * [new branch] gh/malfet/209/orig -> origin/gh/malfet/209/orig 2025-03-14T05:31:37.6220906Z * [new branch] gh/malfet/210/base -> origin/gh/malfet/210/base 2025-03-14T05:31:37.6222489Z * [new branch] gh/malfet/210/head -> origin/gh/malfet/210/head 2025-03-14T05:31:37.6224402Z * [new branch] gh/malfet/210/orig -> origin/gh/malfet/210/orig 2025-03-14T05:31:37.6226515Z * [new branch] gh/malfet/211/base -> origin/gh/malfet/211/base 2025-03-14T05:31:37.6228125Z * [new branch] gh/malfet/211/head -> origin/gh/malfet/211/head 2025-03-14T05:31:37.6229847Z * [new branch] gh/malfet/211/orig -> origin/gh/malfet/211/orig 2025-03-14T05:31:37.6232250Z * [new branch] gh/malfet/212/base -> origin/gh/malfet/212/base 2025-03-14T05:31:37.6233905Z * [new branch] gh/malfet/212/head -> origin/gh/malfet/212/head 2025-03-14T05:31:37.6235632Z * [new branch] gh/malfet/212/orig -> origin/gh/malfet/212/orig 2025-03-14T05:31:37.6237966Z * [new branch] gh/malfet/213/base -> origin/gh/malfet/213/base 2025-03-14T05:31:37.6239623Z * [new branch] gh/malfet/213/head -> origin/gh/malfet/213/head 2025-03-14T05:31:37.6241279Z * [new branch] gh/malfet/213/orig -> origin/gh/malfet/213/orig 2025-03-14T05:31:37.6243640Z * [new branch] gh/malfet/214/base -> origin/gh/malfet/214/base 2025-03-14T05:31:37.6245386Z * [new branch] gh/malfet/214/head -> origin/gh/malfet/214/head 2025-03-14T05:31:37.6247051Z * [new branch] gh/malfet/214/orig -> origin/gh/malfet/214/orig 2025-03-14T05:31:37.6249300Z * [new branch] gh/malfet/215/base -> origin/gh/malfet/215/base 2025-03-14T05:31:37.6250976Z * [new branch] gh/malfet/215/head -> origin/gh/malfet/215/head 2025-03-14T05:31:37.6252730Z * [new branch] gh/malfet/215/orig -> origin/gh/malfet/215/orig 2025-03-14T05:31:37.6255122Z * [new branch] gh/malfet/216/base -> origin/gh/malfet/216/base 2025-03-14T05:31:37.6256793Z * [new branch] gh/malfet/216/head -> origin/gh/malfet/216/head 2025-03-14T05:31:37.6258551Z * [new branch] gh/malfet/216/orig -> origin/gh/malfet/216/orig 2025-03-14T05:31:37.6260838Z * [new branch] gh/malfet/217/base -> origin/gh/malfet/217/base 2025-03-14T05:31:37.6262420Z * [new branch] gh/malfet/217/head -> origin/gh/malfet/217/head 2025-03-14T05:31:37.6264132Z * [new branch] gh/malfet/217/orig -> origin/gh/malfet/217/orig 2025-03-14T05:31:37.6266444Z * [new branch] gh/malfet/218/base -> origin/gh/malfet/218/base 2025-03-14T05:31:37.6268339Z * [new branch] gh/malfet/218/head -> origin/gh/malfet/218/head 2025-03-14T05:31:37.6270212Z * [new branch] gh/malfet/218/orig -> origin/gh/malfet/218/orig 2025-03-14T05:31:37.6272545Z * [new branch] gh/malfet/219/base -> origin/gh/malfet/219/base 2025-03-14T05:31:37.6274286Z * [new branch] gh/malfet/219/head -> origin/gh/malfet/219/head 2025-03-14T05:31:37.6276098Z * [new branch] gh/malfet/219/orig -> origin/gh/malfet/219/orig 2025-03-14T05:31:37.6278473Z * [new branch] gh/malfet/220/base -> origin/gh/malfet/220/base 2025-03-14T05:31:37.6280104Z * [new branch] gh/malfet/220/head -> origin/gh/malfet/220/head 2025-03-14T05:31:37.6281846Z * [new branch] gh/malfet/220/orig -> origin/gh/malfet/220/orig 2025-03-14T05:31:37.6284062Z * [new branch] gh/malfet/221/base -> origin/gh/malfet/221/base 2025-03-14T05:31:37.6285732Z * [new branch] gh/malfet/221/head -> origin/gh/malfet/221/head 2025-03-14T05:31:37.6287360Z * [new branch] gh/malfet/221/orig -> origin/gh/malfet/221/orig 2025-03-14T05:31:37.6289700Z * [new branch] gh/malfet/222/base -> origin/gh/malfet/222/base 2025-03-14T05:31:37.6291472Z * [new branch] gh/malfet/222/head -> origin/gh/malfet/222/head 2025-03-14T05:31:37.6293045Z * [new branch] gh/malfet/222/orig -> origin/gh/malfet/222/orig 2025-03-14T05:31:37.6295418Z * [new branch] gh/malfet/223/base -> origin/gh/malfet/223/base 2025-03-14T05:31:37.6297097Z * [new branch] gh/malfet/223/head -> origin/gh/malfet/223/head 2025-03-14T05:31:37.6298715Z * [new branch] gh/malfet/223/orig -> origin/gh/malfet/223/orig 2025-03-14T05:31:37.6301094Z * [new branch] gh/malfet/224/base -> origin/gh/malfet/224/base 2025-03-14T05:31:37.6302709Z * [new branch] gh/malfet/224/head -> origin/gh/malfet/224/head 2025-03-14T05:31:37.6304684Z * [new branch] gh/malfet/224/orig -> origin/gh/malfet/224/orig 2025-03-14T05:31:37.6306734Z * [new branch] gh/malfet/225/base -> origin/gh/malfet/225/base 2025-03-14T05:31:37.6308373Z * [new branch] gh/malfet/225/head -> origin/gh/malfet/225/head 2025-03-14T05:31:37.6310029Z * [new branch] gh/malfet/225/orig -> origin/gh/malfet/225/orig 2025-03-14T05:31:37.6312232Z * [new branch] gh/malfet/226/base -> origin/gh/malfet/226/base 2025-03-14T05:31:37.6313885Z * [new branch] gh/malfet/226/head -> origin/gh/malfet/226/head 2025-03-14T05:31:37.6315654Z * [new branch] gh/malfet/226/orig -> origin/gh/malfet/226/orig 2025-03-14T05:31:37.6318159Z * [new branch] gh/malfet/227/base -> origin/gh/malfet/227/base 2025-03-14T05:31:37.6319736Z * [new branch] gh/malfet/227/head -> origin/gh/malfet/227/head 2025-03-14T05:31:37.6321434Z * [new branch] gh/malfet/227/orig -> origin/gh/malfet/227/orig 2025-03-14T05:31:37.6323803Z * [new branch] gh/malfet/64/base -> origin/gh/malfet/64/base 2025-03-14T05:31:37.6325501Z * [new branch] gh/malfet/64/head -> origin/gh/malfet/64/head 2025-03-14T05:31:37.6327771Z * [new branch] gh/malfet/96/base -> origin/gh/malfet/96/base 2025-03-14T05:31:37.6329442Z * [new branch] gh/malfet/96/head -> origin/gh/malfet/96/head 2025-03-14T05:31:37.6331059Z * [new branch] gh/malfet/96/orig -> origin/gh/malfet/96/orig 2025-03-14T05:31:37.6333955Z * [new branch] gh/markkm/1/base -> origin/gh/markkm/1/base 2025-03-14T05:31:37.6337094Z * [new branch] gh/masnesral/155/base -> origin/gh/masnesral/155/base 2025-03-14T05:31:37.6338770Z * [new branch] gh/masnesral/155/head -> origin/gh/masnesral/155/head 2025-03-14T05:31:37.6340450Z * [new branch] gh/masnesral/155/orig -> origin/gh/masnesral/155/orig 2025-03-14T05:31:37.6342846Z * [new branch] gh/masnesral/161/base -> origin/gh/masnesral/161/base 2025-03-14T05:31:37.6344539Z * [new branch] gh/masnesral/161/head -> origin/gh/masnesral/161/head 2025-03-14T05:31:37.6346156Z * [new branch] gh/masnesral/161/orig -> origin/gh/masnesral/161/orig 2025-03-14T05:31:37.6348301Z * [new branch] gh/masnesral/162/base -> origin/gh/masnesral/162/base 2025-03-14T05:31:37.6349928Z * [new branch] gh/masnesral/162/head -> origin/gh/masnesral/162/head 2025-03-14T05:31:37.6351572Z * [new branch] gh/masnesral/162/orig -> origin/gh/masnesral/162/orig 2025-03-14T05:31:37.6353807Z * [new branch] gh/masnesral/173/base -> origin/gh/masnesral/173/base 2025-03-14T05:31:37.6355654Z * [new branch] gh/masnesral/173/head -> origin/gh/masnesral/173/head 2025-03-14T05:31:37.6357402Z * [new branch] gh/masnesral/173/orig -> origin/gh/masnesral/173/orig 2025-03-14T05:31:37.6359829Z * [new branch] gh/masnesral/176/base -> origin/gh/masnesral/176/base 2025-03-14T05:31:37.6361808Z * [new branch] gh/masnesral/176/head -> origin/gh/masnesral/176/head 2025-03-14T05:31:37.6363379Z * [new branch] gh/masnesral/176/orig -> origin/gh/masnesral/176/orig 2025-03-14T05:31:37.6365959Z * [new branch] gh/masnesral/177/base -> origin/gh/masnesral/177/base 2025-03-14T05:31:37.6367593Z * [new branch] gh/masnesral/177/head -> origin/gh/masnesral/177/head 2025-03-14T05:31:37.6369692Z * [new branch] gh/masnesral/177/orig -> origin/gh/masnesral/177/orig 2025-03-14T05:31:37.6371857Z * [new branch] gh/masnesral/178/base -> origin/gh/masnesral/178/base 2025-03-14T05:31:37.6373561Z * [new branch] gh/masnesral/178/head -> origin/gh/masnesral/178/head 2025-03-14T05:31:37.6375290Z * [new branch] gh/masnesral/178/orig -> origin/gh/masnesral/178/orig 2025-03-14T05:31:37.6377675Z * [new branch] gh/masnesral/179/base -> origin/gh/masnesral/179/base 2025-03-14T05:31:37.6379439Z * [new branch] gh/masnesral/179/head -> origin/gh/masnesral/179/head 2025-03-14T05:31:37.6381117Z * [new branch] gh/masnesral/179/orig -> origin/gh/masnesral/179/orig 2025-03-14T05:31:37.6383358Z * [new branch] gh/masnesral/180/base -> origin/gh/masnesral/180/base 2025-03-14T05:31:37.6385198Z * [new branch] gh/masnesral/180/head -> origin/gh/masnesral/180/head 2025-03-14T05:31:37.6386746Z * [new branch] gh/masnesral/180/orig -> origin/gh/masnesral/180/orig 2025-03-14T05:31:37.6389236Z * [new branch] gh/masnesral/34/base -> origin/gh/masnesral/34/base 2025-03-14T05:31:37.6392571Z * [new branch] gh/mhorowitz/0/base -> origin/gh/mhorowitz/0/base 2025-03-14T05:31:37.6394304Z * [new branch] gh/mhorowitz/0/head -> origin/gh/mhorowitz/0/head 2025-03-14T05:31:37.6396505Z * [new branch] gh/mhorowitz/1/base -> origin/gh/mhorowitz/1/base 2025-03-14T05:31:37.6398175Z * [new branch] gh/mhorowitz/1/head -> origin/gh/mhorowitz/1/head 2025-03-14T05:31:37.6400195Z * [new branch] gh/mhorowitz/2/base -> origin/gh/mhorowitz/2/base 2025-03-14T05:31:37.6401883Z * [new branch] gh/mhorowitz/2/head -> origin/gh/mhorowitz/2/head 2025-03-14T05:31:37.6404024Z * [new branch] gh/mhorowitz/3/base -> origin/gh/mhorowitz/3/base 2025-03-14T05:31:37.6405622Z * [new branch] gh/mhorowitz/3/head -> origin/gh/mhorowitz/3/head 2025-03-14T05:31:37.6407713Z * [new branch] gh/mhorowitz/4/base -> origin/gh/mhorowitz/4/base 2025-03-14T05:31:37.6409441Z * [new branch] gh/mhorowitz/4/head -> origin/gh/mhorowitz/4/head 2025-03-14T05:31:37.6411562Z * [new branch] gh/mhorowitz/5/base -> origin/gh/mhorowitz/5/base 2025-03-14T05:31:37.6413129Z * [new branch] gh/mhorowitz/5/head -> origin/gh/mhorowitz/5/head 2025-03-14T05:31:37.6415263Z * [new branch] gh/mhorowitz/6/base -> origin/gh/mhorowitz/6/base 2025-03-14T05:31:37.6416841Z * [new branch] gh/mhorowitz/6/head -> origin/gh/mhorowitz/6/head 2025-03-14T05:31:37.6419703Z * [new branch] gh/mikaylagawarecki/234/base -> origin/gh/mikaylagawarecki/234/base 2025-03-14T05:31:37.6421320Z * [new branch] gh/mikaylagawarecki/234/head -> origin/gh/mikaylagawarecki/234/head 2025-03-14T05:31:37.6423569Z * [new branch] gh/mikaylagawarecki/235/base -> origin/gh/mikaylagawarecki/235/base 2025-03-14T05:31:37.6425240Z * [new branch] gh/mikaylagawarecki/235/head -> origin/gh/mikaylagawarecki/235/head 2025-03-14T05:31:37.6427457Z * [new branch] gh/mikaylagawarecki/236/base -> origin/gh/mikaylagawarecki/236/base 2025-03-14T05:31:37.6429356Z * [new branch] gh/mikaylagawarecki/236/head -> origin/gh/mikaylagawarecki/236/head 2025-03-14T05:31:37.6431321Z * [new branch] gh/mikaylagawarecki/237/base -> origin/gh/mikaylagawarecki/237/base 2025-03-14T05:31:37.6432941Z * [new branch] gh/mikaylagawarecki/237/head -> origin/gh/mikaylagawarecki/237/head 2025-03-14T05:31:37.6435312Z * [new branch] gh/mikaylagawarecki/238/base -> origin/gh/mikaylagawarecki/238/base 2025-03-14T05:31:37.6437000Z * [new branch] gh/mikaylagawarecki/238/head -> origin/gh/mikaylagawarecki/238/head 2025-03-14T05:31:37.6439558Z * [new branch] gh/mikaylagawarecki/281/base -> origin/gh/mikaylagawarecki/281/base 2025-03-14T05:31:37.6441205Z * [new branch] gh/mikaylagawarecki/281/head -> origin/gh/mikaylagawarecki/281/head 2025-03-14T05:31:37.6442860Z * [new branch] gh/mikaylagawarecki/281/orig -> origin/gh/mikaylagawarecki/281/orig 2025-03-14T05:31:37.6445199Z * [new branch] gh/mikaylagawarecki/299/base -> origin/gh/mikaylagawarecki/299/base 2025-03-14T05:31:37.6446749Z * [new branch] gh/mikaylagawarecki/299/head -> origin/gh/mikaylagawarecki/299/head 2025-03-14T05:31:37.6448455Z * [new branch] gh/mikaylagawarecki/299/orig -> origin/gh/mikaylagawarecki/299/orig 2025-03-14T05:31:37.6451338Z * [new branch] gh/mikaylagawarecki/304/base -> origin/gh/mikaylagawarecki/304/base 2025-03-14T05:31:37.6452964Z * [new branch] gh/mikaylagawarecki/304/head -> origin/gh/mikaylagawarecki/304/head 2025-03-14T05:31:37.6454605Z * [new branch] gh/mikaylagawarecki/304/orig -> origin/gh/mikaylagawarecki/304/orig 2025-03-14T05:31:37.6456944Z * [new branch] gh/mikaylagawarecki/307/base -> origin/gh/mikaylagawarecki/307/base 2025-03-14T05:31:37.6458661Z * [new branch] gh/mikaylagawarecki/307/head -> origin/gh/mikaylagawarecki/307/head 2025-03-14T05:31:37.6460237Z * [new branch] gh/mikaylagawarecki/307/orig -> origin/gh/mikaylagawarecki/307/orig 2025-03-14T05:31:37.6462532Z * [new branch] gh/mikaylagawarecki/310/base -> origin/gh/mikaylagawarecki/310/base 2025-03-14T05:31:37.6464199Z * [new branch] gh/mikaylagawarecki/310/head -> origin/gh/mikaylagawarecki/310/head 2025-03-14T05:31:37.6465922Z * [new branch] gh/mikaylagawarecki/310/orig -> origin/gh/mikaylagawarecki/310/orig 2025-03-14T05:31:37.6468337Z * [new branch] gh/mikaylagawarecki/313/base -> origin/gh/mikaylagawarecki/313/base 2025-03-14T05:31:37.6471862Z * [new branch] gh/mikaylagawarecki/313/head -> origin/gh/mikaylagawarecki/313/head 2025-03-14T05:31:37.6473485Z * [new branch] gh/mikaylagawarecki/313/orig -> origin/gh/mikaylagawarecki/313/orig 2025-03-14T05:31:37.6476079Z * [new branch] gh/mikaylagawarecki/314/base -> origin/gh/mikaylagawarecki/314/base 2025-03-14T05:31:37.6477660Z * [new branch] gh/mikaylagawarecki/314/head -> origin/gh/mikaylagawarecki/314/head 2025-03-14T05:31:37.6479334Z * [new branch] gh/mikaylagawarecki/314/orig -> origin/gh/mikaylagawarecki/314/orig 2025-03-14T05:31:37.6481657Z * [new branch] gh/mikaylagawarecki/315/base -> origin/gh/mikaylagawarecki/315/base 2025-03-14T05:31:37.6483627Z * [new branch] gh/mikaylagawarecki/315/head -> origin/gh/mikaylagawarecki/315/head 2025-03-14T05:31:37.6484719Z * [new branch] gh/mikaylagawarecki/315/orig -> origin/gh/mikaylagawarecki/315/orig 2025-03-14T05:31:37.6487340Z * [new branch] gh/mikaylagawarecki/316/base -> origin/gh/mikaylagawarecki/316/base 2025-03-14T05:31:37.6488979Z * [new branch] gh/mikaylagawarecki/316/head -> origin/gh/mikaylagawarecki/316/head 2025-03-14T05:31:37.6490593Z * [new branch] gh/mikaylagawarecki/316/orig -> origin/gh/mikaylagawarecki/316/orig 2025-03-14T05:31:37.6493392Z * [new branch] gh/mikaylagawarecki/317/base -> origin/gh/mikaylagawarecki/317/base 2025-03-14T05:31:37.6495322Z * [new branch] gh/mikaylagawarecki/317/head -> origin/gh/mikaylagawarecki/317/head 2025-03-14T05:31:37.6497052Z * [new branch] gh/mikaylagawarecki/317/orig -> origin/gh/mikaylagawarecki/317/orig 2025-03-14T05:31:37.6499387Z * [new branch] gh/mikaylagawarecki/318/base -> origin/gh/mikaylagawarecki/318/base 2025-03-14T05:31:37.6501012Z * [new branch] gh/mikaylagawarecki/318/head -> origin/gh/mikaylagawarecki/318/head 2025-03-14T05:31:37.6502727Z * [new branch] gh/mikaylagawarecki/318/orig -> origin/gh/mikaylagawarecki/318/orig 2025-03-14T05:31:37.6505699Z * [new branch] gh/mikaylagawarecki/319/base -> origin/gh/mikaylagawarecki/319/base 2025-03-14T05:31:37.6507350Z * [new branch] gh/mikaylagawarecki/319/head -> origin/gh/mikaylagawarecki/319/head 2025-03-14T05:31:37.6508966Z * [new branch] gh/mikaylagawarecki/319/orig -> origin/gh/mikaylagawarecki/319/orig 2025-03-14T05:31:37.6511358Z * [new branch] gh/mikaylagawarecki/320/base -> origin/gh/mikaylagawarecki/320/base 2025-03-14T05:31:37.6512998Z * [new branch] gh/mikaylagawarecki/320/head -> origin/gh/mikaylagawarecki/320/head 2025-03-14T05:31:37.6514733Z * [new branch] gh/mikaylagawarecki/320/orig -> origin/gh/mikaylagawarecki/320/orig 2025-03-14T05:31:37.6517091Z * [new branch] gh/mikaylagawarecki/321/base -> origin/gh/mikaylagawarecki/321/base 2025-03-14T05:31:37.6518734Z * [new branch] gh/mikaylagawarecki/321/head -> origin/gh/mikaylagawarecki/321/head 2025-03-14T05:31:37.6520423Z * [new branch] gh/mikaylagawarecki/321/orig -> origin/gh/mikaylagawarecki/321/orig 2025-03-14T05:31:37.6522785Z * [new branch] gh/mikaylagawarecki/322/base -> origin/gh/mikaylagawarecki/322/base 2025-03-14T05:31:37.6524502Z * [new branch] gh/mikaylagawarecki/322/head -> origin/gh/mikaylagawarecki/322/head 2025-03-14T05:31:37.6526169Z * [new branch] gh/mikaylagawarecki/322/orig -> origin/gh/mikaylagawarecki/322/orig 2025-03-14T05:31:37.6528621Z * [new branch] gh/mikaylagawarecki/323/base -> origin/gh/mikaylagawarecki/323/base 2025-03-14T05:31:37.6530326Z * [new branch] gh/mikaylagawarecki/323/head -> origin/gh/mikaylagawarecki/323/head 2025-03-14T05:31:37.6532059Z * [new branch] gh/mikaylagawarecki/323/orig -> origin/gh/mikaylagawarecki/323/orig 2025-03-14T05:31:37.6534333Z * [new branch] gh/mikaylagawarecki/324/base -> origin/gh/mikaylagawarecki/324/base 2025-03-14T05:31:37.6536013Z * [new branch] gh/mikaylagawarecki/324/head -> origin/gh/mikaylagawarecki/324/head 2025-03-14T05:31:37.6537340Z * [new branch] gh/mikaylagawarecki/324/orig -> origin/gh/mikaylagawarecki/324/orig 2025-03-14T05:31:37.6540256Z * [new branch] gh/mlazos/1/base -> origin/gh/mlazos/1/base 2025-03-14T05:31:37.6542021Z * [new branch] gh/mlazos/1/head -> origin/gh/mlazos/1/head 2025-03-14T05:31:37.6544207Z * [new branch] gh/mlazos/2/base -> origin/gh/mlazos/2/base 2025-03-14T05:31:37.6545800Z * [new branch] gh/mlazos/2/head -> origin/gh/mlazos/2/head 2025-03-14T05:31:37.6548006Z * [new branch] gh/mlazos/3/base -> origin/gh/mlazos/3/base 2025-03-14T05:31:37.6549673Z * [new branch] gh/mlazos/3/head -> origin/gh/mlazos/3/head 2025-03-14T05:31:37.6551321Z * [new branch] gh/mlazos/3/orig -> origin/gh/mlazos/3/orig 2025-03-14T05:31:37.6553683Z * [new branch] gh/mlazos/4/base -> origin/gh/mlazos/4/base 2025-03-14T05:31:37.6555781Z * [new branch] gh/mlazos/4/head -> origin/gh/mlazos/4/head 2025-03-14T05:31:37.6556855Z * [new branch] gh/mlazos/4/orig -> origin/gh/mlazos/4/orig 2025-03-14T05:31:37.6559581Z * [new branch] gh/mlazos/5/base -> origin/gh/mlazos/5/base 2025-03-14T05:31:37.6561340Z * [new branch] gh/mlazos/5/head -> origin/gh/mlazos/5/head 2025-03-14T05:31:37.6562766Z * [new branch] gh/mlazos/5/orig -> origin/gh/mlazos/5/orig 2025-03-14T05:31:37.6564935Z * [new branch] gh/mlazos/6/base -> origin/gh/mlazos/6/base 2025-03-14T05:31:37.6566706Z * [new branch] gh/mlazos/6/head -> origin/gh/mlazos/6/head 2025-03-14T05:31:37.6568557Z * [new branch] gh/mlazos/6/orig -> origin/gh/mlazos/6/orig 2025-03-14T05:31:37.6571004Z * [new branch] gh/mlazos/7/base -> origin/gh/mlazos/7/base 2025-03-14T05:31:37.6572724Z * [new branch] gh/mlazos/7/head -> origin/gh/mlazos/7/head 2025-03-14T05:31:37.6574358Z * [new branch] gh/mlazos/7/orig -> origin/gh/mlazos/7/orig 2025-03-14T05:31:37.6577167Z * [new branch] gh/mrmiywj/1/base -> origin/gh/mrmiywj/1/base 2025-03-14T05:31:37.6578775Z * [new branch] gh/mrmiywj/1/head -> origin/gh/mrmiywj/1/head 2025-03-14T05:31:37.6581566Z * [new branch] gh/muchulee8/1/base -> origin/gh/muchulee8/1/base 2025-03-14T05:31:37.6583295Z * [new branch] gh/muchulee8/1/orig -> origin/gh/muchulee8/1/orig 2025-03-14T05:31:37.6585520Z * [new branch] gh/muchulee8/2/base -> origin/gh/muchulee8/2/base 2025-03-14T05:31:37.6587293Z * [new branch] gh/muchulee8/2/orig -> origin/gh/muchulee8/2/orig 2025-03-14T05:31:37.6589651Z * [new branch] gh/muchulee8/40/base -> origin/gh/muchulee8/40/base 2025-03-14T05:31:37.6591327Z * [new branch] gh/muchulee8/40/head -> origin/gh/muchulee8/40/head 2025-03-14T05:31:37.6592950Z * [new branch] gh/muchulee8/40/orig -> origin/gh/muchulee8/40/orig 2025-03-14T05:31:37.6595354Z * [new branch] gh/muchulee8/41/base -> origin/gh/muchulee8/41/base 2025-03-14T05:31:37.6597027Z * [new branch] gh/muchulee8/41/head -> origin/gh/muchulee8/41/head 2025-03-14T05:31:37.6600102Z * [new branch] gh/muchulee8/41/orig -> origin/gh/muchulee8/41/orig 2025-03-14T05:31:37.6601588Z * [new branch] gh/muchulee8/42/base -> origin/gh/muchulee8/42/base 2025-03-14T05:31:37.6602726Z * [new branch] gh/muchulee8/42/head -> origin/gh/muchulee8/42/head 2025-03-14T05:31:37.6605161Z * [new branch] gh/muchulee8/42/orig -> origin/gh/muchulee8/42/orig 2025-03-14T05:31:37.6607514Z * [new branch] gh/muchulee8/43/base -> origin/gh/muchulee8/43/base 2025-03-14T05:31:37.6609216Z * [new branch] gh/muchulee8/43/head -> origin/gh/muchulee8/43/head 2025-03-14T05:31:37.6610998Z * [new branch] gh/muchulee8/43/orig -> origin/gh/muchulee8/43/orig 2025-03-14T05:31:37.6613344Z * [new branch] gh/muchulee8/44/base -> origin/gh/muchulee8/44/base 2025-03-14T05:31:37.6615035Z * [new branch] gh/muchulee8/44/head -> origin/gh/muchulee8/44/head 2025-03-14T05:31:37.6616723Z * [new branch] gh/muchulee8/44/orig -> origin/gh/muchulee8/44/orig 2025-03-14T05:31:37.6618870Z * [new branch] gh/muchulee8/45/base -> origin/gh/muchulee8/45/base 2025-03-14T05:31:37.6620495Z * [new branch] gh/muchulee8/45/head -> origin/gh/muchulee8/45/head 2025-03-14T05:31:37.6622175Z * [new branch] gh/muchulee8/45/orig -> origin/gh/muchulee8/45/orig 2025-03-14T05:31:37.6624629Z * [new branch] gh/muchulee8/5/base -> origin/gh/muchulee8/5/base 2025-03-14T05:31:37.6626288Z * [new branch] gh/muchulee8/5/orig -> origin/gh/muchulee8/5/orig 2025-03-14T05:31:37.6628979Z * [new branch] gh/mzzchy/2/base -> origin/gh/mzzchy/2/base 2025-03-14T05:31:37.6630867Z * [new branch] gh/mzzchy/2/head -> origin/gh/mzzchy/2/head 2025-03-14T05:31:37.6632471Z * [new branch] gh/mzzchy/2/orig -> origin/gh/mzzchy/2/orig 2025-03-14T05:31:37.6634894Z * [new branch] gh/mzzchy/3/base -> origin/gh/mzzchy/3/base 2025-03-14T05:31:37.6636515Z * [new branch] gh/mzzchy/3/head -> origin/gh/mzzchy/3/head 2025-03-14T05:31:37.6638653Z * [new branch] gh/mzzchy/3/orig -> origin/gh/mzzchy/3/orig 2025-03-14T05:31:37.6640750Z * [new branch] gh/mzzchy/4/base -> origin/gh/mzzchy/4/base 2025-03-14T05:31:37.6642505Z * [new branch] gh/mzzchy/4/head -> origin/gh/mzzchy/4/head 2025-03-14T05:31:37.6644773Z * [new branch] gh/mzzchy/5/base -> origin/gh/mzzchy/5/base 2025-03-14T05:31:37.6646534Z * [new branch] gh/mzzchy/5/head -> origin/gh/mzzchy/5/head 2025-03-14T05:31:37.6648276Z * [new branch] gh/mzzchy/5/orig -> origin/gh/mzzchy/5/orig 2025-03-14T05:31:37.6651134Z * [new branch] gh/nmacchioni/12/base -> origin/gh/nmacchioni/12/base 2025-03-14T05:31:37.6652800Z * [new branch] gh/nmacchioni/12/head -> origin/gh/nmacchioni/12/head 2025-03-14T05:31:37.6654660Z * [new branch] gh/nmacchioni/12/orig -> origin/gh/nmacchioni/12/orig 2025-03-14T05:31:37.6656787Z * [new branch] gh/nmacchioni/31/base -> origin/gh/nmacchioni/31/base 2025-03-14T05:31:37.6658444Z * [new branch] gh/nmacchioni/31/head -> origin/gh/nmacchioni/31/head 2025-03-14T05:31:37.6660056Z * [new branch] gh/nmacchioni/31/orig -> origin/gh/nmacchioni/31/orig 2025-03-14T05:31:37.6662296Z * [new branch] gh/nmacchioni/32/base -> origin/gh/nmacchioni/32/base 2025-03-14T05:31:37.6663994Z * [new branch] gh/nmacchioni/32/head -> origin/gh/nmacchioni/32/head 2025-03-14T05:31:37.6665655Z * [new branch] gh/nmacchioni/32/orig -> origin/gh/nmacchioni/32/orig 2025-03-14T05:31:37.6668118Z * [new branch] gh/nmacchioni/33/base -> origin/gh/nmacchioni/33/base 2025-03-14T05:31:37.6669883Z * [new branch] gh/nmacchioni/33/head -> origin/gh/nmacchioni/33/head 2025-03-14T05:31:37.6671652Z * [new branch] gh/nmacchioni/33/orig -> origin/gh/nmacchioni/33/orig 2025-03-14T05:31:37.6673951Z * [new branch] gh/nmacchioni/35/base -> origin/gh/nmacchioni/35/base 2025-03-14T05:31:37.6675779Z * [new branch] gh/nmacchioni/35/head -> origin/gh/nmacchioni/35/head 2025-03-14T05:31:37.6677535Z * [new branch] gh/nmacchioni/35/orig -> origin/gh/nmacchioni/35/orig 2025-03-14T05:31:37.6679808Z * [new branch] gh/nmacchioni/36/base -> origin/gh/nmacchioni/36/base 2025-03-14T05:31:37.6681462Z * [new branch] gh/nmacchioni/36/head -> origin/gh/nmacchioni/36/head 2025-03-14T05:31:37.6683052Z * [new branch] gh/nmacchioni/36/orig -> origin/gh/nmacchioni/36/orig 2025-03-14T05:31:37.6685200Z * [new branch] gh/nmacchioni/37/base -> origin/gh/nmacchioni/37/base 2025-03-14T05:31:37.6686841Z * [new branch] gh/nmacchioni/37/head -> origin/gh/nmacchioni/37/head 2025-03-14T05:31:37.6688494Z * [new branch] gh/nmacchioni/37/orig -> origin/gh/nmacchioni/37/orig 2025-03-14T05:31:37.6690752Z * [new branch] gh/nmacchioni/39/base -> origin/gh/nmacchioni/39/base 2025-03-14T05:31:37.6692403Z * [new branch] gh/nmacchioni/39/head -> origin/gh/nmacchioni/39/head 2025-03-14T05:31:37.6694640Z * [new branch] gh/nmacchioni/39/orig -> origin/gh/nmacchioni/39/orig 2025-03-14T05:31:37.6696862Z * [new branch] gh/nmacchioni/8/base -> origin/gh/nmacchioni/8/base 2025-03-14T05:31:37.6698720Z * [new branch] gh/nmacchioni/8/head -> origin/gh/nmacchioni/8/head 2025-03-14T05:31:37.6700263Z * [new branch] gh/nmacchioni/8/orig -> origin/gh/nmacchioni/8/orig 2025-03-14T05:31:37.6703523Z * [new branch] gh/oulgen/150/base -> origin/gh/oulgen/150/base 2025-03-14T05:31:37.6705261Z * [new branch] gh/oulgen/150/head -> origin/gh/oulgen/150/head 2025-03-14T05:31:37.6706878Z * [new branch] gh/oulgen/150/orig -> origin/gh/oulgen/150/orig 2025-03-14T05:31:37.6709688Z * [new branch] gh/oulgen/151/base -> origin/gh/oulgen/151/base 2025-03-14T05:31:37.6711389Z * [new branch] gh/oulgen/151/head -> origin/gh/oulgen/151/head 2025-03-14T05:31:37.6713043Z * [new branch] gh/oulgen/151/orig -> origin/gh/oulgen/151/orig 2025-03-14T05:31:37.6715399Z * [new branch] gh/oulgen/152/base -> origin/gh/oulgen/152/base 2025-03-14T05:31:37.6717107Z * [new branch] gh/oulgen/152/head -> origin/gh/oulgen/152/head 2025-03-14T05:31:37.6718774Z * [new branch] gh/oulgen/152/orig -> origin/gh/oulgen/152/orig 2025-03-14T05:31:37.6721049Z * [new branch] gh/oulgen/153/base -> origin/gh/oulgen/153/base 2025-03-14T05:31:37.6722786Z * [new branch] gh/oulgen/153/head -> origin/gh/oulgen/153/head 2025-03-14T05:31:37.6724425Z * [new branch] gh/oulgen/153/orig -> origin/gh/oulgen/153/orig 2025-03-14T05:31:37.6726914Z * [new branch] gh/oulgen/154/base -> origin/gh/oulgen/154/base 2025-03-14T05:31:37.6728583Z * [new branch] gh/oulgen/154/head -> origin/gh/oulgen/154/head 2025-03-14T05:31:37.6730686Z * [new branch] gh/oulgen/154/orig -> origin/gh/oulgen/154/orig 2025-03-14T05:31:37.6732936Z * [new branch] gh/oulgen/155/base -> origin/gh/oulgen/155/base 2025-03-14T05:31:37.6734714Z * [new branch] gh/oulgen/155/head -> origin/gh/oulgen/155/head 2025-03-14T05:31:37.6736306Z * [new branch] gh/oulgen/155/orig -> origin/gh/oulgen/155/orig 2025-03-14T05:31:37.6738650Z * [new branch] gh/oulgen/156/base -> origin/gh/oulgen/156/base 2025-03-14T05:31:37.6740115Z * [new branch] gh/oulgen/156/head -> origin/gh/oulgen/156/head 2025-03-14T05:31:37.6741778Z * [new branch] gh/oulgen/156/orig -> origin/gh/oulgen/156/orig 2025-03-14T05:31:37.6743995Z * [new branch] gh/oulgen/157/base -> origin/gh/oulgen/157/base 2025-03-14T05:31:37.6745777Z * [new branch] gh/oulgen/157/head -> origin/gh/oulgen/157/head 2025-03-14T05:31:37.6747458Z * [new branch] gh/oulgen/157/orig -> origin/gh/oulgen/157/orig 2025-03-14T05:31:37.6749727Z * [new branch] gh/oulgen/158/base -> origin/gh/oulgen/158/base 2025-03-14T05:31:37.6751348Z * [new branch] gh/oulgen/158/head -> origin/gh/oulgen/158/head 2025-03-14T05:31:37.6753010Z * [new branch] gh/oulgen/158/orig -> origin/gh/oulgen/158/orig 2025-03-14T05:31:37.6755353Z * [new branch] gh/oulgen/159/base -> origin/gh/oulgen/159/base 2025-03-14T05:31:37.6756931Z * [new branch] gh/oulgen/159/head -> origin/gh/oulgen/159/head 2025-03-14T05:31:37.6759019Z * [new branch] gh/oulgen/159/orig -> origin/gh/oulgen/159/orig 2025-03-14T05:31:37.6761491Z * [new branch] gh/oulgen/160/base -> origin/gh/oulgen/160/base 2025-03-14T05:31:37.6763313Z * [new branch] gh/oulgen/160/head -> origin/gh/oulgen/160/head 2025-03-14T05:31:37.6765029Z * [new branch] gh/oulgen/160/orig -> origin/gh/oulgen/160/orig 2025-03-14T05:31:37.6768020Z * [new branch] gh/oulgen/161/base -> origin/gh/oulgen/161/base 2025-03-14T05:31:37.6770169Z * [new branch] gh/oulgen/161/head -> origin/gh/oulgen/161/head 2025-03-14T05:31:37.6771658Z * [new branch] gh/oulgen/161/orig -> origin/gh/oulgen/161/orig 2025-03-14T05:31:37.6773996Z * [new branch] gh/oulgen/2/base -> origin/gh/oulgen/2/base 2025-03-14T05:31:37.6775641Z * [new branch] gh/oulgen/2/head -> origin/gh/oulgen/2/head 2025-03-14T05:31:37.6777309Z * [new branch] gh/oulgen/2/orig -> origin/gh/oulgen/2/orig 2025-03-14T05:31:37.6779674Z * [new branch] gh/oulgen/21/base -> origin/gh/oulgen/21/base 2025-03-14T05:31:37.6781370Z * [new branch] gh/oulgen/21/head -> origin/gh/oulgen/21/head 2025-03-14T05:31:37.6783117Z * [new branch] gh/oulgen/21/orig -> origin/gh/oulgen/21/orig 2025-03-14T05:31:37.6786049Z * [new branch] gh/pearu/108/base -> origin/gh/pearu/108/base 2025-03-14T05:31:37.6787752Z * [new branch] gh/pearu/108/head -> origin/gh/pearu/108/head 2025-03-14T05:31:37.6789447Z * [new branch] gh/pearu/108/orig -> origin/gh/pearu/108/orig 2025-03-14T05:31:37.6792185Z * [new branch] gh/pearu/56/base -> origin/gh/pearu/56/base 2025-03-14T05:31:37.6794023Z * [new branch] gh/pearu/56/head -> origin/gh/pearu/56/head 2025-03-14T05:31:37.6795898Z * [new branch] gh/pearu/56/orig -> origin/gh/pearu/56/orig 2025-03-14T05:31:37.6798599Z * [new branch] gh/pearu/97/base -> origin/gh/pearu/97/base 2025-03-14T05:31:37.6800223Z * [new branch] gh/pearu/97/head -> origin/gh/pearu/97/head 2025-03-14T05:31:37.6801867Z * [new branch] gh/pearu/97/orig -> origin/gh/pearu/97/orig 2025-03-14T05:31:37.6804731Z * [new branch] gh/peterbell10/603/base -> origin/gh/peterbell10/603/base 2025-03-14T05:31:37.6806481Z * [new branch] gh/peterbell10/603/head -> origin/gh/peterbell10/603/head 2025-03-14T05:31:37.6808168Z * [new branch] gh/peterbell10/603/orig -> origin/gh/peterbell10/603/orig 2025-03-14T05:31:37.6810484Z * [new branch] gh/peterbell10/635/base -> origin/gh/peterbell10/635/base 2025-03-14T05:31:37.6812238Z * [new branch] gh/peterbell10/635/head -> origin/gh/peterbell10/635/head 2025-03-14T05:31:37.6813872Z * [new branch] gh/peterbell10/635/orig -> origin/gh/peterbell10/635/orig 2025-03-14T05:31:37.6816242Z * [new branch] gh/peterbell10/636/base -> origin/gh/peterbell10/636/base 2025-03-14T05:31:37.6817581Z * [new branch] gh/peterbell10/636/head -> origin/gh/peterbell10/636/head 2025-03-14T05:31:37.6819485Z * [new branch] gh/peterbell10/636/orig -> origin/gh/peterbell10/636/orig 2025-03-14T05:31:37.6822206Z * [new branch] gh/qqaatw/26/base -> origin/gh/qqaatw/26/base 2025-03-14T05:31:37.6823805Z * [new branch] gh/qqaatw/26/head -> origin/gh/qqaatw/26/head 2025-03-14T05:31:37.6825503Z * [new branch] gh/qqaatw/26/orig -> origin/gh/qqaatw/26/orig 2025-03-14T05:31:37.6827890Z * [new branch] gh/raymo/log-graph-breaks -> origin/gh/raymo/log-graph-breaks 2025-03-14T05:31:37.6830514Z * [new branch] gh/rec/115/base -> origin/gh/rec/115/base 2025-03-14T05:31:37.6832216Z * [new branch] gh/rec/115/head -> origin/gh/rec/115/head 2025-03-14T05:31:37.6833856Z * [new branch] gh/rec/115/orig -> origin/gh/rec/115/orig 2025-03-14T05:31:37.6836352Z * [new branch] gh/rec/118/base -> origin/gh/rec/118/base 2025-03-14T05:31:37.6837896Z * [new branch] gh/rec/118/head -> origin/gh/rec/118/head 2025-03-14T05:31:37.6839648Z * [new branch] gh/rec/118/orig -> origin/gh/rec/118/orig 2025-03-14T05:31:37.6841775Z * [new branch] gh/rec/119/base -> origin/gh/rec/119/base 2025-03-14T05:31:37.6843429Z * [new branch] gh/rec/119/head -> origin/gh/rec/119/head 2025-03-14T05:31:37.6845011Z * [new branch] gh/rec/119/orig -> origin/gh/rec/119/orig 2025-03-14T05:31:37.6847242Z * [new branch] gh/rec/120/base -> origin/gh/rec/120/base 2025-03-14T05:31:37.6848936Z * [new branch] gh/rec/120/head -> origin/gh/rec/120/head 2025-03-14T05:31:37.6850598Z * [new branch] gh/rec/120/orig -> origin/gh/rec/120/orig 2025-03-14T05:31:37.6852776Z * [new branch] gh/rec/124/base -> origin/gh/rec/124/base 2025-03-14T05:31:37.6854453Z * [new branch] gh/rec/124/head -> origin/gh/rec/124/head 2025-03-14T05:31:37.6856126Z * [new branch] gh/rec/124/orig -> origin/gh/rec/124/orig 2025-03-14T05:31:37.6859014Z * [new branch] gh/rec/125/base -> origin/gh/rec/125/base 2025-03-14T05:31:37.6860589Z * [new branch] gh/rec/125/head -> origin/gh/rec/125/head 2025-03-14T05:31:37.6868338Z * [new branch] gh/rec/125/orig -> origin/gh/rec/125/orig 2025-03-14T05:31:37.6869008Z * [new branch] gh/rec/128/base -> origin/gh/rec/128/base 2025-03-14T05:31:37.6869293Z * [new branch] gh/rec/128/head -> origin/gh/rec/128/head 2025-03-14T05:31:37.6869570Z * [new branch] gh/rec/128/orig -> origin/gh/rec/128/orig 2025-03-14T05:31:37.6870051Z * [new branch] gh/rec/129/base -> origin/gh/rec/129/base 2025-03-14T05:31:37.6872032Z * [new branch] gh/rec/129/head -> origin/gh/rec/129/head 2025-03-14T05:31:37.6873753Z * [new branch] gh/rec/129/orig -> origin/gh/rec/129/orig 2025-03-14T05:31:37.6876044Z * [new branch] gh/rec/132/base -> origin/gh/rec/132/base 2025-03-14T05:31:37.6877682Z * [new branch] gh/rec/132/head -> origin/gh/rec/132/head 2025-03-14T05:31:37.6879316Z * [new branch] gh/rec/132/orig -> origin/gh/rec/132/orig 2025-03-14T05:31:37.6881632Z * [new branch] gh/rec/133/base -> origin/gh/rec/133/base 2025-03-14T05:31:37.6883194Z * [new branch] gh/rec/133/head -> origin/gh/rec/133/head 2025-03-14T05:31:37.6884841Z * [new branch] gh/rec/133/orig -> origin/gh/rec/133/orig 2025-03-14T05:31:37.6887988Z * [new branch] gh/rec/134/base -> origin/gh/rec/134/base 2025-03-14T05:31:37.6888996Z * [new branch] gh/rec/134/head -> origin/gh/rec/134/head 2025-03-14T05:31:37.6890891Z * [new branch] gh/rec/134/orig -> origin/gh/rec/134/orig 2025-03-14T05:31:37.6893174Z * [new branch] gh/rec/135/base -> origin/gh/rec/135/base 2025-03-14T05:31:37.6894786Z * [new branch] gh/rec/135/head -> origin/gh/rec/135/head 2025-03-14T05:31:37.6896419Z * [new branch] gh/rec/135/orig -> origin/gh/rec/135/orig 2025-03-14T05:31:37.6898686Z * [new branch] gh/rec/136/base -> origin/gh/rec/136/base 2025-03-14T05:31:37.6900340Z * [new branch] gh/rec/136/head -> origin/gh/rec/136/head 2025-03-14T05:31:37.6902041Z * [new branch] gh/rec/136/orig -> origin/gh/rec/136/orig 2025-03-14T05:31:37.6904284Z * [new branch] gh/rec/137/base -> origin/gh/rec/137/base 2025-03-14T05:31:37.6905981Z * [new branch] gh/rec/137/head -> origin/gh/rec/137/head 2025-03-14T05:31:37.6907655Z * [new branch] gh/rec/137/orig -> origin/gh/rec/137/orig 2025-03-14T05:31:37.6910003Z * [new branch] gh/rec/27/base -> origin/gh/rec/27/base 2025-03-14T05:31:37.6911505Z * [new branch] gh/rec/27/head -> origin/gh/rec/27/head 2025-03-14T05:31:37.6913149Z * [new branch] gh/rec/27/orig -> origin/gh/rec/27/orig 2025-03-14T05:31:37.6916110Z * [new branch] gh/rohan-varma/742/base -> origin/gh/rohan-varma/742/base 2025-03-14T05:31:37.6917731Z * [new branch] gh/rohan-varma/742/head -> origin/gh/rohan-varma/742/head 2025-03-14T05:31:37.6919473Z * [new branch] gh/rohan-varma/742/orig -> origin/gh/rohan-varma/742/orig 2025-03-14T05:31:37.6922134Z * [new branch] gh/seemethere/10/base -> origin/gh/seemethere/10/base 2025-03-14T05:31:37.6923931Z * [new branch] gh/seemethere/10/head -> origin/gh/seemethere/10/head 2025-03-14T05:31:37.6925683Z * [new branch] gh/seemethere/10/orig -> origin/gh/seemethere/10/orig 2025-03-14T05:31:37.6927930Z * [new branch] gh/seemethere/11/base -> origin/gh/seemethere/11/base 2025-03-14T05:31:37.6929563Z * [new branch] gh/seemethere/11/head -> origin/gh/seemethere/11/head 2025-03-14T05:31:37.6931211Z * [new branch] gh/seemethere/11/orig -> origin/gh/seemethere/11/orig 2025-03-14T05:31:37.6933441Z * [new branch] gh/seemethere/12/base -> origin/gh/seemethere/12/base 2025-03-14T05:31:37.6935044Z * [new branch] gh/seemethere/12/head -> origin/gh/seemethere/12/head 2025-03-14T05:31:37.6936728Z * [new branch] gh/seemethere/12/orig -> origin/gh/seemethere/12/orig 2025-03-14T05:31:37.6939060Z * [new branch] gh/seemethere/13/base -> origin/gh/seemethere/13/base 2025-03-14T05:31:37.6940692Z * [new branch] gh/seemethere/13/head -> origin/gh/seemethere/13/head 2025-03-14T05:31:37.6942341Z * [new branch] gh/seemethere/13/orig -> origin/gh/seemethere/13/orig 2025-03-14T05:31:37.6944644Z * [new branch] gh/seemethere/14/base -> origin/gh/seemethere/14/base 2025-03-14T05:31:37.6946350Z * [new branch] gh/seemethere/14/head -> origin/gh/seemethere/14/head 2025-03-14T05:31:37.6948003Z * [new branch] gh/seemethere/14/orig -> origin/gh/seemethere/14/orig 2025-03-14T05:31:37.6950247Z * [new branch] gh/seemethere/15/base -> origin/gh/seemethere/15/base 2025-03-14T05:31:37.6951949Z * [new branch] gh/seemethere/15/head -> origin/gh/seemethere/15/head 2025-03-14T05:31:37.6953680Z * [new branch] gh/seemethere/15/orig -> origin/gh/seemethere/15/orig 2025-03-14T05:31:37.6956030Z * [new branch] gh/seemethere/16/base -> origin/gh/seemethere/16/base 2025-03-14T05:31:37.6957627Z * [new branch] gh/seemethere/16/head -> origin/gh/seemethere/16/head 2025-03-14T05:31:37.6959357Z * [new branch] gh/seemethere/16/orig -> origin/gh/seemethere/16/orig 2025-03-14T05:31:37.6961599Z * [new branch] gh/seemethere/17/base -> origin/gh/seemethere/17/base 2025-03-14T05:31:37.6963242Z * [new branch] gh/seemethere/17/head -> origin/gh/seemethere/17/head 2025-03-14T05:31:37.6965032Z * [new branch] gh/seemethere/17/orig -> origin/gh/seemethere/17/orig 2025-03-14T05:31:37.6967234Z * [new branch] gh/seemethere/18/base -> origin/gh/seemethere/18/base 2025-03-14T05:31:37.6969800Z * [new branch] gh/seemethere/18/head -> origin/gh/seemethere/18/head 2025-03-14T05:31:37.6974075Z * [new branch] gh/seemethere/18/orig -> origin/gh/seemethere/18/orig 2025-03-14T05:31:37.6976335Z * [new branch] gh/seemethere/19/base -> origin/gh/seemethere/19/base 2025-03-14T05:31:37.6978068Z * [new branch] gh/seemethere/19/head -> origin/gh/seemethere/19/head 2025-03-14T05:31:37.6980044Z * [new branch] gh/seemethere/19/orig -> origin/gh/seemethere/19/orig 2025-03-14T05:31:37.6982138Z * [new branch] gh/seemethere/20/base -> origin/gh/seemethere/20/base 2025-03-14T05:31:37.6983728Z * [new branch] gh/seemethere/20/head -> origin/gh/seemethere/20/head 2025-03-14T05:31:37.6985466Z * [new branch] gh/seemethere/20/orig -> origin/gh/seemethere/20/orig 2025-03-14T05:31:37.6987695Z * [new branch] gh/seemethere/7/base -> origin/gh/seemethere/7/base 2025-03-14T05:31:37.6989304Z * [new branch] gh/seemethere/7/head -> origin/gh/seemethere/7/head 2025-03-14T05:31:37.6990948Z * [new branch] gh/seemethere/7/orig -> origin/gh/seemethere/7/orig 2025-03-14T05:31:37.6993184Z * [new branch] gh/seemethere/8/base -> origin/gh/seemethere/8/base 2025-03-14T05:31:37.6995304Z * [new branch] gh/seemethere/8/head -> origin/gh/seemethere/8/head 2025-03-14T05:31:37.6996800Z * [new branch] gh/seemethere/8/orig -> origin/gh/seemethere/8/orig 2025-03-14T05:31:37.6999018Z * [new branch] gh/seemethere/9/base -> origin/gh/seemethere/9/base 2025-03-14T05:31:37.7000707Z * [new branch] gh/seemethere/9/head -> origin/gh/seemethere/9/head 2025-03-14T05:31:37.7002329Z * [new branch] gh/seemethere/9/orig -> origin/gh/seemethere/9/orig 2025-03-14T05:31:37.7005347Z * [new branch] gh/shunting314/145/base -> origin/gh/shunting314/145/base 2025-03-14T05:31:37.7007119Z * [new branch] gh/shunting314/145/head -> origin/gh/shunting314/145/head 2025-03-14T05:31:37.7008885Z * [new branch] gh/shunting314/145/orig -> origin/gh/shunting314/145/orig 2025-03-14T05:31:37.7011403Z * [new branch] gh/shunting314/151/base -> origin/gh/shunting314/151/base 2025-03-14T05:31:37.7013052Z * [new branch] gh/shunting314/151/head -> origin/gh/shunting314/151/head 2025-03-14T05:31:37.7014778Z * [new branch] gh/shunting314/151/orig -> origin/gh/shunting314/151/orig 2025-03-14T05:31:37.7017256Z * [new branch] gh/shunting314/176/base -> origin/gh/shunting314/176/base 2025-03-14T05:31:37.7018969Z * [new branch] gh/shunting314/176/head -> origin/gh/shunting314/176/head 2025-03-14T05:31:37.7020603Z * [new branch] gh/shunting314/176/orig -> origin/gh/shunting314/176/orig 2025-03-14T05:31:37.7022971Z * [new branch] gh/shunting314/199/base -> origin/gh/shunting314/199/base 2025-03-14T05:31:37.7024830Z * [new branch] gh/shunting314/199/head -> origin/gh/shunting314/199/head 2025-03-14T05:31:37.7026389Z * [new branch] gh/shunting314/199/orig -> origin/gh/shunting314/199/orig 2025-03-14T05:31:37.7028488Z * [new branch] gh/shunting314/200/base -> origin/gh/shunting314/200/base 2025-03-14T05:31:37.7030232Z * [new branch] gh/shunting314/200/head -> origin/gh/shunting314/200/head 2025-03-14T05:31:37.7032377Z * [new branch] gh/shunting314/201/base -> origin/gh/shunting314/201/base 2025-03-14T05:31:37.7033971Z * [new branch] gh/shunting314/201/head -> origin/gh/shunting314/201/head 2025-03-14T05:31:37.7035797Z * [new branch] gh/shunting314/201/orig -> origin/gh/shunting314/201/orig 2025-03-14T05:31:37.7038429Z * [new branch] gh/sijiac/1/base -> origin/gh/sijiac/1/base 2025-03-14T05:31:37.7040180Z * [new branch] gh/sijiac/1/head -> origin/gh/sijiac/1/head 2025-03-14T05:31:37.7042273Z * [new branch] gh/sijiac/2/base -> origin/gh/sijiac/2/base 2025-03-14T05:31:37.7043928Z * [new branch] gh/sijiac/2/head -> origin/gh/sijiac/2/head 2025-03-14T05:31:37.7046174Z * [new branch] gh/sijiac/3/base -> origin/gh/sijiac/3/base 2025-03-14T05:31:37.7047647Z * [new branch] gh/sijiac/3/head -> origin/gh/sijiac/3/head 2025-03-14T05:31:37.7050386Z * [new branch] gh/silverguo/1/base -> origin/gh/silverguo/1/base 2025-03-14T05:31:37.7052024Z * [new branch] gh/silverguo/1/head -> origin/gh/silverguo/1/head 2025-03-14T05:31:37.7054128Z * [new branch] gh/silverguo/2/base -> origin/gh/silverguo/2/base 2025-03-14T05:31:37.7055724Z * [new branch] gh/silverguo/2/head -> origin/gh/silverguo/2/head 2025-03-14T05:31:37.7057981Z * [new branch] gh/silverguo/3/base -> origin/gh/silverguo/3/base 2025-03-14T05:31:37.7060082Z * [new branch] gh/silverguo/3/head -> origin/gh/silverguo/3/head 2025-03-14T05:31:37.7062220Z * [new branch] gh/silverguo/4/base -> origin/gh/silverguo/4/base 2025-03-14T05:31:37.7063842Z * [new branch] gh/silverguo/4/head -> origin/gh/silverguo/4/head 2025-03-14T05:31:37.7066663Z * [new branch] gh/sinhaanhsul/1/base -> origin/gh/sinhaanhsul/1/base 2025-03-14T05:31:37.7068776Z * [new branch] gh/sinhaanhsul/1/head -> origin/gh/sinhaanhsul/1/head 2025-03-14T05:31:37.7071602Z * [new branch] gh/soulitzer/269/base -> origin/gh/soulitzer/269/base 2025-03-14T05:31:37.7073205Z * [new branch] gh/soulitzer/269/head -> origin/gh/soulitzer/269/head 2025-03-14T05:31:37.7074934Z * [new branch] gh/soulitzer/269/orig -> origin/gh/soulitzer/269/orig 2025-03-14T05:31:37.7077398Z * [new branch] gh/soulitzer/276/base -> origin/gh/soulitzer/276/base 2025-03-14T05:31:37.7079181Z * [new branch] gh/soulitzer/276/head -> origin/gh/soulitzer/276/head 2025-03-14T05:31:37.7081245Z * [new branch] gh/soulitzer/276/orig -> origin/gh/soulitzer/276/orig 2025-03-14T05:31:37.7083745Z * [new branch] gh/soulitzer/287/base -> origin/gh/soulitzer/287/base 2025-03-14T05:31:37.7085405Z * [new branch] gh/soulitzer/287/head -> origin/gh/soulitzer/287/head 2025-03-14T05:31:37.7087180Z * [new branch] gh/soulitzer/287/orig -> origin/gh/soulitzer/287/orig 2025-03-14T05:31:37.7089468Z * [new branch] gh/soulitzer/296/base -> origin/gh/soulitzer/296/base 2025-03-14T05:31:37.7091137Z * [new branch] gh/soulitzer/296/head -> origin/gh/soulitzer/296/head 2025-03-14T05:31:37.7092778Z * [new branch] gh/soulitzer/296/orig -> origin/gh/soulitzer/296/orig 2025-03-14T05:31:37.7095355Z * [new branch] gh/soulitzer/299/base -> origin/gh/soulitzer/299/base 2025-03-14T05:31:37.7096799Z * [new branch] gh/soulitzer/299/head -> origin/gh/soulitzer/299/head 2025-03-14T05:31:37.7098779Z * [new branch] gh/soulitzer/299/orig -> origin/gh/soulitzer/299/orig 2025-03-14T05:31:37.7101116Z * [new branch] gh/soulitzer/300/base -> origin/gh/soulitzer/300/base 2025-03-14T05:31:37.7102865Z * [new branch] gh/soulitzer/300/head -> origin/gh/soulitzer/300/head 2025-03-14T05:31:37.7104489Z * [new branch] gh/soulitzer/300/orig -> origin/gh/soulitzer/300/orig 2025-03-14T05:31:37.7107383Z * [new branch] gh/soulitzer/301/base -> origin/gh/soulitzer/301/base 2025-03-14T05:31:37.7109122Z * [new branch] gh/soulitzer/301/head -> origin/gh/soulitzer/301/head 2025-03-14T05:31:37.7110795Z * [new branch] gh/soulitzer/301/orig -> origin/gh/soulitzer/301/orig 2025-03-14T05:31:37.7113015Z * [new branch] gh/soulitzer/313/base -> origin/gh/soulitzer/313/base 2025-03-14T05:31:37.7114849Z * [new branch] gh/soulitzer/313/head -> origin/gh/soulitzer/313/head 2025-03-14T05:31:37.7116644Z * [new branch] gh/soulitzer/313/orig -> origin/gh/soulitzer/313/orig 2025-03-14T05:31:37.7119072Z * [new branch] gh/soulitzer/319/base -> origin/gh/soulitzer/319/base 2025-03-14T05:31:37.7120181Z * [new branch] gh/soulitzer/319/head -> origin/gh/soulitzer/319/head 2025-03-14T05:31:37.7122182Z * [new branch] gh/soulitzer/319/orig -> origin/gh/soulitzer/319/orig 2025-03-14T05:31:37.7124495Z * [new branch] gh/soulitzer/320/base -> origin/gh/soulitzer/320/base 2025-03-14T05:31:37.7126185Z * [new branch] gh/soulitzer/320/head -> origin/gh/soulitzer/320/head 2025-03-14T05:31:37.7127776Z * [new branch] gh/soulitzer/320/orig -> origin/gh/soulitzer/320/orig 2025-03-14T05:31:37.7130068Z * [new branch] gh/soulitzer/329/base -> origin/gh/soulitzer/329/base 2025-03-14T05:31:37.7131673Z * [new branch] gh/soulitzer/329/head -> origin/gh/soulitzer/329/head 2025-03-14T05:31:37.7133333Z * [new branch] gh/soulitzer/329/orig -> origin/gh/soulitzer/329/orig 2025-03-14T05:31:37.7135557Z * [new branch] gh/soulitzer/331/base -> origin/gh/soulitzer/331/base 2025-03-14T05:31:37.7137214Z * [new branch] gh/soulitzer/331/head -> origin/gh/soulitzer/331/head 2025-03-14T05:31:37.7138846Z * [new branch] gh/soulitzer/331/orig -> origin/gh/soulitzer/331/orig 2025-03-14T05:31:37.7141274Z * [new branch] gh/soulitzer/332/base -> origin/gh/soulitzer/332/base 2025-03-14T05:31:37.7142819Z * [new branch] gh/soulitzer/332/head -> origin/gh/soulitzer/332/head 2025-03-14T05:31:37.7144437Z * [new branch] gh/soulitzer/332/orig -> origin/gh/soulitzer/332/orig 2025-03-14T05:31:37.7146774Z * [new branch] gh/soulitzer/335/base -> origin/gh/soulitzer/335/base 2025-03-14T05:31:37.7148619Z * [new branch] gh/soulitzer/335/head -> origin/gh/soulitzer/335/head 2025-03-14T05:31:37.7150271Z * [new branch] gh/soulitzer/335/orig -> origin/gh/soulitzer/335/orig 2025-03-14T05:31:37.7152647Z * [new branch] gh/soulitzer/336/base -> origin/gh/soulitzer/336/base 2025-03-14T05:31:37.7154374Z * [new branch] gh/soulitzer/336/head -> origin/gh/soulitzer/336/head 2025-03-14T05:31:37.7155965Z * [new branch] gh/soulitzer/336/orig -> origin/gh/soulitzer/336/orig 2025-03-14T05:31:37.7158256Z * [new branch] gh/soulitzer/347/base -> origin/gh/soulitzer/347/base 2025-03-14T05:31:37.7160363Z * [new branch] gh/soulitzer/347/head -> origin/gh/soulitzer/347/head 2025-03-14T05:31:37.7162102Z * [new branch] gh/soulitzer/347/orig -> origin/gh/soulitzer/347/orig 2025-03-14T05:31:37.7165029Z * [new branch] gh/soulitzer/349/base -> origin/gh/soulitzer/349/base 2025-03-14T05:31:37.7166758Z * [new branch] gh/soulitzer/349/head -> origin/gh/soulitzer/349/head 2025-03-14T05:31:37.7168834Z * [new branch] gh/soulitzer/349/orig -> origin/gh/soulitzer/349/orig 2025-03-14T05:31:37.7171191Z * [new branch] gh/soulitzer/350/base -> origin/gh/soulitzer/350/base 2025-03-14T05:31:37.7172825Z * [new branch] gh/soulitzer/350/head -> origin/gh/soulitzer/350/head 2025-03-14T05:31:37.7175068Z * [new branch] gh/soulitzer/350/orig -> origin/gh/soulitzer/350/orig 2025-03-14T05:31:37.7177507Z * [new branch] gh/soulitzer/351/base -> origin/gh/soulitzer/351/base 2025-03-14T05:31:37.7179203Z * [new branch] gh/soulitzer/351/head -> origin/gh/soulitzer/351/head 2025-03-14T05:31:37.7180820Z * [new branch] gh/soulitzer/351/orig -> origin/gh/soulitzer/351/orig 2025-03-14T05:31:37.7183116Z * [new branch] gh/soulitzer/353/base -> origin/gh/soulitzer/353/base 2025-03-14T05:31:37.7185178Z * [new branch] gh/soulitzer/353/head -> origin/gh/soulitzer/353/head 2025-03-14T05:31:37.7186601Z * [new branch] gh/soulitzer/353/orig -> origin/gh/soulitzer/353/orig 2025-03-14T05:31:37.7189412Z * [new branch] gh/suo/619/base -> origin/gh/suo/619/base 2025-03-14T05:31:37.7192669Z * [new branch] gh/swolchok/704/base -> origin/gh/swolchok/704/base 2025-03-14T05:31:37.7194533Z * [new branch] gh/swolchok/704/orig -> origin/gh/swolchok/704/orig 2025-03-14T05:31:37.7197023Z * [new branch] gh/swolchok/722/base -> origin/gh/swolchok/722/base 2025-03-14T05:31:37.7198694Z * [new branch] gh/swolchok/722/head -> origin/gh/swolchok/722/head 2025-03-14T05:31:37.7200360Z * [new branch] gh/swolchok/722/orig -> origin/gh/swolchok/722/orig 2025-03-14T05:31:37.7202534Z * [new branch] gh/swolchok/723/base -> origin/gh/swolchok/723/base 2025-03-14T05:31:37.7204267Z * [new branch] gh/swolchok/723/head -> origin/gh/swolchok/723/head 2025-03-14T05:31:37.7205968Z * [new branch] gh/swolchok/723/orig -> origin/gh/swolchok/723/orig 2025-03-14T05:31:37.7208670Z * [new branch] gh/syed-ahmed/1/base -> origin/gh/syed-ahmed/1/base 2025-03-14T05:31:37.7210389Z * [new branch] gh/syed-ahmed/1/head -> origin/gh/syed-ahmed/1/head 2025-03-14T05:31:37.7211999Z * [new branch] gh/syed-ahmed/1/orig -> origin/gh/syed-ahmed/1/orig 2025-03-14T05:31:37.7214156Z * [new branch] gh/syed-ahmed/2/base -> origin/gh/syed-ahmed/2/base 2025-03-14T05:31:37.7215946Z * [new branch] gh/syed-ahmed/2/head -> origin/gh/syed-ahmed/2/head 2025-03-14T05:31:37.7217618Z * [new branch] gh/syed-ahmed/2/orig -> origin/gh/syed-ahmed/2/orig 2025-03-14T05:31:37.7220359Z * [new branch] gh/tianyu-l/2/base -> origin/gh/tianyu-l/2/base 2025-03-14T05:31:37.7222021Z * [new branch] gh/tianyu-l/2/head -> origin/gh/tianyu-l/2/head 2025-03-14T05:31:37.7223627Z * [new branch] gh/tianyu-l/2/orig -> origin/gh/tianyu-l/2/orig 2025-03-14T05:31:37.7225956Z * [new branch] gh/tianyu-l/6/base -> origin/gh/tianyu-l/6/base 2025-03-14T05:31:37.7227639Z * [new branch] gh/tianyu-l/6/head -> origin/gh/tianyu-l/6/head 2025-03-14T05:31:37.7229310Z * [new branch] gh/tianyu-l/6/orig -> origin/gh/tianyu-l/6/orig 2025-03-14T05:31:37.7231516Z * [new branch] gh/tianyu-l/7/base -> origin/gh/tianyu-l/7/base 2025-03-14T05:31:37.7233174Z * [new branch] gh/tianyu-l/7/head -> origin/gh/tianyu-l/7/head 2025-03-14T05:31:37.7235413Z * [new branch] gh/tianyu-l/7/orig -> origin/gh/tianyu-l/7/orig 2025-03-14T05:31:37.7238322Z * [new branch] gh/tugsbayasgalan/155/base -> origin/gh/tugsbayasgalan/155/base 2025-03-14T05:31:37.7239992Z * [new branch] gh/tugsbayasgalan/155/head -> origin/gh/tugsbayasgalan/155/head 2025-03-14T05:31:37.7241636Z * [new branch] gh/tugsbayasgalan/155/orig -> origin/gh/tugsbayasgalan/155/orig 2025-03-14T05:31:37.7243953Z * [new branch] gh/tugsbayasgalan/162/base -> origin/gh/tugsbayasgalan/162/base 2025-03-14T05:31:37.7245521Z * [new branch] gh/tugsbayasgalan/162/head -> origin/gh/tugsbayasgalan/162/head 2025-03-14T05:31:37.7247212Z * [new branch] gh/tugsbayasgalan/162/orig -> origin/gh/tugsbayasgalan/162/orig 2025-03-14T05:31:37.7249534Z * [new branch] gh/tugsbayasgalan/277/base -> origin/gh/tugsbayasgalan/277/base 2025-03-14T05:31:37.7251106Z * [new branch] gh/tugsbayasgalan/277/head -> origin/gh/tugsbayasgalan/277/head 2025-03-14T05:31:37.7252717Z * [new branch] gh/tugsbayasgalan/277/orig -> origin/gh/tugsbayasgalan/277/orig 2025-03-14T05:31:37.7255408Z * [new branch] gh/tugsbayasgalan/282/base -> origin/gh/tugsbayasgalan/282/base 2025-03-14T05:31:37.7257168Z * [new branch] gh/tugsbayasgalan/282/head -> origin/gh/tugsbayasgalan/282/head 2025-03-14T05:31:37.7258816Z * [new branch] gh/tugsbayasgalan/282/orig -> origin/gh/tugsbayasgalan/282/orig 2025-03-14T05:31:37.7261263Z * [new branch] gh/tugsbayasgalan/290/base -> origin/gh/tugsbayasgalan/290/base 2025-03-14T05:31:37.7262954Z * [new branch] gh/tugsbayasgalan/290/head -> origin/gh/tugsbayasgalan/290/head 2025-03-14T05:31:37.7264646Z * [new branch] gh/tugsbayasgalan/290/orig -> origin/gh/tugsbayasgalan/290/orig 2025-03-14T05:31:37.7266977Z * [new branch] gh/tugsbayasgalan/291/base -> origin/gh/tugsbayasgalan/291/base 2025-03-14T05:31:37.7268899Z * [new branch] gh/tugsbayasgalan/291/head -> origin/gh/tugsbayasgalan/291/head 2025-03-14T05:31:37.7273547Z * [new branch] gh/tugsbayasgalan/291/orig -> origin/gh/tugsbayasgalan/291/orig 2025-03-14T05:31:37.7276068Z * [new branch] gh/tugsbayasgalan/292/base -> origin/gh/tugsbayasgalan/292/base 2025-03-14T05:31:37.7277633Z * [new branch] gh/tugsbayasgalan/292/head -> origin/gh/tugsbayasgalan/292/head 2025-03-14T05:31:37.7279233Z * [new branch] gh/tugsbayasgalan/292/orig -> origin/gh/tugsbayasgalan/292/orig 2025-03-14T05:31:37.7281499Z * [new branch] gh/tugsbayasgalan/293/base -> origin/gh/tugsbayasgalan/293/base 2025-03-14T05:31:37.7283206Z * [new branch] gh/tugsbayasgalan/293/head -> origin/gh/tugsbayasgalan/293/head 2025-03-14T05:31:37.7284844Z * [new branch] gh/tugsbayasgalan/293/orig -> origin/gh/tugsbayasgalan/293/orig 2025-03-14T05:31:37.7287216Z * [new branch] gh/tugsbayasgalan/294/base -> origin/gh/tugsbayasgalan/294/base 2025-03-14T05:31:37.7288806Z * [new branch] gh/tugsbayasgalan/294/head -> origin/gh/tugsbayasgalan/294/head 2025-03-14T05:31:37.7290438Z * [new branch] gh/tugsbayasgalan/294/orig -> origin/gh/tugsbayasgalan/294/orig 2025-03-14T05:31:37.7292726Z * [new branch] gh/tugsbayasgalan/295/base -> origin/gh/tugsbayasgalan/295/base 2025-03-14T05:31:37.7294535Z * [new branch] gh/tugsbayasgalan/295/head -> origin/gh/tugsbayasgalan/295/head 2025-03-14T05:31:37.7296219Z * [new branch] gh/tugsbayasgalan/295/orig -> origin/gh/tugsbayasgalan/295/orig 2025-03-14T05:31:37.7298518Z * [new branch] gh/tugsbayasgalan/296/base -> origin/gh/tugsbayasgalan/296/base 2025-03-14T05:31:37.7300258Z * [new branch] gh/tugsbayasgalan/296/head -> origin/gh/tugsbayasgalan/296/head 2025-03-14T05:31:37.7301897Z * [new branch] gh/tugsbayasgalan/296/orig -> origin/gh/tugsbayasgalan/296/orig 2025-03-14T05:31:37.7304399Z * [new branch] gh/tugsbayasgalan/297/base -> origin/gh/tugsbayasgalan/297/base 2025-03-14T05:31:37.7305993Z * [new branch] gh/tugsbayasgalan/297/head -> origin/gh/tugsbayasgalan/297/head 2025-03-14T05:31:37.7307689Z * [new branch] gh/tugsbayasgalan/297/orig -> origin/gh/tugsbayasgalan/297/orig 2025-03-14T05:31:37.7310063Z * [new branch] gh/tugsbayasgalan/298/base -> origin/gh/tugsbayasgalan/298/base 2025-03-14T05:31:37.7311720Z * [new branch] gh/tugsbayasgalan/298/head -> origin/gh/tugsbayasgalan/298/head 2025-03-14T05:31:37.7313375Z * [new branch] gh/tugsbayasgalan/298/orig -> origin/gh/tugsbayasgalan/298/orig 2025-03-14T05:31:37.7315737Z * [new branch] gh/tugsbayasgalan/299/base -> origin/gh/tugsbayasgalan/299/base 2025-03-14T05:31:37.7317359Z * [new branch] gh/tugsbayasgalan/299/head -> origin/gh/tugsbayasgalan/299/head 2025-03-14T05:31:37.7318978Z * [new branch] gh/tugsbayasgalan/299/orig -> origin/gh/tugsbayasgalan/299/orig 2025-03-14T05:31:37.7321938Z * [new branch] gh/vkuzo/1/head -> origin/gh/vkuzo/1/head 2025-03-14T05:31:37.7322944Z * [new branch] gh/vkuzo/1/next -> origin/gh/vkuzo/1/next 2025-03-14T05:31:37.7324897Z * [new branch] gh/vkuzo/1/orig -> origin/gh/vkuzo/1/orig 2025-03-14T05:31:37.7327123Z * [new branch] gh/vkuzo/2/head -> origin/gh/vkuzo/2/head 2025-03-14T05:31:37.7328622Z * [new branch] gh/vkuzo/2/next -> origin/gh/vkuzo/2/next 2025-03-14T05:31:37.7330362Z * [new branch] gh/vkuzo/2/orig -> origin/gh/vkuzo/2/orig 2025-03-14T05:31:37.7332588Z * [new branch] gh/vkuzo/3/head -> origin/gh/vkuzo/3/head 2025-03-14T05:31:37.7334139Z * [new branch] gh/vkuzo/3/next -> origin/gh/vkuzo/3/next 2025-03-14T05:31:37.7335790Z * [new branch] gh/vkuzo/3/orig -> origin/gh/vkuzo/3/orig 2025-03-14T05:31:37.7338023Z * [new branch] gh/vkuzo/4/base -> origin/gh/vkuzo/4/base 2025-03-14T05:31:37.7339690Z * [new branch] gh/vkuzo/4/head -> origin/gh/vkuzo/4/head 2025-03-14T05:31:37.7341297Z * [new branch] gh/vkuzo/4/orig -> origin/gh/vkuzo/4/orig 2025-03-14T05:31:37.7343719Z * [new branch] gh/vkuzo/5/base -> origin/gh/vkuzo/5/base 2025-03-14T05:31:37.7345351Z * [new branch] gh/vkuzo/5/head -> origin/gh/vkuzo/5/head 2025-03-14T05:31:37.7347108Z * [new branch] gh/vkuzo/5/orig -> origin/gh/vkuzo/5/orig 2025-03-14T05:31:37.7349389Z * [new branch] gh/vkuzo/6/base -> origin/gh/vkuzo/6/base 2025-03-14T05:31:37.7350967Z * [new branch] gh/vkuzo/6/head -> origin/gh/vkuzo/6/head 2025-03-14T05:31:37.7352738Z * [new branch] gh/vkuzo/6/orig -> origin/gh/vkuzo/6/orig 2025-03-14T05:31:37.7355085Z * [new branch] gh/vkuzo/7/base -> origin/gh/vkuzo/7/base 2025-03-14T05:31:37.7356763Z * [new branch] gh/vkuzo/7/head -> origin/gh/vkuzo/7/head 2025-03-14T05:31:37.7358392Z * [new branch] gh/vkuzo/7/orig -> origin/gh/vkuzo/7/orig 2025-03-14T05:31:37.7360643Z * [new branch] gh/vkuzo/8/base -> origin/gh/vkuzo/8/base 2025-03-14T05:31:37.7362298Z * [new branch] gh/vkuzo/8/head -> origin/gh/vkuzo/8/head 2025-03-14T05:31:37.7363968Z * [new branch] gh/vkuzo/8/orig -> origin/gh/vkuzo/8/orig 2025-03-14T05:31:37.7366251Z * [new branch] gh/vkuzo/9/base -> origin/gh/vkuzo/9/base 2025-03-14T05:31:37.7368077Z * [new branch] gh/vkuzo/9/head -> origin/gh/vkuzo/9/head 2025-03-14T05:31:37.7370512Z * [new branch] gh/vkuzo/9/orig -> origin/gh/vkuzo/9/orig 2025-03-14T05:31:37.7373322Z * [new branch] gh/vmoens/10/base -> origin/gh/vmoens/10/base 2025-03-14T05:31:37.7375005Z * [new branch] gh/vmoens/10/head -> origin/gh/vmoens/10/head 2025-03-14T05:31:37.7376632Z * [new branch] gh/vmoens/10/orig -> origin/gh/vmoens/10/orig 2025-03-14T05:31:37.7378877Z * [new branch] gh/vmoens/15/base -> origin/gh/vmoens/15/base 2025-03-14T05:31:37.7380561Z * [new branch] gh/vmoens/15/head -> origin/gh/vmoens/15/head 2025-03-14T05:31:37.7382147Z * [new branch] gh/vmoens/15/orig -> origin/gh/vmoens/15/orig 2025-03-14T05:31:37.7384587Z * [new branch] gh/vmoens/16/base -> origin/gh/vmoens/16/base 2025-03-14T05:31:37.7386435Z * [new branch] gh/vmoens/16/head -> origin/gh/vmoens/16/head 2025-03-14T05:31:37.7388158Z * [new branch] gh/vmoens/16/orig -> origin/gh/vmoens/16/orig 2025-03-14T05:31:37.7390540Z * [new branch] gh/vmoens/17/base -> origin/gh/vmoens/17/base 2025-03-14T05:31:37.7391993Z * [new branch] gh/vmoens/17/head -> origin/gh/vmoens/17/head 2025-03-14T05:31:37.7393703Z * [new branch] gh/vmoens/17/orig -> origin/gh/vmoens/17/orig 2025-03-14T05:31:37.7396175Z * [new branch] gh/vmoens/18/base -> origin/gh/vmoens/18/base 2025-03-14T05:31:37.7397782Z * [new branch] gh/vmoens/18/head -> origin/gh/vmoens/18/head 2025-03-14T05:31:37.7399529Z * [new branch] gh/vmoens/18/orig -> origin/gh/vmoens/18/orig 2025-03-14T05:31:37.7402307Z * [new branch] gh/vmoens/19/base -> origin/gh/vmoens/19/base 2025-03-14T05:31:37.7403935Z * [new branch] gh/vmoens/19/head -> origin/gh/vmoens/19/head 2025-03-14T05:31:37.7405574Z * [new branch] gh/vmoens/19/orig -> origin/gh/vmoens/19/orig 2025-03-14T05:31:37.7407832Z * [new branch] gh/vmoens/20/base -> origin/gh/vmoens/20/base 2025-03-14T05:31:37.7409468Z * [new branch] gh/vmoens/20/head -> origin/gh/vmoens/20/head 2025-03-14T05:31:37.7411087Z * [new branch] gh/vmoens/20/orig -> origin/gh/vmoens/20/orig 2025-03-14T05:31:37.7413885Z * [new branch] gh/voznesenskym/231/base -> origin/gh/voznesenskym/231/base 2025-03-14T05:31:37.7415669Z * [new branch] gh/voznesenskym/231/head -> origin/gh/voznesenskym/231/head 2025-03-14T05:31:37.7417303Z * [new branch] gh/voznesenskym/231/orig -> origin/gh/voznesenskym/231/orig 2025-03-14T05:31:37.7419779Z * [new branch] gh/voznesenskym/254/base -> origin/gh/voznesenskym/254/base 2025-03-14T05:31:37.7421530Z * [new branch] gh/voznesenskym/254/head -> origin/gh/voznesenskym/254/head 2025-03-14T05:31:37.7423178Z * [new branch] gh/voznesenskym/254/orig -> origin/gh/voznesenskym/254/orig 2025-03-14T05:31:37.7425971Z * [new branch] gh/wanchaol/360/base -> origin/gh/wanchaol/360/base 2025-03-14T05:31:37.7427838Z * [new branch] gh/wanchaol/360/head -> origin/gh/wanchaol/360/head 2025-03-14T05:31:37.7429492Z * [new branch] gh/wanchaol/360/orig -> origin/gh/wanchaol/360/orig 2025-03-14T05:31:37.7431859Z * [new branch] gh/wanchaol/367/base -> origin/gh/wanchaol/367/base 2025-03-14T05:31:37.7433602Z * [new branch] gh/wanchaol/367/head -> origin/gh/wanchaol/367/head 2025-03-14T05:31:37.7435524Z * [new branch] gh/wanchaol/367/orig -> origin/gh/wanchaol/367/orig 2025-03-14T05:31:37.7437894Z * [new branch] gh/wanchaol/368/base -> origin/gh/wanchaol/368/base 2025-03-14T05:31:37.7439568Z * [new branch] gh/wanchaol/368/head -> origin/gh/wanchaol/368/head 2025-03-14T05:31:37.7441248Z * [new branch] gh/wanchaol/368/orig -> origin/gh/wanchaol/368/orig 2025-03-14T05:31:37.7444113Z * [new branch] gh/wconstab/204/base -> origin/gh/wconstab/204/base 2025-03-14T05:31:37.7445979Z * [new branch] gh/wconstab/204/orig -> origin/gh/wconstab/204/orig 2025-03-14T05:31:37.7448204Z * [new branch] gh/wconstab/380/base -> origin/gh/wconstab/380/base 2025-03-14T05:31:37.7450401Z * [new branch] gh/wconstab/380/head -> origin/gh/wconstab/380/head 2025-03-14T05:31:37.7452084Z * [new branch] gh/wconstab/380/orig -> origin/gh/wconstab/380/orig 2025-03-14T05:31:37.7454517Z * [new branch] gh/wconstab/392/base -> origin/gh/wconstab/392/base 2025-03-14T05:31:37.7456088Z * [new branch] gh/wconstab/392/head -> origin/gh/wconstab/392/head 2025-03-14T05:31:37.7457776Z * [new branch] gh/wconstab/392/orig -> origin/gh/wconstab/392/orig 2025-03-14T05:31:37.7460455Z * [new branch] gh/wconstab/395/base -> origin/gh/wconstab/395/base 2025-03-14T05:31:37.7462543Z * [new branch] gh/wconstab/395/head -> origin/gh/wconstab/395/head 2025-03-14T05:31:37.7464233Z * [new branch] gh/wconstab/395/orig -> origin/gh/wconstab/395/orig 2025-03-14T05:31:37.7467402Z * [new branch] gh/wconstab/396/base -> origin/gh/wconstab/396/base 2025-03-14T05:31:37.7469241Z * [new branch] gh/wconstab/396/head -> origin/gh/wconstab/396/head 2025-03-14T05:31:37.7471141Z * [new branch] gh/wconstab/396/orig -> origin/gh/wconstab/396/orig 2025-03-14T05:31:37.7473497Z * [new branch] gh/wconstab/397/base -> origin/gh/wconstab/397/base 2025-03-14T05:31:37.7475237Z * [new branch] gh/wconstab/397/head -> origin/gh/wconstab/397/head 2025-03-14T05:31:37.7476886Z * [new branch] gh/wconstab/397/orig -> origin/gh/wconstab/397/orig 2025-03-14T05:31:37.7479687Z * [new branch] gh/weifengpy/21/base -> origin/gh/weifengpy/21/base 2025-03-14T05:31:37.7481337Z * [new branch] gh/weifengpy/21/head -> origin/gh/weifengpy/21/head 2025-03-14T05:31:37.7483017Z * [new branch] gh/weifengpy/21/orig -> origin/gh/weifengpy/21/orig 2025-03-14T05:31:37.7485284Z * [new branch] gh/weifengpy/22/base -> origin/gh/weifengpy/22/base 2025-03-14T05:31:37.7486968Z * [new branch] gh/weifengpy/22/head -> origin/gh/weifengpy/22/head 2025-03-14T05:31:37.7488778Z * [new branch] gh/weifengpy/22/orig -> origin/gh/weifengpy/22/orig 2025-03-14T05:31:37.7491643Z * [new branch] gh/williamwen42/196/base -> origin/gh/williamwen42/196/base 2025-03-14T05:31:37.7493230Z * [new branch] gh/williamwen42/196/head -> origin/gh/williamwen42/196/head 2025-03-14T05:31:37.7494913Z * [new branch] gh/williamwen42/196/orig -> origin/gh/williamwen42/196/orig 2025-03-14T05:31:37.7498154Z * [new branch] gh/williamwen42/197/base -> origin/gh/williamwen42/197/base 2025-03-14T05:31:37.7499846Z * [new branch] gh/williamwen42/197/head -> origin/gh/williamwen42/197/head 2025-03-14T05:31:37.7501433Z * [new branch] gh/williamwen42/197/orig -> origin/gh/williamwen42/197/orig 2025-03-14T05:31:37.7504337Z * [new branch] gh/williamwen42/199/base -> origin/gh/williamwen42/199/base 2025-03-14T05:31:37.7505973Z * [new branch] gh/williamwen42/199/head -> origin/gh/williamwen42/199/head 2025-03-14T05:31:37.7507642Z * [new branch] gh/williamwen42/199/orig -> origin/gh/williamwen42/199/orig 2025-03-14T05:31:37.7510549Z * [new branch] gh/williamwen42/200/base -> origin/gh/williamwen42/200/base 2025-03-14T05:31:37.7512405Z * [new branch] gh/williamwen42/200/head -> origin/gh/williamwen42/200/head 2025-03-14T05:31:37.7514192Z * [new branch] gh/williamwen42/200/orig -> origin/gh/williamwen42/200/orig 2025-03-14T05:31:37.7516606Z * [new branch] gh/williamwen42/201/base -> origin/gh/williamwen42/201/base 2025-03-14T05:31:37.7518263Z * [new branch] gh/williamwen42/201/head -> origin/gh/williamwen42/201/head 2025-03-14T05:31:37.7519947Z * [new branch] gh/williamwen42/201/orig -> origin/gh/williamwen42/201/orig 2025-03-14T05:31:37.7522155Z * [new branch] gh/williamwen42/204/base -> origin/gh/williamwen42/204/base 2025-03-14T05:31:37.7523781Z * [new branch] gh/williamwen42/204/head -> origin/gh/williamwen42/204/head 2025-03-14T05:31:37.7525536Z * [new branch] gh/williamwen42/204/orig -> origin/gh/williamwen42/204/orig 2025-03-14T05:31:37.7527938Z * [new branch] gh/williamwen42/205/base -> origin/gh/williamwen42/205/base 2025-03-14T05:31:37.7529708Z * [new branch] gh/williamwen42/205/head -> origin/gh/williamwen42/205/head 2025-03-14T05:31:37.7531514Z * [new branch] gh/williamwen42/205/orig -> origin/gh/williamwen42/205/orig 2025-03-14T05:31:37.7533811Z * [new branch] gh/williamwen42/206/base -> origin/gh/williamwen42/206/base 2025-03-14T05:31:37.7535640Z * [new branch] gh/williamwen42/206/head -> origin/gh/williamwen42/206/head 2025-03-14T05:31:37.7537299Z * [new branch] gh/williamwen42/206/orig -> origin/gh/williamwen42/206/orig 2025-03-14T05:31:37.7539549Z * [new branch] gh/williamwen42/207/base -> origin/gh/williamwen42/207/base 2025-03-14T05:31:37.7541357Z * [new branch] gh/williamwen42/207/head -> origin/gh/williamwen42/207/head 2025-03-14T05:31:37.7542977Z * [new branch] gh/williamwen42/207/orig -> origin/gh/williamwen42/207/orig 2025-03-14T05:31:37.7545440Z * [new branch] gh/williamwen42/208/base -> origin/gh/williamwen42/208/base 2025-03-14T05:31:37.7547126Z * [new branch] gh/williamwen42/208/head -> origin/gh/williamwen42/208/head 2025-03-14T05:31:37.7548772Z * [new branch] gh/williamwen42/208/orig -> origin/gh/williamwen42/208/orig 2025-03-14T05:31:37.7550905Z * [new branch] gh/williamwen42/209/base -> origin/gh/williamwen42/209/base 2025-03-14T05:31:37.7552627Z * [new branch] gh/williamwen42/209/head -> origin/gh/williamwen42/209/head 2025-03-14T05:31:37.7554315Z * [new branch] gh/williamwen42/209/orig -> origin/gh/williamwen42/209/orig 2025-03-14T05:31:37.7556731Z * [new branch] gh/williamwen42/210/base -> origin/gh/williamwen42/210/base 2025-03-14T05:31:37.7558440Z * [new branch] gh/williamwen42/210/head -> origin/gh/williamwen42/210/head 2025-03-14T05:31:37.7560086Z * [new branch] gh/williamwen42/210/orig -> origin/gh/williamwen42/210/orig 2025-03-14T05:31:37.7562270Z * [new branch] gh/williamwen42/211/base -> origin/gh/williamwen42/211/base 2025-03-14T05:31:37.7563846Z * [new branch] gh/williamwen42/211/head -> origin/gh/williamwen42/211/head 2025-03-14T05:31:37.7565506Z * [new branch] gh/williamwen42/211/orig -> origin/gh/williamwen42/211/orig 2025-03-14T05:31:37.7568074Z * [new branch] gh/williamwen42/212/base -> origin/gh/williamwen42/212/base 2025-03-14T05:31:37.7571389Z * [new branch] gh/williamwen42/212/head -> origin/gh/williamwen42/212/head 2025-03-14T05:31:37.7573078Z * [new branch] gh/williamwen42/212/orig -> origin/gh/williamwen42/212/orig 2025-03-14T05:31:37.7575381Z * [new branch] gh/williamwen42/213/base -> origin/gh/williamwen42/213/base 2025-03-14T05:31:37.7577118Z * [new branch] gh/williamwen42/213/head -> origin/gh/williamwen42/213/head 2025-03-14T05:31:37.7578773Z * [new branch] gh/williamwen42/213/orig -> origin/gh/williamwen42/213/orig 2025-03-14T05:31:37.7581212Z * [new branch] gh/williamwen42/214/base -> origin/gh/williamwen42/214/base 2025-03-14T05:31:37.7582909Z * [new branch] gh/williamwen42/214/head -> origin/gh/williamwen42/214/head 2025-03-14T05:31:37.7584519Z * [new branch] gh/williamwen42/214/orig -> origin/gh/williamwen42/214/orig 2025-03-14T05:31:37.7587412Z * [new branch] gh/williamwen42/215/base -> origin/gh/williamwen42/215/base 2025-03-14T05:31:37.7589177Z * [new branch] gh/williamwen42/215/head -> origin/gh/williamwen42/215/head 2025-03-14T05:31:37.7590832Z * [new branch] gh/williamwen42/215/orig -> origin/gh/williamwen42/215/orig 2025-03-14T05:31:37.7593343Z * [new branch] gh/williamwen42/216/base -> origin/gh/williamwen42/216/base 2025-03-14T05:31:37.7595595Z * [new branch] gh/williamwen42/216/head -> origin/gh/williamwen42/216/head 2025-03-14T05:31:37.7597283Z * [new branch] gh/williamwen42/216/orig -> origin/gh/williamwen42/216/orig 2025-03-14T05:31:37.7599864Z * [new branch] gh/williamwen42/217/base -> origin/gh/williamwen42/217/base 2025-03-14T05:31:37.7601363Z * [new branch] gh/williamwen42/217/head -> origin/gh/williamwen42/217/head 2025-03-14T05:31:37.7603024Z * [new branch] gh/williamwen42/217/orig -> origin/gh/williamwen42/217/orig 2025-03-14T05:31:37.7605577Z * [new branch] gh/williamwen42/218/base -> origin/gh/williamwen42/218/base 2025-03-14T05:31:37.7607574Z * [new branch] gh/williamwen42/218/head -> origin/gh/williamwen42/218/head 2025-03-14T05:31:37.7609252Z * [new branch] gh/williamwen42/218/orig -> origin/gh/williamwen42/218/orig 2025-03-14T05:31:37.7611599Z * [new branch] gh/williamwen42/219/base -> origin/gh/williamwen42/219/base 2025-03-14T05:31:37.7613349Z * [new branch] gh/williamwen42/219/head -> origin/gh/williamwen42/219/head 2025-03-14T05:31:37.7615150Z * [new branch] gh/williamwen42/219/orig -> origin/gh/williamwen42/219/orig 2025-03-14T05:31:37.7617812Z * [new branch] gh/wz337/2/base -> origin/gh/wz337/2/base 2025-03-14T05:31:37.7619477Z * [new branch] gh/wz337/2/head -> origin/gh/wz337/2/head 2025-03-14T05:31:37.7621574Z * [new branch] gh/wz337/3/base -> origin/gh/wz337/3/base 2025-03-14T05:31:37.7623186Z * [new branch] gh/wz337/3/head -> origin/gh/wz337/3/head 2025-03-14T05:31:37.7625969Z * [new branch] gh/xmfan/138/base -> origin/gh/xmfan/138/base 2025-03-14T05:31:37.7627752Z * [new branch] gh/xmfan/138/head -> origin/gh/xmfan/138/head 2025-03-14T05:31:37.7629388Z * [new branch] gh/xmfan/138/orig -> origin/gh/xmfan/138/orig 2025-03-14T05:31:37.7631813Z * [new branch] gh/xmfan/140/base -> origin/gh/xmfan/140/base 2025-03-14T05:31:37.7633434Z * [new branch] gh/xmfan/140/head -> origin/gh/xmfan/140/head 2025-03-14T05:31:37.7635281Z * [new branch] gh/xmfan/140/orig -> origin/gh/xmfan/140/orig 2025-03-14T05:31:37.7637479Z * [new branch] gh/xmfan/157/base -> origin/gh/xmfan/157/base 2025-03-14T05:31:37.7639109Z * [new branch] gh/xmfan/157/head -> origin/gh/xmfan/157/head 2025-03-14T05:31:37.7640695Z * [new branch] gh/xmfan/157/orig -> origin/gh/xmfan/157/orig 2025-03-14T05:31:37.7643557Z * [new branch] gh/xmfan/166/base -> origin/gh/xmfan/166/base 2025-03-14T05:31:37.7645283Z * [new branch] gh/xmfan/166/head -> origin/gh/xmfan/166/head 2025-03-14T05:31:37.7646994Z * [new branch] gh/xmfan/166/orig -> origin/gh/xmfan/166/orig 2025-03-14T05:31:37.7649202Z * [new branch] gh/xmfan/169/base -> origin/gh/xmfan/169/base 2025-03-14T05:31:37.7650950Z * [new branch] gh/xmfan/169/head -> origin/gh/xmfan/169/head 2025-03-14T05:31:37.7653135Z * [new branch] gh/xmfan/170/base -> origin/gh/xmfan/170/base 2025-03-14T05:31:37.7654760Z * [new branch] gh/xmfan/170/head -> origin/gh/xmfan/170/head 2025-03-14T05:31:37.7656926Z * [new branch] gh/xmfan/173/base -> origin/gh/xmfan/173/base 2025-03-14T05:31:37.7658528Z * [new branch] gh/xmfan/173/head -> origin/gh/xmfan/173/head 2025-03-14T05:31:37.7660215Z * [new branch] gh/xmfan/173/orig -> origin/gh/xmfan/173/orig 2025-03-14T05:31:37.7662437Z * [new branch] gh/xmfan/174/base -> origin/gh/xmfan/174/base 2025-03-14T05:31:37.7664091Z * [new branch] gh/xmfan/174/head -> origin/gh/xmfan/174/head 2025-03-14T05:31:37.7665687Z * [new branch] gh/xmfan/174/orig -> origin/gh/xmfan/174/orig 2025-03-14T05:31:37.7668383Z * [new branch] gh/xmfan/177/base -> origin/gh/xmfan/177/base 2025-03-14T05:31:37.7670233Z * [new branch] gh/xmfan/177/head -> origin/gh/xmfan/177/head 2025-03-14T05:31:37.7671780Z * [new branch] gh/xmfan/177/orig -> origin/gh/xmfan/177/orig 2025-03-14T05:31:37.7673993Z * [new branch] gh/xmfan/178/base -> origin/gh/xmfan/178/base 2025-03-14T05:31:37.7676146Z * [new branch] gh/xmfan/178/head -> origin/gh/xmfan/178/head 2025-03-14T05:31:37.7677718Z * [new branch] gh/xmfan/178/orig -> origin/gh/xmfan/178/orig 2025-03-14T05:31:37.7679990Z * [new branch] gh/xmfan/179/base -> origin/gh/xmfan/179/base 2025-03-14T05:31:37.7681122Z * [new branch] gh/xmfan/179/head -> origin/gh/xmfan/179/head 2025-03-14T05:31:37.7682911Z * [new branch] gh/xmfan/179/orig -> origin/gh/xmfan/179/orig 2025-03-14T05:31:37.7685236Z * [new branch] gh/xmfan/18/base -> origin/gh/xmfan/18/base 2025-03-14T05:31:37.7686877Z * [new branch] gh/xmfan/18/head -> origin/gh/xmfan/18/head 2025-03-14T05:31:37.7689542Z * [new branch] gh/xmfan/180/base -> origin/gh/xmfan/180/base 2025-03-14T05:31:37.7691152Z * [new branch] gh/xmfan/180/head -> origin/gh/xmfan/180/head 2025-03-14T05:31:37.7692797Z * [new branch] gh/xmfan/180/orig -> origin/gh/xmfan/180/orig 2025-03-14T05:31:37.7695175Z * [new branch] gh/xmfan/181/base -> origin/gh/xmfan/181/base 2025-03-14T05:31:37.7696841Z * [new branch] gh/xmfan/181/head -> origin/gh/xmfan/181/head 2025-03-14T05:31:37.7698448Z * [new branch] gh/xmfan/181/orig -> origin/gh/xmfan/181/orig 2025-03-14T05:31:37.7700808Z * [new branch] gh/xmfan/182/base -> origin/gh/xmfan/182/base 2025-03-14T05:31:37.7702408Z * [new branch] gh/xmfan/182/head -> origin/gh/xmfan/182/head 2025-03-14T05:31:37.7704293Z * [new branch] gh/xmfan/182/orig -> origin/gh/xmfan/182/orig 2025-03-14T05:31:37.7706574Z * [new branch] gh/xmfan/183/base -> origin/gh/xmfan/183/base 2025-03-14T05:31:37.7708176Z * [new branch] gh/xmfan/183/head -> origin/gh/xmfan/183/head 2025-03-14T05:31:37.7709884Z * [new branch] gh/xmfan/183/orig -> origin/gh/xmfan/183/orig 2025-03-14T05:31:37.7712112Z * [new branch] gh/xmfan/184/base -> origin/gh/xmfan/184/base 2025-03-14T05:31:37.7713778Z * [new branch] gh/xmfan/184/head -> origin/gh/xmfan/184/head 2025-03-14T05:31:37.7715473Z * [new branch] gh/xmfan/184/orig -> origin/gh/xmfan/184/orig 2025-03-14T05:31:37.7718283Z * [new branch] gh/xmfan/185/base -> origin/gh/xmfan/185/base 2025-03-14T05:31:37.7719914Z * [new branch] gh/xmfan/185/head -> origin/gh/xmfan/185/head 2025-03-14T05:31:37.7721546Z * [new branch] gh/xmfan/185/orig -> origin/gh/xmfan/185/orig 2025-03-14T05:31:37.7723863Z * [new branch] gh/xmfan/186/base -> origin/gh/xmfan/186/base 2025-03-14T05:31:37.7725474Z * [new branch] gh/xmfan/186/head -> origin/gh/xmfan/186/head 2025-03-14T05:31:37.7727142Z * [new branch] gh/xmfan/186/orig -> origin/gh/xmfan/186/orig 2025-03-14T05:31:37.7729422Z * [new branch] gh/xmfan/187/base -> origin/gh/xmfan/187/base 2025-03-14T05:31:37.7731032Z * [new branch] gh/xmfan/187/head -> origin/gh/xmfan/187/head 2025-03-14T05:31:37.7732653Z * [new branch] gh/xmfan/187/orig -> origin/gh/xmfan/187/orig 2025-03-14T05:31:37.7734990Z * [new branch] gh/xmfan/188/base -> origin/gh/xmfan/188/base 2025-03-14T05:31:37.7736760Z * [new branch] gh/xmfan/188/head -> origin/gh/xmfan/188/head 2025-03-14T05:31:37.7738292Z * [new branch] gh/xmfan/188/orig -> origin/gh/xmfan/188/orig 2025-03-14T05:31:37.7740728Z * [new branch] gh/xmfan/189/base -> origin/gh/xmfan/189/base 2025-03-14T05:31:37.7742355Z * [new branch] gh/xmfan/189/head -> origin/gh/xmfan/189/head 2025-03-14T05:31:37.7744051Z * [new branch] gh/xmfan/189/orig -> origin/gh/xmfan/189/orig 2025-03-14T05:31:37.7746338Z * [new branch] gh/xmfan/190/base -> origin/gh/xmfan/190/base 2025-03-14T05:31:37.7747987Z * [new branch] gh/xmfan/190/head -> origin/gh/xmfan/190/head 2025-03-14T05:31:37.7749611Z * [new branch] gh/xmfan/190/orig -> origin/gh/xmfan/190/orig 2025-03-14T05:31:37.7751989Z * [new branch] gh/xmfan/191/base -> origin/gh/xmfan/191/base 2025-03-14T05:31:37.7753742Z * [new branch] gh/xmfan/191/head -> origin/gh/xmfan/191/head 2025-03-14T05:31:37.7755548Z * [new branch] gh/xmfan/191/orig -> origin/gh/xmfan/191/orig 2025-03-14T05:31:37.7757846Z * [new branch] gh/xmfan/192/base -> origin/gh/xmfan/192/base 2025-03-14T05:31:37.7759516Z * [new branch] gh/xmfan/192/head -> origin/gh/xmfan/192/head 2025-03-14T05:31:37.7761089Z * [new branch] gh/xmfan/192/orig -> origin/gh/xmfan/192/orig 2025-03-14T05:31:37.7763522Z * [new branch] gh/xmfan/193/base -> origin/gh/xmfan/193/base 2025-03-14T05:31:37.7765140Z * [new branch] gh/xmfan/193/head -> origin/gh/xmfan/193/head 2025-03-14T05:31:37.7766846Z * [new branch] gh/xmfan/193/orig -> origin/gh/xmfan/193/orig 2025-03-14T05:31:37.7769573Z * [new branch] gh/xmfan/194/base -> origin/gh/xmfan/194/base 2025-03-14T05:31:37.7771246Z * [new branch] gh/xmfan/194/head -> origin/gh/xmfan/194/head 2025-03-14T05:31:37.7772854Z * [new branch] gh/xmfan/194/orig -> origin/gh/xmfan/194/orig 2025-03-14T05:31:37.7775233Z * [new branch] gh/xmfan/195/base -> origin/gh/xmfan/195/base 2025-03-14T05:31:37.7776864Z * [new branch] gh/xmfan/195/head -> origin/gh/xmfan/195/head 2025-03-14T05:31:37.7778514Z * [new branch] gh/xmfan/195/orig -> origin/gh/xmfan/195/orig 2025-03-14T05:31:37.7780841Z * [new branch] gh/xmfan/196/base -> origin/gh/xmfan/196/base 2025-03-14T05:31:37.7782573Z * [new branch] gh/xmfan/196/head -> origin/gh/xmfan/196/head 2025-03-14T05:31:37.7784248Z * [new branch] gh/xmfan/196/orig -> origin/gh/xmfan/196/orig 2025-03-14T05:31:37.7786659Z * [new branch] gh/xmfan/197/base -> origin/gh/xmfan/197/base 2025-03-14T05:31:37.7788280Z * [new branch] gh/xmfan/197/head -> origin/gh/xmfan/197/head 2025-03-14T05:31:37.7789926Z * [new branch] gh/xmfan/197/orig -> origin/gh/xmfan/197/orig 2025-03-14T05:31:37.7792187Z * [new branch] gh/xmfan/198/base -> origin/gh/xmfan/198/base 2025-03-14T05:31:37.7793831Z * [new branch] gh/xmfan/198/head -> origin/gh/xmfan/198/head 2025-03-14T05:31:37.7795685Z * [new branch] gh/xmfan/198/orig -> origin/gh/xmfan/198/orig 2025-03-14T05:31:37.7797903Z * [new branch] gh/xmfan/199/base -> origin/gh/xmfan/199/base 2025-03-14T05:31:37.7799482Z * [new branch] gh/xmfan/199/head -> origin/gh/xmfan/199/head 2025-03-14T05:31:37.7801114Z * [new branch] gh/xmfan/199/orig -> origin/gh/xmfan/199/orig 2025-03-14T05:31:37.7803454Z * [new branch] gh/xmfan/200/base -> origin/gh/xmfan/200/base 2025-03-14T05:31:37.7805258Z * [new branch] gh/xmfan/200/head -> origin/gh/xmfan/200/head 2025-03-14T05:31:37.7806793Z * [new branch] gh/xmfan/200/orig -> origin/gh/xmfan/200/orig 2025-03-14T05:31:37.7809107Z * [new branch] gh/xmfan/201/base -> origin/gh/xmfan/201/base 2025-03-14T05:31:37.7810696Z * [new branch] gh/xmfan/201/head -> origin/gh/xmfan/201/head 2025-03-14T05:31:37.7812378Z * [new branch] gh/xmfan/201/orig -> origin/gh/xmfan/201/orig 2025-03-14T05:31:37.7814681Z * [new branch] gh/xmfan/202/base -> origin/gh/xmfan/202/base 2025-03-14T05:31:37.7816290Z * [new branch] gh/xmfan/202/head -> origin/gh/xmfan/202/head 2025-03-14T05:31:37.7817923Z * [new branch] gh/xmfan/202/orig -> origin/gh/xmfan/202/orig 2025-03-14T05:31:37.7821269Z * [new branch] gh/xuanzhang816/10/base -> origin/gh/xuanzhang816/10/base 2025-03-14T05:31:37.7822884Z * [new branch] gh/xuanzhang816/10/head -> origin/gh/xuanzhang816/10/head 2025-03-14T05:31:37.7824557Z * [new branch] gh/xuanzhang816/10/orig -> origin/gh/xuanzhang816/10/orig 2025-03-14T05:31:37.7826938Z * [new branch] gh/xuanzhang816/11/base -> origin/gh/xuanzhang816/11/base 2025-03-14T05:31:37.7828573Z * [new branch] gh/xuanzhang816/11/head -> origin/gh/xuanzhang816/11/head 2025-03-14T05:31:37.7830179Z * [new branch] gh/xuanzhang816/11/orig -> origin/gh/xuanzhang816/11/orig 2025-03-14T05:31:37.7832629Z * [new branch] gh/xuanzhang816/13/base -> origin/gh/xuanzhang816/13/base 2025-03-14T05:31:37.7834332Z * [new branch] gh/xuanzhang816/13/head -> origin/gh/xuanzhang816/13/head 2025-03-14T05:31:37.7836036Z * [new branch] gh/xuanzhang816/13/orig -> origin/gh/xuanzhang816/13/orig 2025-03-14T05:31:37.7838833Z * [new branch] gh/xuhancn/1/base -> origin/gh/xuhancn/1/base 2025-03-14T05:31:37.7840465Z * [new branch] gh/xuhancn/1/head -> origin/gh/xuhancn/1/head 2025-03-14T05:31:37.7842613Z * [new branch] gh/xuhancn/2/base -> origin/gh/xuhancn/2/base 2025-03-14T05:31:37.7844199Z * [new branch] gh/xuhancn/2/head -> origin/gh/xuhancn/2/head 2025-03-14T05:31:37.7846379Z * [new branch] gh/xuhancn/3/base -> origin/gh/xuhancn/3/base 2025-03-14T05:31:37.7847985Z * [new branch] gh/xuhancn/3/head -> origin/gh/xuhancn/3/head 2025-03-14T05:31:37.7850150Z * [new branch] gh/xuhancn/4/base -> origin/gh/xuhancn/4/base 2025-03-14T05:31:37.7851809Z * [new branch] gh/xuhancn/4/head -> origin/gh/xuhancn/4/head 2025-03-14T05:31:37.7853900Z * [new branch] gh/xuhancn/5/base -> origin/gh/xuhancn/5/base 2025-03-14T05:31:37.7855590Z * [new branch] gh/xuhancn/5/head -> origin/gh/xuhancn/5/head 2025-03-14T05:31:37.7857724Z * [new branch] gh/xuhancn/6/base -> origin/gh/xuhancn/6/base 2025-03-14T05:31:37.7859295Z * [new branch] gh/xuhancn/6/head -> origin/gh/xuhancn/6/head 2025-03-14T05:31:37.7861389Z * [new branch] gh/xuhancn/7/base -> origin/gh/xuhancn/7/base 2025-03-14T05:31:37.7862972Z * [new branch] gh/xuhancn/7/head -> origin/gh/xuhancn/7/head 2025-03-14T05:31:37.7865750Z * [new branch] gh/xunnanxu/1/base -> origin/gh/xunnanxu/1/base 2025-03-14T05:31:37.7867247Z * [new branch] gh/xunnanxu/1/head -> origin/gh/xunnanxu/1/head 2025-03-14T05:31:37.7873009Z * [new branch] gh/xunnanxu/1/orig -> origin/gh/xunnanxu/1/orig 2025-03-14T05:31:37.7875529Z * [new branch] gh/xunnanxu/2/base -> origin/gh/xunnanxu/2/base 2025-03-14T05:31:37.7877263Z * [new branch] gh/xunnanxu/2/head -> origin/gh/xunnanxu/2/head 2025-03-14T05:31:37.7878771Z * [new branch] gh/xunnanxu/2/orig -> origin/gh/xunnanxu/2/orig 2025-03-14T05:31:37.7880976Z * [new branch] gh/xunnanxu/3/base -> origin/gh/xunnanxu/3/base 2025-03-14T05:31:37.7882611Z * [new branch] gh/xunnanxu/3/head -> origin/gh/xunnanxu/3/head 2025-03-14T05:31:37.7884311Z * [new branch] gh/xunnanxu/3/orig -> origin/gh/xunnanxu/3/orig 2025-03-14T05:31:37.7886408Z * [new branch] gh/xunnanxu/4/base -> origin/gh/xunnanxu/4/base 2025-03-14T05:31:37.7888035Z * [new branch] gh/xunnanxu/4/head -> origin/gh/xunnanxu/4/head 2025-03-14T05:31:37.7889739Z * [new branch] gh/xunnanxu/4/orig -> origin/gh/xunnanxu/4/orig 2025-03-14T05:31:37.7892596Z * [new branch] gh/yanbing-j/11/base -> origin/gh/yanbing-j/11/base 2025-03-14T05:31:37.7894257Z * [new branch] gh/yanbing-j/11/head -> origin/gh/yanbing-j/11/head 2025-03-14T05:31:37.7895847Z * [new branch] gh/yanbing-j/11/orig -> origin/gh/yanbing-j/11/orig 2025-03-14T05:31:37.7898164Z * [new branch] gh/yanbing-j/12/base -> origin/gh/yanbing-j/12/base 2025-03-14T05:31:37.7899845Z * [new branch] gh/yanbing-j/12/head -> origin/gh/yanbing-j/12/head 2025-03-14T05:31:37.7901426Z * [new branch] gh/yanbing-j/12/orig -> origin/gh/yanbing-j/12/orig 2025-03-14T05:31:37.7903684Z * [new branch] gh/yanbing-j/13/base -> origin/gh/yanbing-j/13/base 2025-03-14T05:31:37.7905407Z * [new branch] gh/yanbing-j/13/head -> origin/gh/yanbing-j/13/head 2025-03-14T05:31:37.7907058Z * [new branch] gh/yanbing-j/13/orig -> origin/gh/yanbing-j/13/orig 2025-03-14T05:31:37.7909267Z * [new branch] gh/yanbing-j/14/base -> origin/gh/yanbing-j/14/base 2025-03-14T05:31:37.7910896Z * [new branch] gh/yanbing-j/14/head -> origin/gh/yanbing-j/14/head 2025-03-14T05:31:37.7913043Z * [new branch] gh/yanbing-j/14/orig -> origin/gh/yanbing-j/14/orig 2025-03-14T05:31:37.7915857Z * [new branch] gh/yanbing-j/15/base -> origin/gh/yanbing-j/15/base 2025-03-14T05:31:37.7917454Z * [new branch] gh/yanbing-j/15/head -> origin/gh/yanbing-j/15/head 2025-03-14T05:31:37.7919139Z * [new branch] gh/yanbing-j/15/orig -> origin/gh/yanbing-j/15/orig 2025-03-14T05:31:37.7921452Z * [new branch] gh/yanbing-j/18/base -> origin/gh/yanbing-j/18/base 2025-03-14T05:31:37.7923157Z * [new branch] gh/yanbing-j/18/head -> origin/gh/yanbing-j/18/head 2025-03-14T05:31:37.7924802Z * [new branch] gh/yanbing-j/18/orig -> origin/gh/yanbing-j/18/orig 2025-03-14T05:31:37.7927040Z * [new branch] gh/yanbing-j/19/base -> origin/gh/yanbing-j/19/base 2025-03-14T05:31:37.7928733Z * [new branch] gh/yanbing-j/19/head -> origin/gh/yanbing-j/19/head 2025-03-14T05:31:37.7930452Z * [new branch] gh/yanbing-j/19/orig -> origin/gh/yanbing-j/19/orig 2025-03-14T05:31:37.7932734Z * [new branch] gh/yanbing-j/20/base -> origin/gh/yanbing-j/20/base 2025-03-14T05:31:37.7934406Z * [new branch] gh/yanbing-j/20/head -> origin/gh/yanbing-j/20/head 2025-03-14T05:31:37.7936077Z * [new branch] gh/yanbing-j/20/orig -> origin/gh/yanbing-j/20/orig 2025-03-14T05:31:37.7938409Z * [new branch] gh/yanbing-j/21/base -> origin/gh/yanbing-j/21/base 2025-03-14T05:31:37.7940072Z * [new branch] gh/yanbing-j/21/head -> origin/gh/yanbing-j/21/head 2025-03-14T05:31:37.7942380Z * [new branch] gh/yanbing-j/22/base -> origin/gh/yanbing-j/22/base 2025-03-14T05:31:37.7944188Z * [new branch] gh/yanbing-j/22/head -> origin/gh/yanbing-j/22/head 2025-03-14T05:31:37.7945754Z * [new branch] gh/yanbing-j/22/orig -> origin/gh/yanbing-j/22/orig 2025-03-14T05:31:37.7947971Z * [new branch] gh/yanbing-j/23/base -> origin/gh/yanbing-j/23/base 2025-03-14T05:31:37.7949630Z * [new branch] gh/yanbing-j/23/head -> origin/gh/yanbing-j/23/head 2025-03-14T05:31:37.7951259Z * [new branch] gh/yanbing-j/23/orig -> origin/gh/yanbing-j/23/orig 2025-03-14T05:31:37.7953949Z * [new branch] gh/yanbing-j/24/base -> origin/gh/yanbing-j/24/base 2025-03-14T05:31:37.7955759Z * [new branch] gh/yanbing-j/24/head -> origin/gh/yanbing-j/24/head 2025-03-14T05:31:37.7957398Z * [new branch] gh/yanbing-j/24/orig -> origin/gh/yanbing-j/24/orig 2025-03-14T05:31:37.7960125Z * [new branch] gh/yanbing-j/25/base -> origin/gh/yanbing-j/25/base 2025-03-14T05:31:37.7961747Z * [new branch] gh/yanbing-j/25/head -> origin/gh/yanbing-j/25/head 2025-03-14T05:31:37.7963353Z * [new branch] gh/yanbing-j/25/orig -> origin/gh/yanbing-j/25/orig 2025-03-14T05:31:37.7965680Z * [new branch] gh/yanbing-j/26/base -> origin/gh/yanbing-j/26/base 2025-03-14T05:31:37.7967382Z * [new branch] gh/yanbing-j/26/head -> origin/gh/yanbing-j/26/head 2025-03-14T05:31:37.7969261Z * [new branch] gh/yanbing-j/26/orig -> origin/gh/yanbing-j/26/orig 2025-03-14T05:31:37.7971594Z * [new branch] gh/yanbing-j/28/base -> origin/gh/yanbing-j/28/base 2025-03-14T05:31:37.7973767Z * [new branch] gh/yanbing-j/28/head -> origin/gh/yanbing-j/28/head 2025-03-14T05:31:37.7975852Z * [new branch] gh/yanbing-j/28/orig -> origin/gh/yanbing-j/28/orig 2025-03-14T05:31:37.7978576Z * [new branch] gh/yanbing-j/34/base -> origin/gh/yanbing-j/34/base 2025-03-14T05:31:37.7979812Z * [new branch] gh/yanbing-j/34/head -> origin/gh/yanbing-j/34/head 2025-03-14T05:31:37.7981431Z * [new branch] gh/yanbing-j/34/orig -> origin/gh/yanbing-j/34/orig 2025-03-14T05:31:37.7983747Z * [new branch] gh/yanbing-j/35/base -> origin/gh/yanbing-j/35/base 2025-03-14T05:31:37.7985422Z * [new branch] gh/yanbing-j/35/head -> origin/gh/yanbing-j/35/head 2025-03-14T05:31:37.7987055Z * [new branch] gh/yanbing-j/35/orig -> origin/gh/yanbing-j/35/orig 2025-03-14T05:31:37.7989312Z * [new branch] gh/yanbing-j/36/base -> origin/gh/yanbing-j/36/base 2025-03-14T05:31:37.7991122Z * [new branch] gh/yanbing-j/36/head -> origin/gh/yanbing-j/36/head 2025-03-14T05:31:37.7992692Z * [new branch] gh/yanbing-j/36/orig -> origin/gh/yanbing-j/36/orig 2025-03-14T05:31:37.7995628Z * [new branch] gh/yanbing-j/37/base -> origin/gh/yanbing-j/37/base 2025-03-14T05:31:37.7997229Z * [new branch] gh/yanbing-j/37/head -> origin/gh/yanbing-j/37/head 2025-03-14T05:31:37.7998816Z * [new branch] gh/yanbing-j/37/orig -> origin/gh/yanbing-j/37/orig 2025-03-14T05:31:37.8001538Z * [new branch] gh/yanboliang/62/base -> origin/gh/yanboliang/62/base 2025-03-14T05:31:37.8003494Z * [new branch] gh/yanboliang/62/head -> origin/gh/yanboliang/62/head 2025-03-14T05:31:37.8005203Z * [new branch] gh/yanboliang/62/orig -> origin/gh/yanboliang/62/orig 2025-03-14T05:31:37.8008085Z * [new branch] gh/ydwu4/168/base -> origin/gh/ydwu4/168/base 2025-03-14T05:31:37.8009755Z * [new branch] gh/ydwu4/168/head -> origin/gh/ydwu4/168/head 2025-03-14T05:31:37.8012077Z * [new branch] gh/ydwu4/168/orig -> origin/gh/ydwu4/168/orig 2025-03-14T05:31:37.8014421Z * [new branch] gh/ydwu4/179/base -> origin/gh/ydwu4/179/base 2025-03-14T05:31:37.8016194Z * [new branch] gh/ydwu4/179/head -> origin/gh/ydwu4/179/head 2025-03-14T05:31:37.8017672Z * [new branch] gh/ydwu4/179/orig -> origin/gh/ydwu4/179/orig 2025-03-14T05:31:37.8020182Z * [new branch] gh/ydwu4/180/base -> origin/gh/ydwu4/180/base 2025-03-14T05:31:37.8021929Z * [new branch] gh/ydwu4/180/head -> origin/gh/ydwu4/180/head 2025-03-14T05:31:37.8023605Z * [new branch] gh/ydwu4/180/orig -> origin/gh/ydwu4/180/orig 2025-03-14T05:31:37.8025950Z * [new branch] gh/ydwu4/194/base -> origin/gh/ydwu4/194/base 2025-03-14T05:31:37.8027536Z * [new branch] gh/ydwu4/194/head -> origin/gh/ydwu4/194/head 2025-03-14T05:31:37.8029150Z * [new branch] gh/ydwu4/194/orig -> origin/gh/ydwu4/194/orig 2025-03-14T05:31:37.8031651Z * [new branch] gh/ydwu4/201/base -> origin/gh/ydwu4/201/base 2025-03-14T05:31:37.8033303Z * [new branch] gh/ydwu4/201/head -> origin/gh/ydwu4/201/head 2025-03-14T05:31:37.8035175Z * [new branch] gh/ydwu4/201/orig -> origin/gh/ydwu4/201/orig 2025-03-14T05:31:37.8037668Z * [new branch] gh/ydwu4/208/base -> origin/gh/ydwu4/208/base 2025-03-14T05:31:37.8039350Z * [new branch] gh/ydwu4/208/head -> origin/gh/ydwu4/208/head 2025-03-14T05:31:37.8041034Z * [new branch] gh/ydwu4/208/orig -> origin/gh/ydwu4/208/orig 2025-03-14T05:31:37.8043300Z * [new branch] gh/ydwu4/209/base -> origin/gh/ydwu4/209/base 2025-03-14T05:31:37.8044938Z * [new branch] gh/ydwu4/209/head -> origin/gh/ydwu4/209/head 2025-03-14T05:31:37.8046565Z * [new branch] gh/ydwu4/209/orig -> origin/gh/ydwu4/209/orig 2025-03-14T05:31:37.8048819Z * [new branch] gh/ydwu4/210/base -> origin/gh/ydwu4/210/base 2025-03-14T05:31:37.8050507Z * [new branch] gh/ydwu4/210/head -> origin/gh/ydwu4/210/head 2025-03-14T05:31:37.8052181Z * [new branch] gh/ydwu4/210/orig -> origin/gh/ydwu4/210/orig 2025-03-14T05:31:37.8054446Z * [new branch] gh/ydwu4/211/base -> origin/gh/ydwu4/211/base 2025-03-14T05:31:37.8056050Z * [new branch] gh/ydwu4/211/head -> origin/gh/ydwu4/211/head 2025-03-14T05:31:37.8057829Z * [new branch] gh/ydwu4/211/orig -> origin/gh/ydwu4/211/orig 2025-03-14T05:31:37.8060170Z * [new branch] gh/ydwu4/212/base -> origin/gh/ydwu4/212/base 2025-03-14T05:31:37.8061801Z * [new branch] gh/ydwu4/212/head -> origin/gh/ydwu4/212/head 2025-03-14T05:31:37.8063411Z * [new branch] gh/ydwu4/212/orig -> origin/gh/ydwu4/212/orig 2025-03-14T05:31:37.8065667Z * [new branch] gh/ydwu4/213/base -> origin/gh/ydwu4/213/base 2025-03-14T05:31:37.8067416Z * [new branch] gh/ydwu4/213/head -> origin/gh/ydwu4/213/head 2025-03-14T05:31:37.8071013Z * [new branch] gh/ydwu4/213/orig -> origin/gh/ydwu4/213/orig 2025-03-14T05:31:37.8073341Z * [new branch] gh/ydwu4/214/base -> origin/gh/ydwu4/214/base 2025-03-14T05:31:37.8075154Z * [new branch] gh/ydwu4/214/head -> origin/gh/ydwu4/214/head 2025-03-14T05:31:37.8076822Z * [new branch] gh/ydwu4/214/orig -> origin/gh/ydwu4/214/orig 2025-03-14T05:31:37.8079174Z * [new branch] gh/ydwu4/215/base -> origin/gh/ydwu4/215/base 2025-03-14T05:31:37.8080918Z * [new branch] gh/ydwu4/215/head -> origin/gh/ydwu4/215/head 2025-03-14T05:31:37.8082611Z * [new branch] gh/ydwu4/215/orig -> origin/gh/ydwu4/215/orig 2025-03-14T05:31:37.8085355Z * [new branch] gh/ydwu4/216/base -> origin/gh/ydwu4/216/base 2025-03-14T05:31:37.8087109Z * [new branch] gh/ydwu4/216/head -> origin/gh/ydwu4/216/head 2025-03-14T05:31:37.8088779Z * [new branch] gh/ydwu4/216/orig -> origin/gh/ydwu4/216/orig 2025-03-14T05:31:37.8091015Z * [new branch] gh/ydwu4/217/base -> origin/gh/ydwu4/217/base 2025-03-14T05:31:37.8092855Z * [new branch] gh/ydwu4/217/head -> origin/gh/ydwu4/217/head 2025-03-14T05:31:37.8094526Z * [new branch] gh/ydwu4/217/orig -> origin/gh/ydwu4/217/orig 2025-03-14T05:31:37.8096800Z * [new branch] gh/ydwu4/218/base -> origin/gh/ydwu4/218/base 2025-03-14T05:31:37.8098494Z * [new branch] gh/ydwu4/218/head -> origin/gh/ydwu4/218/head 2025-03-14T05:31:37.8100144Z * [new branch] gh/ydwu4/218/orig -> origin/gh/ydwu4/218/orig 2025-03-14T05:31:37.8102984Z * [new branch] gh/ydwu4/219/base -> origin/gh/ydwu4/219/base 2025-03-14T05:31:37.8104781Z * [new branch] gh/ydwu4/219/head -> origin/gh/ydwu4/219/head 2025-03-14T05:31:37.8106526Z * [new branch] gh/ydwu4/219/orig -> origin/gh/ydwu4/219/orig 2025-03-14T05:31:37.8108922Z * [new branch] gh/ydwu4/220/base -> origin/gh/ydwu4/220/base 2025-03-14T05:31:37.8110623Z * [new branch] gh/ydwu4/220/head -> origin/gh/ydwu4/220/head 2025-03-14T05:31:37.8112265Z * [new branch] gh/ydwu4/220/orig -> origin/gh/ydwu4/220/orig 2025-03-14T05:31:37.8114749Z * [new branch] gh/ydwu4/221/base -> origin/gh/ydwu4/221/base 2025-03-14T05:31:37.8116413Z * [new branch] gh/ydwu4/221/head -> origin/gh/ydwu4/221/head 2025-03-14T05:31:37.8118058Z * [new branch] gh/ydwu4/221/orig -> origin/gh/ydwu4/221/orig 2025-03-14T05:31:37.8126275Z * [new branch] gh/ydwu4/222/base -> origin/gh/ydwu4/222/base 2025-03-14T05:31:37.8126833Z * [new branch] gh/ydwu4/222/head -> origin/gh/ydwu4/222/head 2025-03-14T05:31:37.8127061Z * [new branch] gh/ydwu4/222/orig -> origin/gh/ydwu4/222/orig 2025-03-14T05:31:37.8127261Z * [new branch] gh/ydwu4/7/base -> origin/gh/ydwu4/7/base 2025-03-14T05:31:37.8127550Z * [new branch] gh/ydwu4/7/head -> origin/gh/ydwu4/7/head 2025-03-14T05:31:37.8129507Z * [new branch] gh/ydwu4/7/orig -> origin/gh/ydwu4/7/orig 2025-03-14T05:31:37.8132251Z * [new branch] gh/yf225/133/base -> origin/gh/yf225/133/base 2025-03-14T05:31:37.8133932Z * [new branch] gh/yf225/133/head -> origin/gh/yf225/133/head 2025-03-14T05:31:37.8136481Z * [new branch] gh/yf225/158/base -> origin/gh/yf225/158/base 2025-03-14T05:31:37.8138155Z * [new branch] gh/yf225/158/head -> origin/gh/yf225/158/head 2025-03-14T05:31:37.8139831Z * [new branch] gh/yf225/158/orig -> origin/gh/yf225/158/orig 2025-03-14T05:31:37.8142146Z * [new branch] gh/yf225/159/base -> origin/gh/yf225/159/base 2025-03-14T05:31:37.8143857Z * [new branch] gh/yf225/159/head -> origin/gh/yf225/159/head 2025-03-14T05:31:37.8145549Z * [new branch] gh/yf225/159/orig -> origin/gh/yf225/159/orig 2025-03-14T05:31:37.8147957Z * [new branch] gh/yf225/160/base -> origin/gh/yf225/160/base 2025-03-14T05:31:37.8149584Z * [new branch] gh/yf225/160/head -> origin/gh/yf225/160/head 2025-03-14T05:31:37.8151355Z * [new branch] gh/yf225/160/orig -> origin/gh/yf225/160/orig 2025-03-14T05:31:37.8153543Z * [new branch] gh/yf225/162/base -> origin/gh/yf225/162/base 2025-03-14T05:31:37.8155374Z * [new branch] gh/yf225/162/head -> origin/gh/yf225/162/head 2025-03-14T05:31:37.8156971Z * [new branch] gh/yf225/162/orig -> origin/gh/yf225/162/orig 2025-03-14T05:31:37.8159372Z * [new branch] gh/yf225/163/base -> origin/gh/yf225/163/base 2025-03-14T05:31:37.8161040Z * [new branch] gh/yf225/163/head -> origin/gh/yf225/163/head 2025-03-14T05:31:37.8162707Z * [new branch] gh/yf225/163/orig -> origin/gh/yf225/163/orig 2025-03-14T05:31:37.8165017Z * [new branch] gh/yf225/164/base -> origin/gh/yf225/164/base 2025-03-14T05:31:37.8166712Z * [new branch] gh/yf225/164/head -> origin/gh/yf225/164/head 2025-03-14T05:31:37.8168608Z * [new branch] gh/yf225/164/orig -> origin/gh/yf225/164/orig 2025-03-14T05:31:37.8171022Z * [new branch] gh/yf225/85/base -> origin/gh/yf225/85/base 2025-03-14T05:31:37.8172743Z * [new branch] gh/yf225/85/head -> origin/gh/yf225/85/head 2025-03-14T05:31:37.8174473Z * [new branch] gh/yf225/85/orig -> origin/gh/yf225/85/orig 2025-03-14T05:31:37.8176789Z * [new branch] gh/yf225/93/base -> origin/gh/yf225/93/base 2025-03-14T05:31:37.8178373Z * [new branch] gh/yf225/93/head -> origin/gh/yf225/93/head 2025-03-14T05:31:37.8181656Z * [new branch] gh/yifuwang/152/base -> origin/gh/yifuwang/152/base 2025-03-14T05:31:37.8183542Z * [new branch] gh/yifuwang/152/head -> origin/gh/yifuwang/152/head 2025-03-14T05:31:37.8185290Z * [new branch] gh/yifuwang/152/orig -> origin/gh/yifuwang/152/orig 2025-03-14T05:31:37.8187541Z * [new branch] gh/yifuwang/174/base -> origin/gh/yifuwang/174/base 2025-03-14T05:31:37.8189184Z * [new branch] gh/yifuwang/174/head -> origin/gh/yifuwang/174/head 2025-03-14T05:31:37.8191378Z * [new branch] gh/yifuwang/174/orig -> origin/gh/yifuwang/174/orig 2025-03-14T05:31:37.8193732Z * [new branch] gh/yifuwang/185/base -> origin/gh/yifuwang/185/base 2025-03-14T05:31:37.8195504Z * [new branch] gh/yifuwang/185/head -> origin/gh/yifuwang/185/head 2025-03-14T05:31:37.8197326Z * [new branch] gh/yifuwang/185/orig -> origin/gh/yifuwang/185/orig 2025-03-14T05:31:37.8199429Z * [new branch] gh/yifuwang/186/base -> origin/gh/yifuwang/186/base 2025-03-14T05:31:37.8201084Z * [new branch] gh/yifuwang/186/head -> origin/gh/yifuwang/186/head 2025-03-14T05:31:37.8202689Z * [new branch] gh/yifuwang/186/orig -> origin/gh/yifuwang/186/orig 2025-03-14T05:31:37.8204990Z * [new branch] gh/yifuwang/187/base -> origin/gh/yifuwang/187/base 2025-03-14T05:31:37.8206577Z * [new branch] gh/yifuwang/187/head -> origin/gh/yifuwang/187/head 2025-03-14T05:31:37.8208281Z * [new branch] gh/yifuwang/187/orig -> origin/gh/yifuwang/187/orig 2025-03-14T05:31:37.8210483Z * [new branch] gh/yifuwang/188/base -> origin/gh/yifuwang/188/base 2025-03-14T05:31:37.8212142Z * [new branch] gh/yifuwang/188/head -> origin/gh/yifuwang/188/head 2025-03-14T05:31:37.8213787Z * [new branch] gh/yifuwang/188/orig -> origin/gh/yifuwang/188/orig 2025-03-14T05:31:37.8215931Z * [new branch] gh/yifuwang/189/base -> origin/gh/yifuwang/189/base 2025-03-14T05:31:37.8217511Z * [new branch] gh/yifuwang/189/head -> origin/gh/yifuwang/189/head 2025-03-14T05:31:37.8219228Z * [new branch] gh/yifuwang/189/orig -> origin/gh/yifuwang/189/orig 2025-03-14T05:31:37.8221352Z * [new branch] gh/yifuwang/190/base -> origin/gh/yifuwang/190/base 2025-03-14T05:31:37.8223009Z * [new branch] gh/yifuwang/190/head -> origin/gh/yifuwang/190/head 2025-03-14T05:31:37.8224892Z * [new branch] gh/yifuwang/190/orig -> origin/gh/yifuwang/190/orig 2025-03-14T05:31:37.8226859Z * [new branch] gh/yifuwang/191/base -> origin/gh/yifuwang/191/base 2025-03-14T05:31:37.8228549Z * [new branch] gh/yifuwang/191/head -> origin/gh/yifuwang/191/head 2025-03-14T05:31:37.8230163Z * [new branch] gh/yifuwang/191/orig -> origin/gh/yifuwang/191/orig 2025-03-14T05:31:37.8232273Z * [new branch] gh/yifuwang/192/base -> origin/gh/yifuwang/192/base 2025-03-14T05:31:37.8233883Z * [new branch] gh/yifuwang/192/head -> origin/gh/yifuwang/192/head 2025-03-14T05:31:37.8235741Z * [new branch] gh/yifuwang/192/orig -> origin/gh/yifuwang/192/orig 2025-03-14T05:31:37.8238012Z * [new branch] gh/yifuwang/194/base -> origin/gh/yifuwang/194/base 2025-03-14T05:31:37.8239645Z * [new branch] gh/yifuwang/194/head -> origin/gh/yifuwang/194/head 2025-03-14T05:31:37.8241388Z * [new branch] gh/yifuwang/194/orig -> origin/gh/yifuwang/194/orig 2025-03-14T05:31:37.8243523Z * [new branch] gh/yifuwang/195/base -> origin/gh/yifuwang/195/base 2025-03-14T05:31:37.8245252Z * [new branch] gh/yifuwang/195/head -> origin/gh/yifuwang/195/head 2025-03-14T05:31:37.8246893Z * [new branch] gh/yifuwang/195/orig -> origin/gh/yifuwang/195/orig 2025-03-14T05:31:37.8249129Z * [new branch] gh/yifuwang/196/base -> origin/gh/yifuwang/196/base 2025-03-14T05:31:37.8251290Z * [new branch] gh/yifuwang/196/head -> origin/gh/yifuwang/196/head 2025-03-14T05:31:37.8253516Z * [new branch] gh/yifuwang/196/orig -> origin/gh/yifuwang/196/orig 2025-03-14T05:31:37.8255969Z * [new branch] gh/yiming0416/1/base -> origin/gh/yiming0416/1/base 2025-03-14T05:31:37.8257641Z * [new branch] gh/yiming0416/1/head -> origin/gh/yiming0416/1/head 2025-03-14T05:31:37.8259750Z * [new branch] gh/yiming0416/2/base -> origin/gh/yiming0416/2/base 2025-03-14T05:31:37.8261299Z * [new branch] gh/yiming0416/2/head -> origin/gh/yiming0416/2/head 2025-03-14T05:31:37.8264108Z * [new branch] gh/ysiraichi/78/base -> origin/gh/ysiraichi/78/base 2025-03-14T05:31:37.8265823Z * [new branch] gh/ysiraichi/78/head -> origin/gh/ysiraichi/78/head 2025-03-14T05:31:37.8267609Z * [new branch] gh/ysiraichi/78/orig -> origin/gh/ysiraichi/78/orig 2025-03-14T05:31:37.8270233Z * [new branch] gh/ysiraichi/79/base -> origin/gh/ysiraichi/79/base 2025-03-14T05:31:37.8271924Z * [new branch] gh/ysiraichi/79/head -> origin/gh/ysiraichi/79/head 2025-03-14T05:31:37.8273678Z * [new branch] gh/ysiraichi/79/orig -> origin/gh/ysiraichi/79/orig 2025-03-14T05:31:37.8276104Z * [new branch] gh/ysiraichi/80/base -> origin/gh/ysiraichi/80/base 2025-03-14T05:31:37.8277733Z * [new branch] gh/ysiraichi/80/head -> origin/gh/ysiraichi/80/head 2025-03-14T05:31:37.8279532Z * [new branch] gh/ysiraichi/80/orig -> origin/gh/ysiraichi/80/orig 2025-03-14T05:31:37.8281806Z * [new branch] gh/ysiraichi/81/base -> origin/gh/ysiraichi/81/base 2025-03-14T05:31:37.8283417Z * [new branch] gh/ysiraichi/81/head -> origin/gh/ysiraichi/81/head 2025-03-14T05:31:37.8285185Z * [new branch] gh/ysiraichi/81/orig -> origin/gh/ysiraichi/81/orig 2025-03-14T05:31:37.8287426Z * [new branch] gh/ysiraichi/82/base -> origin/gh/ysiraichi/82/base 2025-03-14T05:31:37.8289045Z * [new branch] gh/ysiraichi/82/head -> origin/gh/ysiraichi/82/head 2025-03-14T05:31:37.8290761Z * [new branch] gh/ysiraichi/82/orig -> origin/gh/ysiraichi/82/orig 2025-03-14T05:31:37.8293164Z * [new branch] gh/ysiraichi/83/base -> origin/gh/ysiraichi/83/base 2025-03-14T05:31:37.8294755Z * [new branch] gh/ysiraichi/83/head -> origin/gh/ysiraichi/83/head 2025-03-14T05:31:37.8296895Z * [new branch] gh/ysiraichi/83/orig -> origin/gh/ysiraichi/83/orig 2025-03-14T05:31:37.8299725Z * [new branch] gh/zhuhaozhe/28/base -> origin/gh/zhuhaozhe/28/base 2025-03-14T05:31:37.8301732Z * [new branch] gh/zhuhaozhe/28/head -> origin/gh/zhuhaozhe/28/head 2025-03-14T05:31:37.8303375Z * [new branch] gh/zhuhaozhe/28/orig -> origin/gh/zhuhaozhe/28/orig 2025-03-14T05:31:37.8305633Z * [new branch] gh/zhuhaozhe/29/base -> origin/gh/zhuhaozhe/29/base 2025-03-14T05:31:37.8307470Z * [new branch] gh/zhuhaozhe/29/head -> origin/gh/zhuhaozhe/29/head 2025-03-14T05:31:37.8309134Z * [new branch] gh/zhuhaozhe/29/orig -> origin/gh/zhuhaozhe/29/orig 2025-03-14T05:31:37.8311429Z * [new branch] gh/zhuhaozhe/31/base -> origin/gh/zhuhaozhe/31/base 2025-03-14T05:31:37.8313071Z * [new branch] gh/zhuhaozhe/31/head -> origin/gh/zhuhaozhe/31/head 2025-03-14T05:31:37.8314927Z * [new branch] gh/zhuhaozhe/31/orig -> origin/gh/zhuhaozhe/31/orig 2025-03-14T05:31:37.8317109Z * [new branch] gh/zhuhaozhe/32/base -> origin/gh/zhuhaozhe/32/base 2025-03-14T05:31:37.8319223Z * [new branch] gh/zhuhaozhe/32/head -> origin/gh/zhuhaozhe/32/head 2025-03-14T05:31:37.8320877Z * [new branch] gh/zhuhaozhe/32/orig -> origin/gh/zhuhaozhe/32/orig 2025-03-14T05:31:37.8323119Z * [new branch] gh/zhuhaozhe/33/base -> origin/gh/zhuhaozhe/33/base 2025-03-14T05:31:37.8324810Z * [new branch] gh/zhuhaozhe/33/head -> origin/gh/zhuhaozhe/33/head 2025-03-14T05:31:37.8326409Z * [new branch] gh/zhuhaozhe/33/orig -> origin/gh/zhuhaozhe/33/orig 2025-03-14T05:31:37.8329920Z * [new branch] gh/zou3519/1106/base -> origin/gh/zou3519/1106/base 2025-03-14T05:31:37.8331735Z * [new branch] gh/zou3519/1106/head -> origin/gh/zou3519/1106/head 2025-03-14T05:31:37.8333528Z * [new branch] gh/zou3519/1106/orig -> origin/gh/zou3519/1106/orig 2025-03-14T05:31:37.8335849Z * [new branch] gh/zou3519/1107/base -> origin/gh/zou3519/1107/base 2025-03-14T05:31:37.8337835Z * [new branch] gh/zou3519/1107/head -> origin/gh/zou3519/1107/head 2025-03-14T05:31:37.8339288Z * [new branch] gh/zou3519/1107/orig -> origin/gh/zou3519/1107/orig 2025-03-14T05:31:37.8341689Z * [new branch] gh/zou3519/1108/base -> origin/gh/zou3519/1108/base 2025-03-14T05:31:37.8343372Z * [new branch] gh/zou3519/1108/head -> origin/gh/zou3519/1108/head 2025-03-14T05:31:37.8345126Z * [new branch] gh/zou3519/1108/orig -> origin/gh/zou3519/1108/orig 2025-03-14T05:31:37.8347671Z * [new branch] gh/zou3519/1109/base -> origin/gh/zou3519/1109/base 2025-03-14T05:31:37.8349294Z * [new branch] gh/zou3519/1109/head -> origin/gh/zou3519/1109/head 2025-03-14T05:31:37.8351013Z * [new branch] gh/zou3519/1109/orig -> origin/gh/zou3519/1109/orig 2025-03-14T05:31:37.8353582Z * [new branch] gh/zou3519/1110/base -> origin/gh/zou3519/1110/base 2025-03-14T05:31:37.8355985Z * [new branch] gh/zou3519/1110/head -> origin/gh/zou3519/1110/head 2025-03-14T05:31:37.8357648Z * [new branch] gh/zou3519/1110/orig -> origin/gh/zou3519/1110/orig 2025-03-14T05:31:37.8360077Z * [new branch] gh/zou3519/1111/base -> origin/gh/zou3519/1111/base 2025-03-14T05:31:37.8361827Z * [new branch] gh/zou3519/1111/head -> origin/gh/zou3519/1111/head 2025-03-14T05:31:37.8363598Z * [new branch] gh/zou3519/1111/orig -> origin/gh/zou3519/1111/orig 2025-03-14T05:31:37.8365830Z * [new branch] gh/zou3519/1112/base -> origin/gh/zou3519/1112/base 2025-03-14T05:31:37.8367489Z * [new branch] gh/zou3519/1112/head -> origin/gh/zou3519/1112/head 2025-03-14T05:31:37.8369614Z * [new branch] gh/zou3519/1112/orig -> origin/gh/zou3519/1112/orig 2025-03-14T05:31:37.8371911Z * [new branch] gh/zou3519/1129/base -> origin/gh/zou3519/1129/base 2025-03-14T05:31:37.8373626Z * [new branch] gh/zou3519/1129/head -> origin/gh/zou3519/1129/head 2025-03-14T05:31:37.8375345Z * [new branch] gh/zou3519/1129/orig -> origin/gh/zou3519/1129/orig 2025-03-14T05:31:37.8377702Z * [new branch] gh/zou3519/1130/base -> origin/gh/zou3519/1130/base 2025-03-14T05:31:37.8379354Z * [new branch] gh/zou3519/1130/head -> origin/gh/zou3519/1130/head 2025-03-14T05:31:37.8381106Z * [new branch] gh/zou3519/1130/orig -> origin/gh/zou3519/1130/orig 2025-03-14T05:31:37.8383688Z * [new branch] gh/zou3519/1134/base -> origin/gh/zou3519/1134/base 2025-03-14T05:31:37.8385338Z * [new branch] gh/zou3519/1134/head -> origin/gh/zou3519/1134/head 2025-03-14T05:31:37.8387787Z * [new branch] gh/zou3519/1135/base -> origin/gh/zou3519/1135/base 2025-03-14T05:31:37.8389436Z * [new branch] gh/zou3519/1135/head -> origin/gh/zou3519/1135/head 2025-03-14T05:31:37.8391094Z * [new branch] gh/zou3519/1135/orig -> origin/gh/zou3519/1135/orig 2025-03-14T05:31:37.8393288Z * [new branch] gh/zou3519/1136/base -> origin/gh/zou3519/1136/base 2025-03-14T05:31:37.8395009Z * [new branch] gh/zou3519/1136/head -> origin/gh/zou3519/1136/head 2025-03-14T05:31:37.8396635Z * [new branch] gh/zou3519/1136/orig -> origin/gh/zou3519/1136/orig 2025-03-14T05:31:37.8398918Z * [new branch] gh/zou3519/1137/base -> origin/gh/zou3519/1137/base 2025-03-14T05:31:37.8401105Z * [new branch] gh/zou3519/1137/head -> origin/gh/zou3519/1137/head 2025-03-14T05:31:37.8402738Z * [new branch] gh/zou3519/1137/orig -> origin/gh/zou3519/1137/orig 2025-03-14T05:31:37.8405025Z * [new branch] gh/zou3519/1138/base -> origin/gh/zou3519/1138/base 2025-03-14T05:31:37.8406641Z * [new branch] gh/zou3519/1138/head -> origin/gh/zou3519/1138/head 2025-03-14T05:31:37.8408341Z * [new branch] gh/zou3519/1138/orig -> origin/gh/zou3519/1138/orig 2025-03-14T05:31:37.8410717Z * [new branch] gh/zou3519/1139/base -> origin/gh/zou3519/1139/base 2025-03-14T05:31:37.8412272Z * [new branch] gh/zou3519/1139/head -> origin/gh/zou3519/1139/head 2025-03-14T05:31:37.8413885Z * [new branch] gh/zou3519/1139/orig -> origin/gh/zou3519/1139/orig 2025-03-14T05:31:37.8416831Z * [new branch] gh/zou3519/1140/base -> origin/gh/zou3519/1140/base 2025-03-14T05:31:37.8418607Z * [new branch] gh/zou3519/1140/head -> origin/gh/zou3519/1140/head 2025-03-14T05:31:37.8420318Z * [new branch] gh/zou3519/1140/orig -> origin/gh/zou3519/1140/orig 2025-03-14T05:31:37.8422859Z * [new branch] gh/zou3519/1141/base -> origin/gh/zou3519/1141/base 2025-03-14T05:31:37.8424553Z * [new branch] gh/zou3519/1141/head -> origin/gh/zou3519/1141/head 2025-03-14T05:31:37.8426217Z * [new branch] gh/zou3519/1141/orig -> origin/gh/zou3519/1141/orig 2025-03-14T05:31:37.8428593Z * [new branch] gh/zou3519/1142/base -> origin/gh/zou3519/1142/base 2025-03-14T05:31:37.8430252Z * [new branch] gh/zou3519/1142/head -> origin/gh/zou3519/1142/head 2025-03-14T05:31:37.8432668Z * [new branch] gh/zou3519/1142/orig -> origin/gh/zou3519/1142/orig 2025-03-14T05:31:37.8435528Z * [new branch] gh/zou3519/1143/base -> origin/gh/zou3519/1143/base 2025-03-14T05:31:37.8437214Z * [new branch] gh/zou3519/1143/head -> origin/gh/zou3519/1143/head 2025-03-14T05:31:37.8438941Z * [new branch] gh/zou3519/1143/orig -> origin/gh/zou3519/1143/orig 2025-03-14T05:31:37.8441417Z * [new branch] gh/zou3519/1144/base -> origin/gh/zou3519/1144/base 2025-03-14T05:31:37.8443095Z * [new branch] gh/zou3519/1144/head -> origin/gh/zou3519/1144/head 2025-03-14T05:31:37.8444879Z * [new branch] gh/zou3519/1144/orig -> origin/gh/zou3519/1144/orig 2025-03-14T05:31:37.8447217Z * [new branch] gh/zou3519/1145/base -> origin/gh/zou3519/1145/base 2025-03-14T05:31:37.8449025Z * [new branch] gh/zou3519/1145/head -> origin/gh/zou3519/1145/head 2025-03-14T05:31:37.8450753Z * [new branch] gh/zou3519/1145/orig -> origin/gh/zou3519/1145/orig 2025-03-14T05:31:37.8452987Z * [new branch] gh/zou3519/1146/base -> origin/gh/zou3519/1146/base 2025-03-14T05:31:37.8454684Z * [new branch] gh/zou3519/1146/head -> origin/gh/zou3519/1146/head 2025-03-14T05:31:37.8456403Z * [new branch] gh/zou3519/1146/orig -> origin/gh/zou3519/1146/orig 2025-03-14T05:31:37.8458575Z * [new branch] gh/zou3519/1147/base -> origin/gh/zou3519/1147/base 2025-03-14T05:31:37.8460201Z * [new branch] gh/zou3519/1147/head -> origin/gh/zou3519/1147/head 2025-03-14T05:31:37.8461887Z * [new branch] gh/zou3519/1147/orig -> origin/gh/zou3519/1147/orig 2025-03-14T05:31:37.8464160Z * [new branch] gh/zou3519/1148/base -> origin/gh/zou3519/1148/base 2025-03-14T05:31:37.8465887Z * [new branch] gh/zou3519/1148/head -> origin/gh/zou3519/1148/head 2025-03-14T05:31:37.8468340Z * [new branch] gh/zou3519/1149/base -> origin/gh/zou3519/1149/base 2025-03-14T05:31:37.8470395Z * [new branch] gh/zou3519/1149/head -> origin/gh/zou3519/1149/head 2025-03-14T05:31:37.8472498Z * [new branch] gh/zou3519/1149/orig -> origin/gh/zou3519/1149/orig 2025-03-14T05:31:37.8474690Z * [new branch] gh/zou3519/754/base -> origin/gh/zou3519/754/base 2025-03-14T05:31:37.8476336Z * [new branch] gh/zou3519/754/head -> origin/gh/zou3519/754/head 2025-03-14T05:31:37.8478070Z * [new branch] gh/zou3519/754/orig -> origin/gh/zou3519/754/orig 2025-03-14T05:31:37.8481018Z * [new branch] gh/zou3519/916/base -> origin/gh/zou3519/916/base 2025-03-14T05:31:37.8482629Z * [new branch] gh/zou3519/916/head -> origin/gh/zou3519/916/head 2025-03-14T05:31:37.8483903Z * [new branch] google-main -> origin/google-main 2025-03-14T05:31:37.8486144Z * [new branch] guangyey/external_stream -> origin/guangyey/external_stream 2025-03-14T05:31:37.8487633Z * [new branch] guangyey/host_alloc -> origin/guangyey/host_alloc 2025-03-14T05:31:37.8489283Z * [new branch] guangyey/test_2025 -> origin/guangyey/test_2025 2025-03-14T05:31:37.8490911Z * [new branch] guard_system -> origin/guard_system 2025-03-14T05:31:37.8493326Z * [new branch] guilhermeleobas/cherry-pick-55d87d9dfd9 -> origin/guilhermeleobas/cherry-pick-55d87d9dfd9 2025-03-14T05:31:37.8495495Z * [new branch] haozhe/bf16-dynamic-shape -> origin/haozhe/bf16-dynamic-shape 2025-03-14T05:31:37.8497120Z * [new branch] hhh_rand -> origin/hhh_rand 2025-03-14T05:31:37.8498814Z * [new branch] hoy-update-wheel -> origin/hoy-update-wheel 2025-03-14T05:31:37.8501639Z * [new branch] hoy/autofdo/xblock -> origin/hoy/autofdo/xblock 2025-03-14T05:31:37.8503640Z * [new branch] hoy/autotune/nreg -> origin/hoy/autotune/nreg 2025-03-14T05:31:37.8505649Z * [new branch] hoy/autotune/numwarps -> origin/hoy/autotune/numwarps 2025-03-14T05:31:37.8506877Z * [new branch] hoy/mmsplitk -> origin/hoy/mmsplitk 2025-03-14T05:31:37.8508472Z * [new branch] hoy/triton-PR3973 -> origin/hoy/triton-PR3973 2025-03-14T05:31:37.8510135Z * [new branch] hoy/triton-coalescing-baseline -> origin/hoy/triton-coalescing-baseline 2025-03-14T05:31:37.8511721Z * [new branch] hoy/triton-coalescing-min -> origin/hoy/triton-coalescing-min 2025-03-14T05:31:37.8513583Z * [new branch] hoy/triton-coalescing-new -> origin/hoy/triton-coalescing-new 2025-03-14T05:31:37.8515772Z * [new branch] hoy/triton-coalescing-vec -> origin/hoy/triton-coalescing-vec 2025-03-14T05:31:37.8517463Z * [new branch] improve_vec_log -> origin/improve_vec_log 2025-03-14T05:31:37.8519279Z * [new branch] inductor_layout_opt_rocm_disable -> origin/inductor_layout_opt_rocm_disable 2025-03-14T05:31:37.8520946Z * [new branch] inline -> origin/inline 2025-03-14T05:31:37.8522654Z * [new branch] inlining -> origin/inlining 2025-03-14T05:31:37.8524390Z * [new branch] inlining-ezyang -> origin/inlining-ezyang 2025-03-14T05:31:37.8526102Z * [new branch] int8_sdpa -> origin/int8_sdpa 2025-03-14T05:31:37.8527772Z * [new branch] int8_sdpa_template -> origin/int8_sdpa_template 2025-03-14T05:31:37.8529422Z * [new branch] invoke-subgraph -> origin/invoke-subgraph 2025-03-14T05:31:37.8531035Z * [new branch] ios-mac-m1 -> origin/ios-mac-m1 2025-03-14T05:31:37.8533250Z * [new branch] ipiszy/fix -> origin/ipiszy/fix 2025-03-14T05:31:37.8534854Z * [new branch] ipiszy/fp8_test -> origin/ipiszy/fp8_test 2025-03-14T05:31:37.8536381Z * [new branch] ipiszy/mypy -> origin/ipiszy/mypy 2025-03-14T05:31:37.8538144Z * [new branch] issue#58739 -> origin/issue#58739 2025-03-14T05:31:37.8540473Z * [new branch] ivanov/cherry-pick-ckpt-fixes -> origin/ivanov/cherry-pick-ckpt-fixes 2025-03-14T05:31:37.8542232Z * [new branch] jataylo-nvfuser_blocklist -> origin/jataylo-nvfuser_blocklist 2025-03-14T05:31:37.8544665Z * [new branch] jcaip/test-cusparselt-version-0.6.2 -> origin/jcaip/test-cusparselt-version-0.6.2 2025-03-14T05:31:37.8546139Z * [new branch] jcaip/torch-compile-sparse -> origin/jcaip/torch-compile-sparse 2025-03-14T05:31:37.8547882Z * [new branch] jcaip/update-benchmarks -> origin/jcaip/update-benchmarks 2025-03-14T05:31:37.8549541Z * [new branch] jcaip/update-cusparselt-0.6.2 -> origin/jcaip/update-cusparselt-0.6.2 2025-03-14T05:31:37.8552226Z * [new branch] jeanschmidt/manywheel_memory -> origin/jeanschmidt/manywheel_memory 2025-03-14T05:31:37.8553859Z * [new branch] jeanschmidt/pull_ephemeral_runners -> origin/jeanschmidt/pull_ephemeral_runners 2025-03-14T05:31:37.8556308Z * [new branch] jnair/mi300_docker_caching_workflow -> origin/jnair/mi300_docker_caching_workflow 2025-03-14T05:31:37.8558430Z * [new branch] jon-chuang/compile-config-hash -> origin/jon-chuang/compile-config-hash 2025-03-14T05:31:37.8559913Z * [new branch] jon-chuang/compile-ignored -> origin/jon-chuang/compile-ignored 2025-03-14T05:31:37.8562234Z * [new branch] justinchu/onnxscript-0.2.2 -> origin/justinchu/onnxscript-0.2.2 2025-03-14T05:31:37.8564158Z * [new branch] justinchu/redundant-move -> origin/justinchu/redundant-move 2025-03-14T05:31:37.8565617Z * [new branch] justinchu/retrace-jit -> origin/justinchu/retrace-jit 2025-03-14T05:31:37.8567338Z * [new branch] justinchuby-patch-1 -> origin/justinchuby-patch-1 2025-03-14T05:31:37.8570100Z * [new branch] jwagantall/migrate-checkout -> origin/jwagantall/migrate-checkout 2025-03-14T05:31:37.8572290Z * [new branch] jz/istft -> origin/jz/istft 2025-03-14T05:31:37.8573930Z * [new branch] jz/stft-old-fc -> origin/jz/stft-old-fc 2025-03-14T05:31:37.8576101Z * [new branch] kadeng/dev-1 -> origin/kadeng/dev-1 2025-03-14T05:31:37.8578665Z * [new branch] kadeng/inductor-backend/cutlass-evt-fusion-1 -> origin/kadeng/inductor-backend/cutlass-evt-fusion-1 2025-03-14T05:31:37.8580303Z * [new branch] kadeng/inductor-cutlass-epilogue -> origin/kadeng/inductor-cutlass-epilogue 2025-03-14T05:31:37.8582464Z * [new branch] kenjin/call_method_userdefined -> origin/kenjin/call_method_userdefined 2025-03-14T05:31:37.8583919Z * [new branch] kenjin/lambdas -> origin/kenjin/lambdas 2025-03-14T05:31:37.8585596Z * [new branch] kenjin/norefcycles -> origin/kenjin/norefcycles 2025-03-14T05:31:37.8587328Z * [new branch] kit1980-patch-2 -> origin/kit1980-patch-2 2025-03-14T05:31:37.8589160Z * [new branch] kleidiai_bf16_issue_fix -> origin/kleidiai_bf16_issue_fix 2025-03-14T05:31:37.8590948Z * [new branch] kleidiai_submodule_update -> origin/kleidiai_submodule_update 2025-03-14T05:31:37.8592795Z * [new branch] larryliu0820-patch-1 -> origin/larryliu0820-patch-1 2025-03-14T05:31:37.8595318Z * [new branch] leslie/enable_poc_reduction_fusion -> origin/leslie/enable_poc_reduction_fusion 2025-03-14T05:31:37.8596872Z * [new branch] leslie/test_group_gemm_epilogues -> origin/leslie/test_group_gemm_epilogues 2025-03-14T05:31:37.8599812Z * [new branch] lts/release/1.8 -> origin/lts/release/1.8 2025-03-14T05:31:37.8601711Z * [new branch] main -> origin/main 2025-03-14T05:31:37.8603419Z * [new branch] main_dev_hhh -> origin/main_dev_hhh 2025-03-14T05:31:37.8605181Z * [new branch] malfet-patch-1 -> origin/malfet-patch-1 2025-03-14T05:31:37.8607031Z * [new branch] malfet-patch-10 -> origin/malfet-patch-10 2025-03-14T05:31:37.8608791Z * [new branch] malfet-patch-19 -> origin/malfet-patch-19 2025-03-14T05:31:37.8610724Z * [new branch] malfet-patch-2 -> origin/malfet-patch-2 2025-03-14T05:31:37.8612490Z * [new branch] malfet-patch-23 -> origin/malfet-patch-23 2025-03-14T05:31:37.8614253Z * [new branch] malfet-patch-3 -> origin/malfet-patch-3 2025-03-14T05:31:37.8615916Z * [new branch] malfet-patch-32 -> origin/malfet-patch-32 2025-03-14T05:31:37.8617684Z * [new branch] malfet-patch-42 -> origin/malfet-patch-42 2025-03-14T05:31:37.8619442Z * [new branch] malfet-patch-5 -> origin/malfet-patch-5 2025-03-14T05:31:37.8621257Z * [new branch] malfet-patch-6 -> origin/malfet-patch-6 2025-03-14T05:31:37.8623045Z * [new branch] malfet-patch-8 -> origin/malfet-patch-8 2025-03-14T05:31:37.8625407Z * [new branch] malfet/add-benchmark-func -> origin/malfet/add-benchmark-func 2025-03-14T05:31:37.8627192Z * [new branch] malfet/delete-find-openmp -> origin/malfet/delete-find-openmp 2025-03-14T05:31:37.8628800Z * [new branch] malfet/mps-fix-rand-5d -> origin/malfet/mps-fix-rand-5d 2025-03-14T05:31:37.8630575Z * [new branch] malfet/mps-fix-strided-logic -> origin/malfet/mps-fix-strided-logic 2025-03-14T05:31:37.8635193Z * [new branch] malfet/mps-implement-col2im -> origin/malfet/mps-implement-col2im 2025-03-14T05:31:37.8636871Z * [new branch] maxautotune_big_gpu -> origin/maxautotune_big_gpu 2025-03-14T05:31:37.8638559Z * [new branch] mem-leak -> origin/mem-leak 2025-03-14T05:31:37.8640300Z * [new branch] mem-leak1 -> origin/mem-leak1 2025-03-14T05:31:37.8642046Z * [new branch] migrate_map -> origin/migrate_map 2025-03-14T05:31:37.8643937Z * [new branch] missing_gloo_causes_deadlock -> origin/missing_gloo_causes_deadlock 2025-03-14T05:31:37.8646118Z * [new branch] mlazos/S429861-debug -> origin/mlazos/S429861-debug 2025-03-14T05:31:37.8647661Z * [new branch] mlazos/aa -> origin/mlazos/aa 2025-03-14T05:31:37.8649273Z * [new branch] mlazos/adam-compiled -> origin/mlazos/adam-compiled 2025-03-14T05:31:37.8650848Z * [new branch] mlazos/adam-fused-bench -> origin/mlazos/adam-fused-bench 2025-03-14T05:31:37.8652373Z * [new branch] mlazos/adam-fused-bench2 -> origin/mlazos/adam-fused-bench2 2025-03-14T05:31:37.8653916Z * [new branch] mlazos/adam-test2 -> origin/mlazos/adam-test2 2025-03-14T05:31:37.8655426Z * [new branch] mlazos/aux-vars -> origin/mlazos/aux-vars 2025-03-14T05:31:37.8657361Z * [new branch] mlazos/backup-test-branch -> origin/mlazos/backup-test-branch 2025-03-14T05:31:37.8659289Z * [new branch] mlazos/bad-cudagraphs -> origin/mlazos/bad-cudagraphs 2025-03-14T05:31:37.8661484Z * [new branch] mlazos/baseline -> origin/mlazos/baseline 2025-03-14T05:31:37.8663176Z * [new branch] mlazos/baseline-graph-breaks -> origin/mlazos/baseline-graph-breaks 2025-03-14T05:31:37.8664920Z * [new branch] mlazos/batch-fuse-opt -> origin/mlazos/batch-fuse-opt 2025-03-14T05:31:37.8666604Z * [new branch] mlazos/beta-tensor -> origin/mlazos/beta-tensor 2025-03-14T05:31:37.8668560Z * [new branch] mlazos/buff-opt2 -> origin/mlazos/buff-opt2 2025-03-14T05:31:37.8670269Z * [new branch] mlazos/buffers -> origin/mlazos/buffers 2025-03-14T05:31:37.8671780Z * [new branch] mlazos/buffers2 -> origin/mlazos/buffers2 2025-03-14T05:31:37.8673431Z * [new branch] mlazos/buffers3 -> origin/mlazos/buffers3 2025-03-14T05:31:37.8675735Z * [new branch] mlazos/ck2 -> origin/mlazos/ck2 2025-03-14T05:31:37.8677936Z * [new branch] mlazos/combokernels -> origin/mlazos/combokernels 2025-03-14T05:31:37.8679652Z * [new branch] mlazos/compiled-nadam -> origin/mlazos/compiled-nadam 2025-03-14T05:31:37.8681286Z * [new branch] mlazos/concat2 -> origin/mlazos/concat2 2025-03-14T05:31:37.8682892Z * [new branch] mlazos/copy2 -> origin/mlazos/copy2 2025-03-14T05:31:37.8684737Z * [new branch] mlazos/cudagraph-tests -> origin/mlazos/cudagraph-tests 2025-03-14T05:31:37.8686453Z * [new branch] mlazos/cudagraphs-measurement -> origin/mlazos/cudagraphs-measurement 2025-03-14T05:31:37.8688131Z * [new branch] mlazos/data-gather -> origin/mlazos/data-gather 2025-03-14T05:31:37.8689797Z * [new branch] mlazos/data-ptrs2 -> origin/mlazos/data-ptrs2 2025-03-14T05:31:37.8691899Z * [new branch] mlazos/data-ptrs3 -> origin/mlazos/data-ptrs3 2025-03-14T05:31:37.8693608Z * [new branch] mlazos/dataclass-proxy -> origin/mlazos/dataclass-proxy 2025-03-14T05:31:37.8695507Z * [new branch] mlazos/disable-closures -> origin/mlazos/disable-closures 2025-03-14T05:31:37.8697297Z * [new branch] mlazos/disabled-opt -> origin/mlazos/disabled-opt 2025-03-14T05:31:37.8698774Z * [new branch] mlazos/evt -> origin/mlazos/evt 2025-03-14T05:31:37.8700543Z * [new branch] mlazos/exp_disable -> origin/mlazos/exp_disable 2025-03-14T05:31:37.8702226Z * [new branch] mlazos/faster -> origin/mlazos/faster 2025-03-14T05:31:37.8703885Z * [new branch] mlazos/faster2 -> origin/mlazos/faster2 2025-03-14T05:31:37.8705715Z * [new branch] mlazos/fe-copy -> origin/mlazos/fe-copy 2025-03-14T05:31:37.8707394Z * [new branch] mlazos/foreach-op -> origin/mlazos/foreach-op 2025-03-14T05:31:37.8709138Z * [new branch] mlazos/foreach-reds -> origin/mlazos/foreach-reds 2025-03-14T05:31:37.8710814Z * [new branch] mlazos/freezing -> origin/mlazos/freezing 2025-03-14T05:31:37.8712497Z * [new branch] mlazos/gen-foreach -> origin/mlazos/gen-foreach 2025-03-14T05:31:37.8714151Z * [new branch] mlazos/h-comp -> origin/mlazos/h-comp 2025-03-14T05:31:37.8715915Z * [new branch] mlazos/h-comp2 -> origin/mlazos/h-comp2 2025-03-14T05:31:37.8717545Z * [new branch] mlazos/hc-hf -> origin/mlazos/hc-hf 2025-03-14T05:31:37.8719258Z * [new branch] mlazos/init-per-param -> origin/mlazos/init-per-param 2025-03-14T05:31:37.8720895Z * [new branch] mlazos/init_per_param -> origin/mlazos/init_per_param 2025-03-14T05:31:37.8722661Z * [new branch] mlazos/less-guards -> origin/mlazos/less-guards 2025-03-14T05:31:37.8724367Z * [new branch] mlazos/lr-composibility -> origin/mlazos/lr-composibility 2025-03-14T05:31:37.8726040Z * [new branch] mlazos/main-test-enablement -> origin/mlazos/main-test-enablement 2025-03-14T05:31:37.8727640Z * [new branch] mlazos/main2 -> origin/mlazos/main2 2025-03-14T05:31:37.8729616Z * [new branch] mlazos/main_test -> origin/mlazos/main_test 2025-03-14T05:31:37.8731255Z * [new branch] mlazos/mcg -> origin/mlazos/mcg 2025-03-14T05:31:37.8733027Z * [new branch] mlazos/mcg2 -> origin/mlazos/mcg2 2025-03-14T05:31:37.8734843Z * [new branch] mlazos/meta-guards -> origin/mlazos/meta-guards 2025-03-14T05:31:37.8736934Z * [new branch] mlazos/mlazos/ck2 -> origin/mlazos/mlazos/ck2 2025-03-14T05:31:37.8738440Z * [new branch] mlazos/mlazos/clean -> origin/mlazos/mlazos/clean 2025-03-14T05:31:37.8740052Z * [new branch] mlazos/mlazos/faster2 -> origin/mlazos/mlazos/faster2 2025-03-14T05:31:37.8741770Z * [new branch] mlazos/mlazos/foreach-map-adam -> origin/mlazos/mlazos/foreach-map-adam 2025-03-14T05:31:37.8743360Z * [new branch] mlazos/mlazos/subclass-test -> origin/mlazos/mlazos/subclass-test 2025-03-14T05:31:37.8745062Z * [new branch] mlazos/mlazos/tf-mode-backup -> origin/mlazos/mlazos/tf-mode-backup 2025-03-14T05:31:37.8746615Z * [new branch] mlazos/mlazos/tf-trace-full -> origin/mlazos/mlazos/tf-trace-full 2025-03-14T05:31:37.8748420Z * [new branch] mlazos/mod-fix -> origin/mlazos/mod-fix 2025-03-14T05:31:37.8750110Z * [new branch] mlazos/more-tests -> origin/mlazos/more-tests 2025-03-14T05:31:37.8751795Z * [new branch] mlazos/mutable-backup -> origin/mlazos/mutable-backup 2025-03-14T05:31:37.8753429Z * [new branch] mlazos/mv-tfo -> origin/mlazos/mv-tfo 2025-03-14T05:31:37.8755165Z * [new branch] mlazos/no-cpp -> origin/mlazos/no-cpp 2025-03-14T05:31:37.8757180Z * [new branch] mlazos/no-init-group-handling -> origin/mlazos/no-init-group-handling 2025-03-14T05:31:37.8758721Z * [new branch] mlazos/op-investigation -> origin/mlazos/op-investigation 2025-03-14T05:31:37.8760372Z * [new branch] mlazos/opt-bench-exp2 -> origin/mlazos/opt-bench-exp2 2025-03-14T05:31:37.8762017Z * [new branch] mlazos/opt-bench2 -> origin/mlazos/opt-bench2 2025-03-14T05:31:37.8763689Z * [new branch] mlazos/opt-bench3 -> origin/mlazos/opt-bench3 2025-03-14T05:31:37.8765272Z * [new branch] mlazos/opt-incr -> origin/mlazos/opt-incr 2025-03-14T05:31:37.8767149Z * [new branch] mlazos/opt-recipe -> origin/mlazos/opt-recipe 2025-03-14T05:31:37.8769118Z * [new branch] mlazos/opt-slowdown -> origin/mlazos/opt-slowdown 2025-03-14T05:31:37.8773832Z * [new branch] mlazos/proxy-ctors -> origin/mlazos/proxy-ctors 2025-03-14T05:31:37.8775495Z * [new branch] mlazos/proxy-opt -> origin/mlazos/proxy-opt 2025-03-14T05:31:37.8777113Z * [new branch] mlazos/pt -> origin/mlazos/pt 2025-03-14T05:31:37.8778900Z * [new branch] mlazos/restart -> origin/mlazos/restart 2025-03-14T05:31:37.8780506Z * [new branch] mlazos/rtp -> origin/mlazos/rtp 2025-03-14T05:31:37.8782155Z * [new branch] mlazos/sdpa-driss -> origin/mlazos/sdpa-driss 2025-03-14T05:31:37.8783853Z * [new branch] mlazos/static-inputs-log -> origin/mlazos/static-inputs-log 2025-03-14T05:31:37.8785395Z * [new branch] mlazos/subclass-test -> origin/mlazos/subclass-test 2025-03-14T05:31:37.8787099Z * [new branch] mlazos/td-fix2 -> origin/mlazos/td-fix2 2025-03-14T05:31:37.8788757Z * [new branch] mlazos/tensor-hasattr2 -> origin/mlazos/tensor-hasattr2 2025-03-14T05:31:37.8790424Z * [new branch] mlazos/tensor-inherit-backup -> origin/mlazos/tensor-inherit-backup 2025-03-14T05:31:37.8792062Z * [new branch] mlazos/tensor-like-fix -> origin/mlazos/tensor-like-fix 2025-03-14T05:31:37.8793664Z * [new branch] mlazos/tensor-lr -> origin/mlazos/tensor-lr 2025-03-14T05:31:37.8796050Z * [new branch] mlazos/tensor-lr2 -> origin/mlazos/tensor-lr2 2025-03-14T05:31:37.8797623Z * [new branch] mlazos/tf-inherit -> origin/mlazos/tf-inherit 2025-03-14T05:31:37.8799273Z * [new branch] mlazos/tf-mode -> origin/mlazos/tf-mode 2025-03-14T05:31:37.8801002Z * [new branch] mlazos/tf-mode-backup2 -> origin/mlazos/tf-mode-backup2 2025-03-14T05:31:37.8803199Z * [new branch] mlazos/tf-mode-reland -> origin/mlazos/tf-mode-reland 2025-03-14T05:31:37.8805052Z * [new branch] mlazos/tf-mode-reland2 -> origin/mlazos/tf-mode-reland2 2025-03-14T05:31:37.8806812Z * [new branch] mlazos/tf-mode-reland3 -> origin/mlazos/tf-mode-reland3 2025-03-14T05:31:37.8808473Z * [new branch] mlazos/tf-refactor -> origin/mlazos/tf-refactor 2025-03-14T05:31:37.8810181Z * [new branch] mlazos/tf-subclass-stack -> origin/mlazos/tf-subclass-stack 2025-03-14T05:31:37.8811855Z * [new branch] mlazos/tf-trace-full -> origin/mlazos/tf-trace-full 2025-03-14T05:31:37.8813505Z * [new branch] mlazos/th -> origin/mlazos/th 2025-03-14T05:31:37.8815200Z * [new branch] mlazos/tune-proto -> origin/mlazos/tune-proto 2025-03-14T05:31:37.8817377Z * [new branch] mlazos/vary-beta -> origin/mlazos/vary-beta 2025-03-14T05:31:37.8819087Z * [new branch] mlazos/vary-beta2 -> origin/mlazos/vary-beta2 2025-03-14T05:31:37.8820788Z * [new branch] mlazos/weird-perf1 -> origin/mlazos/weird-perf1 2025-03-14T05:31:37.8822638Z * [new branch] mod_guards1 -> origin/mod_guards1 2025-03-14T05:31:37.8824239Z * [new branch] mod_guards3 -> origin/mod_guards3 2025-03-14T05:31:37.8825905Z * [new branch] moderniz29_cyy -> origin/moderniz29_cyy 2025-03-14T05:31:37.8827663Z * [new branch] mps-linear-1d -> origin/mps-linear-1d 2025-03-14T05:31:37.8829904Z * [new branch] mradmila/host_stats -> origin/mradmila/host_stats 2025-03-14T05:31:37.8832084Z * [new branch] msaroufim-patch-10 -> origin/msaroufim-patch-10 2025-03-14T05:31:37.8833922Z * [new branch] msaroufim-patch-11 -> origin/msaroufim-patch-11 2025-03-14T05:31:37.8835803Z * [new branch] msaroufim-patch-12 -> origin/msaroufim-patch-12 2025-03-14T05:31:37.8837577Z * [new branch] msaroufim-patch-13 -> origin/msaroufim-patch-13 2025-03-14T05:31:37.8839324Z * [new branch] msaroufim-patch-14 -> origin/msaroufim-patch-14 2025-03-14T05:31:37.8841532Z * [new branch] msaroufim/cache -> origin/msaroufim/cache 2025-03-14T05:31:37.8843289Z * [new branch] msaroufim/dtensorfusedadam -> origin/msaroufim/dtensorfusedadam 2025-03-14T05:31:37.8844777Z * [new branch] msaroufim/warn_once -> origin/msaroufim/warn_once 2025-03-14T05:31:37.8846460Z * [new branch] mypy_fix -> origin/mypy_fix 2025-03-14T05:31:37.8848328Z * [new branch] myst_nb_trial -> origin/myst_nb_trial 2025-03-14T05:31:37.8850318Z * [new branch] nWEIdia-patch-1 -> origin/nWEIdia-patch-1 2025-03-14T05:31:37.8852014Z * [new branch] nestedfairseq2ops1 -> origin/nestedfairseq2ops1 2025-03-14T05:31:37.8853847Z * [new branch] new-batch-norm -> origin/new-batch-norm 2025-03-14T05:31:37.8855602Z * [new branch] new_guard_system -> origin/new_guard_system 2025-03-14T05:31:37.8857893Z * [new branch] ngimel/bits -> origin/ngimel/bits 2025-03-14T05:31:37.8859589Z * [new branch] ngimel/copy2d -> origin/ngimel/copy2d 2025-03-14T05:31:37.8861124Z * [new branch] ngimel/gg -> origin/ngimel/gg 2025-03-14T05:31:37.8862686Z * [new branch] ngimel/gg_new -> origin/ngimel/gg_new 2025-03-14T05:31:37.8864614Z * [new branch] nightly -> origin/nightly 2025-03-14T05:31:37.8866960Z * [new branch] nikitaved/solve_doc_update -> origin/nikitaved/solve_doc_update 2025-03-14T05:31:37.8868533Z * [new branch] nikitaved/tensordot -> origin/nikitaved/tensordot 2025-03-14T05:31:37.8870593Z * [new branch] offline -> origin/offline 2025-03-14T05:31:37.8872540Z * [new branch] openblas_gemv -> origin/openblas_gemv 2025-03-14T05:31:37.8875504Z * [new branch] orig/release/1.10 -> origin/orig/release/1.10 2025-03-14T05:31:37.8877379Z * [new branch] orig/release/1.11 -> origin/orig/release/1.11 2025-03-14T05:31:37.8879115Z * [new branch] orig/release/1.12 -> origin/orig/release/1.12 2025-03-14T05:31:37.8881011Z * [new branch] orig/release/1.13 -> origin/orig/release/1.13 2025-03-14T05:31:37.8882669Z * [new branch] orig/release/1.6 -> origin/orig/release/1.6 2025-03-14T05:31:37.8884606Z * [new branch] orig/release/1.7 -> origin/orig/release/1.7 2025-03-14T05:31:37.8886309Z * [new branch] orig/release/1.8 -> origin/orig/release/1.8 2025-03-14T05:31:37.8888126Z * [new branch] orig/release/1.9 -> origin/orig/release/1.9 2025-03-14T05:31:37.8889901Z * [new branch] orig/release/2.0 -> origin/orig/release/2.0 2025-03-14T05:31:37.8891492Z * [new branch] orig/release/2.1 -> origin/orig/release/2.1 2025-03-14T05:31:37.8893710Z * [new branch] orig/release/2.2 -> origin/orig/release/2.2 2025-03-14T05:31:37.8895420Z * [new branch] orig/release/2.3 -> origin/orig/release/2.3 2025-03-14T05:31:37.8897059Z * [new branch] orig/release/2.4 -> origin/orig/release/2.4 2025-03-14T05:31:37.8898703Z * [new branch] orig/release/2.5 -> origin/orig/release/2.5 2025-03-14T05:31:37.8900368Z * [new branch] orig/release/2.6 -> origin/orig/release/2.6 2025-03-14T05:31:37.8902421Z * [new branch] orig/release/2.7 -> origin/orig/release/2.7 2025-03-14T05:31:37.8906208Z * [new branch] origin/gh/stroxler/1/head -> origin/origin/gh/stroxler/1/head 2025-03-14T05:31:37.8908488Z * [new branch] origin/voz/serde -> origin/origin/voz/serde 2025-03-14T05:31:37.8910737Z * [new branch] oulgen/fx_graph -> origin/oulgen/fx_graph 2025-03-14T05:31:37.8912479Z * [new branch] padded-tensor -> origin/padded-tensor 2025-03-14T05:31:37.8914534Z * [new branch] palic_hotfix -> origin/palic_hotfix 2025-03-14T05:31:37.8916945Z * [new branch] parallel_cat -> origin/parallel_cat 2025-03-14T05:31:37.8918656Z * [new branch] parallel_reduce -> origin/parallel_reduce 2025-03-14T05:31:37.8920421Z * [new branch] pca2 -> origin/pca2 2025-03-14T05:31:37.8922507Z * [new branch] pianpwk/backed_size_oblivious -> origin/pianpwk/backed_size_oblivious 2025-03-14T05:31:37.8924076Z * [new branch] pianpwk/backed_size_oblivious_global -> origin/pianpwk/backed_size_oblivious_global 2025-03-14T05:31:37.8925593Z * [new branch] pianpwk/backed_symint_endofbounds -> origin/pianpwk/backed_symint_endofbounds 2025-03-14T05:31:37.8927128Z * [new branch] pianpwk/clear_pending_unbacked -> origin/pianpwk/clear_pending_unbacked 2025-03-14T05:31:37.8928642Z * [new branch] pianpwk/draft_strict_stack -> origin/pianpwk/draft_strict_stack 2025-03-14T05:31:37.8930666Z * [new branch] pianpwk/inductor_unbacked_symint -> origin/pianpwk/inductor_unbacked_symint 2025-03-14T05:31:37.8932723Z * [new branch] pianpwk/pre_forward_hook -> origin/pianpwk/pre_forward_hook 2025-03-14T05:31:37.8934407Z * [new branch] pianpwk/symbol_provenance_v1 -> origin/pianpwk/symbol_provenance_v1 2025-03-14T05:31:37.8936211Z * [new branch] pianpwk/torchbench_combine_args -> origin/pianpwk/torchbench_combine_args 2025-03-14T05:31:37.8937863Z * [new branch] pianpwk/treat_sizes_as_size_like -> origin/pianpwk/treat_sizes_as_size_like 2025-03-14T05:31:37.8939611Z * [new branch] pianpwk/unbacked_bindings -> origin/pianpwk/unbacked_bindings 2025-03-14T05:31:37.8941350Z * [new branch] plain-metal-mul-kernel -> origin/plain-metal-mul-kernel 2025-03-14T05:31:37.8943046Z * [new branch] polyfill-class -> origin/polyfill-class 2025-03-14T05:31:37.8945369Z * [new branch] pr/131860 -> origin/pr/131860 2025-03-14T05:31:37.8947246Z * [new branch] pr149164 -> origin/pr149164 2025-03-14T05:31:37.8949110Z * [new branch] prepare-android-artifacts -> origin/prepare-android-artifacts 2025-03-14T05:31:37.8951038Z * [new branch] print_hostname_rocm_runners -> origin/print_hostname_rocm_runners 2025-03-14T05:31:37.8952801Z * [new branch] pt-debug-cpu0 -> origin/pt-debug-cpu0 2025-03-14T05:31:37.8954697Z * [new branch] pt-opt-cuda3 -> origin/pt-opt-cuda3 2025-03-14T05:31:37.8956758Z * [new branch] python_compiled_autograd -> origin/python_compiled_autograd 2025-03-14T05:31:37.8958412Z * [new branch] qat-conv-bn-1d -> origin/qat-conv-bn-1d 2025-03-14T05:31:37.8960351Z * [new branch] qat-remove-bias-temp -> origin/qat-remove-bias-temp 2025-03-14T05:31:37.8962085Z * [new branch] qat_cudnn_batchnorm -> origin/qat_cudnn_batchnorm 2025-03-14T05:31:37.8963904Z * [new branch] qat_preserve_source_fn_stack -> origin/qat_preserve_source_fn_stack 2025-03-14T05:31:37.8966627Z * [new branch] qchip/export-D54134695 -> origin/qchip/export-D54134695 2025-03-14T05:31:37.8968591Z * [new branch] raggedsdpa -> origin/raggedsdpa 2025-03-14T05:31:37.8970532Z * [new branch] reenable-sgd-benchmark -> origin/reenable-sgd-benchmark 2025-03-14T05:31:37.8972313Z * [new branch] refactor-adamw -> origin/refactor-adamw 2025-03-14T05:31:37.8974797Z * [new branch] release/1.10 -> origin/release/1.10 2025-03-14T05:31:37.8976559Z * [new branch] release/1.11 -> origin/release/1.11 2025-03-14T05:31:37.8978269Z * [new branch] release/1.12 -> origin/release/1.12 2025-03-14T05:31:37.8979974Z * [new branch] release/1.13 -> origin/release/1.13 2025-03-14T05:31:37.8981633Z * [new branch] release/1.4 -> origin/release/1.4 2025-03-14T05:31:37.8983095Z * [new branch] release/1.4.1 -> origin/release/1.4.1 2025-03-14T05:31:37.8984802Z * [new branch] release/1.5 -> origin/release/1.5 2025-03-14T05:31:37.8986696Z * [new branch] release/1.6 -> origin/release/1.6 2025-03-14T05:31:37.8988407Z * [new branch] release/1.7 -> origin/release/1.7 2025-03-14T05:31:37.8990280Z * [new branch] release/1.8 -> origin/release/1.8 2025-03-14T05:31:37.8991909Z * [new branch] release/1.9 -> origin/release/1.9 2025-03-14T05:31:37.8993563Z * [new branch] release/2.0 -> origin/release/2.0 2025-03-14T05:31:37.8995497Z * [new branch] release/2.1 -> origin/release/2.1 2025-03-14T05:31:37.8997189Z * [new branch] release/2.2 -> origin/release/2.2 2025-03-14T05:31:37.8999254Z * [new branch] release/2.3 -> origin/release/2.3 2025-03-14T05:31:37.9001433Z * [new branch] release/2.4 -> origin/release/2.4 2025-03-14T05:31:37.9003790Z * [new branch] release/2.5 -> origin/release/2.5 2025-03-14T05:31:37.9005669Z * [new branch] release/2.6 -> origin/release/2.6 2025-03-14T05:31:37.9007367Z * [new branch] release/2.7 -> origin/release/2.7 2025-03-14T05:31:37.9009248Z * [new branch] release_notes -> origin/release_notes 2025-03-14T05:31:37.9011405Z * [new branch] remove-edit-on-github -> origin/remove-edit-on-github 2025-03-14T05:31:37.9013022Z * [new branch] remove-link-survey -> origin/remove-link-survey 2025-03-14T05:31:37.9014917Z * [new branch] remove_global_ns -> origin/remove_global_ns 2025-03-14T05:31:37.9016848Z * [new branch] requires_grad_fix -> origin/requires_grad_fix 2025-03-14T05:31:37.9019570Z * [new branch] revert-111036-skylion007/backport-2-1-1-2023-10-11-0 -> origin/revert-111036-skylion007/backport-2-1-1-2023-10-11-0 2025-03-14T05:31:37.9021165Z * [new branch] revert-112125 -> origin/revert-112125 2025-03-14T05:31:37.9024756Z * [new branch] revert-131069-gh/krzysztofjordan/1/head -> origin/revert-131069-gh/krzysztofjordan/1/head 2025-03-14T05:31:37.9028609Z * [new branch] revert-131469-gh/andrewor14/51/head -> origin/revert-131469-gh/andrewor14/51/head 2025-03-14T05:31:37.9030897Z * [new branch] revert_realize_input_ExternKernel -> origin/revert_realize_input_ExternKernel 2025-03-14T05:31:37.9032797Z * [new branch] rohan-varma-patch-13 -> origin/rohan-varma-patch-13 2025-03-14T05:31:37.9034885Z * [new branch] rohan-varma-patch-14 -> origin/rohan-varma-patch-14 2025-03-14T05:31:37.9036747Z * [new branch] rohan-varma-patch-15 -> origin/rohan-varma-patch-15 2025-03-14T05:31:37.9038737Z * [new branch] rohan-varma-patch-16 -> origin/rohan-varma-patch-16 2025-03-14T05:31:37.9040578Z * [new branch] rprop-playground -> origin/rprop-playground 2025-03-14T05:31:37.9042386Z * [new branch] run-ios-test-device-farm -> origin/run-ios-test-device-farm 2025-03-14T05:31:37.9044976Z * [new branch] ryanguo99/cleanup-dynamo-expected-failures -> origin/ryanguo99/cleanup-dynamo-expected-failures 2025-03-14T05:31:37.9046462Z * [new branch] ryanguo99/fix-closure-var -> origin/ryanguo99/fix-closure-var 2025-03-14T05:31:37.9048837Z * [new branch] rzou/cache_name -> origin/rzou/cache_name 2025-03-14T05:31:37.9050450Z * [new branch] rzou/faketensor_bench -> origin/rzou/faketensor_bench 2025-03-14T05:31:37.9052070Z * [new branch] rzou/fix -> origin/rzou/fix 2025-03-14T05:31:37.9053642Z * [new branch] rzou/fix2 -> origin/rzou/fix2 2025-03-14T05:31:37.9055339Z * [new branch] rzou/njt -> origin/rzou/njt 2025-03-14T05:31:37.9056952Z * [new branch] rzou/operator -> origin/rzou/operator 2025-03-14T05:31:37.9058781Z * [new branch] rzou/pca -> origin/rzou/pca 2025-03-14T05:31:37.9060411Z * [new branch] rzou/pipe_split -> origin/rzou/pipe_split 2025-03-14T05:31:37.9062054Z * [new branch] rzou/realprop -> origin/rzou/realprop 2025-03-14T05:31:37.9063701Z * [new branch] rzou/setup_context -> origin/rzou/setup_context 2025-03-14T05:31:37.9066326Z * [new branch] sanchitintel/fix_llama_da8w8_corner_case -> origin/sanchitintel/fix_llama_da8w8_corner_case 2025-03-14T05:31:37.9068856Z * [new branch] sanchitintel/gemm_template_avoid_malloc_lock_contention -> origin/sanchitintel/gemm_template_avoid_malloc_lock_contention 2025-03-14T05:31:37.9073447Z * [new branch] sanchitintel/modify_fp32_micro_gemm -> origin/sanchitintel/modify_fp32_micro_gemm 2025-03-14T05:31:37.9075715Z * [new branch] sanchitintel/refactor_aten_int8_woq_gemm -> origin/sanchitintel/refactor_aten_int8_woq_gemm 2025-03-14T05:31:37.9078015Z * [new branch] sanchitintel/weird_thing_with_test_cpu_select_algorithm -> origin/sanchitintel/weird_thing_with_test_cpu_select_algorithm 2025-03-14T05:31:37.9079610Z * [new branch] sanchitintel/woq_gemm_buf_size_patch -> origin/sanchitintel/woq_gemm_buf_size_patch 2025-03-14T05:31:37.9082137Z * [new branch] sanchitj/remove_duplicate_line_from_freezing.py -> origin/sanchitj/remove_duplicate_line_from_freezing.py 2025-03-14T05:31:37.9084049Z * [new branch] sapling-pr-archive-SS-JIA -> origin/sapling-pr-archive-SS-JIA 2025-03-14T05:31:37.9086391Z * [new branch] sdpa_autocast_cpu -> origin/sdpa_autocast_cpu 2025-03-14T05:31:37.9088730Z * [new branch] sdym/2.5.1 -> origin/sdym/2.5.1 2025-03-14T05:31:37.9090449Z * [new branch] sdym/docker-python-3.8 -> origin/sdym/docker-python-3.8 2025-03-14T05:31:37.9091995Z * [new branch] sdym/revert-107846 -> origin/sdym/revert-107846 2025-03-14T05:31:37.9093899Z * [new branch] sdym/revert-109859 -> origin/sdym/revert-109859 2025-03-14T05:31:37.9095343Z * [new branch] sdym/skip-asan -> origin/sdym/skip-asan 2025-03-14T05:31:37.9096980Z * [new branch] sdym/todo-docstring -> origin/sdym/todo-docstring 2025-03-14T05:31:37.9098571Z * [new branch] sdym/torchfix -> origin/sdym/torchfix 2025-03-14T05:31:37.9101078Z * [new branch] sdym/torchvision-pretrained -> origin/sdym/torchvision-pretrained 2025-03-14T05:31:37.9103058Z * [new branch] sdym/typed-storage -> origin/sdym/typed-storage 2025-03-14T05:31:37.9105202Z * [new branch] sdym/wno -> origin/sdym/wno 2025-03-14T05:31:37.9107829Z * [new branch] seemethere/add_h100_nightly_perf_benchmarks -> origin/seemethere/add_h100_nightly_perf_benchmarks 2025-03-14T05:31:37.9109465Z * [new branch] share_and_pin_fork -> origin/share_and_pin_fork 2025-03-14T05:31:37.9111818Z * [new branch] shengf/fx-xform-perf -> origin/shengf/fx-xform-perf 2025-03-14T05:31:37.9113608Z * [new branch] shikaili_fp8_allgather -> origin/shikaili_fp8_allgather 2025-03-14T05:31:37.9115834Z * [new branch] shunting-multi-kernel-2 -> origin/shunting-multi-kernel-2 2025-03-14T05:31:37.9117676Z * [new branch] shunting-multi-kernel-3 -> origin/shunting-multi-kernel-3 2025-03-14T05:31:37.9119862Z * [new branch] shunting-scale-down-rblock -> origin/shunting-scale-down-rblock 2025-03-14T05:31:37.9122013Z * [new branch] shunting-tigher-upperbound -> origin/shunting-tigher-upperbound 2025-03-14T05:31:37.9124104Z * [new branch] shunting-triton-pin-update-5 -> origin/shunting-triton-pin-update-5 2025-03-14T05:31:37.9126012Z * [new branch] simplify-fq-per-channel -> origin/simplify-fq-per-channel 2025-03-14T05:31:37.9128205Z * [new branch] source_fn_stack -> origin/source_fn_stack 2025-03-14T05:31:37.9130067Z * [new branch] speedup-mps-string-key -> origin/speedup-mps-string-key 2025-03-14T05:31:37.9133027Z * [new branch] sqzhang/flight4 -> origin/sqzhang/flight4 2025-03-14T05:31:37.9135123Z * [new branch] sqzhang/flight4plus -> origin/sqzhang/flight4plus 2025-03-14T05:31:37.9137173Z * [new branch] sraikund/record_funct_test -> origin/sraikund/record_funct_test 2025-03-14T05:31:37.9139403Z * [new branch] sraikund16/test -> origin/sraikund16/test 2025-03-14T05:31:37.9141233Z * [new branch] stable-library -> origin/stable-library 2025-03-14T05:31:37.9143261Z * [new branch] subscribe_codeowners_lucasllc -> origin/subscribe_codeowners_lucasllc 2025-03-14T05:31:37.9144961Z * [new branch] super -> origin/super 2025-03-14T05:31:37.9147309Z * [new branch] svekars-patch-7 -> origin/svekars-patch-7 2025-03-14T05:31:37.9149241Z * [new branch] switch-bn -> origin/switch-bn 2025-03-14T05:31:37.9151189Z * [new branch] sympy-bottleneck-repro -> origin/sympy-bottleneck-repro 2025-03-14T05:31:37.9153456Z * [new branch] teja/dcp_poc -> origin/teja/dcp_poc 2025-03-14T05:31:37.9155380Z * [new branch] tensor_life -> origin/tensor_life 2025-03-14T05:31:37.9157280Z * [new branch] tensordict_integration -> origin/tensordict_integration 2025-03-14T05:31:37.9159151Z * [new branch] test-move-conda-builds -> origin/test-move-conda-builds 2025-03-14T05:31:37.9161018Z * [new branch] test-torchvision-install-ci -> origin/test-torchvision-install-ci 2025-03-14T05:31:37.9163288Z * [new branch] test/inductor -> origin/test/inductor 2025-03-14T05:31:37.9165355Z * [new branch] test_od_cudnn_bn_qat_fusion -> origin/test_od_cudnn_bn_qat_fusion 2025-03-14T05:31:37.9167107Z * [new branch] tidy_performance_cyy -> origin/tidy_performance_cyy 2025-03-14T05:31:37.9169114Z * [new branch] torch-abi-version -> origin/torch-abi-version 2025-03-14T05:31:37.9171072Z * [new branch] torchgen_ns -> origin/torchgen_ns 2025-03-14T05:31:37.9172920Z * [new branch] trace_fsdp_torchtune_lora -> origin/trace_fsdp_torchtune_lora 2025-03-14T05:31:37.9174702Z * [new branch] traceable_fsdp_unit_tests -> origin/traceable_fsdp_unit_tests 2025-03-14T05:31:37.9176635Z * [new branch] tree_loop_vec_base -> origin/tree_loop_vec_base 2025-03-14T05:31:37.9178560Z * [new branch] tree_vec_base -> origin/tree_vec_base 2025-03-14T05:31:37.9180834Z * [new branch] triton-cpu-arm-expriment -> origin/triton-cpu-arm-expriment 2025-03-14T05:31:37.9182614Z * [new branch] triton-update -> origin/triton-update 2025-03-14T05:31:37.9184558Z * [new branch] triton_kernel -> origin/triton_kernel 2025-03-14T05:31:37.9186158Z * [new branch] triton_kernel_perf -> origin/triton_kernel_perf 2025-03-14T05:31:37.9187929Z * [new branch] try-speedup-docbuild -> origin/try-speedup-docbuild 2025-03-14T05:31:37.9189763Z * [new branch] type_dec -> origin/type_dec 2025-03-14T05:31:37.9191576Z * [new branch] unbreak_cpp_builder_clang -> origin/unbreak_cpp_builder_clang 2025-03-14T05:31:37.9194043Z * [new branch] update-audio-commit-hash/13210264744-1454-1 -> origin/update-audio-commit-hash/13210264744-1454-1 2025-03-14T05:31:37.9195882Z * [new branch] update-audio-commit-hash/13402729107-1466-1 -> origin/update-audio-commit-hash/13402729107-1466-1 2025-03-14T05:31:37.9198180Z * [new branch] update-executorch-commit-hash/12838938822-1425-1 -> origin/update-executorch-commit-hash/12838938822-1425-1 2025-03-14T05:31:37.9199754Z * [new branch] update-executorch-commit-hash/13319730828-1460-1 -> origin/update-executorch-commit-hash/13319730828-1460-1 2025-03-14T05:31:37.9201363Z * [new branch] update-executorch-commit-hash/13339750520-1461-1 -> origin/update-executorch-commit-hash/13339750520-1461-1 2025-03-14T05:31:37.9203090Z * [new branch] update-executorch-commit-hash/13349943940-1462-1 -> origin/update-executorch-commit-hash/13349943940-1462-1 2025-03-14T05:31:37.9204724Z * [new branch] update-executorch-commit-hash/13360269739-1463-1 -> origin/update-executorch-commit-hash/13360269739-1463-1 2025-03-14T05:31:37.9206647Z * [new branch] update-executorch-commit-hash/13380672687-1464-1 -> origin/update-executorch-commit-hash/13380672687-1464-1 2025-03-14T05:31:37.9208659Z * [new branch] update-executorch-commit-hash/13402729107-1466-1 -> origin/update-executorch-commit-hash/13402729107-1466-1 2025-03-14T05:31:37.9210999Z * [new branch] update-triton-commit-hash/13663274526-1487-2 -> origin/update-triton-commit-hash/13663274526-1487-2 2025-03-14T05:31:37.9213316Z * [new branch] update-vision-commit-hash/6210383723-710-1 -> origin/update-vision-commit-hash/6210383723-710-1 2025-03-14T05:31:37.9214963Z * [new branch] update-vision-commit-hash/6319671985-721-1 -> origin/update-vision-commit-hash/6319671985-721-1 2025-03-14T05:31:37.9216563Z * [new branch] update-vision-commit-hash/6345577305-723-1 -> origin/update-vision-commit-hash/6345577305-723-1 2025-03-14T05:31:37.9218204Z * [new branch] update-vision-commit-hash/6366568705-725-1 -> origin/update-vision-commit-hash/6366568705-725-1 2025-03-14T05:31:37.9219737Z * [new branch] update-vision-commit-hash/6386942932-727-1 -> origin/update-vision-commit-hash/6386942932-727-1 2025-03-14T05:31:37.9221516Z * [new branch] update-vision-commit-hash/6399845260-728-1 -> origin/update-vision-commit-hash/6399845260-728-1 2025-03-14T05:31:37.9223380Z * [new branch] update-vision-commit-hash/6412969951-729-1 -> origin/update-vision-commit-hash/6412969951-729-1 2025-03-14T05:31:37.9225476Z * [new branch] update-vision-commit-hash/6425844356-730-1 -> origin/update-vision-commit-hash/6425844356-730-1 2025-03-14T05:31:37.9227231Z * [new branch] update-vision-commit-hash/6463026337-734-1 -> origin/update-vision-commit-hash/6463026337-734-1 2025-03-14T05:31:37.9229043Z * [new branch] update-vision-commit-hash/6489506557-736-1 -> origin/update-vision-commit-hash/6489506557-736-1 2025-03-14T05:31:37.9230694Z * [new branch] update-vision-commit-hash/6520762621-739-1 -> origin/update-vision-commit-hash/6520762621-739-1 2025-03-14T05:31:37.9232544Z * [new branch] update-vision-commit-hash/6581672893-744-1 -> origin/update-vision-commit-hash/6581672893-744-1 2025-03-14T05:31:37.9234307Z * [new branch] update-vision-commit-hash/6593929043-745-1 -> origin/update-vision-commit-hash/6593929043-745-1 2025-03-14T05:31:37.9236033Z * [new branch] update-vision-commit-hash/6634009725-750-1 -> origin/update-vision-commit-hash/6634009725-750-1 2025-03-14T05:31:37.9237705Z * [new branch] update-vision-commit-hash/6673463792-754-1 -> origin/update-vision-commit-hash/6673463792-754-1 2025-03-14T05:31:37.9239439Z * [new branch] update-vision-commit-hash/6700258936-758-1 -> origin/update-vision-commit-hash/6700258936-758-1 2025-03-14T05:31:37.9241046Z * [new branch] update-vision-commit-hash/6805589684-770-1 -> origin/update-vision-commit-hash/6805589684-770-1 2025-03-14T05:31:37.9242711Z * [new branch] update-vision-commit-hash/6818989957-773-1 -> origin/update-vision-commit-hash/6818989957-773-1 2025-03-14T05:31:37.9244467Z * [new branch] update-vision-commit-hash/6830864778-774-1 -> origin/update-vision-commit-hash/6830864778-774-1 2025-03-14T05:31:37.9246145Z * [new branch] update-vision-commit-hash/6857388096-777-1 -> origin/update-vision-commit-hash/6857388096-777-1 2025-03-14T05:31:37.9247860Z * [new branch] update-vision-commit-hash/6871122584-778-1 -> origin/update-vision-commit-hash/6871122584-778-1 2025-03-14T05:31:37.9249516Z * [new branch] update-vision-commit-hash/6884505667-779-1 -> origin/update-vision-commit-hash/6884505667-779-1 2025-03-14T05:31:37.9251201Z * [new branch] update-vision-commit-hash/9010274985-1089-1 -> origin/update-vision-commit-hash/9010274985-1089-1 2025-03-14T05:31:37.9253539Z * [new branch] update-xla-commit-hash/10140112669-125-1 -> origin/update-xla-commit-hash/10140112669-125-1 2025-03-14T05:31:37.9255238Z * [new branch] update-xla-commit-hash/6219563710-79-1 -> origin/update-xla-commit-hash/6219563710-79-1 2025-03-14T05:31:37.9256840Z * [new branch] update-xla-commit-hash/6296332542-80-1 -> origin/update-xla-commit-hash/6296332542-80-1 2025-03-14T05:31:37.9258433Z * [new branch] update-xla-commit-hash/6377302016-81-1 -> origin/update-xla-commit-hash/6377302016-81-1 2025-03-14T05:31:37.9259994Z * [new branch] update-xla-commit-hash/6453689944-82-1 -> origin/update-xla-commit-hash/6453689944-82-1 2025-03-14T05:31:37.9262062Z * [new branch] update-xla-commit-hash/6530489691-83-1 -> origin/update-xla-commit-hash/6530489691-83-1 2025-03-14T05:31:37.9264006Z * [new branch] update-xla-commit-hash/6610159969-84-1 -> origin/update-xla-commit-hash/6610159969-84-1 2025-03-14T05:31:37.9266037Z * [new branch] update-xla-commit-hash/6689695021-85-1 -> origin/update-xla-commit-hash/6689695021-85-1 2025-03-14T05:31:37.9268378Z * [new branch] update-xla-commit-hash/6767672412-86-1 -> origin/update-xla-commit-hash/6767672412-86-1 2025-03-14T05:31:37.9270399Z * [new branch] update-xla-commit-hash/6846986487-87-1 -> origin/update-xla-commit-hash/6846986487-87-1 2025-03-14T05:31:37.9272017Z * [new branch] update_docs_torch_multinomial_issue#125388 -> origin/update_docs_torch_multinomial_issue#125388 2025-03-14T05:31:37.9273780Z * [new branch] update_kineto_0212_3 -> origin/update_kineto_0212_3 2025-03-14T05:31:37.9275730Z * [new branch] update_kineto_0214 -> origin/update_kineto_0214 2025-03-14T05:31:37.9277624Z * [new branch] update_slow_tests_1722488736 -> origin/update_slow_tests_1722488736 2025-03-14T05:31:37.9279412Z * [new branch] update_slow_tests_1722879173 -> origin/update_slow_tests_1722879173 2025-03-14T05:31:37.9281191Z * [new branch] update_slow_tests_1739173241 -> origin/update_slow_tests_1739173241 2025-03-14T05:31:37.9282901Z * [new branch] update_slow_tests_1739777990 -> origin/update_slow_tests_1739777990 2025-03-14T05:31:37.9285009Z * [new branch] update_slow_tests_1740382789 -> origin/update_slow_tests_1740382789 2025-03-14T05:31:37.9286942Z * [new branch] update_slow_tests_1741592409 -> origin/update_slow_tests_1741592409 2025-03-14T05:31:37.9288701Z * [new branch] update_submodule_FBGEMM -> origin/update_submodule_FBGEMM 2025-03-14T05:31:37.9290497Z * [new branch] update_submodule_kineto -> origin/update_submodule_kineto 2025-03-14T05:31:37.9299591Z * [new branch] use-better-label-for-dcp -> origin/use-better-label-for-dcp 2025-03-14T05:31:37.9300220Z * [new branch] v0.1.2 -> origin/v0.1.2 2025-03-14T05:31:37.9300687Z * [new branch] v1.0.1 -> origin/v1.0.1 2025-03-14T05:31:37.9301139Z * [new branch] v1.0.3 -> origin/v1.0.3 2025-03-14T05:31:37.9301581Z * [new branch] v1.1.0 -> origin/v1.1.0 2025-03-14T05:31:37.9302122Z * [new branch] v1.2.0 -> origin/v1.2.0 2025-03-14T05:31:37.9303985Z * [new branch] v1.3.0 -> origin/v1.3.0 2025-03-14T05:31:37.9306748Z * [new branch] v1.3.1 -> origin/v1.3.1 2025-03-14T05:31:37.9308147Z * [new branch] validate_fn -> origin/validate_fn 2025-03-14T05:31:37.9310969Z * [new branch] validations_2.6 -> origin/validations_2.6 2025-03-14T05:31:37.9312377Z * [new branch] vfdev-5-patch-2 -> origin/vfdev-5-patch-2 2025-03-14T05:31:37.9315512Z * [new branch] viable/strict -> origin/viable/strict 2025-03-14T05:31:37.9317381Z * [new branch] voz/fsdp_autograd2 -> origin/voz/fsdp_autograd2 2025-03-14T05:31:37.9318912Z * [new branch] voz/fsdp_autograd4 -> origin/voz/fsdp_autograd4 2025-03-14T05:31:37.9320546Z * [new branch] voz/fsdp_autograd_merge -> origin/voz/fsdp_autograd_merge 2025-03-14T05:31:37.9322128Z * [new branch] voz/fsdp_autograd_merge2 -> origin/voz/fsdp_autograd_merge2 2025-03-14T05:31:37.9323533Z * [new branch] voz/serde2 -> origin/voz/serde2 2025-03-14T05:31:37.9325800Z * [new branch] voz/soft_fork_autograd_fsdp -> origin/voz/soft_fork_autograd_fsdp 2025-03-14T05:31:37.9327581Z * [new branch] wdvr/add_boto3 -> origin/wdvr/add_boto3 2025-03-14T05:31:37.9329242Z * [new branch] wdvr/iss145259_alt -> origin/wdvr/iss145259_alt 2025-03-14T05:31:37.9330767Z * [new branch] wdvr/iss_145259 -> origin/wdvr/iss_145259 2025-03-14T05:31:37.9332423Z * [new branch] wdvr/sccache_nvcc -> origin/wdvr/sccache_nvcc 2025-03-14T05:31:37.9334069Z * [new branch] wdvr/sccache_simplified -> origin/wdvr/sccache_simplified 2025-03-14T05:31:37.9335834Z * [new branch] wdvr/xpu_sccache_fix -> origin/wdvr/xpu_sccache_fix 2025-03-14T05:31:37.9338660Z * [new branch] whc/flight -> origin/whc/flight 2025-03-14T05:31:37.9340138Z * [new branch] whc/flight4 -> origin/whc/flight4 2025-03-14T05:31:37.9341707Z * [new branch] whc/flight51 -> origin/whc/flight51 2025-03-14T05:31:37.9343401Z * [new branch] whc/flight53 -> origin/whc/flight53 2025-03-14T05:31:37.9345106Z * [new branch] whc/flight_full -> origin/whc/flight_full 2025-03-14T05:31:37.9346747Z * [new branch] whc/flightbase -> origin/whc/flightbase 2025-03-14T05:31:37.9348330Z * [new branch] whc/p2phang -> origin/whc/p2phang 2025-03-14T05:31:37.9350071Z * [new branch] whc/stage2 -> origin/whc/stage2 2025-03-14T05:31:37.9352956Z * [new branch] xmfan/ca_5a2be192d1 -> origin/xmfan/ca_5a2be192d1 2025-03-14T05:31:37.9354036Z * [new branch] xmfan/ca_api -> origin/xmfan/ca_api 2025-03-14T05:31:37.9355805Z * [new branch] xmfan/ca_base -> origin/xmfan/ca_base 2025-03-14T05:31:37.9357466Z * [new branch] xmfan/ca_cudagraphs -> origin/xmfan/ca_cudagraphs 2025-03-14T05:31:37.9359002Z * [new branch] xmfan/ca_dynamic -> origin/xmfan/ca_dynamic 2025-03-14T05:31:37.9360567Z * [new branch] xmfan/ca_fix_dyn -> origin/xmfan/ca_fix_dyn 2025-03-14T05:31:37.9362019Z * [new branch] xmfan/ca_jan3 -> origin/xmfan/ca_jan3 2025-03-14T05:31:37.9363716Z * [new branch] xmfan/ca_jun18 -> origin/xmfan/ca_jun18 2025-03-14T05:31:37.9365353Z * [new branch] xmfan/ca_jun24 -> origin/xmfan/ca_jun24 2025-03-14T05:31:37.9368016Z * [new branch] xmfan/ca_mem_base -> origin/xmfan/ca_mem_base 2025-03-14T05:31:37.9370237Z * [new branch] xmfan/ca_mem_fix -> origin/xmfan/ca_mem_fix 2025-03-14T05:31:37.9371627Z * [new branch] xmfan/ca_memory_fix -> origin/xmfan/ca_memory_fix 2025-03-14T05:31:37.9373421Z * [new branch] xmfan/ca_memory_fix_rebased -> origin/xmfan/ca_memory_fix_rebased 2025-03-14T05:31:37.9375211Z * [new branch] xmfan/ca_memory_fix_rebased2 -> origin/xmfan/ca_memory_fix_rebased2 2025-03-14T05:31:37.9376844Z * [new branch] xmfan/ca_move_to_cuda -> origin/xmfan/ca_move_to_cuda 2025-03-14T05:31:37.9378537Z * [new branch] xmfan/ca_overhead -> origin/xmfan/ca_overhead 2025-03-14T05:31:37.9381347Z * [new branch] xmfan/ca_overhead_0eba7e5451 -> origin/xmfan/ca_overhead_0eba7e5451 2025-03-14T05:31:37.9382510Z * [new branch] xmfan/ca_scalar -> origin/xmfan/ca_scalar 2025-03-14T05:31:37.9384234Z * [new branch] xmfan/ca_subclass_mem_fix -> origin/xmfan/ca_subclass_mem_fix 2025-03-14T05:31:37.9385936Z * [new branch] xmfan/ca_warm_mem -> origin/xmfan/ca_warm_mem 2025-03-14T05:31:37.9388112Z * [new branch] xmfan/ca_warm_mem_base -> origin/xmfan/ca_warm_mem_base 2025-03-14T05:31:37.9389450Z * [new branch] xmfan/cacu_jun18 -> origin/xmfan/cacu_jun18 2025-03-14T05:31:37.9391029Z * [new branch] xmfan/cacu_jun19 -> origin/xmfan/cacu_jun19 2025-03-14T05:31:37.9393339Z * [new branch] xmfan/cacu_jun4 -> origin/xmfan/cacu_jun4 2025-03-14T05:31:37.9394872Z * [new branch] xmfan/cacu_may27 -> origin/xmfan/cacu_may27 2025-03-14T05:31:37.9396478Z * [new branch] xmfan/circular_dep -> origin/xmfan/circular_dep 2025-03-14T05:31:37.9398274Z * [new branch] xmfan/compiled_autograd_bench -> origin/xmfan/compiled_autograd_bench 2025-03-14T05:31:37.9400324Z * [new branch] xmfan/compiled_autograd_bench_base -> origin/xmfan/compiled_autograd_bench_base 2025-03-14T05:31:37.9401806Z * [new branch] xmfan/compiled_autograd_benchmark -> origin/xmfan/compiled_autograd_benchmark 2025-03-14T05:31:37.9403503Z * [new branch] xmfan/compiled_autograd_ddp -> origin/xmfan/compiled_autograd_ddp 2025-03-14T05:31:37.9405241Z * [new branch] xmfan/compiled_autograd_feb_29 -> origin/xmfan/compiled_autograd_feb_29 2025-03-14T05:31:37.9407017Z * [new branch] xmfan/compiled_autograd_graph_breaks -> origin/xmfan/compiled_autograd_graph_breaks 2025-03-14T05:31:37.9408742Z * [new branch] xmfan/compiled_autograd_hud -> origin/xmfan/compiled_autograd_hud 2025-03-14T05:31:37.9410536Z * [new branch] xmfan/compiled_autograd_hypothetical_perf -> origin/xmfan/compiled_autograd_hypothetical_perf 2025-03-14T05:31:37.9412205Z * [new branch] xmfan/compiled_autograd_perf_no_reuse -> origin/xmfan/compiled_autograd_perf_no_reuse 2025-03-14T05:31:37.9413753Z * [new branch] xmfan/disable_duck_shape -> origin/xmfan/disable_duck_shape 2025-03-14T05:31:37.9415589Z * [new branch] xmfan/distributed_torchbench -> origin/xmfan/distributed_torchbench 2025-03-14T05:31:37.9417228Z * [new branch] xmfan/fca_cpp_node_passthrough -> origin/xmfan/fca_cpp_node_passthrough 2025-03-14T05:31:37.9418903Z * [new branch] xmfan/feb_10_compiled_autograd -> origin/xmfan/feb_10_compiled_autograd 2025-03-14T05:31:37.9420919Z * [new branch] xmfan/feb_10_compiled_autograd_cudagraph -> origin/xmfan/feb_10_compiled_autograd_cudagraph 2025-03-14T05:31:37.9422453Z * [new branch] xmfan/fsdp_wraps -> origin/xmfan/fsdp_wraps 2025-03-14T05:31:37.9424215Z * [new branch] xmfan/issue_123374 -> origin/xmfan/issue_123374 2025-03-14T05:31:37.9426060Z * [new branch] xmfan/oss_benchmark_script -> origin/xmfan/oss_benchmark_script 2025-03-14T05:31:37.9427725Z * [new branch] xmfan/rename_nanogpt -> origin/xmfan/rename_nanogpt 2025-03-14T05:31:37.9430451Z * [new branch] xmfan/retains_grad_hooks -> origin/xmfan/retains_grad_hooks 2025-03-14T05:31:37.9431718Z * [new branch] xmfan/segfault_test -> origin/xmfan/segfault_test 2025-03-14T05:31:37.9433356Z * [new branch] xmfan/single_step -> origin/xmfan/single_step 2025-03-14T05:31:37.9435140Z * [new branch] xmfan/sth_0829 -> origin/xmfan/sth_0829 2025-03-14T05:31:37.9437523Z * [new branch] xmfan/test -> origin/xmfan/test 2025-03-14T05:31:37.9438816Z * [new branch] xmfan/yolov3_oom -> origin/xmfan/yolov3_oom 2025-03-14T05:31:37.9441639Z * [new branch] yguo/debug-0226-constexpr -> origin/yguo/debug-0226-constexpr 2025-03-14T05:31:37.9442711Z * [new branch] yguo/fix-remaining-cpp-wrapper -> origin/yguo/fix-remaining-cpp-wrapper 2025-03-14T05:31:37.9444321Z * [new branch] yguo/new_latest_changes -> origin/yguo/new_latest_changes 2025-03-14T05:31:37.9446008Z * [new branch] yguo/patch_constexpr_changes -> origin/yguo/patch_constexpr_changes 2025-03-14T05:31:37.9448738Z * [new branch] yguo/repro-segfault-triton-aoti-cpp-wrapper -> origin/yguo/repro-segfault-triton-aoti-cpp-wrapper 2025-03-14T05:31:37.9450129Z * [new branch] yihan_quantization -> origin/yihan_quantization 2025-03-14T05:31:37.9452329Z * [new branch] yiming/bootcamp -> origin/yiming/bootcamp 2025-03-14T05:31:37.9454721Z * [new branch] zainr/canary-test -> origin/zainr/canary-test 2025-03-14T05:31:37.9456031Z * [new branch] zainr/historical-correlation-fix -> origin/zainr/historical-correlation-fix 2025-03-14T05:31:37.9457768Z * [new branch] zainr/lint-fix -> origin/zainr/lint-fix 2025-03-14T05:31:37.9459886Z * [new branch] zainr/make-unstable -> origin/zainr/make-unstable 2025-03-14T05:31:37.9461032Z * [new branch] zainr/metrics-job-id -> origin/zainr/metrics-job-id 2025-03-14T05:31:37.9462865Z * [new branch] zainr/metrics-pr -> origin/zainr/metrics-pr 2025-03-14T05:31:37.9464168Z * [new branch] zainr/mypy-break-test -> origin/zainr/mypy-break-test 2025-03-14T05:31:37.9466454Z * [new branch] zainr/mypy-break-test2 -> origin/zainr/mypy-break-test2 2025-03-14T05:31:37.9468748Z * [new branch] zainr/mypy-break-test3 -> origin/zainr/mypy-break-test3 2025-03-14T05:31:37.9470580Z * [new branch] zainr/mypy-update -> origin/zainr/mypy-update 2025-03-14T05:31:37.9472294Z * [new branch] zainr/pull-migration-c -> origin/zainr/pull-migration-c 2025-03-14T05:31:37.9474315Z * [new branch] zainr/revert-60576419a2a-make-dynamic -> origin/zainr/revert-60576419a2a-make-dynamic 2025-03-14T05:31:37.9475653Z * [new branch] zainr/sha-checking -> origin/zainr/sha-checking 2025-03-14T05:31:37.9477716Z * [new branch] zainr/td-baseline-stats -> origin/zainr/td-baseline-stats 2025-03-14T05:31:37.9479904Z * [new branch] zainr/td-class -> origin/zainr/td-class 2025-03-14T05:31:37.9481603Z * [new branch] zainr/td-class-metrics -> origin/zainr/td-class-metrics 2025-03-14T05:31:37.9482864Z * [new branch] zainr/td-downgrade -> origin/zainr/td-downgrade 2025-03-14T05:31:37.9484855Z * [new branch] zainr/td-file-pass -> origin/zainr/td-file-pass 2025-03-14T05:31:37.9486623Z * [new branch] zainr/td-metrics-v2 -> origin/zainr/td-metrics-v2 2025-03-14T05:31:37.9488337Z * [new branch] zainr/td-pass-class-times -> origin/zainr/td-pass-class-times 2025-03-14T05:31:37.9489804Z * [new branch] zainr/td-shard-info -> origin/zainr/td-shard-info 2025-03-14T05:31:37.9491632Z * [new branch] zainr/td-trial -> origin/zainr/td-trial 2025-03-14T05:31:37.9493401Z * [new branch] zainr/unstable -> origin/zainr/unstable 2025-03-14T05:31:37.9495892Z * [new branch] zainrizvi/testing1 -> origin/zainrizvi/testing1 2025-03-14T05:31:37.9497840Z * [new branch] zasdfgbnm-patch-3 -> origin/zasdfgbnm-patch-3 2025-03-14T05:31:37.9499613Z * [new branch] zb2p -> origin/zb2p 2025-03-14T05:31:37.9501703Z * [new branch] zdevito-patch-1 -> origin/zdevito-patch-1 2025-03-14T05:31:37.9503574Z * [new branch] zdevito-patch-2 -> origin/zdevito-patch-2 2025-03-14T05:31:37.9505404Z * [new branch] zeros-and-scatter-part2 -> origin/zeros-and-scatter-part2 2025-03-14T05:31:37.9508168Z * [new branch] zhxchen17/scratch/0 -> origin/zhxchen17/scratch/0 2025-03-14T05:31:37.9510575Z * [new branch] zhxchen17/sticky_cache/0 -> origin/zhxchen17/sticky_cache/0 2025-03-14T05:31:37.9511775Z * [new tag] bc2caa7fdf006894eff7af936babde69ab5a40f8-huydhn-debug -> bc2caa7fdf006894eff7af936babde69ab5a40f8-huydhn-debug 2025-03-14T05:31:37.9513173Z * [new tag] ci/binaries/77164 -> ci/binaries/77164 2025-03-14T05:31:37.9515316Z * [new tag] ciflow/all/70978 -> ciflow/all/70978 2025-03-14T05:31:37.9516323Z * [new tag] ciflow/all/70979 -> ciflow/all/70979 2025-03-14T05:31:37.9518009Z * [new tag] ciflow/all/70989 -> ciflow/all/70989 2025-03-14T05:31:37.9519245Z * [new tag] ciflow/binaries/120076 -> ciflow/binaries/120076 2025-03-14T05:31:37.9520418Z * [new tag] ciflow/binaries/138996 -> ciflow/binaries/138996 2025-03-14T05:31:37.9522255Z * [new tag] ciflow/binaries/143416 -> ciflow/binaries/143416 2025-03-14T05:31:37.9523030Z * [new tag] ciflow/binaries/144127 -> ciflow/binaries/144127 2025-03-14T05:31:37.9524110Z * [new tag] ciflow/binaries/145119 -> ciflow/binaries/145119 2025-03-14T05:31:37.9525789Z * [new tag] ciflow/binaries/145224 -> ciflow/binaries/145224 2025-03-14T05:31:37.9526811Z * [new tag] ciflow/binaries/146717 -> ciflow/binaries/146717 2025-03-14T05:31:37.9528003Z * [new tag] ciflow/binaries/147498 -> ciflow/binaries/147498 2025-03-14T05:31:37.9529578Z * [new tag] ciflow/binaries/147664 -> ciflow/binaries/147664 2025-03-14T05:31:37.9530514Z * [new tag] ciflow/binaries/147917 -> ciflow/binaries/147917 2025-03-14T05:31:37.9531686Z * [new tag] ciflow/binaries/148163 -> ciflow/binaries/148163 2025-03-14T05:31:37.9533736Z * [new tag] ciflow/binaries/148173 -> ciflow/binaries/148173 2025-03-14T05:31:37.9535039Z * [new tag] ciflow/binaries_wheel/138834 -> ciflow/binaries_wheel/138834 2025-03-14T05:31:37.9536078Z * [new tag] ciflow/binaries_wheel/143388 -> ciflow/binaries_wheel/143388 2025-03-14T05:31:37.9537471Z * [new tag] ciflow/binaries_wheel/144049 -> ciflow/binaries_wheel/144049 2025-03-14T05:31:37.9538483Z * [new tag] ciflow/binaries_wheel/146055 -> ciflow/binaries_wheel/146055 2025-03-14T05:31:37.9539719Z * [new tag] ciflow/binaries_wheel/146573 -> ciflow/binaries_wheel/146573 2025-03-14T05:31:37.9541370Z * [new tag] ciflow/binaries_wheel/147074 -> ciflow/binaries_wheel/147074 2025-03-14T05:31:37.9542315Z * [new tag] ciflow/binaries_wheel/147455 -> ciflow/binaries_wheel/147455 2025-03-14T05:31:37.9543587Z * [new tag] ciflow/binaries_wheel/148070 -> ciflow/binaries_wheel/148070 2025-03-14T05:31:37.9545320Z * [new tag] ciflow/binaries_wheel/148197 -> ciflow/binaries_wheel/148197 2025-03-14T05:31:37.9546232Z * [new tag] ciflow/binaries_wheel/148320 -> ciflow/binaries_wheel/148320 2025-03-14T05:31:37.9547828Z * [new tag] ciflow/cuda/70978 -> ciflow/cuda/70978 2025-03-14T05:31:37.9548635Z * [new tag] ciflow/cuda/70979 -> ciflow/cuda/70979 2025-03-14T05:31:37.9549702Z * [new tag] ciflow/cuda/70989 -> ciflow/cuda/70989 2025-03-14T05:31:37.9551564Z * [new tag] ciflow/inductor-micro-benchmark/141910 -> ciflow/inductor-micro-benchmark/141910 2025-03-14T05:31:37.9552895Z * [new tag] ciflow/inductor-perf-test-nightly-rocm/148672 -> ciflow/inductor-perf-test-nightly-rocm/148672 2025-03-14T05:31:37.9554167Z * [new tag] ciflow/inductor-perf-test-nightly-rocm/149039 -> ciflow/inductor-perf-test-nightly-rocm/149039 2025-03-14T05:31:37.9555580Z * [new tag] ciflow/inductor-periodic/145612 -> ciflow/inductor-periodic/145612 2025-03-14T05:31:37.9556898Z * [new tag] ciflow/inductor-periodic/147315 -> ciflow/inductor-periodic/147315 2025-03-14T05:31:37.9558709Z * [new tag] ciflow/inductor-rocm/140989 -> ciflow/inductor-rocm/140989 2025-03-14T05:31:37.9559752Z * [new tag] ciflow/inductor-rocm/141309 -> ciflow/inductor-rocm/141309 2025-03-14T05:31:37.9561400Z * [new tag] ciflow/inductor-rocm/146264 -> ciflow/inductor-rocm/146264 2025-03-14T05:31:37.9562448Z * [new tag] ciflow/inductor-rocm/146903 -> ciflow/inductor-rocm/146903 2025-03-14T05:31:37.9563489Z * [new tag] ciflow/inductor-rocm/147315 -> ciflow/inductor-rocm/147315 2025-03-14T05:31:37.9565258Z * [new tag] ciflow/inductor-rocm/147452 -> ciflow/inductor-rocm/147452 2025-03-14T05:31:37.9566464Z * [new tag] ciflow/inductor-rocm/147583 -> ciflow/inductor-rocm/147583 2025-03-14T05:31:37.9567675Z * [new tag] ciflow/inductor-rocm/148327 -> ciflow/inductor-rocm/148327 2025-03-14T05:31:37.9569175Z * [new tag] ciflow/inductor-rocm/149041 -> ciflow/inductor-rocm/149041 2025-03-14T05:31:37.9571145Z * [new tag] ciflow/inductor/110155 -> ciflow/inductor/110155 2025-03-14T05:31:37.9572069Z * [new tag] ciflow/inductor/113257 -> ciflow/inductor/113257 2025-03-14T05:31:37.9573187Z * [new tag] ciflow/inductor/119496 -> ciflow/inductor/119496 2025-03-14T05:31:37.9574256Z * [new tag] ciflow/inductor/119977 -> ciflow/inductor/119977 2025-03-14T05:31:37.9575377Z * [new tag] ciflow/inductor/120076 -> ciflow/inductor/120076 2025-03-14T05:31:37.9576487Z * [new tag] ciflow/inductor/121445 -> ciflow/inductor/121445 2025-03-14T05:31:37.9577672Z * [new tag] ciflow/inductor/124490 -> ciflow/inductor/124490 2025-03-14T05:31:37.9578744Z * [new tag] ciflow/inductor/125270 -> ciflow/inductor/125270 2025-03-14T05:31:37.9579869Z * [new tag] ciflow/inductor/125326 -> ciflow/inductor/125326 2025-03-14T05:31:37.9581000Z * [new tag] ciflow/inductor/125428 -> ciflow/inductor/125428 2025-03-14T05:31:37.9582117Z * [new tag] ciflow/inductor/125806 -> ciflow/inductor/125806 2025-03-14T05:31:37.9584408Z * [new tag] ciflow/inductor/125888 -> ciflow/inductor/125888 2025-03-14T05:31:37.9586097Z * [new tag] ciflow/inductor/125995 -> ciflow/inductor/125995 2025-03-14T05:31:37.9587195Z * [new tag] ciflow/inductor/126348 -> ciflow/inductor/126348 2025-03-14T05:31:37.9588832Z * [new tag] ciflow/inductor/127171 -> ciflow/inductor/127171 2025-03-14T05:31:37.9589735Z * [new tag] ciflow/inductor/127293 -> ciflow/inductor/127293 2025-03-14T05:31:37.9590908Z * [new tag] ciflow/inductor/127294 -> ciflow/inductor/127294 2025-03-14T05:31:37.9592613Z * [new tag] ciflow/inductor/129352 -> ciflow/inductor/129352 2025-03-14T05:31:37.9593540Z * [new tag] ciflow/inductor/129420 -> ciflow/inductor/129420 2025-03-14T05:31:37.9594790Z * [new tag] ciflow/inductor/130141 -> ciflow/inductor/130141 2025-03-14T05:31:37.9596421Z * [new tag] ciflow/inductor/130499 -> ciflow/inductor/130499 2025-03-14T05:31:37.9597287Z * [new tag] ciflow/inductor/130887 -> ciflow/inductor/130887 2025-03-14T05:31:37.9598436Z * [new tag] ciflow/inductor/131354 -> ciflow/inductor/131354 2025-03-14T05:31:37.9599567Z * [new tag] ciflow/inductor/132021 -> ciflow/inductor/132021 2025-03-14T05:31:37.9601185Z * [new tag] ciflow/inductor/132414 -> ciflow/inductor/132414 2025-03-14T05:31:37.9602068Z * [new tag] ciflow/inductor/133044 -> ciflow/inductor/133044 2025-03-14T05:31:37.9603150Z * [new tag] ciflow/inductor/133121 -> ciflow/inductor/133121 2025-03-14T05:31:37.9604299Z * [new tag] ciflow/inductor/133287 -> ciflow/inductor/133287 2025-03-14T05:31:37.9605472Z * [new tag] ciflow/inductor/133289 -> ciflow/inductor/133289 2025-03-14T05:31:37.9606633Z * [new tag] ciflow/inductor/133296 -> ciflow/inductor/133296 2025-03-14T05:31:37.9607834Z * [new tag] ciflow/inductor/133297 -> ciflow/inductor/133297 2025-03-14T05:31:37.9608986Z * [new tag] ciflow/inductor/133315 -> ciflow/inductor/133315 2025-03-14T05:31:37.9610143Z * [new tag] ciflow/inductor/133392 -> ciflow/inductor/133392 2025-03-14T05:31:37.9611294Z * [new tag] ciflow/inductor/133419 -> ciflow/inductor/133419 2025-03-14T05:31:37.9612614Z * [new tag] ciflow/inductor/133423 -> ciflow/inductor/133423 2025-03-14T05:31:37.9613630Z * [new tag] ciflow/inductor/133667 -> ciflow/inductor/133667 2025-03-14T05:31:37.9614773Z * [new tag] ciflow/inductor/133753 -> ciflow/inductor/133753 2025-03-14T05:31:37.9616537Z * [new tag] ciflow/inductor/134592 -> ciflow/inductor/134592 2025-03-14T05:31:37.9617392Z * [new tag] ciflow/inductor/134681 -> ciflow/inductor/134681 2025-03-14T05:31:37.9618510Z * [new tag] ciflow/inductor/135708 -> ciflow/inductor/135708 2025-03-14T05:31:37.9619708Z * [new tag] ciflow/inductor/135792 -> ciflow/inductor/135792 2025-03-14T05:31:37.9620862Z * [new tag] ciflow/inductor/136355 -> ciflow/inductor/136355 2025-03-14T05:31:37.9622020Z * [new tag] ciflow/inductor/136702 -> ciflow/inductor/136702 2025-03-14T05:31:37.9623192Z * [new tag] ciflow/inductor/137400 -> ciflow/inductor/137400 2025-03-14T05:31:37.9624365Z * [new tag] ciflow/inductor/137568 -> ciflow/inductor/137568 2025-03-14T05:31:37.9625557Z * [new tag] ciflow/inductor/137583 -> ciflow/inductor/137583 2025-03-14T05:31:37.9627303Z * [new tag] ciflow/inductor/137846 -> ciflow/inductor/137846 2025-03-14T05:31:37.9628151Z * [new tag] ciflow/inductor/137884 -> ciflow/inductor/137884 2025-03-14T05:31:37.9629287Z * [new tag] ciflow/inductor/138185 -> ciflow/inductor/138185 2025-03-14T05:31:37.9630843Z * [new tag] ciflow/inductor/138202 -> ciflow/inductor/138202 2025-03-14T05:31:37.9631714Z * [new tag] ciflow/inductor/138214 -> ciflow/inductor/138214 2025-03-14T05:31:37.9633301Z * [new tag] ciflow/inductor/138388 -> ciflow/inductor/138388 2025-03-14T05:31:37.9634197Z * [new tag] ciflow/inductor/138513 -> ciflow/inductor/138513 2025-03-14T05:31:37.9636408Z * [new tag] ciflow/inductor/138519 -> ciflow/inductor/138519 2025-03-14T05:31:37.9637385Z * [new tag] ciflow/inductor/138555 -> ciflow/inductor/138555 2025-03-14T05:31:37.9639067Z * [new tag] ciflow/inductor/138626 -> ciflow/inductor/138626 2025-03-14T05:31:37.9640096Z * [new tag] ciflow/inductor/139094 -> ciflow/inductor/139094 2025-03-14T05:31:37.9641299Z * [new tag] ciflow/inductor/139271 -> ciflow/inductor/139271 2025-03-14T05:31:37.9642473Z * [new tag] ciflow/inductor/139561 -> ciflow/inductor/139561 2025-03-14T05:31:37.9643673Z * [new tag] ciflow/inductor/139975 -> ciflow/inductor/139975 2025-03-14T05:31:37.9645300Z * [new tag] ciflow/inductor/140032 -> ciflow/inductor/140032 2025-03-14T05:31:37.9646474Z * [new tag] ciflow/inductor/140159 -> ciflow/inductor/140159 2025-03-14T05:31:37.9647591Z * [new tag] ciflow/inductor/140756 -> ciflow/inductor/140756 2025-03-14T05:31:37.9649276Z * [new tag] ciflow/inductor/140979 -> ciflow/inductor/140979 2025-03-14T05:31:37.9650180Z * [new tag] ciflow/inductor/141096 -> ciflow/inductor/141096 2025-03-14T05:31:37.9651380Z * [new tag] ciflow/inductor/141097 -> ciflow/inductor/141097 2025-03-14T05:31:37.9652530Z * [new tag] ciflow/inductor/141309 -> ciflow/inductor/141309 2025-03-14T05:31:37.9653726Z * [new tag] ciflow/inductor/141641 -> ciflow/inductor/141641 2025-03-14T05:31:37.9654978Z * [new tag] ciflow/inductor/141684 -> ciflow/inductor/141684 2025-03-14T05:31:37.9656581Z * [new tag] ciflow/inductor/141700 -> ciflow/inductor/141700 2025-03-14T05:31:37.9657716Z * [new tag] ciflow/inductor/141730 -> ciflow/inductor/141730 2025-03-14T05:31:37.9658741Z * [new tag] ciflow/inductor/141842 -> ciflow/inductor/141842 2025-03-14T05:31:37.9659941Z * [new tag] ciflow/inductor/141940 -> ciflow/inductor/141940 2025-03-14T05:31:37.9661134Z * [new tag] ciflow/inductor/141944 -> ciflow/inductor/141944 2025-03-14T05:31:37.9662324Z * [new tag] ciflow/inductor/141961 -> ciflow/inductor/141961 2025-03-14T05:31:37.9664168Z * [new tag] ciflow/inductor/142272 -> ciflow/inductor/142272 2025-03-14T05:31:37.9665142Z * [new tag] ciflow/inductor/142295 -> ciflow/inductor/142295 2025-03-14T05:31:37.9666735Z * [new tag] ciflow/inductor/142309 -> ciflow/inductor/142309 2025-03-14T05:31:37.9667621Z * [new tag] ciflow/inductor/142372 -> ciflow/inductor/142372 2025-03-14T05:31:37.9670695Z * [new tag] ciflow/inductor/142851 -> ciflow/inductor/142851 2025-03-14T05:31:37.9671643Z * [new tag] ciflow/inductor/143103 -> ciflow/inductor/143103 2025-03-14T05:31:37.9673291Z * [new tag] ciflow/inductor/143256 -> ciflow/inductor/143256 2025-03-14T05:31:37.9674171Z * [new tag] ciflow/inductor/143275 -> ciflow/inductor/143275 2025-03-14T05:31:37.9675833Z * [new tag] ciflow/inductor/143313 -> ciflow/inductor/143313 2025-03-14T05:31:37.9676738Z * [new tag] ciflow/inductor/143411 -> ciflow/inductor/143411 2025-03-14T05:31:37.9677901Z * [new tag] ciflow/inductor/143457 -> ciflow/inductor/143457 2025-03-14T05:31:37.9679774Z * [new tag] ciflow/inductor/143464 -> ciflow/inductor/143464 2025-03-14T05:31:37.9680790Z * [new tag] ciflow/inductor/143475 -> ciflow/inductor/143475 2025-03-14T05:31:37.9682377Z * [new tag] ciflow/inductor/143525 -> ciflow/inductor/143525 2025-03-14T05:31:37.9683525Z * [new tag] ciflow/inductor/143527 -> ciflow/inductor/143527 2025-03-14T05:31:37.9684671Z * [new tag] ciflow/inductor/143533 -> ciflow/inductor/143533 2025-03-14T05:31:37.9685863Z * [new tag] ciflow/inductor/143534 -> ciflow/inductor/143534 2025-03-14T05:31:37.9687531Z * [new tag] ciflow/inductor/143544 -> ciflow/inductor/143544 2025-03-14T05:31:37.9688743Z * [new tag] ciflow/inductor/143666 -> ciflow/inductor/143666 2025-03-14T05:31:37.9689903Z * [new tag] ciflow/inductor/143671 -> ciflow/inductor/143671 2025-03-14T05:31:37.9691124Z * [new tag] ciflow/inductor/143712 -> ciflow/inductor/143712 2025-03-14T05:31:37.9692268Z * [new tag] ciflow/inductor/143812 -> ciflow/inductor/143812 2025-03-14T05:31:37.9693964Z * [new tag] ciflow/inductor/143833 -> ciflow/inductor/143833 2025-03-14T05:31:37.9694880Z * [new tag] ciflow/inductor/143961 -> ciflow/inductor/143961 2025-03-14T05:31:37.9696505Z * [new tag] ciflow/inductor/143987 -> ciflow/inductor/143987 2025-03-14T05:31:37.9697426Z * [new tag] ciflow/inductor/144008 -> ciflow/inductor/144008 2025-03-14T05:31:37.9699073Z * [new tag] ciflow/inductor/144017 -> ciflow/inductor/144017 2025-03-14T05:31:37.9699973Z * [new tag] ciflow/inductor/144073 -> ciflow/inductor/144073 2025-03-14T05:31:37.9701126Z * [new tag] ciflow/inductor/144120 -> ciflow/inductor/144120 2025-03-14T05:31:37.9702909Z * [new tag] ciflow/inductor/144172 -> ciflow/inductor/144172 2025-03-14T05:31:37.9703941Z * [new tag] ciflow/inductor/144234 -> ciflow/inductor/144234 2025-03-14T05:31:37.9705667Z * [new tag] ciflow/inductor/144272 -> ciflow/inductor/144272 2025-03-14T05:31:37.9706771Z * [new tag] ciflow/inductor/144288 -> ciflow/inductor/144288 2025-03-14T05:31:37.9707763Z * [new tag] ciflow/inductor/144293 -> ciflow/inductor/144293 2025-03-14T05:31:37.9709752Z * [new tag] ciflow/inductor/144294 -> ciflow/inductor/144294 2025-03-14T05:31:37.9710624Z * [new tag] ciflow/inductor/144332 -> ciflow/inductor/144332 2025-03-14T05:31:37.9711765Z * [new tag] ciflow/inductor/144333 -> ciflow/inductor/144333 2025-03-14T05:31:37.9713383Z * [new tag] ciflow/inductor/144353 -> ciflow/inductor/144353 2025-03-14T05:31:37.9714427Z * [new tag] ciflow/inductor/144365 -> ciflow/inductor/144365 2025-03-14T05:31:37.9715615Z * [new tag] ciflow/inductor/144366 -> ciflow/inductor/144366 2025-03-14T05:31:37.9716800Z * [new tag] ciflow/inductor/144405 -> ciflow/inductor/144405 2025-03-14T05:31:37.9718389Z * [new tag] ciflow/inductor/144438 -> ciflow/inductor/144438 2025-03-14T05:31:37.9719675Z * [new tag] ciflow/inductor/144452 -> ciflow/inductor/144452 2025-03-14T05:31:37.9720836Z * [new tag] ciflow/inductor/144458 -> ciflow/inductor/144458 2025-03-14T05:31:37.9722462Z * [new tag] ciflow/inductor/144501 -> ciflow/inductor/144501 2025-03-14T05:31:37.9723536Z * [new tag] ciflow/inductor/144505 -> ciflow/inductor/144505 2025-03-14T05:31:37.9724663Z * [new tag] ciflow/inductor/144507 -> ciflow/inductor/144507 2025-03-14T05:31:37.9725844Z * [new tag] ciflow/inductor/144516 -> ciflow/inductor/144516 2025-03-14T05:31:37.9727049Z * [new tag] ciflow/inductor/144542 -> ciflow/inductor/144542 2025-03-14T05:31:37.9728642Z * [new tag] ciflow/inductor/144548 -> ciflow/inductor/144548 2025-03-14T05:31:37.9729520Z * [new tag] ciflow/inductor/144551 -> ciflow/inductor/144551 2025-03-14T05:31:37.9730673Z * [new tag] ciflow/inductor/144553 -> ciflow/inductor/144553 2025-03-14T05:31:37.9731869Z * [new tag] ciflow/inductor/144555 -> ciflow/inductor/144555 2025-03-14T05:31:37.9733455Z * [new tag] ciflow/inductor/144556 -> ciflow/inductor/144556 2025-03-14T05:31:37.9734337Z * [new tag] ciflow/inductor/144579 -> ciflow/inductor/144579 2025-03-14T05:31:37.9736044Z * [new tag] ciflow/inductor/144598 -> ciflow/inductor/144598 2025-03-14T05:31:37.9736963Z * [new tag] ciflow/inductor/144712 -> ciflow/inductor/144712 2025-03-14T05:31:37.9738183Z * [new tag] ciflow/inductor/144721 -> ciflow/inductor/144721 2025-03-14T05:31:37.9739909Z * [new tag] ciflow/inductor/144724 -> ciflow/inductor/144724 2025-03-14T05:31:37.9740865Z * [new tag] ciflow/inductor/144733 -> ciflow/inductor/144733 2025-03-14T05:31:37.9741984Z * [new tag] ciflow/inductor/144765 -> ciflow/inductor/144765 2025-03-14T05:31:37.9743697Z * [new tag] ciflow/inductor/144771 -> ciflow/inductor/144771 2025-03-14T05:31:37.9744841Z * [new tag] ciflow/inductor/144880 -> ciflow/inductor/144880 2025-03-14T05:31:37.9745973Z * [new tag] ciflow/inductor/144905 -> ciflow/inductor/144905 2025-03-14T05:31:37.9747547Z * [new tag] ciflow/inductor/144925 -> ciflow/inductor/144925 2025-03-14T05:31:37.9748486Z * [new tag] ciflow/inductor/144943 -> ciflow/inductor/144943 2025-03-14T05:31:37.9750138Z * [new tag] ciflow/inductor/144953 -> ciflow/inductor/144953 2025-03-14T05:31:37.9751037Z * [new tag] ciflow/inductor/144975 -> ciflow/inductor/144975 2025-03-14T05:31:37.9752190Z * [new tag] ciflow/inductor/144979 -> ciflow/inductor/144979 2025-03-14T05:31:37.9753517Z * [new tag] ciflow/inductor/144986 -> ciflow/inductor/144986 2025-03-14T05:31:37.9755255Z * [new tag] ciflow/inductor/144992 -> ciflow/inductor/144992 2025-03-14T05:31:37.9756181Z * [new tag] ciflow/inductor/145024 -> ciflow/inductor/145024 2025-03-14T05:31:37.9757336Z * [new tag] ciflow/inductor/145061 -> ciflow/inductor/145061 2025-03-14T05:31:37.9758553Z * [new tag] ciflow/inductor/145117 -> ciflow/inductor/145117 2025-03-14T05:31:37.9759748Z * [new tag] ciflow/inductor/145119 -> ciflow/inductor/145119 2025-03-14T05:31:37.9761446Z * [new tag] ciflow/inductor/145130 -> ciflow/inductor/145130 2025-03-14T05:31:37.9762377Z * [new tag] ciflow/inductor/145150 -> ciflow/inductor/145150 2025-03-14T05:31:37.9764080Z * [new tag] ciflow/inductor/145153 -> ciflow/inductor/145153 2025-03-14T05:31:37.9765061Z * [new tag] ciflow/inductor/145254 -> ciflow/inductor/145254 2025-03-14T05:31:37.9766233Z * [new tag] ciflow/inductor/145331 -> ciflow/inductor/145331 2025-03-14T05:31:37.9767979Z * [new tag] ciflow/inductor/145353 -> ciflow/inductor/145353 2025-03-14T05:31:37.9769032Z * [new tag] ciflow/inductor/145475 -> ciflow/inductor/145475 2025-03-14T05:31:37.9770309Z * [new tag] ciflow/inductor/145523 -> ciflow/inductor/145523 2025-03-14T05:31:37.9771543Z * [new tag] ciflow/inductor/145540 -> ciflow/inductor/145540 2025-03-14T05:31:37.9772817Z * [new tag] ciflow/inductor/145559 -> ciflow/inductor/145559 2025-03-14T05:31:37.9773973Z * [new tag] ciflow/inductor/145562 -> ciflow/inductor/145562 2025-03-14T05:31:37.9775153Z * [new tag] ciflow/inductor/145594 -> ciflow/inductor/145594 2025-03-14T05:31:37.9776503Z * [new tag] ciflow/inductor/145595 -> ciflow/inductor/145595 2025-03-14T05:31:37.9777745Z * [new tag] ciflow/inductor/145605 -> ciflow/inductor/145605 2025-03-14T05:31:37.9786589Z * [new tag] ciflow/inductor/145612 -> ciflow/inductor/145612 2025-03-14T05:31:37.9787092Z * [new tag] ciflow/inductor/145636 -> ciflow/inductor/145636 2025-03-14T05:31:37.9787572Z * [new tag] ciflow/inductor/145647 -> ciflow/inductor/145647 2025-03-14T05:31:37.9788054Z * [new tag] ciflow/inductor/145681 -> ciflow/inductor/145681 2025-03-14T05:31:37.9788552Z * [new tag] ciflow/inductor/145865 -> ciflow/inductor/145865 2025-03-14T05:31:37.9789222Z * [new tag] ciflow/inductor/145885 -> ciflow/inductor/145885 2025-03-14T05:31:37.9789698Z * [new tag] ciflow/inductor/145911 -> ciflow/inductor/145911 2025-03-14T05:31:37.9790176Z * [new tag] ciflow/inductor/145922 -> ciflow/inductor/145922 2025-03-14T05:31:37.9790648Z * [new tag] ciflow/inductor/145936 -> ciflow/inductor/145936 2025-03-14T05:31:37.9791126Z * [new tag] ciflow/inductor/145969 -> ciflow/inductor/145969 2025-03-14T05:31:37.9791605Z * [new tag] ciflow/inductor/145979 -> ciflow/inductor/145979 2025-03-14T05:31:37.9792365Z * [new tag] ciflow/inductor/145992 -> ciflow/inductor/145992 2025-03-14T05:31:37.9793960Z * [new tag] ciflow/inductor/146051 -> ciflow/inductor/146051 2025-03-14T05:31:37.9795136Z * [new tag] ciflow/inductor/146063 -> ciflow/inductor/146063 2025-03-14T05:31:37.9796334Z * [new tag] ciflow/inductor/146101 -> ciflow/inductor/146101 2025-03-14T05:31:37.9798066Z * [new tag] ciflow/inductor/146115 -> ciflow/inductor/146115 2025-03-14T05:31:37.9799208Z * [new tag] ciflow/inductor/146135 -> ciflow/inductor/146135 2025-03-14T05:31:37.9801015Z * [new tag] ciflow/inductor/146171 -> ciflow/inductor/146171 2025-03-14T05:31:37.9802007Z * [new tag] ciflow/inductor/146172 -> ciflow/inductor/146172 2025-03-14T05:31:37.9803195Z * [new tag] ciflow/inductor/146176 -> ciflow/inductor/146176 2025-03-14T05:31:37.9804797Z * [new tag] ciflow/inductor/146180 -> ciflow/inductor/146180 2025-03-14T05:31:37.9805799Z * [new tag] ciflow/inductor/146218 -> ciflow/inductor/146218 2025-03-14T05:31:37.9807561Z * [new tag] ciflow/inductor/146228 -> ciflow/inductor/146228 2025-03-14T05:31:37.9808561Z * [new tag] ciflow/inductor/146264 -> ciflow/inductor/146264 2025-03-14T05:31:37.9810104Z * [new tag] ciflow/inductor/146267 -> ciflow/inductor/146267 2025-03-14T05:31:37.9811116Z * [new tag] ciflow/inductor/146275 -> ciflow/inductor/146275 2025-03-14T05:31:37.9812293Z * [new tag] ciflow/inductor/146280 -> ciflow/inductor/146280 2025-03-14T05:31:37.9813495Z * [new tag] ciflow/inductor/146288 -> ciflow/inductor/146288 2025-03-14T05:31:37.9815061Z * [new tag] ciflow/inductor/146319 -> ciflow/inductor/146319 2025-03-14T05:31:37.9816008Z * [new tag] ciflow/inductor/146335 -> ciflow/inductor/146335 2025-03-14T05:31:37.9817181Z * [new tag] ciflow/inductor/146341 -> ciflow/inductor/146341 2025-03-14T05:31:37.9818705Z * [new tag] ciflow/inductor/146395 -> ciflow/inductor/146395 2025-03-14T05:31:37.9819703Z * [new tag] ciflow/inductor/146415 -> ciflow/inductor/146415 2025-03-14T05:31:37.9820882Z * [new tag] ciflow/inductor/146421 -> ciflow/inductor/146421 2025-03-14T05:31:37.9822098Z * [new tag] ciflow/inductor/146436 -> ciflow/inductor/146436 2025-03-14T05:31:37.9823606Z * [new tag] ciflow/inductor/146500 -> ciflow/inductor/146500 2025-03-14T05:31:37.9824700Z * [new tag] ciflow/inductor/146501 -> ciflow/inductor/146501 2025-03-14T05:31:37.9825862Z * [new tag] ciflow/inductor/146505 -> ciflow/inductor/146505 2025-03-14T05:31:37.9827341Z * [new tag] ciflow/inductor/146506 -> ciflow/inductor/146506 2025-03-14T05:31:37.9828886Z * [new tag] ciflow/inductor/146526 -> ciflow/inductor/146526 2025-03-14T05:31:37.9830388Z * [new tag] ciflow/inductor/146530 -> ciflow/inductor/146530 2025-03-14T05:31:37.9831427Z * [new tag] ciflow/inductor/146535 -> ciflow/inductor/146535 2025-03-14T05:31:37.9832615Z * [new tag] ciflow/inductor/146558 -> ciflow/inductor/146558 2025-03-14T05:31:37.9834185Z * [new tag] ciflow/inductor/146561 -> ciflow/inductor/146561 2025-03-14T05:31:37.9837337Z * [new tag] ciflow/inductor/146562 -> ciflow/inductor/146562 2025-03-14T05:31:37.9837831Z * [new tag] ciflow/inductor/146661 -> ciflow/inductor/146661 2025-03-14T05:31:37.9838307Z * [new tag] ciflow/inductor/146678 -> ciflow/inductor/146678 2025-03-14T05:31:37.9838803Z * [new tag] ciflow/inductor/146706 -> ciflow/inductor/146706 2025-03-14T05:31:37.9839783Z * [new tag] ciflow/inductor/146718 -> ciflow/inductor/146718 2025-03-14T05:31:37.9840924Z * [new tag] ciflow/inductor/146779 -> ciflow/inductor/146779 2025-03-14T05:31:37.9842646Z * [new tag] ciflow/inductor/146781 -> ciflow/inductor/146781 2025-03-14T05:31:37.9843782Z * [new tag] ciflow/inductor/146823 -> ciflow/inductor/146823 2025-03-14T05:31:37.9845420Z * [new tag] ciflow/inductor/146826 -> ciflow/inductor/146826 2025-03-14T05:31:37.9846558Z * [new tag] ciflow/inductor/146827 -> ciflow/inductor/146827 2025-03-14T05:31:37.9848101Z * [new tag] ciflow/inductor/146844 -> ciflow/inductor/146844 2025-03-14T05:31:37.9849055Z * [new tag] ciflow/inductor/146845 -> ciflow/inductor/146845 2025-03-14T05:31:37.9850198Z * [new tag] ciflow/inductor/146850 -> ciflow/inductor/146850 2025-03-14T05:31:37.9851417Z * [new tag] ciflow/inductor/146864 -> ciflow/inductor/146864 2025-03-14T05:31:37.9853297Z * [new tag] ciflow/inductor/146874 -> ciflow/inductor/146874 2025-03-14T05:31:37.9854364Z * [new tag] ciflow/inductor/146894 -> ciflow/inductor/146894 2025-03-14T05:31:37.9855907Z * [new tag] ciflow/inductor/146895 -> ciflow/inductor/146895 2025-03-14T05:31:37.9857005Z * [new tag] ciflow/inductor/146919 -> ciflow/inductor/146919 2025-03-14T05:31:37.9858220Z * [new tag] ciflow/inductor/146921 -> ciflow/inductor/146921 2025-03-14T05:31:37.9859709Z * [new tag] ciflow/inductor/146928 -> ciflow/inductor/146928 2025-03-14T05:31:37.9860690Z * [new tag] ciflow/inductor/146935 -> ciflow/inductor/146935 2025-03-14T05:31:37.9861864Z * [new tag] ciflow/inductor/146942 -> ciflow/inductor/146942 2025-03-14T05:31:37.9863487Z * [new tag] ciflow/inductor/146962 -> ciflow/inductor/146962 2025-03-14T05:31:37.9864612Z * [new tag] ciflow/inductor/146983 -> ciflow/inductor/146983 2025-03-14T05:31:37.9866202Z * [new tag] ciflow/inductor/146989 -> ciflow/inductor/146989 2025-03-14T05:31:37.9867676Z * [new tag] ciflow/inductor/147007 -> ciflow/inductor/147007 2025-03-14T05:31:37.9868934Z * [new tag] ciflow/inductor/147014 -> ciflow/inductor/147014 2025-03-14T05:31:37.9870500Z * [new tag] ciflow/inductor/147021 -> ciflow/inductor/147021 2025-03-14T05:31:37.9871500Z * [new tag] ciflow/inductor/147036 -> ciflow/inductor/147036 2025-03-14T05:31:37.9872705Z * [new tag] ciflow/inductor/147049 -> ciflow/inductor/147049 2025-03-14T05:31:37.9874308Z * [new tag] ciflow/inductor/147105 -> ciflow/inductor/147105 2025-03-14T05:31:37.9875393Z * [new tag] ciflow/inductor/147146 -> ciflow/inductor/147146 2025-03-14T05:31:37.9876877Z * [new tag] ciflow/inductor/147155 -> ciflow/inductor/147155 2025-03-14T05:31:37.9877916Z * [new tag] ciflow/inductor/147178 -> ciflow/inductor/147178 2025-03-14T05:31:37.9879136Z * [new tag] ciflow/inductor/147205 -> ciflow/inductor/147205 2025-03-14T05:31:37.9880641Z * [new tag] ciflow/inductor/147225 -> ciflow/inductor/147225 2025-03-14T05:31:37.9881640Z * [new tag] ciflow/inductor/147229 -> ciflow/inductor/147229 2025-03-14T05:31:37.9883539Z * [new tag] ciflow/inductor/147269 -> ciflow/inductor/147269 2025-03-14T05:31:37.9884620Z * [new tag] ciflow/inductor/147272 -> ciflow/inductor/147272 2025-03-14T05:31:37.9886200Z * [new tag] ciflow/inductor/147314 -> ciflow/inductor/147314 2025-03-14T05:31:37.9887298Z * [new tag] ciflow/inductor/147315 -> ciflow/inductor/147315 2025-03-14T05:31:37.9888769Z * [new tag] ciflow/inductor/147341 -> ciflow/inductor/147341 2025-03-14T05:31:37.9889755Z * [new tag] ciflow/inductor/147360 -> ciflow/inductor/147360 2025-03-14T05:31:37.9891346Z * [new tag] ciflow/inductor/147368 -> ciflow/inductor/147368 2025-03-14T05:31:37.9892357Z * [new tag] ciflow/inductor/147410 -> ciflow/inductor/147410 2025-03-14T05:31:37.9893676Z * [new tag] ciflow/inductor/147414 -> ciflow/inductor/147414 2025-03-14T05:31:37.9894894Z * [new tag] ciflow/inductor/147415 -> ciflow/inductor/147415 2025-03-14T05:31:37.9896094Z * [new tag] ciflow/inductor/147422 -> ciflow/inductor/147422 2025-03-14T05:31:37.9897699Z * [new tag] ciflow/inductor/147445 -> ciflow/inductor/147445 2025-03-14T05:31:37.9898708Z * [new tag] ciflow/inductor/147452 -> ciflow/inductor/147452 2025-03-14T05:31:37.9899910Z * [new tag] ciflow/inductor/147481 -> ciflow/inductor/147481 2025-03-14T05:31:37.9901418Z * [new tag] ciflow/inductor/147498 -> ciflow/inductor/147498 2025-03-14T05:31:37.9902437Z * [new tag] ciflow/inductor/147514 -> ciflow/inductor/147514 2025-03-14T05:31:37.9903603Z * [new tag] ciflow/inductor/147528 -> ciflow/inductor/147528 2025-03-14T05:31:37.9905099Z * [new tag] ciflow/inductor/147562 -> ciflow/inductor/147562 2025-03-14T05:31:37.9906108Z * [new tag] ciflow/inductor/147583 -> ciflow/inductor/147583 2025-03-14T05:31:37.9907298Z * [new tag] ciflow/inductor/147592 -> ciflow/inductor/147592 2025-03-14T05:31:37.9908795Z * [new tag] ciflow/inductor/147603 -> ciflow/inductor/147603 2025-03-14T05:31:37.9910244Z * [new tag] ciflow/inductor/147656 -> ciflow/inductor/147656 2025-03-14T05:31:37.9911278Z * [new tag] ciflow/inductor/147745 -> ciflow/inductor/147745 2025-03-14T05:31:37.9912442Z * [new tag] ciflow/inductor/147790 -> ciflow/inductor/147790 2025-03-14T05:31:37.9913990Z * [new tag] ciflow/inductor/147797 -> ciflow/inductor/147797 2025-03-14T05:31:37.9915104Z * [new tag] ciflow/inductor/147800 -> ciflow/inductor/147800 2025-03-14T05:31:37.9916715Z * [new tag] ciflow/inductor/147821 -> ciflow/inductor/147821 2025-03-14T05:31:37.9917700Z * [new tag] ciflow/inductor/147863 -> ciflow/inductor/147863 2025-03-14T05:31:37.9919260Z * [new tag] ciflow/inductor/147870 -> ciflow/inductor/147870 2025-03-14T05:31:37.9920257Z * [new tag] ciflow/inductor/147881 -> ciflow/inductor/147881 2025-03-14T05:31:37.9921876Z * [new tag] ciflow/inductor/147899 -> ciflow/inductor/147899 2025-03-14T05:31:37.9922837Z * [new tag] ciflow/inductor/147902 -> ciflow/inductor/147902 2025-03-14T05:31:37.9924309Z * [new tag] ciflow/inductor/147903 -> ciflow/inductor/147903 2025-03-14T05:31:37.9925608Z * [new tag] ciflow/inductor/147908 -> ciflow/inductor/147908 2025-03-14T05:31:37.9927100Z * [new tag] ciflow/inductor/147910 -> ciflow/inductor/147910 2025-03-14T05:31:37.9928174Z * [new tag] ciflow/inductor/147915 -> ciflow/inductor/147915 2025-03-14T05:31:37.9929685Z * [new tag] ciflow/inductor/147917 -> ciflow/inductor/147917 2025-03-14T05:31:37.9930827Z * [new tag] ciflow/inductor/147927 -> ciflow/inductor/147927 2025-03-14T05:31:37.9932428Z * [new tag] ciflow/inductor/147960 -> ciflow/inductor/147960 2025-03-14T05:31:37.9933414Z * [new tag] ciflow/inductor/147962 -> ciflow/inductor/147962 2025-03-14T05:31:37.9934628Z * [new tag] ciflow/inductor/147990 -> ciflow/inductor/147990 2025-03-14T05:31:37.9936149Z * [new tag] ciflow/inductor/148008 -> ciflow/inductor/148008 2025-03-14T05:31:37.9937191Z * [new tag] ciflow/inductor/148010 -> ciflow/inductor/148010 2025-03-14T05:31:37.9938359Z * [new tag] ciflow/inductor/148046 -> ciflow/inductor/148046 2025-03-14T05:31:37.9939847Z * [new tag] ciflow/inductor/148063 -> ciflow/inductor/148063 2025-03-14T05:31:37.9941111Z * [new tag] ciflow/inductor/148091 -> ciflow/inductor/148091 2025-03-14T05:31:37.9942230Z * [new tag] ciflow/inductor/148092 -> ciflow/inductor/148092 2025-03-14T05:31:37.9943915Z * [new tag] ciflow/inductor/148104 -> ciflow/inductor/148104 2025-03-14T05:31:37.9945020Z * [new tag] ciflow/inductor/148130 -> ciflow/inductor/148130 2025-03-14T05:31:37.9946538Z * [new tag] ciflow/inductor/148131 -> ciflow/inductor/148131 2025-03-14T05:31:37.9947609Z * [new tag] ciflow/inductor/148132 -> ciflow/inductor/148132 2025-03-14T05:31:37.9949074Z * [new tag] ciflow/inductor/148160 -> ciflow/inductor/148160 2025-03-14T05:31:37.9950193Z * [new tag] ciflow/inductor/148163 -> ciflow/inductor/148163 2025-03-14T05:31:37.9951686Z * [new tag] ciflow/inductor/148173 -> ciflow/inductor/148173 2025-03-14T05:31:37.9952729Z * [new tag] ciflow/inductor/148174 -> ciflow/inductor/148174 2025-03-14T05:31:37.9954271Z * [new tag] ciflow/inductor/148176 -> ciflow/inductor/148176 2025-03-14T05:31:37.9955358Z * [new tag] ciflow/inductor/148186 -> ciflow/inductor/148186 2025-03-14T05:31:37.9956858Z * [new tag] ciflow/inductor/148202 -> ciflow/inductor/148202 2025-03-14T05:31:37.9957932Z * [new tag] ciflow/inductor/148206 -> ciflow/inductor/148206 2025-03-14T05:31:37.9959435Z * [new tag] ciflow/inductor/148209 -> ciflow/inductor/148209 2025-03-14T05:31:37.9960498Z * [new tag] ciflow/inductor/148210 -> ciflow/inductor/148210 2025-03-14T05:31:37.9961987Z * [new tag] ciflow/inductor/148234 -> ciflow/inductor/148234 2025-03-14T05:31:37.9963027Z * [new tag] ciflow/inductor/148235 -> ciflow/inductor/148235 2025-03-14T05:31:37.9964526Z * [new tag] ciflow/inductor/148236 -> ciflow/inductor/148236 2025-03-14T05:31:37.9966487Z * [new tag] ciflow/inductor/148294 -> ciflow/inductor/148294 2025-03-14T05:31:37.9967511Z * [new tag] ciflow/inductor/148327 -> ciflow/inductor/148327 2025-03-14T05:31:37.9969334Z * [new tag] ciflow/inductor/148328 -> ciflow/inductor/148328 2025-03-14T05:31:37.9970802Z * [new tag] ciflow/inductor/148357 -> ciflow/inductor/148357 2025-03-14T05:31:37.9971848Z * [new tag] ciflow/inductor/148358 -> ciflow/inductor/148358 2025-03-14T05:31:37.9973581Z * [new tag] ciflow/inductor/148380 -> ciflow/inductor/148380 2025-03-14T05:31:37.9974640Z * [new tag] ciflow/inductor/148408 -> ciflow/inductor/148408 2025-03-14T05:31:37.9975835Z * [new tag] ciflow/inductor/148413 -> ciflow/inductor/148413 2025-03-14T05:31:37.9977367Z * [new tag] ciflow/inductor/148414 -> ciflow/inductor/148414 2025-03-14T05:31:37.9978477Z * [new tag] ciflow/inductor/148415 -> ciflow/inductor/148415 2025-03-14T05:31:37.9980099Z * [new tag] ciflow/inductor/148418 -> ciflow/inductor/148418 2025-03-14T05:31:37.9981239Z * [new tag] ciflow/inductor/148424 -> ciflow/inductor/148424 2025-03-14T05:31:37.9982551Z * [new tag] ciflow/inductor/148430 -> ciflow/inductor/148430 2025-03-14T05:31:37.9984186Z * [new tag] ciflow/inductor/148445 -> ciflow/inductor/148445 2025-03-14T05:31:37.9985375Z * [new tag] ciflow/inductor/148452 -> ciflow/inductor/148452 2025-03-14T05:31:37.9986853Z * [new tag] ciflow/inductor/148459 -> ciflow/inductor/148459 2025-03-14T05:31:37.9987921Z * [new tag] ciflow/inductor/148461 -> ciflow/inductor/148461 2025-03-14T05:31:37.9989602Z * [new tag] ciflow/inductor/148484 -> ciflow/inductor/148484 2025-03-14T05:31:37.9990477Z * [new tag] ciflow/inductor/148485 -> ciflow/inductor/148485 2025-03-14T05:31:37.9991967Z * [new tag] ciflow/inductor/148488 -> ciflow/inductor/148488 2025-03-14T05:31:37.9992998Z * [new tag] ciflow/inductor/148492 -> ciflow/inductor/148492 2025-03-14T05:31:37.9994893Z * [new tag] ciflow/inductor/148502 -> ciflow/inductor/148502 2025-03-14T05:31:37.9996409Z * [new tag] ciflow/inductor/148503 -> ciflow/inductor/148503 2025-03-14T05:31:37.9997397Z * [new tag] ciflow/inductor/148505 -> ciflow/inductor/148505 2025-03-14T05:31:37.9998894Z * [new tag] ciflow/inductor/148508 -> ciflow/inductor/148508 2025-03-14T05:31:37.9999989Z * [new tag] ciflow/inductor/148516 -> ciflow/inductor/148516 2025-03-14T05:31:38.0001648Z * [new tag] ciflow/inductor/148517 -> ciflow/inductor/148517 2025-03-14T05:31:38.0003040Z * [new tag] ciflow/inductor/148529 -> ciflow/inductor/148529 2025-03-14T05:31:38.0004597Z * [new tag] ciflow/inductor/148554 -> ciflow/inductor/148554 2025-03-14T05:31:38.0005443Z * [new tag] ciflow/inductor/148561 -> ciflow/inductor/148561 2025-03-14T05:31:38.0006905Z * [new tag] ciflow/inductor/148569 -> ciflow/inductor/148569 2025-03-14T05:31:38.0008179Z * [new tag] ciflow/inductor/148580 -> ciflow/inductor/148580 2025-03-14T05:31:38.0009592Z * [new tag] ciflow/inductor/148613 -> ciflow/inductor/148613 2025-03-14T05:31:38.0011043Z * [new tag] ciflow/inductor/148618 -> ciflow/inductor/148618 2025-03-14T05:31:38.0012416Z * [new tag] ciflow/inductor/148622 -> ciflow/inductor/148622 2025-03-14T05:31:38.0013926Z * [new tag] ciflow/inductor/148630 -> ciflow/inductor/148630 2025-03-14T05:31:38.0015342Z * [new tag] ciflow/inductor/148637 -> ciflow/inductor/148637 2025-03-14T05:31:38.0016889Z * [new tag] ciflow/inductor/148638 -> ciflow/inductor/148638 2025-03-14T05:31:38.0018267Z * [new tag] ciflow/inductor/148684 -> ciflow/inductor/148684 2025-03-14T05:31:38.0019631Z * [new tag] ciflow/inductor/148692 -> ciflow/inductor/148692 2025-03-14T05:31:38.0021026Z * [new tag] ciflow/inductor/148694 -> ciflow/inductor/148694 2025-03-14T05:31:38.0022623Z * [new tag] ciflow/inductor/148704 -> ciflow/inductor/148704 2025-03-14T05:31:38.0023993Z * [new tag] ciflow/inductor/148708 -> ciflow/inductor/148708 2025-03-14T05:31:38.0025446Z * [new tag] ciflow/inductor/148710 -> ciflow/inductor/148710 2025-03-14T05:31:38.0026835Z * [new tag] ciflow/inductor/148712 -> ciflow/inductor/148712 2025-03-14T05:31:38.0028278Z * [new tag] ciflow/inductor/148729 -> ciflow/inductor/148729 2025-03-14T05:31:38.0029666Z * [new tag] ciflow/inductor/148731 -> ciflow/inductor/148731 2025-03-14T05:31:38.0031185Z * [new tag] ciflow/inductor/148736 -> ciflow/inductor/148736 2025-03-14T05:31:38.0032591Z * [new tag] ciflow/inductor/148742 -> ciflow/inductor/148742 2025-03-14T05:31:38.0033969Z * [new tag] ciflow/inductor/148765 -> ciflow/inductor/148765 2025-03-14T05:31:38.0035499Z * [new tag] ciflow/inductor/148766 -> ciflow/inductor/148766 2025-03-14T05:31:38.0036926Z * [new tag] ciflow/inductor/148772 -> ciflow/inductor/148772 2025-03-14T05:31:38.0038304Z * [new tag] ciflow/inductor/148773 -> ciflow/inductor/148773 2025-03-14T05:31:38.0039716Z * [new tag] ciflow/inductor/148780 -> ciflow/inductor/148780 2025-03-14T05:31:38.0041384Z * [new tag] ciflow/inductor/148800 -> ciflow/inductor/148800 2025-03-14T05:31:38.0042843Z * [new tag] ciflow/inductor/148804 -> ciflow/inductor/148804 2025-03-14T05:31:38.0044255Z * [new tag] ciflow/inductor/148834 -> ciflow/inductor/148834 2025-03-14T05:31:38.0045746Z * [new tag] ciflow/inductor/148844 -> ciflow/inductor/148844 2025-03-14T05:31:38.0047233Z * [new tag] ciflow/inductor/148878 -> ciflow/inductor/148878 2025-03-14T05:31:38.0048686Z * [new tag] ciflow/inductor/148890 -> ciflow/inductor/148890 2025-03-14T05:31:38.0050089Z * [new tag] ciflow/inductor/148893 -> ciflow/inductor/148893 2025-03-14T05:31:38.0051486Z * [new tag] ciflow/inductor/148894 -> ciflow/inductor/148894 2025-03-14T05:31:38.0052861Z * [new tag] ciflow/inductor/148896 -> ciflow/inductor/148896 2025-03-14T05:31:38.0054499Z * [new tag] ciflow/inductor/148898 -> ciflow/inductor/148898 2025-03-14T05:31:38.0056300Z * [new tag] ciflow/inductor/148922 -> ciflow/inductor/148922 2025-03-14T05:31:38.0057741Z * [new tag] ciflow/inductor/148932 -> ciflow/inductor/148932 2025-03-14T05:31:38.0059102Z * [new tag] ciflow/inductor/148947 -> ciflow/inductor/148947 2025-03-14T05:31:38.0060601Z * [new tag] ciflow/inductor/148953 -> ciflow/inductor/148953 2025-03-14T05:31:38.0061996Z * [new tag] ciflow/inductor/148962 -> ciflow/inductor/148962 2025-03-14T05:31:38.0063385Z * [new tag] ciflow/inductor/148997 -> ciflow/inductor/148997 2025-03-14T05:31:38.0064856Z * [new tag] ciflow/inductor/149007 -> ciflow/inductor/149007 2025-03-14T05:31:38.0066213Z * [new tag] ciflow/inductor/149014 -> ciflow/inductor/149014 2025-03-14T05:31:38.0068096Z * [new tag] ciflow/inductor/149027 -> ciflow/inductor/149027 2025-03-14T05:31:38.0071584Z * [new tag] ciflow/inductor/149031 -> ciflow/inductor/149031 2025-03-14T05:31:38.0072955Z * [new tag] ciflow/inductor/149039 -> ciflow/inductor/149039 2025-03-14T05:31:38.0074427Z * [new tag] ciflow/inductor/149041 -> ciflow/inductor/149041 2025-03-14T05:31:38.0075951Z * [new tag] ciflow/inductor/149052 -> ciflow/inductor/149052 2025-03-14T05:31:38.0077324Z * [new tag] ciflow/inductor/149054 -> ciflow/inductor/149054 2025-03-14T05:31:38.0078683Z * [new tag] ciflow/inductor/149055 -> ciflow/inductor/149055 2025-03-14T05:31:38.0080082Z * [new tag] ciflow/inductor/149064 -> ciflow/inductor/149064 2025-03-14T05:31:38.0081453Z * [new tag] ciflow/inductor/149066 -> ciflow/inductor/149066 2025-03-14T05:31:38.0082816Z * [new tag] ciflow/inductor/149067 -> ciflow/inductor/149067 2025-03-14T05:31:38.0084210Z * [new tag] ciflow/inductor/149068 -> ciflow/inductor/149068 2025-03-14T05:31:38.0085577Z * [new tag] ciflow/inductor/149072 -> ciflow/inductor/149072 2025-03-14T05:31:38.0087175Z * [new tag] ciflow/inductor/149074 -> ciflow/inductor/149074 2025-03-14T05:31:38.0088527Z * [new tag] ciflow/inductor/149087 -> ciflow/inductor/149087 2025-03-14T05:31:38.0090020Z * [new tag] ciflow/inductor/149103 -> ciflow/inductor/149103 2025-03-14T05:31:38.0091574Z * [new tag] ciflow/inductor/149106 -> ciflow/inductor/149106 2025-03-14T05:31:38.0093183Z * [new tag] ciflow/inductor/149136 -> ciflow/inductor/149136 2025-03-14T05:31:38.0094534Z * [new tag] ciflow/inductor/149140 -> ciflow/inductor/149140 2025-03-14T05:31:38.0095928Z * [new tag] ciflow/inductor/149148 -> ciflow/inductor/149148 2025-03-14T05:31:38.0097453Z * [new tag] ciflow/inductor/149149 -> ciflow/inductor/149149 2025-03-14T05:31:38.0098744Z * [new tag] ciflow/inductor/149154 -> ciflow/inductor/149154 2025-03-14T05:31:38.0100077Z * [new tag] ciflow/inductor/149160 -> ciflow/inductor/149160 2025-03-14T05:31:38.0101432Z * [new tag] ciflow/inductor/149161 -> ciflow/inductor/149161 2025-03-14T05:31:38.0102804Z * [new tag] ciflow/inductor/149168 -> ciflow/inductor/149168 2025-03-14T05:31:38.0104242Z * [new tag] ciflow/inductor/149172 -> ciflow/inductor/149172 2025-03-14T05:31:38.0105739Z * [new tag] ciflow/inductor/149173 -> ciflow/inductor/149173 2025-03-14T05:31:38.0107136Z * [new tag] ciflow/inductor/149176 -> ciflow/inductor/149176 2025-03-14T05:31:38.0108852Z * [new tag] ciflow/inductor/149178 -> ciflow/inductor/149178 2025-03-14T05:31:38.0110571Z * [new tag] ciflow/inductor/3b9a386 -> ciflow/inductor/3b9a386 2025-03-14T05:31:38.0112107Z * [new tag] ciflow/inductor/3d4b92b -> ciflow/inductor/3d4b92b 2025-03-14T05:31:38.0113659Z * [new tag] ciflow/inductor/88106 -> ciflow/inductor/88106 2025-03-14T05:31:38.0115364Z * [new tag] ciflow/inductor/88196 -> ciflow/inductor/88196 2025-03-14T05:31:38.0117072Z * [new tag] ciflow/inductor/88998 -> ciflow/inductor/88998 2025-03-14T05:31:38.0118658Z * [new tag] ciflow/inductor/d224ac7 -> ciflow/inductor/d224ac7 2025-03-14T05:31:38.0120136Z * [new tag] ciflow/linux-aarch64/125888 -> ciflow/linux-aarch64/125888 2025-03-14T05:31:38.0121011Z * [new tag] ciflow/linux-aarch64/126050 -> ciflow/linux-aarch64/126050 2025-03-14T05:31:38.0122401Z * [new tag] ciflow/linux-aarch64/126054 -> ciflow/linux-aarch64/126054 2025-03-14T05:31:38.0123263Z * [new tag] ciflow/linux-aarch64/133297 -> ciflow/linux-aarch64/133297 2025-03-14T05:31:38.0124568Z * [new tag] ciflow/linux-aarch64/133315 -> ciflow/linux-aarch64/133315 2025-03-14T05:31:38.0125469Z * [new tag] ciflow/linux-aarch64/133392 -> ciflow/linux-aarch64/133392 2025-03-14T05:31:38.0126849Z * [new tag] ciflow/linux-aarch64/133419 -> ciflow/linux-aarch64/133419 2025-03-14T05:31:38.0127753Z * [new tag] ciflow/linux-aarch64/133423 -> ciflow/linux-aarch64/133423 2025-03-14T05:31:38.0129116Z * [new tag] ciflow/linux-aarch64/133667 -> ciflow/linux-aarch64/133667 2025-03-14T05:31:38.0130042Z * [new tag] ciflow/linux-aarch64/133753 -> ciflow/linux-aarch64/133753 2025-03-14T05:31:38.0131552Z * [new tag] ciflow/linux-aarch64/135058 -> ciflow/linux-aarch64/135058 2025-03-14T05:31:38.0133063Z * [new tag] ciflow/linux-aarch64/135333 -> ciflow/linux-aarch64/135333 2025-03-14T05:31:38.0134682Z * [new tag] ciflow/linux-aarch64/135792 -> ciflow/linux-aarch64/135792 2025-03-14T05:31:38.0136062Z * [new tag] ciflow/linux-aarch64/136355 -> ciflow/linux-aarch64/136355 2025-03-14T05:31:38.0137265Z * [new tag] ciflow/linux-aarch64/137568 -> ciflow/linux-aarch64/137568 2025-03-14T05:31:38.0138517Z * [new tag] ciflow/linux-aarch64/138388 -> ciflow/linux-aarch64/138388 2025-03-14T05:31:38.0139742Z * [new tag] ciflow/linux-aarch64/140159 -> ciflow/linux-aarch64/140159 2025-03-14T05:31:38.0140943Z * [new tag] ciflow/linux-aarch64/146823 -> ciflow/linux-aarch64/146823 2025-03-14T05:31:38.0142155Z * [new tag] ciflow/linux-aarch64/146826 -> ciflow/linux-aarch64/146826 2025-03-14T05:31:38.0143071Z * [new tag] ciflow/linux-aarch64/146895 -> ciflow/linux-aarch64/146895 2025-03-14T05:31:38.0144595Z * [new tag] ciflow/linux-aarch64/147073 -> ciflow/linux-aarch64/147073 2025-03-14T05:31:38.0145528Z * [new tag] ciflow/linux-aarch64/147341 -> ciflow/linux-aarch64/147341 2025-03-14T05:31:38.0146970Z * [new tag] ciflow/linux-aarch64/147359 -> ciflow/linux-aarch64/147359 2025-03-14T05:31:38.0148188Z * [new tag] ciflow/linux-aarch64/147498 -> ciflow/linux-aarch64/147498 2025-03-14T05:31:38.0149600Z * [new tag] ciflow/linux-aarch64/147763 -> ciflow/linux-aarch64/147763 2025-03-14T05:31:38.0151314Z * [new tag] ciflow/linux-aarch64/147855 -> ciflow/linux-aarch64/147855 2025-03-14T05:31:38.0152498Z * [new tag] ciflow/linux-aarch64/147917 -> ciflow/linux-aarch64/147917 2025-03-14T05:31:38.0153708Z * [new tag] ciflow/linux-aarch64/148070 -> ciflow/linux-aarch64/148070 2025-03-14T05:31:38.0154685Z * [new tag] ciflow/linux-aarch64/148163 -> ciflow/linux-aarch64/148163 2025-03-14T05:31:38.0156238Z * [new tag] ciflow/linux-aarch64/148173 -> ciflow/linux-aarch64/148173 2025-03-14T05:31:38.0157449Z * [new tag] ciflow/linux-aarch64/148424 -> ciflow/linux-aarch64/148424 2025-03-14T05:31:38.0158382Z * [new tag] ciflow/linux-aarch64/148585 -> ciflow/linux-aarch64/148585 2025-03-14T05:31:38.0159810Z * [new tag] ciflow/linux-aarch64/148653 -> ciflow/linux-aarch64/148653 2025-03-14T05:31:38.0161407Z * [new tag] ciflow/mps/102148 -> ciflow/mps/102148 2025-03-14T05:31:38.0162290Z * [new tag] ciflow/mps/119496 -> ciflow/mps/119496 2025-03-14T05:31:38.0163621Z * [new tag] ciflow/mps/120076 -> ciflow/mps/120076 2025-03-14T05:31:38.0164790Z * [new tag] ciflow/mps/133423 -> ciflow/mps/133423 2025-03-14T05:31:38.0165953Z * [new tag] ciflow/mps/133667 -> ciflow/mps/133667 2025-03-14T05:31:38.0167403Z * [new tag] ciflow/mps/138640 -> ciflow/mps/138640 2025-03-14T05:31:38.0168811Z * [new tag] ciflow/mps/139469 -> ciflow/mps/139469 2025-03-14T05:31:38.0170028Z * [new tag] ciflow/mps/140159 -> ciflow/mps/140159 2025-03-14T05:31:38.0171263Z * [new tag] ciflow/mps/140211 -> ciflow/mps/140211 2025-03-14T05:31:38.0172798Z * [new tag] ciflow/mps/140725 -> ciflow/mps/140725 2025-03-14T05:31:38.0174051Z * [new tag] ciflow/mps/142097 -> ciflow/mps/142097 2025-03-14T05:31:38.0175718Z * [new tag] ciflow/mps/142202 -> ciflow/mps/142202 2025-03-14T05:31:38.0177324Z * [new tag] ciflow/mps/143630 -> ciflow/mps/143630 2025-03-14T05:31:38.0178692Z * [new tag] ciflow/mps/143666 -> ciflow/mps/143666 2025-03-14T05:31:38.0179854Z * [new tag] ciflow/mps/143911 -> ciflow/mps/143911 2025-03-14T05:31:38.0181043Z * [new tag] ciflow/mps/143966 -> ciflow/mps/143966 2025-03-14T05:31:38.0182270Z * [new tag] ciflow/mps/144405 -> ciflow/mps/144405 2025-03-14T05:31:38.0183427Z * [new tag] ciflow/mps/144664 -> ciflow/mps/144664 2025-03-14T05:31:38.0185078Z * [new tag] ciflow/mps/145955 -> ciflow/mps/145955 2025-03-14T05:31:38.0186308Z * [new tag] ciflow/mps/146436 -> ciflow/mps/146436 2025-03-14T05:31:38.0187723Z * [new tag] ciflow/mps/146754 -> ciflow/mps/146754 2025-03-14T05:31:38.0188932Z * [new tag] ciflow/mps/146989 -> ciflow/mps/146989 2025-03-14T05:31:38.0190295Z * [new tag] ciflow/mps/147205 -> ciflow/mps/147205 2025-03-14T05:31:38.0191480Z * [new tag] ciflow/mps/147583 -> ciflow/mps/147583 2025-03-14T05:31:38.0193173Z * [new tag] ciflow/mps/147644 -> ciflow/mps/147644 2025-03-14T05:31:38.0193957Z * [new tag] ciflow/mps/147893 -> ciflow/mps/147893 2025-03-14T05:31:38.0195836Z * [new tag] ciflow/mps/148408 -> ciflow/mps/148408 2025-03-14T05:31:38.0197121Z * [new tag] ciflow/mps/148415 -> ciflow/mps/148415 2025-03-14T05:31:38.0198433Z * [new tag] ciflow/mps/148942 -> ciflow/mps/148942 2025-03-14T05:31:38.0199743Z * [new tag] ciflow/mps/149147 -> ciflow/mps/149147 2025-03-14T05:31:38.0200973Z * [new tag] ciflow/mps/149173 -> ciflow/mps/149173 2025-03-14T05:31:38.0202538Z * [new tag] ciflow/op-benchmark/143733 -> ciflow/op-benchmark/143733 2025-03-14T05:31:38.0204160Z * [new tag] ciflow/periodic/054a2fd -> ciflow/periodic/054a2fd 2025-03-14T05:31:38.0205505Z * [new tag] ciflow/periodic/123020 -> ciflow/periodic/123020 2025-03-14T05:31:38.0206408Z * [new tag] ciflow/periodic/140989 -> ciflow/periodic/140989 2025-03-14T05:31:38.0207771Z * [new tag] ciflow/periodic/141309 -> ciflow/periodic/141309 2025-03-14T05:31:38.0208640Z * [new tag] ciflow/periodic/141730 -> ciflow/periodic/141730 2025-03-14T05:31:38.0209966Z * [new tag] ciflow/periodic/142179 -> ciflow/periodic/142179 2025-03-14T05:31:38.0211220Z * [new tag] ciflow/periodic/143959 -> ciflow/periodic/143959 2025-03-14T05:31:38.0212128Z * [new tag] ciflow/periodic/144953 -> ciflow/periodic/144953 2025-03-14T05:31:38.0213495Z * [new tag] ciflow/periodic/145130 -> ciflow/periodic/145130 2025-03-14T05:31:38.0214681Z * [new tag] ciflow/periodic/146264 -> ciflow/periodic/146264 2025-03-14T05:31:38.0215976Z * [new tag] ciflow/periodic/146403 -> ciflow/periodic/146403 2025-03-14T05:31:38.0217469Z * [new tag] ciflow/periodic/146823 -> ciflow/periodic/146823 2025-03-14T05:31:38.0219023Z * [new tag] ciflow/periodic/146903 -> ciflow/periodic/146903 2025-03-14T05:31:38.0220428Z * [new tag] ciflow/periodic/147870 -> ciflow/periodic/147870 2025-03-14T05:31:38.0221778Z * [new tag] ciflow/periodic/148760 -> ciflow/periodic/148760 2025-03-14T05:31:38.0223077Z * [new tag] ciflow/periodic/149091 -> ciflow/periodic/149091 2025-03-14T05:31:38.0224460Z * [new tag] ciflow/periodic/2a6d37d -> ciflow/periodic/2a6d37d 2025-03-14T05:31:38.0225834Z * [new tag] ciflow/periodic/317eeb8 -> ciflow/periodic/317eeb8 2025-03-14T05:31:38.0227162Z * [new tag] ciflow/periodic/3c32 -> ciflow/periodic/3c32 2025-03-14T05:31:38.0228582Z * [new tag] ciflow/periodic/3e98831 -> ciflow/periodic/3e98831 2025-03-14T05:31:38.0230212Z * [new tag] ciflow/periodic/94512-point -> ciflow/periodic/94512-point 2025-03-14T05:31:38.0231890Z * [new tag] ciflow/periodic/csl/test87519 -> ciflow/periodic/csl/test87519 2025-03-14T05:31:38.0233319Z * [new tag] ciflow/periodic/csltest88275 -> ciflow/periodic/csltest88275 2025-03-14T05:31:38.0234776Z * [new tag] ciflow/periodic/csltest88761 -> ciflow/periodic/csltest88761 2025-03-14T05:31:38.0236228Z * [new tag] ciflow/periodic/release_1.12 -> ciflow/periodic/release_1.12 2025-03-14T05:31:38.0238030Z * [new tag] ciflow/periodic/release_1.12.0 -> ciflow/periodic/release_1.12.0 2025-03-14T05:31:38.0239630Z * [new tag] ciflow/periodic/sha-ec5b83 -> ciflow/periodic/sha-ec5b83 2025-03-14T05:31:38.0241050Z * [new tag] ciflow/rocm-mi300/148394 -> ciflow/rocm-mi300/148394 2025-03-14T05:31:38.0241925Z * [new tag] ciflow/rocm-mi300/148492 -> ciflow/rocm-mi300/148492 2025-03-14T05:31:38.0243374Z * [new tag] ciflow/rocm-mi300/148916 -> ciflow/rocm-mi300/148916 2025-03-14T05:31:38.0244682Z * [new tag] ciflow/rocm-mi300/148945 -> ciflow/rocm-mi300/148945 2025-03-14T05:31:38.0246222Z * [new tag] ciflow/rocm/124424 -> ciflow/rocm/124424 2025-03-14T05:31:38.0247042Z * [new tag] ciflow/rocm/139469 -> ciflow/rocm/139469 2025-03-14T05:31:38.0248384Z * [new tag] ciflow/rocm/139975 -> ciflow/rocm/139975 2025-03-14T05:31:38.0249371Z * [new tag] ciflow/rocm/140989 -> ciflow/rocm/140989 2025-03-14T05:31:38.0250670Z * [new tag] ciflow/rocm/141309 -> ciflow/rocm/141309 2025-03-14T05:31:38.0251535Z * [new tag] ciflow/rocm/142097 -> ciflow/rocm/142097 2025-03-14T05:31:38.0252877Z * [new tag] ciflow/rocm/142859 -> ciflow/rocm/142859 2025-03-14T05:31:38.0254026Z * [new tag] ciflow/rocm/143416 -> ciflow/rocm/143416 2025-03-14T05:31:38.0255298Z * [new tag] ciflow/rocm/143971 -> ciflow/rocm/143971 2025-03-14T05:31:38.0256148Z * [new tag] ciflow/rocm/144120 -> ciflow/rocm/144120 2025-03-14T05:31:38.0257596Z * [new tag] ciflow/rocm/144572 -> ciflow/rocm/144572 2025-03-14T05:31:38.0259156Z * [new tag] ciflow/rocm/144664 -> ciflow/rocm/144664 2025-03-14T05:31:38.0260710Z * [new tag] ciflow/rocm/145130 -> ciflow/rocm/145130 2025-03-14T05:31:38.0262086Z * [new tag] ciflow/rocm/145475 -> ciflow/rocm/145475 2025-03-14T05:31:38.0263403Z * [new tag] ciflow/rocm/145584 -> ciflow/rocm/145584 2025-03-14T05:31:38.0264779Z * [new tag] ciflow/rocm/145685 -> ciflow/rocm/145685 2025-03-14T05:31:38.0265996Z * [new tag] ciflow/rocm/146264 -> ciflow/rocm/146264 2025-03-14T05:31:38.0267274Z * [new tag] ciflow/rocm/146448 -> ciflow/rocm/146448 2025-03-14T05:31:38.0268806Z * [new tag] ciflow/rocm/146903 -> ciflow/rocm/146903 2025-03-14T05:31:38.0270240Z * [new tag] ciflow/rocm/147315 -> ciflow/rocm/147315 2025-03-14T05:31:38.0271645Z * [new tag] ciflow/rocm/147382 -> ciflow/rocm/147382 2025-03-14T05:31:38.0272815Z * [new tag] ciflow/rocm/147452 -> ciflow/rocm/147452 2025-03-14T05:31:38.0274339Z * [new tag] ciflow/rocm/147527 -> ciflow/rocm/147527 2025-03-14T05:31:38.0275413Z * [new tag] ciflow/rocm/147821 -> ciflow/rocm/147821 2025-03-14T05:31:38.0276872Z * [new tag] ciflow/rocm/148327 -> ciflow/rocm/148327 2025-03-14T05:31:38.0278225Z * [new tag] ciflow/rocm/148355 -> ciflow/rocm/148355 2025-03-14T05:31:38.0279423Z * [new tag] ciflow/rocm/148394 -> ciflow/rocm/148394 2025-03-14T05:31:38.0280324Z * [new tag] ciflow/rocm/148492 -> ciflow/rocm/148492 2025-03-14T05:31:38.0281761Z * [new tag] ciflow/rocm/148672 -> ciflow/rocm/148672 2025-03-14T05:31:38.0282934Z * [new tag] ciflow/rocm/148864 -> ciflow/rocm/148864 2025-03-14T05:31:38.0284188Z * [new tag] ciflow/rocm/148880 -> ciflow/rocm/148880 2025-03-14T05:31:38.0285473Z * [new tag] ciflow/rocm/148911 -> ciflow/rocm/148911 2025-03-14T05:31:38.0286723Z * [new tag] ciflow/rocm/148916 -> ciflow/rocm/148916 2025-03-14T05:31:38.0287938Z * [new tag] ciflow/rocm/148945 -> ciflow/rocm/148945 2025-03-14T05:31:38.0289134Z * [new tag] ciflow/rocm/149039 -> ciflow/rocm/149039 2025-03-14T05:31:38.0290466Z * [new tag] ciflow/rocm/149041 -> ciflow/rocm/149041 2025-03-14T05:31:38.0291694Z * [new tag] ciflow/rocm/149145 -> ciflow/rocm/149145 2025-03-14T05:31:38.0293184Z * [new tag] ciflow/s390/142346 -> ciflow/s390/142346 2025-03-14T05:31:38.0294079Z * [new tag] ciflow/s390/143959 -> ciflow/s390/143959 2025-03-14T05:31:38.0295473Z * [new tag] ciflow/s390/148452 -> ciflow/s390/148452 2025-03-14T05:31:38.0297102Z * [new tag] ciflow/slow/01c7106 -> ciflow/slow/01c7106 2025-03-14T05:31:38.0298297Z * [new tag] ciflow/slow/0577043 -> ciflow/slow/0577043 2025-03-14T05:31:38.0299930Z * [new tag] ciflow/slow/0d5b74da0cab798fbfdb9caa53fad816999c8386-sdym -> ciflow/slow/0d5b74da0cab798fbfdb9caa53fad816999c8386-sdym 2025-03-14T05:31:38.0300653Z * [new tag] ciflow/slow/0e81104 -> ciflow/slow/0e81104 2025-03-14T05:31:38.0302064Z * [new tag] ciflow/slow/139975 -> ciflow/slow/139975 2025-03-14T05:31:38.0302942Z * [new tag] ciflow/slow/146903 -> ciflow/slow/146903 2025-03-14T05:31:38.0304526Z * [new tag] ciflow/slow/1732077 -> ciflow/slow/1732077 2025-03-14T05:31:38.0305940Z * [new tag] ciflow/slow/187eb7c -> ciflow/slow/187eb7c 2025-03-14T05:31:38.0307647Z * [new tag] ciflow/slow/1faef89 -> ciflow/slow/1faef89 2025-03-14T05:31:38.0309762Z * [new tag] ciflow/slow/3920ec1 -> ciflow/slow/3920ec1 2025-03-14T05:31:38.0311328Z * [new tag] ciflow/slow/3b7c6b2 -> ciflow/slow/3b7c6b2 2025-03-14T05:31:38.0312754Z * [new tag] ciflow/slow/59a3759 -> ciflow/slow/59a3759 2025-03-14T05:31:38.0314077Z * [new tag] ciflow/slow/70ef0bb -> ciflow/slow/70ef0bb 2025-03-14T05:31:38.0315520Z * [new tag] ciflow/slow/788ff06 -> ciflow/slow/788ff06 2025-03-14T05:31:38.0317299Z * [new tag] ciflow/slow/8751002215790a3a88750faa8f4366933e296693-sdym -> ciflow/slow/8751002215790a3a88750faa8f4366933e296693-sdym 2025-03-14T05:31:38.0318126Z * [new tag] ciflow/slow/9d85864 -> ciflow/slow/9d85864 2025-03-14T05:31:38.0319659Z * [new tag] ciflow/slow/9ffad5b -> ciflow/slow/9ffad5b 2025-03-14T05:31:38.0321121Z * [new tag] ciflow/slow/a206e8b -> ciflow/slow/a206e8b 2025-03-14T05:31:38.0322518Z * [new tag] ciflow/slow/a837609 -> ciflow/slow/a837609 2025-03-14T05:31:38.0323905Z * [new tag] ciflow/slow/af841f3 -> ciflow/slow/af841f3 2025-03-14T05:31:38.0325745Z * [new tag] ciflow/slow/da3aba1e46157c4df504b067477cdf2b3c96b194-sdym -> ciflow/slow/da3aba1e46157c4df504b067477cdf2b3c96b194-sdym 2025-03-14T05:31:38.0326993Z * [new tag] ciflow/trunk/101814 -> ciflow/trunk/101814 2025-03-14T05:31:38.0327999Z * [new tag] ciflow/trunk/108303 -> ciflow/trunk/108303 2025-03-14T05:31:38.0329278Z * [new tag] ciflow/trunk/113257 -> ciflow/trunk/113257 2025-03-14T05:31:38.0330440Z * [new tag] ciflow/trunk/113258 -> ciflow/trunk/113258 2025-03-14T05:31:38.0331314Z * [new tag] ciflow/trunk/120076 -> ciflow/trunk/120076 2025-03-14T05:31:38.0332694Z * [new tag] ciflow/trunk/121445 -> ciflow/trunk/121445 2025-03-14T05:31:38.0333839Z * [new tag] ciflow/trunk/123020 -> ciflow/trunk/123020 2025-03-14T05:31:38.0335037Z * [new tag] ciflow/trunk/124424 -> ciflow/trunk/124424 2025-03-14T05:31:38.0335945Z * [new tag] ciflow/trunk/124490 -> ciflow/trunk/124490 2025-03-14T05:31:38.0337290Z * [new tag] ciflow/trunk/125806 -> ciflow/trunk/125806 2025-03-14T05:31:38.0338626Z * [new tag] ciflow/trunk/125888 -> ciflow/trunk/125888 2025-03-14T05:31:38.0340214Z * [new tag] ciflow/trunk/125995 -> ciflow/trunk/125995 2025-03-14T05:31:38.0341841Z * [new tag] ciflow/trunk/126050 -> ciflow/trunk/126050 2025-03-14T05:31:38.0343328Z * [new tag] ciflow/trunk/126054 -> ciflow/trunk/126054 2025-03-14T05:31:38.0344846Z * [new tag] ciflow/trunk/126635 -> ciflow/trunk/126635 2025-03-14T05:31:38.0346148Z * [new tag] ciflow/trunk/127171 -> ciflow/trunk/127171 2025-03-14T05:31:38.0347466Z * [new tag] ciflow/trunk/127919 -> ciflow/trunk/127919 2025-03-14T05:31:38.0348701Z * [new tag] ciflow/trunk/129352 -> ciflow/trunk/129352 2025-03-14T05:31:38.0350025Z * [new tag] ciflow/trunk/129420 -> ciflow/trunk/129420 2025-03-14T05:31:38.0351207Z * [new tag] ciflow/trunk/130141 -> ciflow/trunk/130141 2025-03-14T05:31:38.0352442Z * [new tag] ciflow/trunk/130752 -> ciflow/trunk/130752 2025-03-14T05:31:38.0353656Z * [new tag] ciflow/trunk/131354 -> ciflow/trunk/131354 2025-03-14T05:31:38.0354987Z * [new tag] ciflow/trunk/131507 -> ciflow/trunk/131507 2025-03-14T05:31:38.0356209Z * [new tag] ciflow/trunk/132021 -> ciflow/trunk/132021 2025-03-14T05:31:38.0357406Z * [new tag] ciflow/trunk/133044 -> ciflow/trunk/133044 2025-03-14T05:31:38.0358646Z * [new tag] ciflow/trunk/133289 -> ciflow/trunk/133289 2025-03-14T05:31:38.0359854Z * [new tag] ciflow/trunk/133296 -> ciflow/trunk/133296 2025-03-14T05:31:38.0361034Z * [new tag] ciflow/trunk/133297 -> ciflow/trunk/133297 2025-03-14T05:31:38.0362237Z * [new tag] ciflow/trunk/133315 -> ciflow/trunk/133315 2025-03-14T05:31:38.0363444Z * [new tag] ciflow/trunk/133392 -> ciflow/trunk/133392 2025-03-14T05:31:38.0364746Z * [new tag] ciflow/trunk/133419 -> ciflow/trunk/133419 2025-03-14T05:31:38.0365909Z * [new tag] ciflow/trunk/133423 -> ciflow/trunk/133423 2025-03-14T05:31:38.0367144Z * [new tag] ciflow/trunk/133667 -> ciflow/trunk/133667 2025-03-14T05:31:38.0368603Z * [new tag] ciflow/trunk/133753 -> ciflow/trunk/133753 2025-03-14T05:31:38.0370162Z * [new tag] ciflow/trunk/134219 -> ciflow/trunk/134219 2025-03-14T05:31:38.0371057Z * [new tag] ciflow/trunk/135058 -> ciflow/trunk/135058 2025-03-14T05:31:38.0372707Z * [new tag] ciflow/trunk/135631 -> ciflow/trunk/135631 2025-03-14T05:31:38.0374630Z * [new tag] ciflow/trunk/136780 -> ciflow/trunk/136780 2025-03-14T05:31:38.0376046Z * [new tag] ciflow/trunk/136824 -> ciflow/trunk/136824 2025-03-14T05:31:38.0377343Z * [new tag] ciflow/trunk/136835 -> ciflow/trunk/136835 2025-03-14T05:31:38.0378559Z * [new tag] ciflow/trunk/137400 -> ciflow/trunk/137400 2025-03-14T05:31:38.0379732Z * [new tag] ciflow/trunk/137580 -> ciflow/trunk/137580 2025-03-14T05:31:38.0381094Z * [new tag] ciflow/trunk/138436 -> ciflow/trunk/138436 2025-03-14T05:31:38.0381992Z * [new tag] ciflow/trunk/138626 -> ciflow/trunk/138626 2025-03-14T05:31:38.0383413Z * [new tag] ciflow/trunk/138834 -> ciflow/trunk/138834 2025-03-14T05:31:38.0384736Z * [new tag] ciflow/trunk/138996 -> ciflow/trunk/138996 2025-03-14T05:31:38.0386071Z * [new tag] ciflow/trunk/139070 -> ciflow/trunk/139070 2025-03-14T05:31:38.0387087Z * [new tag] ciflow/trunk/139094 -> ciflow/trunk/139094 2025-03-14T05:31:38.0388722Z * [new tag] ciflow/trunk/139171 -> ciflow/trunk/139171 2025-03-14T05:31:38.0389665Z * [new tag] ciflow/trunk/139971 -> ciflow/trunk/139971 2025-03-14T05:31:38.0391278Z * [new tag] ciflow/trunk/139975 -> ciflow/trunk/139975 2025-03-14T05:31:38.0392141Z * [new tag] ciflow/trunk/140159 -> ciflow/trunk/140159 2025-03-14T05:31:38.0393681Z * [new tag] ciflow/trunk/140200 -> ciflow/trunk/140200 2025-03-14T05:31:38.0395138Z * [new tag] ciflow/trunk/140211 -> ciflow/trunk/140211 2025-03-14T05:31:38.0396390Z * [new tag] ciflow/trunk/140298 -> ciflow/trunk/140298 2025-03-14T05:31:38.0397291Z * [new tag] ciflow/trunk/140323 -> ciflow/trunk/140323 2025-03-14T05:31:38.0398730Z * [new tag] ciflow/trunk/140365 -> ciflow/trunk/140365 2025-03-14T05:31:38.0400105Z * [new tag] ciflow/trunk/140399 -> ciflow/trunk/140399 2025-03-14T05:31:38.0401362Z * [new tag] ciflow/trunk/140756 -> ciflow/trunk/140756 2025-03-14T05:31:38.0402565Z * [new tag] ciflow/trunk/140979 -> ciflow/trunk/140979 2025-03-14T05:31:38.0403774Z * [new tag] ciflow/trunk/140989 -> ciflow/trunk/140989 2025-03-14T05:31:38.0405199Z * [new tag] ciflow/trunk/141257 -> ciflow/trunk/141257 2025-03-14T05:31:38.0406442Z * [new tag] ciflow/trunk/141309 -> ciflow/trunk/141309 2025-03-14T05:31:38.0407645Z * [new tag] ciflow/trunk/141730 -> ciflow/trunk/141730 2025-03-14T05:31:38.0409003Z * [new tag] ciflow/trunk/141796 -> ciflow/trunk/141796 2025-03-14T05:31:38.0410255Z * [new tag] ciflow/trunk/141842 -> ciflow/trunk/141842 2025-03-14T05:31:38.0411463Z * [new tag] ciflow/trunk/141910 -> ciflow/trunk/141910 2025-03-14T05:31:38.0412687Z * [new tag] ciflow/trunk/141961 -> ciflow/trunk/141961 2025-03-14T05:31:38.0413910Z * [new tag] ciflow/trunk/142097 -> ciflow/trunk/142097 2025-03-14T05:31:38.0415103Z * [new tag] ciflow/trunk/142179 -> ciflow/trunk/142179 2025-03-14T05:31:38.0416415Z * [new tag] ciflow/trunk/142272 -> ciflow/trunk/142272 2025-03-14T05:31:38.0417808Z * [new tag] ciflow/trunk/142326 -> ciflow/trunk/142326 2025-03-14T05:31:38.0418891Z * [new tag] ciflow/trunk/142346 -> ciflow/trunk/142346 2025-03-14T05:31:38.0420343Z * [new tag] ciflow/trunk/142372 -> ciflow/trunk/142372 2025-03-14T05:31:38.0421816Z * [new tag] ciflow/trunk/142821 -> ciflow/trunk/142821 2025-03-14T05:31:38.0429276Z * [new tag] ciflow/trunk/142859 -> ciflow/trunk/142859 2025-03-14T05:31:38.0429699Z * [new tag] ciflow/trunk/143093 -> ciflow/trunk/143093 2025-03-14T05:31:38.0429892Z * [new tag] ciflow/trunk/143261 -> ciflow/trunk/143261 2025-03-14T05:31:38.0430064Z * [new tag] ciflow/trunk/143303 -> ciflow/trunk/143303 2025-03-14T05:31:38.0430241Z * [new tag] ciflow/trunk/143313 -> ciflow/trunk/143313 2025-03-14T05:31:38.0430420Z * [new tag] ciflow/trunk/143347 -> ciflow/trunk/143347 2025-03-14T05:31:38.0430624Z * [new tag] ciflow/trunk/143402 -> ciflow/trunk/143402 2025-03-14T05:31:38.0432338Z * [new tag] ciflow/trunk/143416 -> ciflow/trunk/143416 2025-03-14T05:31:38.0433688Z * [new tag] ciflow/trunk/143451 -> ciflow/trunk/143451 2025-03-14T05:31:38.0434930Z * [new tag] ciflow/trunk/143475 -> ciflow/trunk/143475 2025-03-14T05:31:38.0436364Z * [new tag] ciflow/trunk/143630 -> ciflow/trunk/143630 2025-03-14T05:31:38.0437229Z * [new tag] ciflow/trunk/143666 -> ciflow/trunk/143666 2025-03-14T05:31:38.0438680Z * [new tag] ciflow/trunk/143671 -> ciflow/trunk/143671 2025-03-14T05:31:38.0440125Z * [new tag] ciflow/trunk/143689 -> ciflow/trunk/143689 2025-03-14T05:31:38.0441356Z * [new tag] ciflow/trunk/143712 -> ciflow/trunk/143712 2025-03-14T05:31:38.0442698Z * [new tag] ciflow/trunk/143733 -> ciflow/trunk/143733 2025-03-14T05:31:38.0443996Z * [new tag] ciflow/trunk/143822 -> ciflow/trunk/143822 2025-03-14T05:31:38.0445339Z * [new tag] ciflow/trunk/143833 -> ciflow/trunk/143833 2025-03-14T05:31:38.0446803Z * [new tag] ciflow/trunk/143894 -> ciflow/trunk/143894 2025-03-14T05:31:38.0448046Z * [new tag] ciflow/trunk/143896 -> ciflow/trunk/143896 2025-03-14T05:31:38.0449283Z * [new tag] ciflow/trunk/143961 -> ciflow/trunk/143961 2025-03-14T05:31:38.0450503Z * [new tag] ciflow/trunk/143966 -> ciflow/trunk/143966 2025-03-14T05:31:38.0451952Z * [new tag] ciflow/trunk/144017 -> ciflow/trunk/144017 2025-03-14T05:31:38.0452860Z * [new tag] ciflow/trunk/144019 -> ciflow/trunk/144019 2025-03-14T05:31:38.0454297Z * [new tag] ciflow/trunk/144120 -> ciflow/trunk/144120 2025-03-14T05:31:38.0455657Z * [new tag] ciflow/trunk/144138 -> ciflow/trunk/144138 2025-03-14T05:31:38.0457443Z * [new tag] ciflow/trunk/144172 -> ciflow/trunk/144172 2025-03-14T05:31:38.0458785Z * [new tag] ciflow/trunk/144177 -> ciflow/trunk/144177 2025-03-14T05:31:38.0460220Z * [new tag] ciflow/trunk/144268 -> ciflow/trunk/144268 2025-03-14T05:31:38.0461456Z * [new tag] ciflow/trunk/144272 -> ciflow/trunk/144272 2025-03-14T05:31:38.0462679Z * [new tag] ciflow/trunk/144293 -> ciflow/trunk/144293 2025-03-14T05:31:38.0463931Z * [new tag] ciflow/trunk/144452 -> ciflow/trunk/144452 2025-03-14T05:31:38.0465328Z * [new tag] ciflow/trunk/144468 -> ciflow/trunk/144468 2025-03-14T05:31:38.0466568Z * [new tag] ciflow/trunk/144557 -> ciflow/trunk/144557 2025-03-14T05:31:38.0467900Z * [new tag] ciflow/trunk/144572 -> ciflow/trunk/144572 2025-03-14T05:31:38.0473241Z * [new tag] ciflow/trunk/144616 -> ciflow/trunk/144616 2025-03-14T05:31:38.0474345Z * [new tag] ciflow/trunk/144621 -> ciflow/trunk/144621 2025-03-14T05:31:38.0475956Z * [new tag] ciflow/trunk/144664 -> ciflow/trunk/144664 2025-03-14T05:31:38.0477166Z * [new tag] ciflow/trunk/144721 -> ciflow/trunk/144721 2025-03-14T05:31:38.0478456Z * [new tag] ciflow/trunk/144733 -> ciflow/trunk/144733 2025-03-14T05:31:38.0479664Z * [new tag] ciflow/trunk/144771 -> ciflow/trunk/144771 2025-03-14T05:31:38.0481201Z * [new tag] ciflow/trunk/144844 -> ciflow/trunk/144844 2025-03-14T05:31:38.0482036Z * [new tag] ciflow/trunk/144880 -> ciflow/trunk/144880 2025-03-14T05:31:38.0483470Z * [new tag] ciflow/trunk/144925 -> ciflow/trunk/144925 2025-03-14T05:31:38.0484741Z * [new tag] ciflow/trunk/144953 -> ciflow/trunk/144953 2025-03-14T05:31:38.0485963Z * [new tag] ciflow/trunk/144975 -> ciflow/trunk/144975 2025-03-14T05:31:38.0487186Z * [new tag] ciflow/trunk/144992 -> ciflow/trunk/144992 2025-03-14T05:31:38.0488409Z * [new tag] ciflow/trunk/145061 -> ciflow/trunk/145061 2025-03-14T05:31:38.0489793Z * [new tag] ciflow/trunk/145116 -> ciflow/trunk/145116 2025-03-14T05:31:38.0490667Z * [new tag] ciflow/trunk/145119 -> ciflow/trunk/145119 2025-03-14T05:31:38.0492132Z * [new tag] ciflow/trunk/145130 -> ciflow/trunk/145130 2025-03-14T05:31:38.0493592Z * [new tag] ciflow/trunk/145136 -> ciflow/trunk/145136 2025-03-14T05:31:38.0494948Z * [new tag] ciflow/trunk/145153 -> ciflow/trunk/145153 2025-03-14T05:31:38.0496133Z * [new tag] ciflow/trunk/145224 -> ciflow/trunk/145224 2025-03-14T05:31:38.0497363Z * [new tag] ciflow/trunk/145241 -> ciflow/trunk/145241 2025-03-14T05:31:38.0498564Z * [new tag] ciflow/trunk/145254 -> ciflow/trunk/145254 2025-03-14T05:31:38.0499810Z * [new tag] ciflow/trunk/145331 -> ciflow/trunk/145331 2025-03-14T05:31:38.0501191Z * [new tag] ciflow/trunk/145406 -> ciflow/trunk/145406 2025-03-14T05:31:38.0502400Z * [new tag] ciflow/trunk/145523 -> ciflow/trunk/145523 2025-03-14T05:31:38.0503625Z * [new tag] ciflow/trunk/145559 -> ciflow/trunk/145559 2025-03-14T05:31:38.0505124Z * [new tag] ciflow/trunk/145600 -> ciflow/trunk/145600 2025-03-14T05:31:38.0506473Z * [new tag] ciflow/trunk/145674 -> ciflow/trunk/145674 2025-03-14T05:31:38.0507856Z * [new tag] ciflow/trunk/145677 -> ciflow/trunk/145677 2025-03-14T05:31:38.0509269Z * [new tag] ciflow/trunk/145719 -> ciflow/trunk/145719 2025-03-14T05:31:38.0510510Z * [new tag] ciflow/trunk/145936 -> ciflow/trunk/145936 2025-03-14T05:31:38.0511761Z * [new tag] ciflow/trunk/145979 -> ciflow/trunk/145979 2025-03-14T05:31:38.0513009Z * [new tag] ciflow/trunk/146051 -> ciflow/trunk/146051 2025-03-14T05:31:38.0514400Z * [new tag] ciflow/trunk/146090 -> ciflow/trunk/146090 2025-03-14T05:31:38.0515694Z * [new tag] ciflow/trunk/146115 -> ciflow/trunk/146115 2025-03-14T05:31:38.0516906Z * [new tag] ciflow/trunk/146135 -> ciflow/trunk/146135 2025-03-14T05:31:38.0518180Z * [new tag] ciflow/trunk/146176 -> ciflow/trunk/146176 2025-03-14T05:31:38.0519661Z * [new tag] ciflow/trunk/146182 -> ciflow/trunk/146182 2025-03-14T05:31:38.0520895Z * [new tag] ciflow/trunk/146275 -> ciflow/trunk/146275 2025-03-14T05:31:38.0522245Z * [new tag] ciflow/trunk/146289 -> ciflow/trunk/146289 2025-03-14T05:31:38.0523303Z * [new tag] ciflow/trunk/146335 -> ciflow/trunk/146335 2025-03-14T05:31:38.0524822Z * [new tag] ciflow/trunk/146421 -> ciflow/trunk/146421 2025-03-14T05:31:38.0526238Z * [new tag] ciflow/trunk/146489 -> ciflow/trunk/146489 2025-03-14T05:31:38.0527564Z * [new tag] ciflow/trunk/146517 -> ciflow/trunk/146517 2025-03-14T05:31:38.0528848Z * [new tag] ciflow/trunk/146530 -> ciflow/trunk/146530 2025-03-14T05:31:38.0530030Z * [new tag] ciflow/trunk/146561 -> ciflow/trunk/146561 2025-03-14T05:31:38.0531245Z * [new tag] ciflow/trunk/146562 -> ciflow/trunk/146562 2025-03-14T05:31:38.0532478Z * [new tag] ciflow/trunk/146573 -> ciflow/trunk/146573 2025-03-14T05:31:38.0533726Z * [new tag] ciflow/trunk/146661 -> ciflow/trunk/146661 2025-03-14T05:31:38.0535022Z * [new tag] ciflow/trunk/146718 -> ciflow/trunk/146718 2025-03-14T05:31:38.0536219Z * [new tag] ciflow/trunk/146777 -> ciflow/trunk/146777 2025-03-14T05:31:38.0537638Z * [new tag] ciflow/trunk/146807 -> ciflow/trunk/146807 2025-03-14T05:31:38.0538961Z * [new tag] ciflow/trunk/146823 -> ciflow/trunk/146823 2025-03-14T05:31:38.0539861Z * [new tag] ciflow/trunk/146826 -> ciflow/trunk/146826 2025-03-14T05:31:38.0541284Z * [new tag] ciflow/trunk/146827 -> ciflow/trunk/146827 2025-03-14T05:31:38.0542557Z * [new tag] ciflow/trunk/146845 -> ciflow/trunk/146845 2025-03-14T05:31:38.0544366Z * [new tag] ciflow/trunk/146874 -> ciflow/trunk/146874 2025-03-14T05:31:38.0545628Z * [new tag] ciflow/trunk/146903 -> ciflow/trunk/146903 2025-03-14T05:31:38.0546946Z * [new tag] ciflow/trunk/146911 -> ciflow/trunk/146911 2025-03-14T05:31:38.0548183Z * [new tag] ciflow/trunk/146928 -> ciflow/trunk/146928 2025-03-14T05:31:38.0549459Z * [new tag] ciflow/trunk/147014 -> ciflow/trunk/147014 2025-03-14T05:31:38.0550691Z * [new tag] ciflow/trunk/147072 -> ciflow/trunk/147072 2025-03-14T05:31:38.0551896Z * [new tag] ciflow/trunk/147105 -> ciflow/trunk/147105 2025-03-14T05:31:38.0553150Z * [new tag] ciflow/trunk/147155 -> ciflow/trunk/147155 2025-03-14T05:31:38.0554636Z * [new tag] ciflow/trunk/147260 -> ciflow/trunk/147260 2025-03-14T05:31:38.0555913Z * [new tag] ciflow/trunk/147272 -> ciflow/trunk/147272 2025-03-14T05:31:38.0557156Z * [new tag] ciflow/trunk/147314 -> ciflow/trunk/147314 2025-03-14T05:31:38.0558398Z * [new tag] ciflow/trunk/147349 -> ciflow/trunk/147349 2025-03-14T05:31:38.0559620Z * [new tag] ciflow/trunk/147368 -> ciflow/trunk/147368 2025-03-14T05:31:38.0560946Z * [new tag] ciflow/trunk/147422 -> ciflow/trunk/147422 2025-03-14T05:31:38.0562447Z * [new tag] ciflow/trunk/147433 -> ciflow/trunk/147433 2025-03-14T05:31:38.0563691Z * [new tag] ciflow/trunk/147452 -> ciflow/trunk/147452 2025-03-14T05:31:38.0564991Z * [new tag] ciflow/trunk/147481 -> ciflow/trunk/147481 2025-03-14T05:31:38.0566173Z * [new tag] ciflow/trunk/147498 -> ciflow/trunk/147498 2025-03-14T05:31:38.0567386Z * [new tag] ciflow/trunk/147507 -> ciflow/trunk/147507 2025-03-14T05:31:38.0568828Z * [new tag] ciflow/trunk/147527 -> ciflow/trunk/147527 2025-03-14T05:31:38.0570204Z * [new tag] ciflow/trunk/147583 -> ciflow/trunk/147583 2025-03-14T05:31:38.0571595Z * [new tag] ciflow/trunk/147593 -> ciflow/trunk/147593 2025-03-14T05:31:38.0572935Z * [new tag] ciflow/trunk/147656 -> ciflow/trunk/147656 2025-03-14T05:31:38.0574140Z * [new tag] ciflow/trunk/147664 -> ciflow/trunk/147664 2025-03-14T05:31:38.0575525Z * [new tag] ciflow/trunk/147670 -> ciflow/trunk/147670 2025-03-14T05:31:38.0576846Z * [new tag] ciflow/trunk/147723 -> ciflow/trunk/147723 2025-03-14T05:31:38.0578093Z * [new tag] ciflow/trunk/147752 -> ciflow/trunk/147752 2025-03-14T05:31:38.0579344Z * [new tag] ciflow/trunk/147797 -> ciflow/trunk/147797 2025-03-14T05:31:38.0580753Z * [new tag] ciflow/trunk/147808 -> ciflow/trunk/147808 2025-03-14T05:31:38.0582391Z * [new tag] ciflow/trunk/147820 -> ciflow/trunk/147820 2025-03-14T05:31:38.0583654Z * [new tag] ciflow/trunk/147821 -> ciflow/trunk/147821 2025-03-14T05:31:38.0584945Z * [new tag] ciflow/trunk/147870 -> ciflow/trunk/147870 2025-03-14T05:31:38.0586170Z * [new tag] ciflow/trunk/147881 -> ciflow/trunk/147881 2025-03-14T05:31:38.0587529Z * [new tag] ciflow/trunk/147897 -> ciflow/trunk/147897 2025-03-14T05:31:38.0588447Z * [new tag] ciflow/trunk/147910 -> ciflow/trunk/147910 2025-03-14T05:31:38.0589905Z * [new tag] ciflow/trunk/147917 -> ciflow/trunk/147917 2025-03-14T05:31:38.0591091Z * [new tag] ciflow/trunk/147962 -> ciflow/trunk/147962 2025-03-14T05:31:38.0592325Z * [new tag] ciflow/trunk/148024 -> ciflow/trunk/148024 2025-03-14T05:31:38.0593629Z * [new tag] ciflow/trunk/148070 -> ciflow/trunk/148070 2025-03-14T05:31:38.0595019Z * [new tag] ciflow/trunk/148130 -> ciflow/trunk/148130 2025-03-14T05:31:38.0596248Z * [new tag] ciflow/trunk/148131 -> ciflow/trunk/148131 2025-03-14T05:31:38.0597710Z * [new tag] ciflow/trunk/148140 -> ciflow/trunk/148140 2025-03-14T05:31:38.0598938Z * [new tag] ciflow/trunk/148163 -> ciflow/trunk/148163 2025-03-14T05:31:38.0600194Z * [new tag] ciflow/trunk/148173 -> ciflow/trunk/148173 2025-03-14T05:31:38.0601402Z * [new tag] ciflow/trunk/148180 -> ciflow/trunk/148180 2025-03-14T05:31:38.0602901Z * [new tag] ciflow/trunk/148258 -> ciflow/trunk/148258 2025-03-14T05:31:38.0604307Z * [new tag] ciflow/trunk/148281 -> ciflow/trunk/148281 2025-03-14T05:31:38.0605572Z * [new tag] ciflow/trunk/148492 -> ciflow/trunk/148492 2025-03-14T05:31:38.0606821Z * [new tag] ciflow/trunk/148502 -> ciflow/trunk/148502 2025-03-14T05:31:38.0608065Z * [new tag] ciflow/trunk/148503 -> ciflow/trunk/148503 2025-03-14T05:31:38.0609293Z * [new tag] ciflow/trunk/148517 -> ciflow/trunk/148517 2025-03-14T05:31:38.0610533Z * [new tag] ciflow/trunk/148554 -> ciflow/trunk/148554 2025-03-14T05:31:38.0612207Z * [new tag] ciflow/trunk/148561 -> ciflow/trunk/148561 2025-03-14T05:31:38.0612956Z * [new tag] ciflow/trunk/148611 -> ciflow/trunk/148611 2025-03-14T05:31:38.0614399Z * [new tag] ciflow/trunk/148622 -> ciflow/trunk/148622 2025-03-14T05:31:38.0615834Z * [new tag] ciflow/trunk/148646 -> ciflow/trunk/148646 2025-03-14T05:31:38.0617105Z * [new tag] ciflow/trunk/148684 -> ciflow/trunk/148684 2025-03-14T05:31:38.0618330Z * [new tag] ciflow/trunk/148704 -> ciflow/trunk/148704 2025-03-14T05:31:38.0619580Z * [new tag] ciflow/trunk/148708 -> ciflow/trunk/148708 2025-03-14T05:31:38.0620574Z * [new tag] ciflow/trunk/148772 -> ciflow/trunk/148772 2025-03-14T05:31:38.0622030Z * [new tag] ciflow/trunk/148773 -> ciflow/trunk/148773 2025-03-14T05:31:38.0623268Z * [new tag] ciflow/trunk/148800 -> ciflow/trunk/148800 2025-03-14T05:31:38.0624483Z * [new tag] ciflow/trunk/148823 -> ciflow/trunk/148823 2025-03-14T05:31:38.0625732Z * [new tag] ciflow/trunk/148834 -> ciflow/trunk/148834 2025-03-14T05:31:38.0627450Z * [new tag] ciflow/trunk/148864 -> ciflow/trunk/148864 2025-03-14T05:31:38.0628812Z * [new tag] ciflow/trunk/148875 -> ciflow/trunk/148875 2025-03-14T05:31:38.0630059Z * [new tag] ciflow/trunk/148878 -> ciflow/trunk/148878 2025-03-14T05:31:38.0631310Z * [new tag] ciflow/trunk/148880 -> ciflow/trunk/148880 2025-03-14T05:31:38.0632546Z * [new tag] ciflow/trunk/148890 -> ciflow/trunk/148890 2025-03-14T05:31:38.0633977Z * [new tag] ciflow/trunk/148900 -> ciflow/trunk/148900 2025-03-14T05:31:38.0635577Z * [new tag] ciflow/trunk/148903 -> ciflow/trunk/148903 2025-03-14T05:31:38.0637058Z * [new tag] ciflow/trunk/148919 -> ciflow/trunk/148919 2025-03-14T05:31:38.0638413Z * [new tag] ciflow/trunk/148936 -> ciflow/trunk/148936 2025-03-14T05:31:38.0639603Z * [new tag] ciflow/trunk/148997 -> ciflow/trunk/148997 2025-03-14T05:31:38.0640844Z * [new tag] ciflow/trunk/149007 -> ciflow/trunk/149007 2025-03-14T05:31:38.0642251Z * [new tag] ciflow/trunk/149018 -> ciflow/trunk/149018 2025-03-14T05:31:38.0643473Z * [new tag] ciflow/trunk/149041 -> ciflow/trunk/149041 2025-03-14T05:31:38.0645025Z * [new tag] ciflow/trunk/149053 -> ciflow/trunk/149053 2025-03-14T05:31:38.0646257Z * [new tag] ciflow/trunk/149054 -> ciflow/trunk/149054 2025-03-14T05:31:38.0647504Z * [new tag] ciflow/trunk/149064 -> ciflow/trunk/149064 2025-03-14T05:31:38.0648724Z * [new tag] ciflow/trunk/149074 -> ciflow/trunk/149074 2025-03-14T05:31:38.0650185Z * [new tag] ciflow/trunk/149098 -> ciflow/trunk/149098 2025-03-14T05:31:38.0651560Z * [new tag] ciflow/trunk/149113 -> ciflow/trunk/149113 2025-03-14T05:31:38.0652843Z * [new tag] ciflow/trunk/149136 -> ciflow/trunk/149136 2025-03-14T05:31:38.0654171Z * [new tag] ciflow/trunk/149146 -> ciflow/trunk/149146 2025-03-14T05:31:38.0655569Z * [new tag] ciflow/trunk/149159 -> ciflow/trunk/149159 2025-03-14T05:31:38.0656930Z * [new tag] ciflow/trunk/70978 -> ciflow/trunk/70978 2025-03-14T05:31:38.0658346Z * [new tag] ciflow/trunk/70979 -> ciflow/trunk/70979 2025-03-14T05:31:38.0659988Z * [new tag] ciflow/unstable/123 -> ciflow/unstable/123 2025-03-14T05:31:38.0661272Z * [new tag] ciflow/unstable/146104 -> ciflow/unstable/146104 2025-03-14T05:31:38.0662163Z * [new tag] ciflow/unstable/146264 -> ciflow/unstable/146264 2025-03-14T05:31:38.0663942Z * [new tag] ciflow/win-arm64/148753 -> ciflow/win-arm64/148753 2025-03-14T05:31:38.0665319Z * [new tag] ciflow/xpu/137566 -> ciflow/xpu/137566 2025-03-14T05:31:38.0666210Z * [new tag] ciflow/xpu/137580 -> ciflow/xpu/137580 2025-03-14T05:31:38.0667532Z * [new tag] ciflow/xpu/138996 -> ciflow/xpu/138996 2025-03-14T05:31:38.0668986Z * [new tag] ciflow/xpu/139469 -> ciflow/xpu/139469 2025-03-14T05:31:38.0669938Z * [new tag] ciflow/xpu/139971 -> ciflow/xpu/139971 2025-03-14T05:31:38.0671271Z * [new tag] ciflow/xpu/140365 -> ciflow/xpu/140365 2025-03-14T05:31:38.0672439Z * [new tag] ciflow/xpu/140372 -> ciflow/xpu/140372 2025-03-14T05:31:38.0674112Z * [new tag] ciflow/xpu/142097 -> ciflow/xpu/142097 2025-03-14T05:31:38.0675403Z * [new tag] ciflow/xpu/143597 -> ciflow/xpu/143597 2025-03-14T05:31:38.0676298Z * [new tag] ciflow/xpu/143833 -> ciflow/xpu/143833 2025-03-14T05:31:38.0677648Z * [new tag] ciflow/xpu/144452 -> ciflow/xpu/144452 2025-03-14T05:31:38.0679164Z * [new tag] ciflow/xpu/144664 -> ciflow/xpu/144664 2025-03-14T05:31:38.0680621Z * [new tag] ciflow/xpu/147349 -> ciflow/xpu/147349 2025-03-14T05:31:38.0682190Z * [new tag] ciflow/xpu/147355 -> ciflow/xpu/147355 2025-03-14T05:31:38.0683478Z * [new tag] ciflow/xpu/147498 -> ciflow/xpu/147498 2025-03-14T05:31:38.0684708Z * [new tag] ciflow/xpu/147507 -> ciflow/xpu/147507 2025-03-14T05:31:38.0685609Z * [new tag] ciflow/xpu/147583 -> ciflow/xpu/147583 2025-03-14T05:31:38.0687166Z * [new tag] ciflow/xpu/147664 -> ciflow/xpu/147664 2025-03-14T05:31:38.0688023Z * [new tag] ciflow/xpu/147821 -> ciflow/xpu/147821 2025-03-14T05:31:38.0689425Z * [new tag] ciflow/xpu/147962 -> ciflow/xpu/147962 2025-03-14T05:31:38.0690608Z * [new tag] ciflow/xpu/148646 -> ciflow/xpu/148646 2025-03-14T05:31:38.0691798Z * [new tag] ciflow/xpu/148864 -> ciflow/xpu/148864 2025-03-14T05:31:38.0692993Z * [new tag] ciflow/xpu/148880 -> ciflow/xpu/148880 2025-03-14T05:31:38.0694213Z * [new tag] ciflow/xpu/149053 -> ciflow/xpu/149053 2025-03-14T05:31:38.0695433Z * [new tag] ciflow/xpu/149113 -> ciflow/xpu/149113 2025-03-14T05:31:38.0696627Z * [new tag] ciflow/xpu/149175 -> ciflow/xpu/149175 2025-03-14T05:31:38.0697870Z * [new tag] cslpull75 -> cslpull75 2025-03-14T05:31:38.0699154Z * [new tag] cslpull76 -> cslpull76 2025-03-14T05:31:38.0700043Z * [new tag] cslpull77 -> cslpull77 2025-03-14T05:31:38.0701425Z * [new tag] cslpull78 -> cslpull78 2025-03-14T05:31:38.0703069Z * [new tag] cslpull79 -> cslpull79 2025-03-14T05:31:38.0704774Z * [new tag] cslpull80 -> cslpull80 2025-03-14T05:31:38.0706051Z * [new tag] cslpull81 -> cslpull81 2025-03-14T05:31:38.0707283Z * [new tag] cslpull82 -> cslpull82 2025-03-14T05:31:38.0708531Z * [new tag] cslpull83 -> cslpull83 2025-03-14T05:31:38.0709822Z * [new tag] cslpull84 -> cslpull84 2025-03-14T05:31:38.0711013Z * [new tag] cslpull85 -> cslpull85 2025-03-14T05:31:38.0712320Z * [new tag] cslpull86 -> cslpull86 2025-03-14T05:31:38.0713641Z * [new tag] cslpull87 -> cslpull87 2025-03-14T05:31:38.0715153Z * [new tag] cslpull88 -> cslpull88 2025-03-14T05:31:38.0716329Z * [new tag] cslpull89 -> cslpull89 2025-03-14T05:31:38.0717466Z * [new tag] cslpull90 -> cslpull90 2025-03-14T05:31:38.0719963Z * [new tag] cslpull91 -> cslpull91 2025-03-14T05:31:38.0721366Z * [new tag] cslpull92 -> cslpull92 2025-03-14T05:31:38.0722376Z * [new tag] flight_5 -> flight_5 2025-03-14T05:31:38.0724042Z * [new tag] flight_5.1 -> flight_5.1 2025-03-14T05:31:38.0725512Z * [new tag] flight_5.2 -> flight_5.2 2025-03-14T05:31:38.0726817Z * [new tag] flight_5.3 -> flight_5.3 2025-03-14T05:31:38.0727952Z * [new tag] forpull1 -> forpull1 2025-03-14T05:31:38.0729490Z * [new tag] malfet/tag-2ef5611 -> malfet/tag-2ef5611 2025-03-14T05:31:38.0730871Z * [new tag] malfet/tag-317b1a0 -> malfet/tag-317b1a0 2025-03-14T05:31:38.0732147Z * [new tag] malfet/tag-ec6f767 -> malfet/tag-ec6f767 2025-03-14T05:31:38.0733479Z * [new tag] nightly-binary -> nightly-binary 2025-03-14T05:31:38.0734387Z * [new tag] sqzhang_flight4_plus -> sqzhang_flight4_plus 2025-03-14T05:31:38.0736220Z * [new tag] sqzhang_flight_3 -> sqzhang_flight_3 2025-03-14T05:31:38.0737354Z * [new tag] v0.1.1 -> v0.1.1 2025-03-14T05:31:38.0738662Z * [new tag] v0.1.10 -> v0.1.10 2025-03-14T05:31:38.0740060Z * [new tag] v0.1.11 -> v0.1.11 2025-03-14T05:31:38.0741244Z * [new tag] v0.1.12 -> v0.1.12 2025-03-14T05:31:38.0742425Z * [new tag] v0.1.2 -> v0.1.2 2025-03-14T05:31:38.0743707Z * [new tag] v0.1.3 -> v0.1.3 2025-03-14T05:31:38.0745339Z * [new tag] v0.1.4 -> v0.1.4 2025-03-14T05:31:38.0745961Z * [new tag] v0.1.5 -> v0.1.5 2025-03-14T05:31:38.0747425Z * [new tag] v0.1.6 -> v0.1.6 2025-03-14T05:31:38.0748658Z * [new tag] v0.1.7 -> v0.1.7 2025-03-14T05:31:38.0749854Z * [new tag] v0.1.8 -> v0.1.8 2025-03-14T05:31:38.0751119Z * [new tag] v0.1.9 -> v0.1.9 2025-03-14T05:31:38.0752413Z * [new tag] v0.2.0 -> v0.2.0 2025-03-14T05:31:38.0753708Z * [new tag] v0.3.0 -> v0.3.0 2025-03-14T05:31:38.0755189Z * [new tag] v0.3.1 -> v0.3.1 2025-03-14T05:31:38.0756497Z * [new tag] v0.4.0 -> v0.4.0 2025-03-14T05:31:38.0757742Z * [new tag] v0.4.1 -> v0.4.1 2025-03-14T05:31:38.0759023Z * [new tag] v1.0.0 -> v1.0.0 2025-03-14T05:31:38.0760326Z * [new tag] v1.0.0a0 -> v1.0.0a0 2025-03-14T05:31:38.0761629Z * [new tag] v1.0.1 -> v1.0.1 2025-03-14T05:31:38.0762943Z * [new tag] v1.0rc0 -> v1.0rc0 2025-03-14T05:31:38.0763822Z * [new tag] v1.0rc1 -> v1.0rc1 2025-03-14T05:31:38.0765448Z * [new tag] v1.1.0 -> v1.1.0 2025-03-14T05:31:38.0766842Z * [new tag] v1.1.0a0 -> v1.1.0a0 2025-03-14T05:31:38.0768581Z * [new tag] v1.10.0 -> v1.10.0 2025-03-14T05:31:38.0770044Z * [new tag] v1.10.0-rc1 -> v1.10.0-rc1 2025-03-14T05:31:38.0771404Z * [new tag] v1.10.0-rc2 -> v1.10.0-rc2 2025-03-14T05:31:38.0772283Z * [new tag] v1.10.0-rc3 -> v1.10.0-rc3 2025-03-14T05:31:38.0773856Z * [new tag] v1.10.1 -> v1.10.1 2025-03-14T05:31:38.0775017Z * [new tag] v1.10.1-rc1 -> v1.10.1-rc1 2025-03-14T05:31:38.0775898Z * [new tag] v1.10.2 -> v1.10.2 2025-03-14T05:31:38.0777199Z * [new tag] v1.10.2-rc1 -> v1.10.2-rc1 2025-03-14T05:31:38.0778546Z * [new tag] v1.11.0 -> v1.11.0 2025-03-14T05:31:38.0779955Z * [new tag] v1.11.0-rc1 -> v1.11.0-rc1 2025-03-14T05:31:38.0781343Z * [new tag] v1.11.0-rc2 -> v1.11.0-rc2 2025-03-14T05:31:38.0782707Z * [new tag] v1.11.0-rc3 -> v1.11.0-rc3 2025-03-14T05:31:38.0784116Z * [new tag] v1.11.0-rc4 -> v1.11.0-rc4 2025-03-14T05:31:38.0785421Z * [new tag] v1.11.0-rc5 -> v1.11.0-rc5 2025-03-14T05:31:38.0786594Z * [new tag] v1.11.0-rc6 -> v1.11.0-rc6 2025-03-14T05:31:38.0787573Z * [new tag] v1.11.0-rc7 -> v1.11.0-rc7 2025-03-14T05:31:38.0789011Z * [new tag] v1.12.0 -> v1.12.0 2025-03-14T05:31:38.0790318Z * [new tag] v1.12.0-rc1 -> v1.12.0-rc1 2025-03-14T05:31:38.0791634Z * [new tag] v1.12.0-rc2 -> v1.12.0-rc2 2025-03-14T05:31:38.0793207Z * [new tag] v1.12.0-rc3 -> v1.12.0-rc3 2025-03-14T05:31:38.0794465Z * [new tag] v1.12.0-rc4 -> v1.12.0-rc4 2025-03-14T05:31:38.0795836Z * [new tag] v1.12.0-rc5 -> v1.12.0-rc5 2025-03-14T05:31:38.0797213Z * [new tag] v1.12.0-rc6 -> v1.12.0-rc6 2025-03-14T05:31:38.0798092Z * [new tag] v1.12.0-rc7 -> v1.12.0-rc7 2025-03-14T05:31:38.0799417Z * [new tag] v1.12.0-rc8 -> v1.12.0-rc8 2025-03-14T05:31:38.0800296Z * [new tag] v1.12.1 -> v1.12.1 2025-03-14T05:31:38.0801923Z * [new tag] v1.12.1-rc1 -> v1.12.1-rc1 2025-03-14T05:31:38.0803232Z * [new tag] v1.12.1-rc2 -> v1.12.1-rc2 2025-03-14T05:31:38.0804750Z * [new tag] v1.12.1-rc3 -> v1.12.1-rc3 2025-03-14T05:31:38.0805990Z * [new tag] v1.12.1-rc4 -> v1.12.1-rc4 2025-03-14T05:31:38.0807750Z * [new tag] v1.12.1-rc5 -> v1.12.1-rc5 2025-03-14T05:31:38.0809129Z * [new tag] v1.13.0 -> v1.13.0 2025-03-14T05:31:38.0810445Z * [new tag] v1.13.0-rc1 -> v1.13.0-rc1 2025-03-14T05:31:38.0811770Z * [new tag] v1.13.0-rc2 -> v1.13.0-rc2 2025-03-14T05:31:38.0813033Z * [new tag] v1.13.0-rc3 -> v1.13.0-rc3 2025-03-14T05:31:38.0814427Z * [new tag] v1.13.0-rc4 -> v1.13.0-rc4 2025-03-14T05:31:38.0815410Z * [new tag] v1.13.0-rc5 -> v1.13.0-rc5 2025-03-14T05:31:38.0816645Z * [new tag] v1.13.0-rc6 -> v1.13.0-rc6 2025-03-14T05:31:38.0818062Z * [new tag] v1.13.1 -> v1.13.1 2025-03-14T05:31:38.0819201Z * [new tag] v1.13.1-rc1 -> v1.13.1-rc1 2025-03-14T05:31:38.0820502Z * [new tag] v1.2.0 -> v1.2.0 2025-03-14T05:31:38.0821822Z * [new tag] v1.2.0a0 -> v1.2.0a0 2025-03-14T05:31:38.0823139Z * [new tag] v1.3.0 -> v1.3.0 2025-03-14T05:31:38.0824431Z * [new tag] v1.3.0a0 -> v1.3.0a0 2025-03-14T05:31:38.0825329Z * [new tag] v1.3.1 -> v1.3.1 2025-03-14T05:31:38.0826829Z * [new tag] v1.4.0 -> v1.4.0 2025-03-14T05:31:38.0828204Z * [new tag] v1.4.0a0 -> v1.4.0a0 2025-03-14T05:31:38.0829087Z * [new tag] v1.4.1 -> v1.4.1 2025-03-14T05:31:38.0830690Z * [new tag] v1.5.0 -> v1.5.0 2025-03-14T05:31:38.0832116Z * [new tag] v1.5.0-rc1 -> v1.5.0-rc1 2025-03-14T05:31:38.0833461Z * [new tag] v1.5.0-rc2 -> v1.5.0-rc2 2025-03-14T05:31:38.0834940Z * [new tag] v1.5.0-rc3 -> v1.5.0-rc3 2025-03-14T05:31:38.0836212Z * [new tag] v1.5.0-rc4 -> v1.5.0-rc4 2025-03-14T05:31:38.0837104Z * [new tag] v1.5.0-rc5 -> v1.5.0-rc5 2025-03-14T05:31:38.0838695Z * [new tag] v1.5.1 -> v1.5.1 2025-03-14T05:31:38.0839676Z * [new tag] v1.5.1-rc1 -> v1.5.1-rc1 2025-03-14T05:31:38.0840905Z * [new tag] v1.6.0 -> v1.6.0 2025-03-14T05:31:38.0842298Z * [new tag] v1.6.0-rc1 -> v1.6.0-rc1 2025-03-14T05:31:38.0843691Z * [new tag] v1.6.0-rc2 -> v1.6.0-rc2 2025-03-14T05:31:38.0845162Z * [new tag] v1.6.0-rc3 -> v1.6.0-rc3 2025-03-14T05:31:38.0846545Z * [new tag] v1.6.0-rc4 -> v1.6.0-rc4 2025-03-14T05:31:38.0848506Z * [new tag] v1.6.0-rc5 -> v1.6.0-rc5 2025-03-14T05:31:38.0849146Z * [new tag] v1.6.0-rc6 -> v1.6.0-rc6 2025-03-14T05:31:38.0850481Z * [new tag] v1.6.0-rc7 -> v1.6.0-rc7 2025-03-14T05:31:38.0851811Z * [new tag] v1.7.0 -> v1.7.0 2025-03-14T05:31:38.0853234Z * [new tag] v1.7.0-rc1 -> v1.7.0-rc1 2025-03-14T05:31:38.0854678Z * [new tag] v1.7.0-rc2 -> v1.7.0-rc2 2025-03-14T05:31:38.0856016Z * [new tag] v1.7.0-rc3 -> v1.7.0-rc3 2025-03-14T05:31:38.0857007Z * [new tag] v1.7.0-rc4 -> v1.7.0-rc4 2025-03-14T05:31:38.0858447Z * [new tag] v1.7.1 -> v1.7.1 2025-03-14T05:31:38.0859931Z * [new tag] v1.7.1-rc1 -> v1.7.1-rc1 2025-03-14T05:31:38.0861275Z * [new tag] v1.7.1-rc2 -> v1.7.1-rc2 2025-03-14T05:31:38.0862462Z * [new tag] v1.7.1-rc3 -> v1.7.1-rc3 2025-03-14T05:31:38.0863813Z * [new tag] v1.8.0 -> v1.8.0 2025-03-14T05:31:38.0865005Z * [new tag] v1.8.0-rc1 -> v1.8.0-rc1 2025-03-14T05:31:38.0866326Z * [new tag] v1.8.0-rc2 -> v1.8.0-rc2 2025-03-14T05:31:38.0867668Z * [new tag] v1.8.0-rc3 -> v1.8.0-rc3 2025-03-14T05:31:38.0872274Z * [new tag] v1.8.0-rc4 -> v1.8.0-rc4 2025-03-14T05:31:38.0873427Z * [new tag] v1.8.0-rc5 -> v1.8.0-rc5 2025-03-14T05:31:38.0874350Z * [new tag] v1.8.1 -> v1.8.1 2025-03-14T05:31:38.0876031Z * [new tag] v1.8.1-rc1 -> v1.8.1-rc1 2025-03-14T05:31:38.0876926Z * [new tag] v1.8.1-rc2 -> v1.8.1-rc2 2025-03-14T05:31:38.0878217Z * [new tag] v1.8.1-rc3 -> v1.8.1-rc3 2025-03-14T05:31:38.0880123Z * [new tag] v1.8.2 -> v1.8.2 2025-03-14T05:31:38.0880968Z * [new tag] v1.8.2-rc1 -> v1.8.2-rc1 2025-03-14T05:31:38.0882521Z * [new tag] v1.9.0 -> v1.9.0 2025-03-14T05:31:38.0883842Z * [new tag] v1.9.0-rc1 -> v1.9.0-rc1 2025-03-14T05:31:38.0885223Z * [new tag] v1.9.0-rc2 -> v1.9.0-rc2 2025-03-14T05:31:38.0886630Z * [new tag] v1.9.0-rc3 -> v1.9.0-rc3 2025-03-14T05:31:38.0887545Z * [new tag] v1.9.0-rc4 -> v1.9.0-rc4 2025-03-14T05:31:38.0889094Z * [new tag] v1.9.1 -> v1.9.1 2025-03-14T05:31:38.0890628Z * [new tag] v1.9.1-rc1 -> v1.9.1-rc1 2025-03-14T05:31:38.0891766Z * [new tag] v1.9.1-rc2 -> v1.9.1-rc2 2025-03-14T05:31:38.0893105Z * [new tag] v2.0.0 -> v2.0.0 2025-03-14T05:31:38.0894995Z * [new tag] v2.0.0-rc1 -> v2.0.0-rc1 2025-03-14T05:31:38.0896340Z * [new tag] v2.0.0-rc2 -> v2.0.0-rc2 2025-03-14T05:31:38.0897769Z * [new tag] v2.0.0-rc3 -> v2.0.0-rc3 2025-03-14T05:31:38.0899066Z * [new tag] v2.0.0-rc4 -> v2.0.0-rc4 2025-03-14T05:31:38.0900347Z * [new tag] v2.0.0-rc5 -> v2.0.0-rc5 2025-03-14T05:31:38.0901225Z * [new tag] v2.0.0-rc6 -> v2.0.0-rc6 2025-03-14T05:31:38.0902825Z * [new tag] v2.0.1 -> v2.0.1 2025-03-14T05:31:38.0904288Z * [new tag] v2.0.1-rc1 -> v2.0.1-rc1 2025-03-14T05:31:38.0905255Z * [new tag] v2.0.1-rc2 -> v2.0.1-rc2 2025-03-14T05:31:38.0906627Z * [new tag] v2.0.1-rc3 -> v2.0.1-rc3 2025-03-14T05:31:38.0907505Z * [new tag] v2.0.1-rc4 -> v2.0.1-rc4 2025-03-14T05:31:38.0909504Z * [new tag] v2.1.0 -> v2.1.0 2025-03-14T05:31:38.0910812Z * [new tag] v2.1.0-rc1 -> v2.1.0-rc1 2025-03-14T05:31:38.0912127Z * [new tag] v2.1.0-rc2 -> v2.1.0-rc2 2025-03-14T05:31:38.0913518Z * [new tag] v2.1.0-rc3 -> v2.1.0-rc3 2025-03-14T05:31:38.0914975Z * [new tag] v2.1.0-rc4 -> v2.1.0-rc4 2025-03-14T05:31:38.0916290Z * [new tag] v2.1.0-rc5 -> v2.1.0-rc5 2025-03-14T05:31:38.0917140Z * [new tag] v2.1.0-rc6 -> v2.1.0-rc6 2025-03-14T05:31:38.0918680Z * [new tag] v2.1.1 -> v2.1.1 2025-03-14T05:31:38.0921497Z * [new tag] v2.1.1-rc1 -> v2.1.1-rc1 2025-03-14T05:31:38.0921668Z * [new tag] v2.1.1-rc2 -> v2.1.1-rc2 2025-03-14T05:31:38.0922939Z * [new tag] v2.1.1-rc3 -> v2.1.1-rc3 2025-03-14T05:31:38.0923563Z * [new tag] v2.1.1-rc4 -> v2.1.1-rc4 2025-03-14T05:31:38.0925140Z * [new tag] v2.1.1-rc5 -> v2.1.1-rc5 2025-03-14T05:31:38.0925951Z * [new tag] v2.1.1-rc6 -> v2.1.1-rc6 2025-03-14T05:31:38.0927375Z * [new tag] v2.1.2 -> v2.1.2 2025-03-14T05:31:38.0928722Z * [new tag] v2.1.2-rc1 -> v2.1.2-rc1 2025-03-14T05:31:38.0930414Z * [new tag] v2.1.2-rc2 -> v2.1.2-rc2 2025-03-14T05:31:38.0930790Z * [new tag] v2.1.2-rc3 -> v2.1.2-rc3 2025-03-14T05:31:38.0932454Z * [new tag] v2.2.0 -> v2.2.0 2025-03-14T05:31:38.0933750Z * [new tag] v2.2.0-rc1 -> v2.2.0-rc1 2025-03-14T05:31:38.0935049Z * [new tag] v2.2.0-rc2 -> v2.2.0-rc2 2025-03-14T05:31:38.0936334Z * [new tag] v2.2.0-rc3 -> v2.2.0-rc3 2025-03-14T05:31:38.0937593Z * [new tag] v2.2.0-rc4 -> v2.2.0-rc4 2025-03-14T05:31:38.0939131Z * [new tag] v2.2.0-rc5 -> v2.2.0-rc5 2025-03-14T05:31:38.0939970Z * [new tag] v2.2.0-rc6 -> v2.2.0-rc6 2025-03-14T05:31:38.0940958Z * [new tag] v2.2.0-rc7 -> v2.2.0-rc7 2025-03-14T05:31:38.0942300Z * [new tag] v2.2.0-rc8 -> v2.2.0-rc8 2025-03-14T05:31:38.0943610Z * [new tag] v2.2.1 -> v2.2.1 2025-03-14T05:31:38.0945049Z * [new tag] v2.2.1-rc1 -> v2.2.1-rc1 2025-03-14T05:31:38.0945863Z * [new tag] v2.2.1-rc2 -> v2.2.1-rc2 2025-03-14T05:31:38.0947108Z * [new tag] v2.2.1-rc3 -> v2.2.1-rc3 2025-03-14T05:31:38.0947945Z * [new tag] v2.2.2 -> v2.2.2 2025-03-14T05:31:38.0949581Z * [new tag] v2.2.2-rc1 -> v2.2.2-rc1 2025-03-14T05:31:38.0950418Z * [new tag] v2.2.2-rc2 -> v2.2.2-rc2 2025-03-14T05:31:38.0951710Z * [new tag] v2.2.2-rc3 -> v2.2.2-rc3 2025-03-14T05:31:38.0953075Z * [new tag] v2.3.0 -> v2.3.0 2025-03-14T05:31:38.0954460Z * [new tag] v2.3.0-rc1 -> v2.3.0-rc1 2025-03-14T05:31:38.0955760Z * [new tag] v2.3.0-rc10 -> v2.3.0-rc10 2025-03-14T05:31:38.0957242Z * [new tag] v2.3.0-rc11 -> v2.3.0-rc11 2025-03-14T05:31:38.0957982Z * [new tag] v2.3.0-rc12 -> v2.3.0-rc12 2025-03-14T05:31:38.0959529Z * [new tag] v2.3.0-rc2 -> v2.3.0-rc2 2025-03-14T05:31:38.0960779Z * [new tag] v2.3.0-rc3 -> v2.3.0-rc3 2025-03-14T05:31:38.0962111Z * [new tag] v2.3.0-rc4 -> v2.3.0-rc4 2025-03-14T05:31:38.0963415Z * [new tag] v2.3.0-rc5 -> v2.3.0-rc5 2025-03-14T05:31:38.0964590Z * [new tag] v2.3.0-rc6 -> v2.3.0-rc6 2025-03-14T05:31:38.0965891Z * [new tag] v2.3.0-rc7 -> v2.3.0-rc7 2025-03-14T05:31:38.0967200Z * [new tag] v2.3.0-rc8 -> v2.3.0-rc8 2025-03-14T05:31:38.0968301Z * [new tag] v2.3.0-rc9 -> v2.3.0-rc9 2025-03-14T05:31:38.0969797Z * [new tag] v2.3.1 -> v2.3.1 2025-03-14T05:31:38.0971249Z * [new tag] v2.3.1-rc1 -> v2.3.1-rc1 2025-03-14T05:31:38.0972393Z * [new tag] v2.3.1-rc2 -> v2.3.1-rc2 2025-03-14T05:31:38.0973711Z * [new tag] v2.3.1-rc3 -> v2.3.1-rc3 2025-03-14T05:31:38.0975121Z * [new tag] v2.4.0 -> v2.4.0 2025-03-14T05:31:38.0976968Z * [new tag] v2.4.0-rc1 -> v2.4.0-rc1 2025-03-14T05:31:38.0978281Z * [new tag] v2.4.0-rc2 -> v2.4.0-rc2 2025-03-14T05:31:38.0979563Z * [new tag] v2.4.0-rc3 -> v2.4.0-rc3 2025-03-14T05:31:38.0980908Z * [new tag] v2.4.0-rc4 -> v2.4.0-rc4 2025-03-14T05:31:38.0982325Z * [new tag] v2.4.0-rc5 -> v2.4.0-rc5 2025-03-14T05:31:38.0983623Z * [new tag] v2.4.0-rc6 -> v2.4.0-rc6 2025-03-14T05:31:38.0985037Z * [new tag] v2.4.0-rc7 -> v2.4.0-rc7 2025-03-14T05:31:38.0986394Z * [new tag] v2.4.0-rc8 -> v2.4.0-rc8 2025-03-14T05:31:38.0988116Z * [new tag] v2.4.0-rc9 -> v2.4.0-rc9 2025-03-14T05:31:38.0988487Z * [new tag] v2.4.1 -> v2.4.1 2025-03-14T05:31:38.0990164Z * [new tag] v2.4.1-rc1 -> v2.4.1-rc1 2025-03-14T05:31:38.0991510Z * [new tag] v2.4.1-rc2 -> v2.4.1-rc2 2025-03-14T05:31:38.0992828Z * [new tag] v2.4.1-rc3 -> v2.4.1-rc3 2025-03-14T05:31:38.0994139Z * [new tag] v2.5.0 -> v2.5.0 2025-03-14T05:31:38.0995577Z * [new tag] v2.5.0-rc1 -> v2.5.0-rc1 2025-03-14T05:31:38.0996471Z * [new tag] v2.5.0-rc10 -> v2.5.0-rc10 2025-03-14T05:31:38.0997937Z * [new tag] v2.5.0-rc2 -> v2.5.0-rc2 2025-03-14T05:31:38.0999250Z * [new tag] v2.5.0-rc3 -> v2.5.0-rc3 2025-03-14T05:31:38.1000518Z * [new tag] v2.5.0-rc4 -> v2.5.0-rc4 2025-03-14T05:31:38.1001878Z * [new tag] v2.5.0-rc5 -> v2.5.0-rc5 2025-03-14T05:31:38.1003246Z * [new tag] v2.5.0-rc6 -> v2.5.0-rc6 2025-03-14T05:31:38.1004609Z * [new tag] v2.5.0-rc7 -> v2.5.0-rc7 2025-03-14T05:31:38.1005959Z * [new tag] v2.5.0-rc8 -> v2.5.0-rc8 2025-03-14T05:31:38.1007318Z * [new tag] v2.5.0-rc9 -> v2.5.0-rc9 2025-03-14T05:31:38.1008149Z * [new tag] v2.5.1 -> v2.5.1 2025-03-14T05:31:38.1009457Z * [new tag] v2.5.1-rc1 -> v2.5.1-rc1 2025-03-14T05:31:38.1010499Z * [new tag] v2.6.0 -> v2.6.0 2025-03-14T05:31:38.1012001Z * [new tag] v2.6.0-rc1 -> v2.6.0-rc1 2025-03-14T05:31:38.1013336Z * [new tag] v2.6.0-rc2 -> v2.6.0-rc2 2025-03-14T05:31:38.1014769Z * [new tag] v2.6.0-rc3 -> v2.6.0-rc3 2025-03-14T05:31:38.1016054Z * [new tag] v2.6.0-rc4 -> v2.6.0-rc4 2025-03-14T05:31:38.1017632Z * [new tag] v2.6.0-rc5 -> v2.6.0-rc5 2025-03-14T05:31:38.1019164Z * [new tag] v2.6.0-rc6 -> v2.6.0-rc6 2025-03-14T05:31:38.1020526Z * [new tag] v2.6.0-rc7 -> v2.6.0-rc7 2025-03-14T05:31:38.1021913Z * [new tag] v2.6.0-rc8 -> v2.6.0-rc8 2025-03-14T05:31:38.1023367Z * [new tag] v2.6.0-rc9 -> v2.6.0-rc9 2025-03-14T05:31:38.1024667Z * [new tag] v2.7.0-rc1 -> v2.7.0-rc1 2025-03-14T05:31:38.1025521Z * [new tag] v2.7.0-rc2 -> v2.7.0-rc2 2025-03-14T05:31:38.1027061Z * [new tag] whc_flight_1 -> whc_flight_1 2025-03-14T05:31:38.1028212Z * [new tag] whc_flight_2 -> whc_flight_2 2025-03-14T05:31:38.1029372Z * [new tag] whc_flight_4 -> whc_flight_4 2025-03-14T05:31:38.1835798Z [command]/usr/bin/git rev-parse --verify --quiet aed0b7a742a2d7b7901790622829cbd2135049a4^{object} 2025-03-14T05:31:38.1864777Z aed0b7a742a2d7b7901790622829cbd2135049a4 2025-03-14T05:31:38.1871179Z ##[endgroup] 2025-03-14T05:31:38.1871792Z ##[group]Determining the checkout info 2025-03-14T05:31:38.1872388Z ##[endgroup] 2025-03-14T05:31:38.1876872Z [command]/usr/bin/git sparse-checkout disable 2025-03-14T05:31:38.1917948Z [command]/usr/bin/git config --local --unset-all extensions.worktreeConfig 2025-03-14T05:31:38.1948730Z ##[group]Checking out the ref 2025-03-14T05:31:38.1951894Z [command]/usr/bin/git checkout --progress --force aed0b7a742a2d7b7901790622829cbd2135049a4 2025-03-14T05:31:39.2304055Z Updating files: 80% (13299/16577) 2025-03-14T05:31:39.2429851Z Updating files: 81% (13428/16577) 2025-03-14T05:31:39.2557679Z Updating files: 82% (13594/16577) 2025-03-14T05:31:39.2688304Z Updating files: 83% (13759/16577) 2025-03-14T05:31:39.2848283Z Updating files: 84% (13925/16577) 2025-03-14T05:31:39.2970962Z Updating files: 85% (14091/16577) 2025-03-14T05:31:39.3103021Z Updating files: 86% (14257/16577) 2025-03-14T05:31:39.3213548Z Updating files: 87% (14422/16577) 2025-03-14T05:31:39.3353547Z Updating files: 88% (14588/16577) 2025-03-14T05:31:39.3528695Z Updating files: 89% (14754/16577) 2025-03-14T05:31:39.3645531Z Updating files: 90% (14920/16577) 2025-03-14T05:31:39.3772476Z Updating files: 91% (15086/16577) 2025-03-14T05:31:39.3951432Z Updating files: 92% (15251/16577) 2025-03-14T05:31:39.4137049Z Updating files: 93% (15417/16577) 2025-03-14T05:31:39.4326852Z Updating files: 94% (15583/16577) 2025-03-14T05:31:39.4490604Z Updating files: 95% (15749/16577) 2025-03-14T05:31:39.4628885Z Updating files: 96% (15914/16577) 2025-03-14T05:31:39.4911519Z Updating files: 97% (16080/16577) 2025-03-14T05:31:39.5064966Z Updating files: 98% (16246/16577) 2025-03-14T05:31:39.5222412Z Updating files: 99% (16412/16577) 2025-03-14T05:31:39.5222737Z Updating files: 100% (16577/16577) 2025-03-14T05:31:39.5223055Z Updating files: 100% (16577/16577), done. 2025-03-14T05:31:39.5467672Z Note: switching to 'aed0b7a742a2d7b7901790622829cbd2135049a4'. 2025-03-14T05:31:39.5468227Z 2025-03-14T05:31:39.5468460Z You are in 'detached HEAD' state. You can look around, make experimental 2025-03-14T05:31:39.5469016Z changes and commit them, and you can discard any commits you make in this 2025-03-14T05:31:39.5469558Z state without impacting any branches by switching back to a branch. 2025-03-14T05:31:39.5469869Z 2025-03-14T05:31:39.5470094Z If you want to create a new branch to retain commits you create, you may 2025-03-14T05:31:39.5470789Z do so (now or later) by using -c with the switch command. Example: 2025-03-14T05:31:39.5471077Z 2025-03-14T05:31:39.5471211Z git switch -c 2025-03-14T05:31:39.5471418Z 2025-03-14T05:31:39.5471547Z Or undo this operation with: 2025-03-14T05:31:39.5471739Z 2025-03-14T05:31:39.5471852Z git switch - 2025-03-14T05:31:39.5471999Z 2025-03-14T05:31:39.5472254Z Turn off this advice by setting config variable advice.detachedHead to false 2025-03-14T05:31:39.5472590Z 2025-03-14T05:31:39.5472917Z HEAD is now at aed0b7a742a [c10d] Add param recording for uniqueID broadcasting and allgather (#149166) 2025-03-14T05:31:39.5595849Z ##[endgroup] 2025-03-14T05:31:39.5596323Z ##[group]Setting up auth for fetching submodules 2025-03-14T05:31:39.5601362Z [command]/usr/bin/git config --global http.https://github.com/.extraheader AUTHORIZATION: basic *** 2025-03-14T05:31:39.5650770Z [command]/usr/bin/git config --global --unset-all url.https://github.com/.insteadOf 2025-03-14T05:31:39.5684253Z [command]/usr/bin/git config --global --add url.https://github.com/.insteadOf git@github.com: 2025-03-14T05:31:39.5722848Z [command]/usr/bin/git config --global --add url.https://github.com/.insteadOf org-21003710@github.com: 2025-03-14T05:31:39.5752260Z ##[endgroup] 2025-03-14T05:31:39.5752695Z ##[group]Fetching submodules 2025-03-14T05:31:39.5755918Z [command]/usr/bin/git submodule sync --recursive 2025-03-14T05:31:39.6136635Z [command]/usr/bin/git -c protocol.version=2 submodule update --init --force --recursive 2025-03-14T05:31:39.6500370Z Submodule 'android/libs/fbjni' (https://github.com/facebookincubator/fbjni.git) registered for path 'android/libs/fbjni' 2025-03-14T05:31:39.6503694Z Submodule 'third_party/NNPACK_deps/FP16' (https://github.com/Maratyszcza/FP16.git) registered for path 'third_party/FP16' 2025-03-14T05:31:39.6508352Z Submodule 'third_party/NNPACK_deps/FXdiv' (https://github.com/Maratyszcza/FXdiv.git) registered for path 'third_party/FXdiv' 2025-03-14T05:31:39.6512627Z Submodule 'third_party/NNPACK' (https://github.com/Maratyszcza/NNPACK.git) registered for path 'third_party/NNPACK' 2025-03-14T05:31:39.6517442Z Submodule 'third_party/NVTX' (https://github.com/NVIDIA/NVTX.git) registered for path 'third_party/NVTX' 2025-03-14T05:31:39.6522520Z Submodule 'third_party/VulkanMemoryAllocator' (https://github.com/GPUOpen-LibrariesAndSDKs/VulkanMemoryAllocator.git) registered for path 'third_party/VulkanMemoryAllocator' 2025-03-14T05:31:39.6526969Z Submodule 'third_party/XNNPACK' (https://github.com/google/XNNPACK.git) registered for path 'third_party/XNNPACK' 2025-03-14T05:31:39.6531859Z Submodule 'third_party/benchmark' (https://github.com/google/benchmark.git) registered for path 'third_party/benchmark' 2025-03-14T05:31:39.6537300Z Submodule 'third_party/composable_kernel' (https://github.com/ROCm/composable_kernel.git) registered for path 'third_party/composable_kernel' 2025-03-14T05:31:39.6542315Z Submodule 'third_party/cpp-httplib' (https://github.com/yhirose/cpp-httplib.git) registered for path 'third_party/cpp-httplib' 2025-03-14T05:31:39.6547359Z Submodule 'third_party/cpuinfo' (https://github.com/pytorch/cpuinfo.git) registered for path 'third_party/cpuinfo' 2025-03-14T05:31:39.6552533Z Submodule 'third_party/cudnn_frontend' (https://github.com/NVIDIA/cudnn-frontend.git) registered for path 'third_party/cudnn_frontend' 2025-03-14T05:31:39.6557893Z Submodule 'third_party/cutlass' (https://github.com/NVIDIA/cutlass.git) registered for path 'third_party/cutlass' 2025-03-14T05:31:39.6563089Z Submodule 'third_party/eigen' (https://gitlab.com/libeigen/eigen.git) registered for path 'third_party/eigen' 2025-03-14T05:31:39.6568892Z Submodule 'third_party/fbgemm' (https://github.com/pytorch/fbgemm) registered for path 'third_party/fbgemm' 2025-03-14T05:31:39.6574796Z Submodule 'third_party/flash-attention' (https://github.com/Dao-AILab/flash-attention.git) registered for path 'third_party/flash-attention' 2025-03-14T05:31:39.6583871Z Submodule 'third_party/flatbuffers' (https://github.com/google/flatbuffers.git) registered for path 'third_party/flatbuffers' 2025-03-14T05:31:39.6589212Z Submodule 'third_party/fmt' (https://github.com/fmtlib/fmt.git) registered for path 'third_party/fmt' 2025-03-14T05:31:39.6594652Z Submodule 'third_party/gemmlowp/gemmlowp' (https://github.com/google/gemmlowp.git) registered for path 'third_party/gemmlowp/gemmlowp' 2025-03-14T05:31:39.6600346Z Submodule 'third_party/gloo' (https://github.com/facebookincubator/gloo) registered for path 'third_party/gloo' 2025-03-14T05:31:39.6606164Z Submodule 'third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/googletest' 2025-03-14T05:31:39.6611851Z Submodule 'third_party/ideep' (https://github.com/intel/ideep) registered for path 'third_party/ideep' 2025-03-14T05:31:39.6617799Z Submodule 'third_party/ittapi' (https://github.com/intel/ittapi.git) registered for path 'third_party/ittapi' 2025-03-14T05:31:39.6623774Z Submodule 'third_party/kineto' (https://github.com/pytorch/kineto) registered for path 'third_party/kineto' 2025-03-14T05:31:39.6629941Z Submodule 'third_party/kleidiai' (https://github.com/ARM-software/kleidiai.git) registered for path 'third_party/kleidiai' 2025-03-14T05:31:39.6636202Z Submodule 'third_party/mimalloc' (https://github.com/microsoft/mimalloc.git) registered for path 'third_party/mimalloc' 2025-03-14T05:31:39.6642326Z Submodule 'third_party/nlohmann' (https://github.com/nlohmann/json.git) registered for path 'third_party/nlohmann' 2025-03-14T05:31:39.6648663Z Submodule 'third_party/onnx' (https://github.com/onnx/onnx.git) registered for path 'third_party/onnx' 2025-03-14T05:31:39.6655382Z Submodule 'third_party/opentelemetry-cpp' (https://github.com/open-telemetry/opentelemetry-cpp.git) registered for path 'third_party/opentelemetry-cpp' 2025-03-14T05:31:39.6661599Z Submodule 'third_party/pocketfft' (https://github.com/mreineck/pocketfft) registered for path 'third_party/pocketfft' 2025-03-14T05:31:39.6668306Z Submodule 'third_party/protobuf' (https://github.com/protocolbuffers/protobuf.git) registered for path 'third_party/protobuf' 2025-03-14T05:31:39.6675653Z Submodule 'third_party/NNPACK_deps/psimd' (https://github.com/Maratyszcza/psimd.git) registered for path 'third_party/psimd' 2025-03-14T05:31:39.6682309Z Submodule 'third_party/NNPACK_deps/pthreadpool' (https://github.com/Maratyszcza/pthreadpool.git) registered for path 'third_party/pthreadpool' 2025-03-14T05:31:39.6692103Z Submodule 'third_party/pybind11' (https://github.com/pybind/pybind11.git) registered for path 'third_party/pybind11' 2025-03-14T05:31:39.6698909Z Submodule 'third_party/python-peachpy' (https://github.com/malfet/PeachPy.git) registered for path 'third_party/python-peachpy' 2025-03-14T05:31:39.6705963Z Submodule 'third_party/sleef' (https://github.com/shibatch/sleef) registered for path 'third_party/sleef' 2025-03-14T05:31:39.6712722Z Submodule 'third_party/tensorpipe' (https://github.com/pytorch/tensorpipe.git) registered for path 'third_party/tensorpipe' 2025-03-14T05:31:39.6748258Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/android/libs/fbjni'... 2025-03-14T05:31:39.9545383Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/FXdiv'... 2025-03-14T05:31:39.9546504Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/FP16'... 2025-03-14T05:31:39.9547599Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/NNPACK'... 2025-03-14T05:31:39.9575906Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fmt'... 2025-03-14T05:31:42.7561889Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/NVTX'... 2025-03-14T05:31:42.7563218Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/benchmark'... 2025-03-14T05:31:42.7564462Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/cpp-httplib'... 2025-03-14T05:31:42.7565999Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/gloo'... 2025-03-14T05:31:42.7567293Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/flash-attention'... 2025-03-14T05:31:42.7568903Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/gemmlowp/gemmlowp'... 2025-03-14T05:31:42.7570164Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/cpuinfo'... 2025-03-14T05:31:42.7571378Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/ittapi'... 2025-03-14T05:31:42.7572536Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/ideep'... 2025-03-14T05:31:42.7573750Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kleidiai'... 2025-03-14T05:31:42.7575000Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/pocketfft'... 2025-03-14T05:31:42.7576235Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/cudnn_frontend'... 2025-03-14T05:31:42.7577540Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/mimalloc'... 2025-03-14T05:31:42.7578720Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/psimd'... 2025-03-14T05:31:42.7579958Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/googletest'... 2025-03-14T05:31:42.7581226Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/pthreadpool'... 2025-03-14T05:31:42.7832119Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/flatbuffers'... 2025-03-14T05:31:43.3082268Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/python-peachpy'... 2025-03-14T05:31:43.3670129Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/sleef'... 2025-03-14T05:31:43.8155375Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto'... 2025-03-14T05:31:43.8156761Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe'... 2025-03-14T05:31:43.8158098Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/VulkanMemoryAllocator'... 2025-03-14T05:31:43.8817302Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/XNNPACK'... 2025-03-14T05:31:56.7977230Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm'... 2025-03-14T05:31:56.7977964Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/pybind11'... 2025-03-14T05:31:56.7978690Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/composable_kernel'... 2025-03-14T05:31:56.7979666Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/cutlass'... 2025-03-14T05:31:56.7980340Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/onnx'... 2025-03-14T05:31:56.7981043Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp'... 2025-03-14T05:31:56.7981761Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/eigen'... 2025-03-14T05:31:56.7982425Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/nlohmann'... 2025-03-14T05:31:56.7983102Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/protobuf'... 2025-03-14T05:31:56.8177089Z Submodule path 'android/libs/fbjni': checked out '7e1e1fe3858c63c251c637ae41a20de425dde96f' 2025-03-14T05:31:56.8327444Z Submodule path 'third_party/FP16': checked out '4dfe081cf6bcd15db339cf2680b9281b8451eeb3' 2025-03-14T05:31:56.8447165Z Submodule path 'third_party/FXdiv': checked out 'b408327ac2a15ec3e43352421954f5b1967701d1' 2025-03-14T05:31:56.8767394Z Submodule path 'third_party/NNPACK': checked out 'c07e3a0400713d546e0dea2d5466dd22ea389c73' 2025-03-14T05:31:56.9206219Z Submodule path 'third_party/NVTX': checked out 'e170594ac7cf1dac584da473d4ca9301087090c1' 2025-03-14T05:31:56.9663950Z Submodule path 'third_party/VulkanMemoryAllocator': checked out 'a6bfc237255a6bac1513f7c1ebde6d8aed6b5191' 2025-03-14T05:31:57.8507359Z Submodule path 'third_party/XNNPACK': checked out '51a0103656eff6fc9bfd39a4597923c4b542c883' 2025-03-14T05:31:57.8800918Z Submodule path 'third_party/benchmark': checked out '0d98dba29d66e93259db7daa53a9327df767a415' 2025-03-14T05:31:58.1973715Z Submodule path 'third_party/composable_kernel': checked out '8086bbe3a78d931eb96fe12fdc014082e18d18d3' 2025-03-14T05:31:58.2542383Z Submodule path 'third_party/cpp-httplib': checked out '3b6597bba913d51161383657829b7e644e59c006' 2025-03-14T05:31:58.3661497Z Submodule path 'third_party/cpuinfo': checked out '1e83a2fdd3102f65c6f1fb602c1b320486218a99' 2025-03-14T05:31:58.4078154Z Submodule path 'third_party/cudnn_frontend': checked out '91b7532f3386768bba4f444ee7672b497f34da8a' 2025-03-14T05:31:59.1235897Z Submodule path 'third_party/cutlass': checked out 'afa1772203677c5118fcd82537a9c8fefbcc7008' 2025-03-14T05:31:59.4137711Z Submodule path 'third_party/eigen': checked out '3147391d946bb4b6c68edd901f2add6ac1f31f8c' 2025-03-14T05:31:59.5263018Z Submodule path 'third_party/fbgemm': checked out 'dbc3157bf256f1339b3fa1fef2be89ac4078be0e' 2025-03-14T05:31:59.5287262Z Submodule 'third_party/asmjit' (https://github.com/asmjit/asmjit.git) registered for path 'third_party/fbgemm/third_party/asmjit' 2025-03-14T05:31:59.5289051Z Submodule 'third_party/cpuinfo' (https://github.com/pytorch/cpuinfo) registered for path 'third_party/fbgemm/third_party/cpuinfo' 2025-03-14T05:31:59.5292847Z Submodule 'third_party/cutlass' (https://github.com/NVIDIA/cutlass.git) registered for path 'third_party/fbgemm/third_party/cutlass' 2025-03-14T05:31:59.5296266Z Submodule 'third_party/googletest' (https://github.com/google/googletest) registered for path 'third_party/fbgemm/third_party/googletest' 2025-03-14T05:31:59.5299777Z Submodule 'third_party/hipify_torch' (https://github.com/ROCmSoftwarePlatform/hipify_torch.git) registered for path 'third_party/fbgemm/third_party/hipify_torch' 2025-03-14T05:31:59.5330888Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/third_party/asmjit'... 2025-03-14T05:32:00.5192556Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/third_party/hipify_torch'... 2025-03-14T05:32:00.5194159Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/third_party/cpuinfo'... 2025-03-14T05:32:00.6194492Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/third_party/cutlass'... 2025-03-14T05:32:01.9141235Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/third_party/googletest'... 2025-03-14T05:32:01.9658751Z Submodule path 'third_party/fbgemm/third_party/asmjit': checked out 'd3fbf7c9bc7c1d1365a94a45614b91c5a3706b81' 2025-03-14T05:32:02.0780901Z Submodule path 'third_party/fbgemm/third_party/cpuinfo': checked out 'ed8b86a253800bafdb7b25c5c399f91bff9cb1f3' 2025-03-14T05:32:02.5883660Z Submodule path 'third_party/fbgemm/third_party/cutlass': checked out 'fc9ebc645b63f3a6bc80aaefde5c063fb72110d6' 2025-03-14T05:32:02.6607763Z Submodule path 'third_party/fbgemm/third_party/googletest': checked out 'cbf019de22c8dd37b2108da35b2748fd702d1796' 2025-03-14T05:32:02.6763233Z Submodule path 'third_party/fbgemm/third_party/hipify_torch': checked out '23f53b025b466d8ec3c45d52290d3442f7fbe6b1' 2025-03-14T05:32:02.7650182Z Submodule path 'third_party/flash-attention': checked out '979702c87a8713a8e0a5e9fee122b90d2ef13be5' 2025-03-14T05:32:02.7675427Z Submodule 'csrc/composable_kernel' (https://github.com/ROCm/composable_kernel.git) registered for path 'third_party/flash-attention/csrc/composable_kernel' 2025-03-14T05:32:02.7676758Z Submodule 'csrc/cutlass' (https://github.com/NVIDIA/cutlass.git) registered for path 'third_party/flash-attention/csrc/cutlass' 2025-03-14T05:32:02.7709928Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/flash-attention/csrc/composable_kernel'... 2025-03-14T05:32:05.1691217Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/flash-attention/csrc/cutlass'... 2025-03-14T05:32:05.4821274Z Submodule path 'third_party/flash-attention/csrc/composable_kernel': checked out '888317e698e9803c62bd38568abc9e05d7709f33' 2025-03-14T05:32:06.1757242Z Submodule path 'third_party/flash-attention/csrc/cutlass': checked out 'c506e16788cb08416a4a57e11a9067beeee29420' 2025-03-14T05:32:06.3456552Z Submodule path 'third_party/flatbuffers': checked out '01834de25e4bf3975a9a00e816292b1ad0fe184b' 2025-03-14T05:32:06.3845058Z Submodule path 'third_party/fmt': checked out '123913715afeb8a437e6388b4473fcc4753e1c9a' 2025-03-14T05:32:06.4307997Z Submodule path 'third_party/gemmlowp/gemmlowp': checked out '3fb5c176c17c765a3492cd2f0321b0dab712f350' 2025-03-14T05:32:06.4648329Z Submodule path 'third_party/gloo': checked out '5354032ea08eadd7fc4456477f7f7c6308818509' 2025-03-14T05:32:06.5168744Z Submodule path 'third_party/googletest': checked out 'b514bdc898e2951020cbdca1304b75f5950d1f59' 2025-03-14T05:32:06.5330461Z Submodule path 'third_party/ideep': checked out '719d8e6cd7f7a0e01b155657526d693acf97c2b3' 2025-03-14T05:32:06.5350995Z Submodule 'mkl-dnn' (https://github.com/intel/mkl-dnn.git) registered for path 'third_party/ideep/mkl-dnn' 2025-03-14T05:32:06.5382067Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/ideep/mkl-dnn'... 2025-03-14T05:32:21.7661429Z Submodule path 'third_party/ideep/mkl-dnn': checked out '8d263e693366ef8db40acc569cc7d8edf644556d' 2025-03-14T05:32:21.7883675Z Submodule path 'third_party/ittapi': checked out '5b8a7d7422611c3a0d799fb5fc5dd4abfae35b42' 2025-03-14T05:32:21.8859435Z Submodule path 'third_party/kineto': checked out '2859721fd9e73d3ca1c56f827dbc64e6d68f78a2' 2025-03-14T05:32:21.8884346Z Submodule 'libkineto/third_party/dynolog' (https://github.com/facebookincubator/dynolog.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog' 2025-03-14T05:32:21.8886121Z Submodule 'libkineto/third_party/fmt' (https://github.com/fmtlib/fmt.git) registered for path 'third_party/kineto/libkineto/third_party/fmt' 2025-03-14T05:32:21.8889913Z Submodule 'libkineto/third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/kineto/libkineto/third_party/googletest' 2025-03-14T05:32:21.8921508Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog'... 2025-03-14T05:32:22.5991943Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/fmt'... 2025-03-14T05:32:23.3354601Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/googletest'... 2025-03-14T05:32:23.4285178Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog': checked out '7d04a0053a845370ae06ce317a22a48e9edcc74e' 2025-03-14T05:32:23.4307500Z Submodule 'third_party/DCGM' (https://github.com/NVIDIA/DCGM.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-03-14T05:32:23.4310199Z Submodule 'third_party/cpr' (https://github.com/libcpr/cpr.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-03-14T05:32:23.4313732Z Submodule 'third_party/fmt' (https://github.com/fmtlib/fmt.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-03-14T05:32:23.4317406Z Submodule 'third_party/gflags' (https://github.com/gflags/gflags.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-03-14T05:32:23.4320919Z Submodule 'third_party/glog' (https://github.com/google/glog.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-03-14T05:32:23.4324729Z Submodule 'third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-03-14T05:32:23.4328797Z Submodule 'third_party/json' (https://github.com/nlohmann/json.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-03-14T05:32:23.4332287Z Submodule 'third_party/pfs' (https://github.com/dtrugman/pfs.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-03-14T05:32:23.4364345Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM'... 2025-03-14T05:32:24.6540512Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/pfs'... 2025-03-14T05:32:24.6542002Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/gflags'... 2025-03-14T05:32:24.6543616Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/cpr'... 2025-03-14T05:32:24.6545210Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/glog'... 2025-03-14T05:32:24.7542670Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/fmt'... 2025-03-14T05:32:24.9899020Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/googletest'... 2025-03-14T05:32:25.0899311Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/json'... 2025-03-14T05:32:30.8957611Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM': checked out 'ffde4e54bc7249a6039a5e6b45b395141e1217f9' 2025-03-14T05:32:30.9198323Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr': checked out '871ed52d350214a034f6ef8a3b8f51c5ce1bd400' 2025-03-14T05:32:30.9633221Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt': checked out 'cd4af11efc9c622896a3e4cb599fa28668ca3d05' 2025-03-14T05:32:30.9806440Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags': checked out 'e171aa2d15ed9eb17054558e0b3a6a413bb01067' 2025-03-14T05:32:30.9826521Z Submodule 'doc' (https://github.com/gflags/gflags.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-03-14T05:32:30.9857393Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc'... 2025-03-14T05:32:31.2774534Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc': checked out '8411df715cf522606e3b1aca386ddfc0b63d34b4' 2025-03-14T05:32:31.3001775Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog': checked out 'b33e3bad4c46c8a6345525fd822af355e5ef9446' 2025-03-14T05:32:31.3479094Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest': checked out '58d77fa8070e8cec2dc1ed015d66b454c8d78850' 2025-03-14T05:32:31.4695572Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/json': checked out '4f8fba14066156b73f1189a2b8bd568bde5284c5' 2025-03-14T05:32:31.4902253Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs': checked out 'f68a2fa8ea36c783bdd760371411fcb495aa3150' 2025-03-14T05:32:31.5361515Z Submodule path 'third_party/kineto/libkineto/third_party/fmt': checked out '0041a40c1350ba702d475b9c4ad62da77caea164' 2025-03-14T05:32:31.6041327Z Submodule path 'third_party/kineto/libkineto/third_party/googletest': checked out '7aca84427f224eeed3144123d5230d5871e93347' 2025-03-14T05:32:31.6500537Z Submodule path 'third_party/kleidiai': checked out 'ef685a13cfbe8d418aa2ed34350e21e4938358b6' 2025-03-14T05:32:31.6960326Z Submodule path 'third_party/mimalloc': checked out 'b66e3214d8a104669c2ec05ae91ebc26a8f5ab78' 2025-03-14T05:32:31.8289332Z Submodule path 'third_party/nlohmann': checked out '87cda1d6646592ac5866dc703c8e1839046a6806' 2025-03-14T05:32:32.3517814Z Submodule path 'third_party/onnx': checked out 'b8baa8446686496da4cc8fda09f2b6fe65c2a02c' 2025-03-14T05:32:32.3556661Z Submodule 'third_party/pybind11' (https://github.com/pybind/pybind11.git) registered for path 'third_party/onnx/third_party/pybind11' 2025-03-14T05:32:32.3587166Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/onnx/third_party/pybind11'... 2025-03-14T05:32:33.5169634Z Submodule path 'third_party/onnx/third_party/pybind11': checked out '3e9dfa2866941655c56877882565e7577de6fc7b' 2025-03-14T05:32:33.6098558Z Submodule path 'third_party/opentelemetry-cpp': checked out 'a799f4aed9c94b765dcdaabaeab7d5e7e2310878' 2025-03-14T05:32:33.6123675Z Submodule 'third_party/benchmark' (https://github.com/google/benchmark) registered for path 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-03-14T05:32:33.6126546Z Submodule 'third_party/googletest' (https://github.com/google/googletest) registered for path 'third_party/opentelemetry-cpp/third_party/googletest' 2025-03-14T05:32:33.6130173Z Submodule 'third_party/ms-gsl' (https://github.com/microsoft/GSL) registered for path 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-03-14T05:32:33.6133987Z Submodule 'third_party/nlohmann-json' (https://github.com/nlohmann/json) registered for path 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-03-14T05:32:33.6138030Z Submodule 'third_party/opentelemetry-proto' (https://github.com/open-telemetry/opentelemetry-proto) registered for path 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-03-14T05:32:33.6141975Z Submodule 'third_party/opentracing-cpp' (https://github.com/opentracing/opentracing-cpp.git) registered for path 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-03-14T05:32:33.6146078Z Submodule 'third_party/prometheus-cpp' (https://github.com/jupp0r/prometheus-cpp) registered for path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-03-14T05:32:33.6150527Z Submodule 'tools/vcpkg' (https://github.com/Microsoft/vcpkg) registered for path 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-03-14T05:32:33.6184910Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/benchmark'... 2025-03-14T05:32:34.2438631Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/opentracing-cpp'... 2025-03-14T05:32:34.2439693Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/opentelemetry-proto'... 2025-03-14T05:32:34.2440717Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/prometheus-cpp'... 2025-03-14T05:32:34.2441993Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/ms-gsl'... 2025-03-14T05:32:34.3440047Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/googletest'... 2025-03-14T05:32:35.1839168Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/nlohmann-json'... 2025-03-14T05:32:41.4444919Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/tools/vcpkg'... 2025-03-14T05:32:42.0334994Z Submodule path 'third_party/opentelemetry-cpp/third_party/benchmark': checked out 'd572f4777349d43653b21d6c2fc63020ab326db2' 2025-03-14T05:32:42.0803810Z Submodule path 'third_party/opentelemetry-cpp/third_party/googletest': checked out 'b796f7d44681514f58a683a3a71ff17c94edb0c1' 2025-03-14T05:32:42.0998847Z Submodule path 'third_party/opentelemetry-cpp/third_party/ms-gsl': checked out '6f4529395c5b7c2d661812257cd6780c67e54afa' 2025-03-14T05:32:42.2264748Z Submodule path 'third_party/opentelemetry-cpp/third_party/nlohmann-json': checked out 'bc889afb4c5bf1c0d8ee29ef35eaaf4c8bef8a5d' 2025-03-14T05:32:42.2436448Z Submodule path 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto': checked out '4ca4f0335c63cda7ab31ea7ed70d6553aee14dce' 2025-03-14T05:32:42.2620618Z Submodule path 'third_party/opentelemetry-cpp/third_party/opentracing-cpp': checked out '06b57f48ded1fa3bdd3d4346f6ef29e40e08eaf5' 2025-03-14T05:32:42.2825726Z Submodule path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp': checked out 'c9ffcdda9086ffd9e1283ea7a0276d831f3c8a8d' 2025-03-14T05:32:42.2846798Z Submodule 'civetweb' (https://github.com/civetweb/civetweb.git) registered for path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-03-14T05:32:42.2849079Z Submodule 'googletest' (https://github.com/google/googletest.git) registered for path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-03-14T05:32:42.2879879Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb'... 2025-03-14T05:32:44.1500449Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest'... 2025-03-14T05:32:44.4328294Z Submodule path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb': checked out 'eefb26f82b233268fc98577d265352720d477ba4' 2025-03-14T05:32:44.4874938Z Submodule path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest': checked out 'e2239ee6043f73722e7aa812a459f54a28552929' 2025-03-14T05:32:45.1213562Z Submodule path 'third_party/opentelemetry-cpp/tools/vcpkg': checked out '8eb57355a4ffb410a2e94c07b4dca2dffbee8e50' 2025-03-14T05:32:45.1366472Z Submodule path 'third_party/pocketfft': checked out '9d3ab05a7fffbc71a492bc6a17be034e83e8f0fe' 2025-03-14T05:32:45.4570289Z Submodule path 'third_party/protobuf': checked out 'd1eca4e4b421cd2997495c4b4e65cea6be4e9b8a' 2025-03-14T05:32:45.4594696Z Submodule 'third_party/benchmark' (https://github.com/google/benchmark.git) registered for path 'third_party/protobuf/third_party/benchmark' 2025-03-14T05:32:45.4597931Z Submodule 'third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/protobuf/third_party/googletest' 2025-03-14T05:32:45.4627911Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/protobuf/third_party/benchmark'... 2025-03-14T05:32:46.0433793Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/protobuf/third_party/googletest'... 2025-03-14T05:32:46.5619801Z Submodule path 'third_party/protobuf/third_party/benchmark': checked out '5b7683f49e1e9223cf9927b24f6fd3d6bd82e3f8' 2025-03-14T05:32:46.6435865Z Submodule path 'third_party/protobuf/third_party/googletest': checked out '5ec7f0c4a113e2f18ac2c6cc7df51ad6afc24081' 2025-03-14T05:32:46.6561913Z Submodule path 'third_party/psimd': checked out '072586a71b55b7f8c584153d223e95687148a900' 2025-03-14T05:32:46.6716884Z Submodule path 'third_party/pthreadpool': checked out '4fe0e1e183925bf8cfa6aae24237e724a96479b8' 2025-03-14T05:32:46.7167537Z Submodule path 'third_party/pybind11': checked out 'a2e59f0e7065404b44dfe92a28aca47ba1378dc4' 2025-03-14T05:32:46.7512031Z Submodule path 'third_party/python-peachpy': checked out 'f45429b087dd7d5bc78bb40dc7cf06425c252d67' 2025-03-14T05:32:46.8030165Z Submodule path 'third_party/sleef': checked out '56e1f79cb140fb9326d612d0be06b5250565cade' 2025-03-14T05:32:46.8380364Z Submodule path 'third_party/tensorpipe': checked out '52791a2fd214b2a9dc5759d36725909c1daa7f2e' 2025-03-14T05:32:46.8401082Z Submodule 'third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/tensorpipe/third_party/googletest' 2025-03-14T05:32:46.8404201Z Submodule 'third_party/libnop' (https://github.com/google/libnop.git) registered for path 'third_party/tensorpipe/third_party/libnop' 2025-03-14T05:32:46.8407656Z Submodule 'third_party/libuv' (https://github.com/libuv/libuv.git) registered for path 'third_party/tensorpipe/third_party/libuv' 2025-03-14T05:32:46.8411399Z Submodule 'third_party/pybind11' (https://github.com/pybind/pybind11.git) registered for path 'third_party/tensorpipe/third_party/pybind11' 2025-03-14T05:32:46.8444530Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe/third_party/googletest'... 2025-03-14T05:32:48.0726112Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe/third_party/libnop'... 2025-03-14T05:32:48.0727134Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe/third_party/pybind11'... 2025-03-14T05:32:48.1726721Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe/third_party/libuv'... 2025-03-14T05:32:48.9133515Z Submodule path 'third_party/tensorpipe/third_party/googletest': checked out 'aee0f9d9b5b87796ee8a0ab26b7587ec30e8858e' 2025-03-14T05:32:48.9333893Z Submodule path 'third_party/tensorpipe/third_party/libnop': checked out '910b55815be16109f04f4180e9adee14fb4ce281' 2025-03-14T05:32:49.0049113Z Submodule path 'third_party/tensorpipe/third_party/libuv': checked out '1dff88e5161cba5c59276d2070d2e304e4dcb242' 2025-03-14T05:32:49.0401068Z Submodule path 'third_party/tensorpipe/third_party/pybind11': checked out 'a23996fce38ff6ccfbcdc09f1e63f2c4be5ea2ef' 2025-03-14T05:32:49.0422257Z Submodule 'tools/clang' (https://github.com/wjakob/clang-cindex-python3) registered for path 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-03-14T05:32:49.0452514Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe/third_party/pybind11/tools/clang'... 2025-03-14T05:32:49.2541706Z Submodule path 'third_party/tensorpipe/third_party/pybind11/tools/clang': checked out '6a00cbc4a9b8e68b71caf7f774b3f9c753ae84d5' 2025-03-14T05:32:49.2589726Z [command]/usr/bin/git submodule foreach --recursive git config --local gc.auto 0 2025-03-14T05:32:49.2946585Z Entering 'android/libs/fbjni' 2025-03-14T05:32:49.2998393Z Entering 'third_party/FP16' 2025-03-14T05:32:49.3049170Z Entering 'third_party/FXdiv' 2025-03-14T05:32:49.3102450Z Entering 'third_party/NNPACK' 2025-03-14T05:32:49.3154549Z Entering 'third_party/NVTX' 2025-03-14T05:32:49.3207590Z Entering 'third_party/VulkanMemoryAllocator' 2025-03-14T05:32:49.3260080Z Entering 'third_party/XNNPACK' 2025-03-14T05:32:49.3329111Z Entering 'third_party/benchmark' 2025-03-14T05:32:49.3382245Z Entering 'third_party/composable_kernel' 2025-03-14T05:32:49.3440656Z Entering 'third_party/cpp-httplib' 2025-03-14T05:32:49.3492948Z Entering 'third_party/cpuinfo' 2025-03-14T05:32:49.3544944Z Entering 'third_party/cudnn_frontend' 2025-03-14T05:32:49.3598382Z Entering 'third_party/cutlass' 2025-03-14T05:32:49.3658245Z Entering 'third_party/eigen' 2025-03-14T05:32:49.3712500Z Entering 'third_party/fbgemm' 2025-03-14T05:32:49.3763790Z Entering 'third_party/fbgemm/third_party/asmjit' 2025-03-14T05:32:49.3815009Z Entering 'third_party/fbgemm/third_party/cpuinfo' 2025-03-14T05:32:49.3868493Z Entering 'third_party/fbgemm/third_party/cutlass' 2025-03-14T05:32:49.3926615Z Entering 'third_party/fbgemm/third_party/googletest' 2025-03-14T05:32:49.3977802Z Entering 'third_party/fbgemm/third_party/hipify_torch' 2025-03-14T05:32:49.4031341Z Entering 'third_party/flash-attention' 2025-03-14T05:32:49.4083404Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-03-14T05:32:49.4141332Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-03-14T05:32:49.4204546Z Entering 'third_party/flatbuffers' 2025-03-14T05:32:49.4259268Z Entering 'third_party/fmt' 2025-03-14T05:32:49.4313135Z Entering 'third_party/gemmlowp/gemmlowp' 2025-03-14T05:32:49.4364881Z Entering 'third_party/gloo' 2025-03-14T05:32:49.4417347Z Entering 'third_party/googletest' 2025-03-14T05:32:49.4472864Z Entering 'third_party/ideep' 2025-03-14T05:32:49.4523856Z Entering 'third_party/ideep/mkl-dnn' 2025-03-14T05:32:49.4582333Z Entering 'third_party/ittapi' 2025-03-14T05:32:49.4633588Z Entering 'third_party/kineto' 2025-03-14T05:32:49.4685002Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-03-14T05:32:49.4734875Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-03-14T05:32:49.4787638Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-03-14T05:32:49.4840089Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-03-14T05:32:49.4891258Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-03-14T05:32:49.4940300Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-03-14T05:32:49.4996877Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-03-14T05:32:49.5051265Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-03-14T05:32:49.5104600Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-03-14T05:32:49.5156324Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-03-14T05:32:49.5212943Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-03-14T05:32:49.5263044Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-03-14T05:32:49.5318306Z Entering 'third_party/kleidiai' 2025-03-14T05:32:49.5369980Z Entering 'third_party/mimalloc' 2025-03-14T05:32:49.5423385Z Entering 'third_party/nlohmann' 2025-03-14T05:32:49.5476599Z Entering 'third_party/onnx' 2025-03-14T05:32:49.5544219Z Entering 'third_party/onnx/third_party/pybind11' 2025-03-14T05:32:49.5605637Z Entering 'third_party/opentelemetry-cpp' 2025-03-14T05:32:49.5658286Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-03-14T05:32:49.5709012Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-03-14T05:32:49.5759078Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-03-14T05:32:49.5809656Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-03-14T05:32:49.5860606Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-03-14T05:32:49.5910721Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-03-14T05:32:49.5959982Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-03-14T05:32:49.6008809Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-03-14T05:32:49.6061110Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-03-14T05:32:49.6114857Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-03-14T05:32:49.6188564Z Entering 'third_party/pocketfft' 2025-03-14T05:32:49.6241147Z Entering 'third_party/protobuf' 2025-03-14T05:32:49.6297789Z Entering 'third_party/protobuf/third_party/benchmark' 2025-03-14T05:32:49.6347168Z Entering 'third_party/protobuf/third_party/googletest' 2025-03-14T05:32:49.6403329Z Entering 'third_party/psimd' 2025-03-14T05:32:49.6455949Z Entering 'third_party/pthreadpool' 2025-03-14T05:32:49.6508751Z Entering 'third_party/pybind11' 2025-03-14T05:32:49.6561820Z Entering 'third_party/python-peachpy' 2025-03-14T05:32:49.6614510Z Entering 'third_party/sleef' 2025-03-14T05:32:49.6666931Z Entering 'third_party/tensorpipe' 2025-03-14T05:32:49.6723141Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-03-14T05:32:49.6773231Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-03-14T05:32:49.6822800Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-03-14T05:32:49.6874496Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-03-14T05:32:49.6922763Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-03-14T05:32:49.6995318Z ##[endgroup] 2025-03-14T05:32:49.6995820Z ##[group]Persisting credentials for submodules 2025-03-14T05:32:49.7001606Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'url\.https\:\/\/github\.com\/\.insteadOf' && git config --local --unset-all 'url.https://github.com/.insteadOf' || :" 2025-03-14T05:32:49.7360300Z Entering 'android/libs/fbjni' 2025-03-14T05:32:49.7429991Z Entering 'third_party/FP16' 2025-03-14T05:32:49.7499258Z Entering 'third_party/FXdiv' 2025-03-14T05:32:49.7567417Z Entering 'third_party/NNPACK' 2025-03-14T05:32:49.7636498Z Entering 'third_party/NVTX' 2025-03-14T05:32:49.7707531Z Entering 'third_party/VulkanMemoryAllocator' 2025-03-14T05:32:49.7776215Z Entering 'third_party/XNNPACK' 2025-03-14T05:32:49.7859564Z Entering 'third_party/benchmark' 2025-03-14T05:32:49.7930712Z Entering 'third_party/composable_kernel' 2025-03-14T05:32:49.8006074Z Entering 'third_party/cpp-httplib' 2025-03-14T05:32:49.8075145Z Entering 'third_party/cpuinfo' 2025-03-14T05:32:49.8144196Z Entering 'third_party/cudnn_frontend' 2025-03-14T05:32:49.8213993Z Entering 'third_party/cutlass' 2025-03-14T05:32:49.8292299Z Entering 'third_party/eigen' 2025-03-14T05:32:49.8362655Z Entering 'third_party/fbgemm' 2025-03-14T05:32:49.8437442Z Entering 'third_party/fbgemm/third_party/asmjit' 2025-03-14T05:32:49.8504425Z Entering 'third_party/fbgemm/third_party/cpuinfo' 2025-03-14T05:32:49.8571581Z Entering 'third_party/fbgemm/third_party/cutlass' 2025-03-14T05:32:49.8645399Z Entering 'third_party/fbgemm/third_party/googletest' 2025-03-14T05:32:49.8711375Z Entering 'third_party/fbgemm/third_party/hipify_torch' 2025-03-14T05:32:49.8782255Z Entering 'third_party/flash-attention' 2025-03-14T05:32:49.8849604Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-03-14T05:32:49.8922679Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-03-14T05:32:49.9001978Z Entering 'third_party/flatbuffers' 2025-03-14T05:32:49.9073735Z Entering 'third_party/fmt' 2025-03-14T05:32:49.9143346Z Entering 'third_party/gemmlowp/gemmlowp' 2025-03-14T05:32:49.9211828Z Entering 'third_party/gloo' 2025-03-14T05:32:49.9280166Z Entering 'third_party/googletest' 2025-03-14T05:32:49.9347707Z Entering 'third_party/ideep' 2025-03-14T05:32:49.9413209Z Entering 'third_party/ideep/mkl-dnn' 2025-03-14T05:32:49.9493893Z Entering 'third_party/ittapi' 2025-03-14T05:32:49.9562011Z Entering 'third_party/kineto' 2025-03-14T05:32:49.9630058Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-03-14T05:32:49.9695109Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-03-14T05:32:49.9764902Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-03-14T05:32:49.9834009Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-03-14T05:32:49.9902896Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-03-14T05:32:49.9969157Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-03-14T05:32:50.0040622Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-03-14T05:32:50.0109468Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-03-14T05:32:50.0178529Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-03-14T05:32:50.0247807Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-03-14T05:32:50.0319294Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-03-14T05:32:50.0386405Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-03-14T05:32:50.0455774Z Entering 'third_party/kleidiai' 2025-03-14T05:32:50.0527861Z Entering 'third_party/mimalloc' 2025-03-14T05:32:50.0596822Z Entering 'third_party/nlohmann' 2025-03-14T05:32:50.0666252Z Entering 'third_party/onnx' 2025-03-14T05:32:50.0748033Z Entering 'third_party/onnx/third_party/pybind11' 2025-03-14T05:32:50.0821935Z Entering 'third_party/opentelemetry-cpp' 2025-03-14T05:32:50.0892224Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-03-14T05:32:50.0959824Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-03-14T05:32:50.1026819Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-03-14T05:32:50.1095654Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-03-14T05:32:50.1164446Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-03-14T05:32:50.1233106Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-03-14T05:32:50.1300673Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-03-14T05:32:50.1365775Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-03-14T05:32:50.1435298Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-03-14T05:32:50.1506930Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-03-14T05:32:50.1597215Z Entering 'third_party/pocketfft' 2025-03-14T05:32:50.1666222Z Entering 'third_party/protobuf' 2025-03-14T05:32:50.1735034Z Entering 'third_party/protobuf/third_party/benchmark' 2025-03-14T05:32:50.1801261Z Entering 'third_party/protobuf/third_party/googletest' 2025-03-14T05:32:50.1872439Z Entering 'third_party/psimd' 2025-03-14T05:32:50.1941040Z Entering 'third_party/pthreadpool' 2025-03-14T05:32:50.2010288Z Entering 'third_party/pybind11' 2025-03-14T05:32:50.2079154Z Entering 'third_party/python-peachpy' 2025-03-14T05:32:50.2147297Z Entering 'third_party/sleef' 2025-03-14T05:32:50.2214847Z Entering 'third_party/tensorpipe' 2025-03-14T05:32:50.2283228Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-03-14T05:32:50.2351050Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-03-14T05:32:50.2420708Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-03-14T05:32:50.2487371Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-03-14T05:32:50.2551111Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-03-14T05:32:50.2645476Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local 'http.https://github.com/.extraheader' 'AUTHORIZATION: basic ***' && git config --local --show-origin --name-only --get-regexp remote.origin.url" 2025-03-14T05:32:50.3001670Z Entering 'android/libs/fbjni' 2025-03-14T05:32:50.3064012Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/android/libs/fbjni/config remote.origin.url 2025-03-14T05:32:50.3087082Z Entering 'third_party/FP16' 2025-03-14T05:32:50.3149856Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NNPACK_deps/FP16/config remote.origin.url 2025-03-14T05:32:50.3172291Z Entering 'third_party/FXdiv' 2025-03-14T05:32:50.3234101Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NNPACK_deps/FXdiv/config remote.origin.url 2025-03-14T05:32:50.3255543Z Entering 'third_party/NNPACK' 2025-03-14T05:32:50.3318924Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NNPACK/config remote.origin.url 2025-03-14T05:32:50.3340830Z Entering 'third_party/NVTX' 2025-03-14T05:32:50.3402804Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NVTX/config remote.origin.url 2025-03-14T05:32:50.3427463Z Entering 'third_party/VulkanMemoryAllocator' 2025-03-14T05:32:50.3490637Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/VulkanMemoryAllocator/config remote.origin.url 2025-03-14T05:32:50.3512375Z Entering 'third_party/XNNPACK' 2025-03-14T05:32:50.3573806Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/XNNPACK/config remote.origin.url 2025-03-14T05:32:50.3609791Z Entering 'third_party/benchmark' 2025-03-14T05:32:50.3670988Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/benchmark/config remote.origin.url 2025-03-14T05:32:50.3693261Z Entering 'third_party/composable_kernel' 2025-03-14T05:32:50.3754126Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/composable_kernel/config remote.origin.url 2025-03-14T05:32:50.3783886Z Entering 'third_party/cpp-httplib' 2025-03-14T05:32:50.3849999Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/cpp-httplib/config remote.origin.url 2025-03-14T05:32:50.3871581Z Entering 'third_party/cpuinfo' 2025-03-14T05:32:50.3938143Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/cpuinfo/config remote.origin.url 2025-03-14T05:32:50.3959838Z Entering 'third_party/cudnn_frontend' 2025-03-14T05:32:50.4022101Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/cudnn_frontend/config remote.origin.url 2025-03-14T05:32:50.4044151Z Entering 'third_party/cutlass' 2025-03-14T05:32:50.4107928Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/cutlass/config remote.origin.url 2025-03-14T05:32:50.4141372Z Entering 'third_party/eigen' 2025-03-14T05:32:50.4205644Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/eigen/config remote.origin.url 2025-03-14T05:32:50.4229276Z Entering 'third_party/fbgemm' 2025-03-14T05:32:50.4292278Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/config remote.origin.url 2025-03-14T05:32:50.4312979Z Entering 'third_party/fbgemm/third_party/asmjit' 2025-03-14T05:32:50.4373747Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/third_party/asmjit/config remote.origin.url 2025-03-14T05:32:50.4394369Z Entering 'third_party/fbgemm/third_party/cpuinfo' 2025-03-14T05:32:50.4454827Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/third_party/cpuinfo/config remote.origin.url 2025-03-14T05:32:50.4476280Z Entering 'third_party/fbgemm/third_party/cutlass' 2025-03-14T05:32:50.4537959Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/third_party/cutlass/config remote.origin.url 2025-03-14T05:32:50.4565407Z Entering 'third_party/fbgemm/third_party/googletest' 2025-03-14T05:32:50.4627669Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/third_party/googletest/config remote.origin.url 2025-03-14T05:32:50.4647442Z Entering 'third_party/fbgemm/third_party/hipify_torch' 2025-03-14T05:32:50.4708759Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/third_party/hipify_torch/config remote.origin.url 2025-03-14T05:32:50.4731717Z Entering 'third_party/flash-attention' 2025-03-14T05:32:50.4797789Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/flash-attention/config remote.origin.url 2025-03-14T05:32:50.4819270Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-03-14T05:32:50.4882343Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/flash-attention/modules/csrc/composable_kernel/config remote.origin.url 2025-03-14T05:32:50.4910681Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-03-14T05:32:50.4972853Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/flash-attention/modules/csrc/cutlass/config remote.origin.url 2025-03-14T05:32:50.5004320Z Entering 'third_party/flatbuffers' 2025-03-14T05:32:50.5067118Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/flatbuffers/config remote.origin.url 2025-03-14T05:32:50.5092007Z Entering 'third_party/fmt' 2025-03-14T05:32:50.5153157Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fmt/config remote.origin.url 2025-03-14T05:32:50.5175955Z Entering 'third_party/gemmlowp/gemmlowp' 2025-03-14T05:32:50.5238889Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/gemmlowp/gemmlowp/config remote.origin.url 2025-03-14T05:32:50.5261049Z Entering 'third_party/gloo' 2025-03-14T05:32:50.5323811Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/gloo/config remote.origin.url 2025-03-14T05:32:50.5345604Z Entering 'third_party/googletest' 2025-03-14T05:32:50.5407796Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/googletest/config remote.origin.url 2025-03-14T05:32:50.5429552Z Entering 'third_party/ideep' 2025-03-14T05:32:50.5492713Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/ideep/config remote.origin.url 2025-03-14T05:32:50.5512211Z Entering 'third_party/ideep/mkl-dnn' 2025-03-14T05:32:50.5573781Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/ideep/modules/mkl-dnn/config remote.origin.url 2025-03-14T05:32:50.5604756Z Entering 'third_party/ittapi' 2025-03-14T05:32:50.5666475Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/ittapi/config remote.origin.url 2025-03-14T05:32:50.5689084Z Entering 'third_party/kineto' 2025-03-14T05:32:50.5751489Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/config remote.origin.url 2025-03-14T05:32:50.5773925Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-03-14T05:32:50.5838434Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/config remote.origin.url 2025-03-14T05:32:50.5858015Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-03-14T05:32:50.5925172Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/DCGM/config remote.origin.url 2025-03-14T05:32:50.5948568Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-03-14T05:32:50.6012805Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/cpr/config remote.origin.url 2025-03-14T05:32:50.6034572Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-03-14T05:32:50.6097696Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/fmt/config remote.origin.url 2025-03-14T05:32:50.6121407Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-03-14T05:32:50.6185652Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/gflags/config remote.origin.url 2025-03-14T05:32:50.6205510Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-03-14T05:32:50.6269165Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/gflags/modules/doc/config remote.origin.url 2025-03-14T05:32:50.6294006Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-03-14T05:32:50.6356716Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/glog/config remote.origin.url 2025-03-14T05:32:50.6379005Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-03-14T05:32:50.6441583Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/googletest/config remote.origin.url 2025-03-14T05:32:50.6463466Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-03-14T05:32:50.6526028Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/json/config remote.origin.url 2025-03-14T05:32:50.6548893Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-03-14T05:32:50.6611598Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/pfs/config remote.origin.url 2025-03-14T05:32:50.6636163Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-03-14T05:32:50.6703301Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/fmt/config remote.origin.url 2025-03-14T05:32:50.6724331Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-03-14T05:32:50.6785608Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/googletest/config remote.origin.url 2025-03-14T05:32:50.6810557Z Entering 'third_party/kleidiai' 2025-03-14T05:32:50.6872319Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kleidiai/config remote.origin.url 2025-03-14T05:32:50.6894254Z Entering 'third_party/mimalloc' 2025-03-14T05:32:50.6960935Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/mimalloc/config remote.origin.url 2025-03-14T05:32:50.6982936Z Entering 'third_party/nlohmann' 2025-03-14T05:32:50.7044019Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/nlohmann/config remote.origin.url 2025-03-14T05:32:50.7066360Z Entering 'third_party/onnx' 2025-03-14T05:32:50.7130177Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/onnx/config remote.origin.url 2025-03-14T05:32:50.7164455Z Entering 'third_party/onnx/third_party/pybind11' 2025-03-14T05:32:50.7229015Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/onnx/modules/third_party/pybind11/config remote.origin.url 2025-03-14T05:32:50.7255152Z Entering 'third_party/opentelemetry-cpp' 2025-03-14T05:32:50.7321902Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/config remote.origin.url 2025-03-14T05:32:50.7343583Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-03-14T05:32:50.7408709Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/benchmark/config remote.origin.url 2025-03-14T05:32:50.7429842Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-03-14T05:32:50.7492822Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/googletest/config remote.origin.url 2025-03-14T05:32:50.7513529Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-03-14T05:32:50.7576229Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/ms-gsl/config remote.origin.url 2025-03-14T05:32:50.7596958Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-03-14T05:32:50.7659699Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/nlohmann-json/config remote.origin.url 2025-03-14T05:32:50.7680949Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-03-14T05:32:50.7742869Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/opentelemetry-proto/config remote.origin.url 2025-03-14T05:32:50.7763392Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-03-14T05:32:50.7824283Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/opentracing-cpp/config remote.origin.url 2025-03-14T05:32:50.7844439Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-03-14T05:32:50.7908888Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/prometheus-cpp/config remote.origin.url 2025-03-14T05:32:50.7927504Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-03-14T05:32:50.7990420Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/prometheus-cpp/modules/civetweb/config remote.origin.url 2025-03-14T05:32:50.8013883Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-03-14T05:32:50.8076925Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/prometheus-cpp/modules/googletest/config remote.origin.url 2025-03-14T05:32:50.8101479Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-03-14T05:32:50.8164201Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/tools/vcpkg/config remote.origin.url 2025-03-14T05:32:50.8207074Z Entering 'third_party/pocketfft' 2025-03-14T05:32:50.8270144Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/pocketfft/config remote.origin.url 2025-03-14T05:32:50.8296049Z Entering 'third_party/protobuf' 2025-03-14T05:32:50.8358802Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/protobuf/config remote.origin.url 2025-03-14T05:32:50.8383215Z Entering 'third_party/protobuf/third_party/benchmark' 2025-03-14T05:32:50.8445170Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/protobuf/modules/third_party/benchmark/config remote.origin.url 2025-03-14T05:32:50.8465641Z Entering 'third_party/protobuf/third_party/googletest' 2025-03-14T05:32:50.8526743Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/protobuf/modules/third_party/googletest/config remote.origin.url 2025-03-14T05:32:50.8551361Z Entering 'third_party/psimd' 2025-03-14T05:32:50.8615004Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NNPACK_deps/psimd/config remote.origin.url 2025-03-14T05:32:50.8637191Z Entering 'third_party/pthreadpool' 2025-03-14T05:32:50.8703878Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NNPACK_deps/pthreadpool/config remote.origin.url 2025-03-14T05:32:50.8726285Z Entering 'third_party/pybind11' 2025-03-14T05:32:50.8792315Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/pybind11/config remote.origin.url 2025-03-14T05:32:50.8814635Z Entering 'third_party/python-peachpy' 2025-03-14T05:32:50.8877755Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/python-peachpy/config remote.origin.url 2025-03-14T05:32:50.8900103Z Entering 'third_party/sleef' 2025-03-14T05:32:50.8961429Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/sleef/config remote.origin.url 2025-03-14T05:32:50.8984909Z Entering 'third_party/tensorpipe' 2025-03-14T05:32:50.9047207Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/config remote.origin.url 2025-03-14T05:32:50.9067671Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-03-14T05:32:50.9129491Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/modules/third_party/googletest/config remote.origin.url 2025-03-14T05:32:50.9150332Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-03-14T05:32:50.9212430Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/modules/third_party/libnop/config remote.origin.url 2025-03-14T05:32:50.9233009Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-03-14T05:32:50.9295674Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/modules/third_party/libuv/config remote.origin.url 2025-03-14T05:32:50.9317298Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-03-14T05:32:50.9379953Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/modules/third_party/pybind11/config remote.origin.url 2025-03-14T05:32:50.9398637Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-03-14T05:32:50.9463184Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/modules/third_party/pybind11/modules/tools/clang/config remote.origin.url 2025-03-14T05:32:51.0161092Z [command]/usr/bin/git submodule foreach --recursive git config --local --add 'url.https://github.com/.insteadOf' 'git@github.com:' 2025-03-14T05:32:51.0524501Z Entering 'android/libs/fbjni' 2025-03-14T05:32:51.0576342Z Entering 'third_party/FP16' 2025-03-14T05:32:51.0628438Z Entering 'third_party/FXdiv' 2025-03-14T05:32:51.0680355Z Entering 'third_party/NNPACK' 2025-03-14T05:32:51.0732487Z Entering 'third_party/NVTX' 2025-03-14T05:32:51.0784771Z Entering 'third_party/VulkanMemoryAllocator' 2025-03-14T05:32:51.0838023Z Entering 'third_party/XNNPACK' 2025-03-14T05:32:51.0906869Z Entering 'third_party/benchmark' 2025-03-14T05:32:51.0958753Z Entering 'third_party/composable_kernel' 2025-03-14T05:32:51.1016712Z Entering 'third_party/cpp-httplib' 2025-03-14T05:32:51.1069619Z Entering 'third_party/cpuinfo' 2025-03-14T05:32:51.1121778Z Entering 'third_party/cudnn_frontend' 2025-03-14T05:32:51.1173805Z Entering 'third_party/cutlass' 2025-03-14T05:32:51.1232887Z Entering 'third_party/eigen' 2025-03-14T05:32:51.1286223Z Entering 'third_party/fbgemm' 2025-03-14T05:32:51.1337796Z Entering 'third_party/fbgemm/third_party/asmjit' 2025-03-14T05:32:51.1388927Z Entering 'third_party/fbgemm/third_party/cpuinfo' 2025-03-14T05:32:51.1447686Z Entering 'third_party/fbgemm/third_party/cutlass' 2025-03-14T05:32:51.1498273Z Entering 'third_party/fbgemm/third_party/googletest' 2025-03-14T05:32:51.1548505Z Entering 'third_party/fbgemm/third_party/hipify_torch' 2025-03-14T05:32:51.1602736Z Entering 'third_party/flash-attention' 2025-03-14T05:32:51.1655339Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-03-14T05:32:51.1712506Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-03-14T05:32:51.1775180Z Entering 'third_party/flatbuffers' 2025-03-14T05:32:51.1830036Z Entering 'third_party/fmt' 2025-03-14T05:32:51.1883241Z Entering 'third_party/gemmlowp/gemmlowp' 2025-03-14T05:32:51.1936764Z Entering 'third_party/gloo' 2025-03-14T05:32:51.1989069Z Entering 'third_party/googletest' 2025-03-14T05:32:51.2040622Z Entering 'third_party/ideep' 2025-03-14T05:32:51.2090238Z Entering 'third_party/ideep/mkl-dnn' 2025-03-14T05:32:51.2152701Z Entering 'third_party/ittapi' 2025-03-14T05:32:51.2206218Z Entering 'third_party/kineto' 2025-03-14T05:32:51.2256434Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-03-14T05:32:51.2307805Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-03-14T05:32:51.2360358Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-03-14T05:32:51.2410726Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-03-14T05:32:51.2461751Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-03-14T05:32:51.2509366Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-03-14T05:32:51.2563833Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-03-14T05:32:51.2614228Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-03-14T05:32:51.2664052Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-03-14T05:32:51.2717279Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-03-14T05:32:51.2770664Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-03-14T05:32:51.2821115Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-03-14T05:32:51.2875238Z Entering 'third_party/kleidiai' 2025-03-14T05:32:51.2927465Z Entering 'third_party/mimalloc' 2025-03-14T05:32:51.2979479Z Entering 'third_party/nlohmann' 2025-03-14T05:32:51.3032346Z Entering 'third_party/onnx' 2025-03-14T05:32:51.3098025Z Entering 'third_party/onnx/third_party/pybind11' 2025-03-14T05:32:51.3152400Z Entering 'third_party/opentelemetry-cpp' 2025-03-14T05:32:51.3207605Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-03-14T05:32:51.3260245Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-03-14T05:32:51.3319313Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-03-14T05:32:51.3373482Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-03-14T05:32:51.3423458Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-03-14T05:32:51.3473568Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-03-14T05:32:51.3523705Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-03-14T05:32:51.3573560Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-03-14T05:32:51.3625950Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-03-14T05:32:51.3680116Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-03-14T05:32:51.3751258Z Entering 'third_party/pocketfft' 2025-03-14T05:32:51.3804157Z Entering 'third_party/protobuf' 2025-03-14T05:32:51.3858350Z Entering 'third_party/protobuf/third_party/benchmark' 2025-03-14T05:32:51.3912565Z Entering 'third_party/protobuf/third_party/googletest' 2025-03-14T05:32:51.3964777Z Entering 'third_party/psimd' 2025-03-14T05:32:51.4019356Z Entering 'third_party/pthreadpool' 2025-03-14T05:32:51.4072176Z Entering 'third_party/pybind11' 2025-03-14T05:32:51.4124127Z Entering 'third_party/python-peachpy' 2025-03-14T05:32:51.4177071Z Entering 'third_party/sleef' 2025-03-14T05:32:51.4228876Z Entering 'third_party/tensorpipe' 2025-03-14T05:32:51.4279717Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-03-14T05:32:51.4331403Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-03-14T05:32:51.4381633Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-03-14T05:32:51.4431163Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-03-14T05:32:51.4482607Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-03-14T05:32:51.4557870Z [command]/usr/bin/git submodule foreach --recursive git config --local --add 'url.https://github.com/.insteadOf' 'org-21003710@github.com:' 2025-03-14T05:32:51.4920751Z Entering 'android/libs/fbjni' 2025-03-14T05:32:51.4974005Z Entering 'third_party/FP16' 2025-03-14T05:32:51.5026351Z Entering 'third_party/FXdiv' 2025-03-14T05:32:51.5078669Z Entering 'third_party/NNPACK' 2025-03-14T05:32:51.5130883Z Entering 'third_party/NVTX' 2025-03-14T05:32:51.5183827Z Entering 'third_party/VulkanMemoryAllocator' 2025-03-14T05:32:51.5236020Z Entering 'third_party/XNNPACK' 2025-03-14T05:32:51.5303863Z Entering 'third_party/benchmark' 2025-03-14T05:32:51.5355498Z Entering 'third_party/composable_kernel' 2025-03-14T05:32:51.5415087Z Entering 'third_party/cpp-httplib' 2025-03-14T05:32:51.5469696Z Entering 'third_party/cpuinfo' 2025-03-14T05:32:51.5522296Z Entering 'third_party/cudnn_frontend' 2025-03-14T05:32:51.5575524Z Entering 'third_party/cutlass' 2025-03-14T05:32:51.5636003Z Entering 'third_party/eigen' 2025-03-14T05:32:51.5690913Z Entering 'third_party/fbgemm' 2025-03-14T05:32:51.5743662Z Entering 'third_party/fbgemm/third_party/asmjit' 2025-03-14T05:32:51.5795076Z Entering 'third_party/fbgemm/third_party/cpuinfo' 2025-03-14T05:32:51.5850973Z Entering 'third_party/fbgemm/third_party/cutlass' 2025-03-14T05:32:51.5909961Z Entering 'third_party/fbgemm/third_party/googletest' 2025-03-14T05:32:51.5960855Z Entering 'third_party/fbgemm/third_party/hipify_torch' 2025-03-14T05:32:51.6014424Z Entering 'third_party/flash-attention' 2025-03-14T05:32:51.6066067Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-03-14T05:32:51.6125279Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-03-14T05:32:51.6186740Z Entering 'third_party/flatbuffers' 2025-03-14T05:32:51.6240337Z Entering 'third_party/fmt' 2025-03-14T05:32:51.6292157Z Entering 'third_party/gemmlowp/gemmlowp' 2025-03-14T05:32:51.6343262Z Entering 'third_party/gloo' 2025-03-14T05:32:51.6395014Z Entering 'third_party/googletest' 2025-03-14T05:32:51.6445815Z Entering 'third_party/ideep' 2025-03-14T05:32:51.6495539Z Entering 'third_party/ideep/mkl-dnn' 2025-03-14T05:32:51.6555706Z Entering 'third_party/ittapi' 2025-03-14T05:32:51.6610914Z Entering 'third_party/kineto' 2025-03-14T05:32:51.6662669Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-03-14T05:32:51.6713614Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-03-14T05:32:51.6767043Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-03-14T05:32:51.6820733Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-03-14T05:32:51.6872135Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-03-14T05:32:51.6920800Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-03-14T05:32:51.6976302Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-03-14T05:32:51.7026594Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-03-14T05:32:51.7078302Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-03-14T05:32:51.7130380Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-03-14T05:32:51.7184366Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-03-14T05:32:51.7240021Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-03-14T05:32:51.7293683Z Entering 'third_party/kleidiai' 2025-03-14T05:32:51.7346550Z Entering 'third_party/mimalloc' 2025-03-14T05:32:51.7400869Z Entering 'third_party/nlohmann' 2025-03-14T05:32:51.7454515Z Entering 'third_party/onnx' 2025-03-14T05:32:51.7523839Z Entering 'third_party/onnx/third_party/pybind11' 2025-03-14T05:32:51.7584162Z Entering 'third_party/opentelemetry-cpp' 2025-03-14T05:32:51.7637844Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-03-14T05:32:51.7688468Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-03-14T05:32:51.7738064Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-03-14T05:32:51.7788218Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-03-14T05:32:51.7839978Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-03-14T05:32:51.7890567Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-03-14T05:32:51.7941280Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-03-14T05:32:51.7991149Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-03-14T05:32:51.8045233Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-03-14T05:32:51.8099612Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-03-14T05:32:51.8171998Z Entering 'third_party/pocketfft' 2025-03-14T05:32:51.8224526Z Entering 'third_party/protobuf' 2025-03-14T05:32:51.8279162Z Entering 'third_party/protobuf/third_party/benchmark' 2025-03-14T05:32:51.8329250Z Entering 'third_party/protobuf/third_party/googletest' 2025-03-14T05:32:51.8384692Z Entering 'third_party/psimd' 2025-03-14T05:32:51.8437942Z Entering 'third_party/pthreadpool' 2025-03-14T05:32:51.8490822Z Entering 'third_party/pybind11' 2025-03-14T05:32:51.8543942Z Entering 'third_party/python-peachpy' 2025-03-14T05:32:51.8596866Z Entering 'third_party/sleef' 2025-03-14T05:32:51.8649669Z Entering 'third_party/tensorpipe' 2025-03-14T05:32:51.8701756Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-03-14T05:32:51.8751434Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-03-14T05:32:51.8804164Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-03-14T05:32:51.8854563Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-03-14T05:32:51.8904545Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-03-14T05:32:51.8981786Z ##[endgroup] 2025-03-14T05:32:51.9027062Z [command]/usr/bin/git log -1 --format=%H 2025-03-14T05:32:51.9055538Z aed0b7a742a2d7b7901790622829cbd2135049a4 2025-03-14T05:32:51.9280747Z Prepare all required actions 2025-03-14T05:32:51.9281322Z Getting action download info 2025-03-14T05:32:52.0636549Z ##[group]Run ./.github/actions/setup-linux 2025-03-14T05:32:52.0636881Z env: 2025-03-14T05:32:52.0637114Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:32:52.0637387Z ##[endgroup] 2025-03-14T05:32:52.0679680Z ##[group]Run set -euo pipefail 2025-03-14T05:32:52.0680046Z set -euo pipefail 2025-03-14T05:32:52.0680358Z function get_ec2_metadata() { 2025-03-14T05:32:52.0680754Z  # Pulled from instance metadata endpoint for EC2 2025-03-14T05:32:52.0681448Z  # see https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/instancedata-data-retrieval.html 2025-03-14T05:32:52.0682012Z  category=$1 2025-03-14T05:32:52.0682392Z  # If it is GCP runner (runner name contains gcp), do not run this 2025-03-14T05:32:52.0682830Z  runner_name_str=i-0995e781c94ad14d3 2025-03-14T05:32:52.0683225Z  if [[ -f /.inarc ]]; then 2025-03-14T05:32:52.0683590Z  echo "ARC Runner, no info on ec2 metadata" 2025-03-14T05:32:52.0684004Z  elif [[ $runner_name_str == *"gcp"* ]]; then 2025-03-14T05:32:52.0684473Z  echo "Runner is from Google Cloud Platform, No info on ec2 metadata" 2025-03-14T05:32:52.0684902Z  else 2025-03-14T05:32:52.0685738Z  curl -H "X-aws-ec2-metadata-token: $(curl -s -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")" -fsSL "http://169.254.169.254/latest/meta-data/${category}" 2025-03-14T05:32:52.0686618Z  fi 2025-03-14T05:32:52.0686856Z } 2025-03-14T05:32:52.0687138Z echo "ami-id: $(get_ec2_metadata ami-id)" 2025-03-14T05:32:52.0687564Z echo "instance-id: $(get_ec2_metadata instance-id)" 2025-03-14T05:32:52.0688037Z echo "instance-type: $(get_ec2_metadata instance-type)" 2025-03-14T05:32:52.0688455Z echo "system info $(uname -a)" 2025-03-14T05:32:52.0699748Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:32:52.0700133Z env: 2025-03-14T05:32:52.0700371Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:32:52.0700646Z ##[endgroup] 2025-03-14T05:32:52.0865796Z ami-id: ami-08b5b3a93ed654d19 2025-03-14T05:32:52.0975890Z instance-id: i-0995e781c94ad14d3 2025-03-14T05:32:52.1087281Z instance-type: g5.4xlarge 2025-03-14T05:32:52.1099908Z system info Linux ip-10-0-78-218.ec2.internal 6.1.129-138.220.amzn2023.x86_64 #1 SMP PREEMPT_DYNAMIC Tue Feb 25 22:18:43 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux 2025-03-14T05:32:52.1126575Z ##[group]Run echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-03-14T05:32:52.1127458Z echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-03-14T05:32:52.1136454Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:32:52.1136847Z env: 2025-03-14T05:32:52.1137092Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:32:52.1137539Z ##[endgroup] 2025-03-14T05:32:52.1201128Z ##[group]Run if systemctl is-active --quiet docker; then 2025-03-14T05:32:52.1201581Z if systemctl is-active --quiet docker; then 2025-03-14T05:32:52.1201978Z  echo "Docker daemon is running..."; 2025-03-14T05:32:52.1202310Z else 2025-03-14T05:32:52.1202674Z  echo "Starting docker deamon..." && sudo systemctl start docker; 2025-03-14T05:32:52.1203093Z fi 2025-03-14T05:32:52.1211823Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:32:52.1212200Z env: 2025-03-14T05:32:52.1212439Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:32:52.1212723Z ##[endgroup] 2025-03-14T05:32:52.1305381Z Docker daemon is running... 2025-03-14T05:32:52.1348472Z ##[group]Run nick-fields/retry@v3.0.0 2025-03-14T05:32:52.1348794Z with: 2025-03-14T05:32:52.1349021Z shell: bash 2025-03-14T05:32:52.1349477Z timeout_minutes: 5 2025-03-14T05:32:52.1349744Z max_attempts: 3 2025-03-14T05:32:52.1350017Z retry_wait_seconds: 30 2025-03-14T05:32:52.1352100Z command: AWS_ACCOUNT_ID=$(aws sts get-caller-identity|grep Account|cut -f4 -d\") aws ecr get-login-password --region "$AWS_DEFAULT_REGION" | docker login --username AWS \ --password-stdin "$AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com" # For LF Runners we need to make sure we also login to Meta's ECR docker registry too. META_AWS_ACCOUNT_ID=308535385114 if [ "$AWS_ACCOUNT_ID" != "$META_AWS_ACCOUNT_ID" ] ; then aws ecr get-login-password --region "$AWS_DEFAULT_REGION" | docker login --username AWS \ --password-stdin "$META_AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com" fi 2025-03-14T05:32:52.1354173Z polling_interval_seconds: 1 2025-03-14T05:32:52.1354608Z warning_on_retry: true 2025-03-14T05:32:52.1354894Z continue_on_error: false 2025-03-14T05:32:52.1355160Z env: 2025-03-14T05:32:52.1355401Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:32:52.1355685Z AWS_RETRY_MODE: standard 2025-03-14T05:32:52.1355972Z AWS_MAX_ATTEMPTS: 5 2025-03-14T05:32:52.1356248Z AWS_DEFAULT_REGION: us-east-1 2025-03-14T05:32:52.1356559Z ##[endgroup] 2025-03-14T05:32:53.2782330Z WARNING! Your password will be stored unencrypted in /home/ec2-user/.docker/config.json. 2025-03-14T05:32:53.2783263Z Configure a credential helper to remove this warning. See 2025-03-14T05:32:53.2784072Z https://docs.docker.com/engine/reference/commandline/login/#credentials-store 2025-03-14T05:32:53.2784458Z 2025-03-14T05:32:53.2784569Z Login Succeeded 2025-03-14T05:32:54.2211958Z Command completed after 1 attempt(s). 2025-03-14T05:32:54.2294067Z ##[group]Run env | grep '^GITHUB' >> "/tmp/github_env_${GITHUB_RUN_ID}" 2025-03-14T05:32:54.2294879Z env | grep '^GITHUB' >> "/tmp/github_env_${GITHUB_RUN_ID}" 2025-03-14T05:32:54.2295618Z env | grep '^CI' >> "/tmp/github_env_${GITHUB_RUN_ID}" 2025-03-14T05:32:54.2307327Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:32:54.2307916Z env: 2025-03-14T05:32:54.2308287Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:32:54.2308712Z ##[endgroup] 2025-03-14T05:32:54.2416231Z ##[group]Run # ignore expansion of "docker ps -q" since it could be empty 2025-03-14T05:32:54.2416807Z # ignore expansion of "docker ps -q" since it could be empty 2025-03-14T05:32:54.2417245Z # shellcheck disable=SC2046 2025-03-14T05:32:54.2417605Z docker stop $(docker ps -q) || true 2025-03-14T05:32:54.2417965Z # Prune all of the docker images 2025-03-14T05:32:54.2418316Z docker system prune -af 2025-03-14T05:32:54.2427214Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:32:54.2427608Z env: 2025-03-14T05:32:54.2427845Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:32:54.2428131Z ##[endgroup] 2025-03-14T05:32:54.2722239Z "docker stop" requires at least 1 argument. 2025-03-14T05:32:54.2722650Z See 'docker stop --help'. 2025-03-14T05:32:54.2722908Z 2025-03-14T05:32:54.2723126Z Usage: docker stop [OPTIONS] CONTAINER [CONTAINER...] 2025-03-14T05:32:54.2723752Z 2025-03-14T05:32:54.2723882Z Stop one or more running containers 2025-03-14T05:32:54.2965649Z Total reclaimed space: 0B 2025-03-14T05:32:54.3014340Z ##[group]Run set +e 2025-03-14T05:32:54.3014639Z set +e 2025-03-14T05:32:54.3014881Z set -x 2025-03-14T05:32:54.3015119Z  2025-03-14T05:32:54.3015381Z PT_DOMAIN=download.pytorch.org 2025-03-14T05:32:54.3015947Z # TODO: Flaky access to download.pytorch.org https://github.com/pytorch/pytorch/issues/100400, 2025-03-14T05:32:54.3016729Z # cleaning this up once the issue is fixed. There are more than one resolved IP here, the last 2025-03-14T05:32:54.3017243Z # one is returned at random 2025-03-14T05:32:54.3017643Z RESOLVED_IP=$(dig -4 +short "${PT_DOMAIN}" | tail -n1) 2025-03-14T05:32:54.3018023Z  2025-03-14T05:32:54.3018475Z if [ -z "${RESOLVED_IP}" ]; then 2025-03-14T05:32:54.3018916Z  echo "Couldn't resolve ${PT_DOMAIN}, retrying with Google DNS..." 2025-03-14T05:32:54.3019442Z  RESOLVED_IP=$(dig -4 +short "${PT_DOMAIN}" @8.8.8.8 | tail -n1) 2025-03-14T05:32:54.3019836Z  2025-03-14T05:32:54.3020093Z  if [ -z "${RESOLVED_IP}" ]; then 2025-03-14T05:32:54.3020485Z  echo "Couldn't resolve ${PT_DOMAIN}, exiting..." 2025-03-14T05:32:54.3020851Z  exit 1 2025-03-14T05:32:54.3021099Z  fi 2025-03-14T05:32:54.3021325Z fi 2025-03-14T05:32:54.3021550Z  2025-03-14T05:32:54.3021826Z if grep -r "${PT_DOMAIN}" /etc/hosts; then 2025-03-14T05:32:54.3022206Z  # Clean up any old records first 2025-03-14T05:32:54.3022577Z  sudo sed -i "/${PT_DOMAIN}/d" /etc/hosts 2025-03-14T05:32:54.3022908Z fi 2025-03-14T05:32:54.3023138Z  2025-03-14T05:32:54.3023465Z echo "${RESOLVED_IP} ${PT_DOMAIN}" | sudo tee -a /etc/hosts 2025-03-14T05:32:54.3023863Z cat /etc/hosts 2025-03-14T05:32:54.3032695Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:32:54.3033072Z env: 2025-03-14T05:32:54.3033312Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:32:54.3033592Z ##[endgroup] 2025-03-14T05:32:54.3060955Z + PT_DOMAIN=download.pytorch.org 2025-03-14T05:32:54.3067618Z ++ dig -4 +short download.pytorch.org 2025-03-14T05:32:54.3068818Z ++ tail -n1 2025-03-14T05:32:54.3808783Z + RESOLVED_IP=18.160.10.22 2025-03-14T05:32:54.3809091Z + '[' -z 18.160.10.22 ']' 2025-03-14T05:32:54.3809399Z + grep -r download.pytorch.org /etc/hosts 2025-03-14T05:32:54.3827593Z + echo '18.160.10.22 download.pytorch.org' 2025-03-14T05:32:54.3828100Z + sudo tee -a /etc/hosts 2025-03-14T05:32:54.7641425Z 18.160.10.22 download.pytorch.org 2025-03-14T05:32:54.7663857Z + cat /etc/hosts 2025-03-14T05:32:54.7673727Z 127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4 2025-03-14T05:32:54.7679801Z ::1 localhost6 localhost6.localdomain6 2025-03-14T05:32:54.7680178Z 18.160.10.22 download.pytorch.org 2025-03-14T05:32:54.7834892Z ##[group]Run pytorch/test-infra/.github/actions/calculate-docker-image@main 2025-03-14T05:32:54.7835360Z with: 2025-03-14T05:32:54.7836173Z docker-image-name: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:32:54.7837068Z docker-build-dir: .ci/docker 2025-03-14T05:32:54.7837372Z working-directory: . 2025-03-14T05:32:54.7837738Z docker-registry: 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-03-14T05:32:54.7838147Z force-push: false 2025-03-14T05:32:54.7838399Z env: 2025-03-14T05:32:54.7838639Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:32:54.7838922Z ##[endgroup] 2025-03-14T05:32:54.7862052Z ##[group]Run set -ex 2025-03-14T05:32:54.7862358Z set -ex 2025-03-14T05:32:54.7862598Z  2025-03-14T05:32:54.7863006Z # If the docker build directory or the build script doesn't exist, the action will 2025-03-14T05:32:54.7863862Z # gracefully return the docker image name as it is. Pulling docker image in Linux 2025-03-14T05:32:54.7864430Z # job could then download the pre-built image as usual 2025-03-14T05:32:54.7864945Z if [[ ! -d "${DOCKER_BUILD_DIR}" ]] || [[ ! -f "${DOCKER_BUILD_DIR}/build.sh" ]]; then 2025-03-14T05:32:54.7865424Z  echo "skip=true" >> "${GITHUB_OUTPUT}" 2025-03-14T05:32:54.7865880Z  echo "docker-image=${DOCKER_IMAGE_NAME}" >> "${GITHUB_OUTPUT}" 2025-03-14T05:32:54.7866291Z  2025-03-14T05:32:54.7866666Z  echo "There is no Docker build script in ${REPO_NAME} repo, skipping..." 2025-03-14T05:32:54.7867107Z  exit 0 2025-03-14T05:32:54.7867352Z else 2025-03-14T05:32:54.7867633Z  echo "skip=false" >> "${GITHUB_OUTPUT}" 2025-03-14T05:32:54.7868286Z fi 2025-03-14T05:32:54.7868524Z  2025-03-14T05:32:54.7868880Z if [[ "${DOCKER_IMAGE_NAME}" == *"${DOCKER_REGISTRY}/${REPO_NAME}"* ]]; then 2025-03-14T05:32:54.7869468Z  # The docker image name already includes the ECR prefix and tag, so we can just 2025-03-14T05:32:54.7869995Z  # use it as it is, but first let's extract the tag 2025-03-14T05:32:54.7870474Z  DOCKER_TAG=$(echo "${DOCKER_IMAGE_NAME}" | awk -F '[:,]' '{print $2}') 2025-03-14T05:32:54.7870978Z  echo "docker-tag=${DOCKER_TAG}" >> "${GITHUB_OUTPUT}" 2025-03-14T05:32:54.7871464Z  echo "docker-image=${DOCKER_IMAGE_NAME}" >> "${GITHUB_OUTPUT}" 2025-03-14T05:32:54.7871874Z else 2025-03-14T05:32:54.7872204Z  DOCKER_TAG=$(git rev-parse HEAD:"${DOCKER_BUILD_DIR}") 2025-03-14T05:32:54.7872668Z  echo "docker-tag=${DOCKER_TAG}" >> "${GITHUB_OUTPUT}" 2025-03-14T05:32:54.7873293Z  echo "docker-image=${DOCKER_REGISTRY}/${REPO_NAME}/${DOCKER_IMAGE_NAME}:${DOCKER_TAG}" >> "${GITHUB_OUTPUT}" 2025-03-14T05:32:54.7873834Z fi 2025-03-14T05:32:54.7887190Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:32:54.7887574Z env: 2025-03-14T05:32:54.7887808Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:32:54.7888089Z REPO_NAME: pytorch 2025-03-14T05:32:54.7888920Z DOCKER_IMAGE_NAME: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:32:54.7889797Z DOCKER_BUILD_DIR: .ci/docker 2025-03-14T05:32:54.7890178Z DOCKER_REGISTRY: 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-03-14T05:32:54.7890568Z ##[endgroup] 2025-03-14T05:32:54.7923762Z + [[ ! -d .ci/docker ]] 2025-03-14T05:32:54.7924100Z + [[ ! -f .ci/docker/build.sh ]] 2025-03-14T05:32:54.7924416Z + echo skip=false 2025-03-14T05:32:54.7925890Z + [[ 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 == *\3\0\8\5\3\5\3\8\5\1\1\4\.\d\k\r\.\e\c\r\.\u\s\-\e\a\s\t\-\1\.\a\m\a\z\o\n\a\w\s\.\c\o\m\/\p\y\t\o\r\c\h* ]] 2025-03-14T05:32:54.7932292Z ++ echo 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:32:54.7933123Z ++ awk -F '[:,]' '{print $2}' 2025-03-14T05:32:54.7961486Z + DOCKER_TAG=aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:32:54.7961923Z + echo docker-tag=aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:32:54.7962861Z + echo docker-image=308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:32:54.7992893Z ##[group]Run set +e 2025-03-14T05:32:54.7993213Z set +e 2025-03-14T05:32:54.7993474Z set -x 2025-03-14T05:32:54.7993725Z  2025-03-14T05:32:54.7993964Z login() { 2025-03-14T05:32:54.7994564Z  aws ecr get-login-password --region us-east-1 | docker login -u AWS --password-stdin "$1" 2025-03-14T05:32:54.7995246Z } 2025-03-14T05:32:54.7995477Z  2025-03-14T05:32:54.7995711Z retry () { 2025-03-14T05:32:54.7996013Z  $* || (sleep 1 && $*) || (sleep 2 && $*) 2025-03-14T05:32:54.7996338Z } 2025-03-14T05:32:54.7996566Z  2025-03-14T05:32:54.7996817Z retry login "${DOCKER_REGISTRY}" 2025-03-14T05:32:54.7997131Z  2025-03-14T05:32:54.7997382Z START_TIME=$(date +%s) 2025-03-14T05:32:54.7997692Z # Wait up to 120 minutes 2025-03-14T05:32:54.7998067Z while [[ $(( $(date +%s) - 7200 )) -lt $START_TIME ]]; do 2025-03-14T05:32:54.7998557Z  # Check if image already exists, if it does then skip building it 2025-03-14T05:32:54.7999047Z  if docker manifest inspect "${DOCKER_IMAGE}"; then 2025-03-14T05:32:54.7999417Z  exit 0 2025-03-14T05:32:54.7999671Z  fi 2025-03-14T05:32:54.7999910Z  2025-03-14T05:32:54.8000315Z  # NB: This flag is used by Docker build workflow to push the image to ECR, so we can 2025-03-14T05:32:54.8000959Z  # use this to differentiate between the Docker build and regular build jobs. For the 2025-03-14T05:32:54.8001603Z  # latter, it will wait for the Docker images to become available before continuing 2025-03-14T05:32:54.8002119Z  if [ "${DOCKER_PUSH:-false}" == "true" ]; then 2025-03-14T05:32:54.8002528Z  # It's a Docker build job, let's build the image 2025-03-14T05:32:54.8002893Z  break 2025-03-14T05:32:54.8003150Z  else 2025-03-14T05:32:54.8003515Z  # It's a regular build job, wait for the image to become available 2025-03-14T05:32:54.8003931Z  sleep 300 2025-03-14T05:32:54.8004199Z  fi 2025-03-14T05:32:54.8004438Z done 2025-03-14T05:32:54.8004673Z  2025-03-14T05:32:54.8005035Z # NB: This part requires a full checkout. Otherwise, the merge base will 2025-03-14T05:32:54.8005612Z # be empty. The default action would be to continue rebuild the image 2025-03-14T05:32:54.8006121Z if [[ "$BASE_REVISION" = "$(git rev-parse HEAD)" ]]; then 2025-03-14T05:32:54.8006623Z  # if we're on the base branch then use the parent commit 2025-03-14T05:32:54.8007068Z  MERGE_BASE=$(git rev-parse HEAD~) 2025-03-14T05:32:54.8007399Z else 2025-03-14T05:32:54.8007745Z  # otherwise we're on a PR, so use the most recent base commit 2025-03-14T05:32:54.8008223Z  MERGE_BASE=$(git merge-base HEAD "$BASE_REVISION") 2025-03-14T05:32:54.8008601Z fi 2025-03-14T05:32:54.8008837Z  2025-03-14T05:32:54.8009095Z if [[ -z "${MERGE_BASE}" ]]; then 2025-03-14T05:32:54.8009474Z  echo "rebuild=true" >> "${GITHUB_OUTPUT}" 2025-03-14T05:32:54.8009826Z  2025-03-14T05:32:54.8010455Z  echo "Finding merge base only works with full checkout, please set fetch-depth to 0, continuing ..." 2025-03-14T05:32:54.8010997Z  exit 0 2025-03-14T05:32:54.8011242Z fi 2025-03-14T05:32:54.8011472Z  2025-03-14T05:32:54.8011794Z if ! git rev-parse "${MERGE_BASE}:${DOCKER_BUILD_DIR}"; then 2025-03-14T05:32:54.8012465Z  echo "Directory '${DOCKER_BUILD_DIR}' not found in commit $MERGE_BASE, you should rebase onto a more recent commit" 2025-03-14T05:32:54.8013041Z  exit 1 2025-03-14T05:32:54.8013284Z fi 2025-03-14T05:32:54.8013513Z  2025-03-14T05:32:54.8013888Z PREVIOUS_DOCKER_TAG=$(git rev-parse "${MERGE_BASE}:${DOCKER_BUILD_DIR}") 2025-03-14T05:32:54.8014528Z # If no image exists but the hash is the same as the previous hash then we should error out here 2025-03-14T05:32:54.8015096Z if [[ "${PREVIOUS_DOCKER_TAG}" == "${DOCKER_TAG}" ]]; then 2025-03-14T05:32:54.8015757Z  echo "WARNING: Something has gone wrong and the previous image isn't available for the merge-base of your branch" 2025-03-14T05:32:54.8016575Z  echo " Will re-build docker image to store in local cache, TTS may be longer" 2025-03-14T05:32:54.8017024Z fi 2025-03-14T05:32:54.8017256Z  2025-03-14T05:32:54.8017539Z echo "rebuild=true" >> "${GITHUB_OUTPUT}" 2025-03-14T05:32:54.8025807Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:32:54.8026186Z env: 2025-03-14T05:32:54.8026427Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:32:54.8026756Z DOCKER_BUILD_DIR: .ci/docker 2025-03-14T05:32:54.8027127Z BASE_REVISION: aed0b7a742a2d7b7901790622829cbd2135049a4 2025-03-14T05:32:54.8028037Z DOCKER_IMAGE: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:32:54.8028936Z DOCKER_TAG: aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:32:54.8029387Z DOCKER_REGISTRY: 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-03-14T05:32:54.8029792Z DOCKER_PUSH: 2025-03-14T05:32:54.8030041Z ##[endgroup] 2025-03-14T05:32:54.8057831Z + retry login 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-03-14T05:32:54.8058480Z + login 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-03-14T05:32:54.8061229Z + aws ecr get-login-password --region us-east-1 2025-03-14T05:32:54.8062288Z + docker login -u AWS --password-stdin 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-03-14T05:32:55.3230005Z WARNING! Your password will be stored unencrypted in /home/ec2-user/.docker/config.json. 2025-03-14T05:32:55.3230622Z Configure a credential helper to remove this warning. See 2025-03-14T05:32:55.3231313Z https://docs.docker.com/engine/reference/commandline/login/#credentials-store 2025-03-14T05:32:55.3231863Z 2025-03-14T05:32:55.3231982Z Login Succeeded 2025-03-14T05:32:55.3257471Z ++ date +%s 2025-03-14T05:32:55.3269278Z + START_TIME=1741930375 2025-03-14T05:32:55.3281195Z ++ date +%s 2025-03-14T05:32:55.3283528Z + [[ 1741923175 -lt 1741930375 ]] 2025-03-14T05:32:55.3284449Z + docker manifest inspect 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:32:55.5357601Z { 2025-03-14T05:32:55.5357953Z "schemaVersion": 2, 2025-03-14T05:32:55.5358504Z "mediaType": "application/vnd.docker.distribution.manifest.v2+json", 2025-03-14T05:32:55.5358946Z "config": { 2025-03-14T05:32:55.5359330Z "mediaType": "application/vnd.docker.container.image.v1+json", 2025-03-14T05:32:55.5359791Z "size": 52935, 2025-03-14T05:32:55.5360292Z "digest": "sha256:9f77b6c3483857c0bff989bce733b5bd5d6fc70a10591ed0f8d1de80d0e77bfd" 2025-03-14T05:32:55.5360857Z }, 2025-03-14T05:32:55.5361085Z "layers": [ 2025-03-14T05:32:55.5361321Z { 2025-03-14T05:32:55.5361693Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5362177Z "size": 28583948, 2025-03-14T05:32:55.5362878Z "digest": "sha256:86e5016c269355b382c9cabab4f6646d56d75914f20d545289970436dae431b1" 2025-03-14T05:32:55.5363373Z }, 2025-03-14T05:32:55.5363602Z { 2025-03-14T05:32:55.5363952Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5364372Z "size": 7964619, 2025-03-14T05:32:55.5364786Z "digest": "sha256:49e139a3d6c2f1801aa0cea1eb34e57c5314065b679325df205026fb175383b8" 2025-03-14T05:32:55.5365245Z }, 2025-03-14T05:32:55.5365459Z { 2025-03-14T05:32:55.5365799Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5366219Z "size": 57379226, 2025-03-14T05:32:55.5366645Z "digest": "sha256:a14844c8c51f98ebee2b5eda8bff8742dd804385e1dbbeb56928946113000293" 2025-03-14T05:32:55.5367162Z }, 2025-03-14T05:32:55.5367371Z { 2025-03-14T05:32:55.5367715Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5368508Z "size": 187, 2025-03-14T05:32:55.5368937Z "digest": "sha256:18fb524087fbed3307d23afde734ef1df452f55e7a22e28b56ce1e6eb9c6b3d9" 2025-03-14T05:32:55.5369566Z }, 2025-03-14T05:32:55.5369777Z { 2025-03-14T05:32:55.5370119Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5370534Z "size": 6885, 2025-03-14T05:32:55.5371060Z "digest": "sha256:efd686a7b2c8bb052a3bb919973c4bee33c2bbcff08fd431362b90c9469fbc4d" 2025-03-14T05:32:55.5371532Z }, 2025-03-14T05:32:55.5371740Z { 2025-03-14T05:32:55.5372079Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5372501Z "size": 1438282733, 2025-03-14T05:32:55.5372923Z "digest": "sha256:52c648e213347a29001a45b2c4f834cc60621a3d6ab68fa06ebd2075f6487ec5" 2025-03-14T05:32:55.5373378Z }, 2025-03-14T05:32:55.5373591Z { 2025-03-14T05:32:55.5373931Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5374350Z "size": 62694, 2025-03-14T05:32:55.5374776Z "digest": "sha256:56e384e4e5aa7ebddf31686fc6ae62ef19fdc5311bbea8186563fc2aeca2fb52" 2025-03-14T05:32:55.5375263Z }, 2025-03-14T05:32:55.5375485Z { 2025-03-14T05:32:55.5375862Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5376284Z "size": 1684, 2025-03-14T05:32:55.5376712Z "digest": "sha256:fb71b792ec6c458c338cace5ce42f653e4ae4b24442a49b7c47af3a38342b5bc" 2025-03-14T05:32:55.5377184Z }, 2025-03-14T05:32:55.5377396Z { 2025-03-14T05:32:55.5377737Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5378156Z "size": 1523, 2025-03-14T05:32:55.5378571Z "digest": "sha256:5509576f2693abf94a50a70a59bbf8b519f20705f147d495ddde85981c8189fa" 2025-03-14T05:32:55.5379030Z }, 2025-03-14T05:32:55.5379240Z { 2025-03-14T05:32:55.5379582Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5380001Z "size": 2515216700, 2025-03-14T05:32:55.5380429Z "digest": "sha256:1e6c6f2d245956001435a5df27fd4defea08affa7f6c22252464f758447db401" 2025-03-14T05:32:55.5380896Z }, 2025-03-14T05:32:55.5381121Z { 2025-03-14T05:32:55.5381456Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5381880Z "size": 86619, 2025-03-14T05:32:55.5382290Z "digest": "sha256:a1fe8922734d371e0c0fd5b04183758331e08f45553720ecff4b062824e0c8e1" 2025-03-14T05:32:55.5382746Z }, 2025-03-14T05:32:55.5382956Z { 2025-03-14T05:32:55.5383294Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5383708Z "size": 1895, 2025-03-14T05:32:55.5384130Z "digest": "sha256:b6dbf6a349bb138f3c043a63744a5726a8cabbdb33a101bcb2974bd2443576b5" 2025-03-14T05:32:55.5384597Z }, 2025-03-14T05:32:55.5384812Z { 2025-03-14T05:32:55.5385151Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5385569Z "size": 245809802, 2025-03-14T05:32:55.5385996Z "digest": "sha256:73d0341633364e49fcc64aa471fe942fbe8d160db68f9a62d8efdf396ba61586" 2025-03-14T05:32:55.5386458Z }, 2025-03-14T05:32:55.5386674Z { 2025-03-14T05:32:55.5387160Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5387621Z "size": 703, 2025-03-14T05:32:55.5388047Z "digest": "sha256:f2a6fb332e8e2f85d989521b6e0ebf5cfb8fb18d0c3f53686ea7a989f3e2db28" 2025-03-14T05:32:55.5388601Z }, 2025-03-14T05:32:55.5388817Z { 2025-03-14T05:32:55.5389176Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5389602Z "size": 1255, 2025-03-14T05:32:55.5390017Z "digest": "sha256:991f90b489b149166bbc0e7ac9d8bef13697c5d4934652e4898847e9e3399277" 2025-03-14T05:32:55.5390482Z }, 2025-03-14T05:32:55.5390696Z { 2025-03-14T05:32:55.5391040Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5391460Z "size": 485, 2025-03-14T05:32:55.5391873Z "digest": "sha256:68199204396f4d8cd760d149b2bec7f6a58069074b1ea88ca6fc9bb55a6309bd" 2025-03-14T05:32:55.5392338Z }, 2025-03-14T05:32:55.5392554Z { 2025-03-14T05:32:55.5392919Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5393420Z "size": 91727383, 2025-03-14T05:32:55.5393855Z "digest": "sha256:8d533b261ba2a83360001b0d5a4fa23e9fb5ef1d45b5f570841c7e88f42fa0c4" 2025-03-14T05:32:55.5394374Z }, 2025-03-14T05:32:55.5394594Z { 2025-03-14T05:32:55.5394944Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5395367Z "size": 3720, 2025-03-14T05:32:55.5395787Z "digest": "sha256:9552b75545b1f7ea035b2c6a9bc5c9110aac689a5a82402714a1e908ab7b87fb" 2025-03-14T05:32:55.5396254Z }, 2025-03-14T05:32:55.5396468Z { 2025-03-14T05:32:55.5396813Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5397231Z "size": 1860, 2025-03-14T05:32:55.5397682Z "digest": "sha256:330a44046cfdafc7f173b0af2a21c36240c683608cba1fadae7ea4a893c50f13" 2025-03-14T05:32:55.5398181Z }, 2025-03-14T05:32:55.5398396Z { 2025-03-14T05:32:55.5398741Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5399176Z "size": 701, 2025-03-14T05:32:55.5399605Z "digest": "sha256:fb9c93639b4043da5bee2c9144ca9cfdc4d78f4ea398226d0543eb383872ac82" 2025-03-14T05:32:55.5400076Z }, 2025-03-14T05:32:55.5400282Z { 2025-03-14T05:32:55.5400629Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5401045Z "size": 477, 2025-03-14T05:32:55.5401467Z "digest": "sha256:a5d1ad88496e5092d20986159ded209dc66ca7ecf26904a3be763c50eb5f3b37" 2025-03-14T05:32:55.5401931Z }, 2025-03-14T05:32:55.5402141Z { 2025-03-14T05:32:55.5402485Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5402902Z "size": 2923135652, 2025-03-14T05:32:55.5403348Z "digest": "sha256:4e2ec52a1b4f3b459a27ea68f5fe337f0d74845bc9553c5be5ce181dbf84498f" 2025-03-14T05:32:55.5403816Z }, 2025-03-14T05:32:55.5404031Z { 2025-03-14T05:32:55.5404379Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5404801Z "size": 380, 2025-03-14T05:32:55.5405240Z "digest": "sha256:1576583c5ff83f09a227a083342d256062f047fe6fc466f2553f08aba32e028a" 2025-03-14T05:32:55.5405707Z }, 2025-03-14T05:32:55.5405926Z { 2025-03-14T05:32:55.5406262Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5406699Z "size": 12922, 2025-03-14T05:32:55.5407118Z "digest": "sha256:ae76494b3f0faf177f07b531d05493b8f6e1ba6a9b24b84acce1289bb4251693" 2025-03-14T05:32:55.5407588Z }, 2025-03-14T05:32:55.5407802Z { 2025-03-14T05:32:55.5408145Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5408556Z "size": 863, 2025-03-14T05:32:55.5408971Z "digest": "sha256:96b47e94ca2f2cb3a745f0807c540c56a831420099db6b9f1f132b6999f8f41e" 2025-03-14T05:32:55.5409433Z }, 2025-03-14T05:32:55.5409647Z { 2025-03-14T05:32:55.5409992Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5410413Z "size": 106, 2025-03-14T05:32:55.5410932Z "digest": "sha256:e26a64ee2eebd4c0bf3f79062035d4158a72d0c7875e0e26366ef1c6b92d8f53" 2025-03-14T05:32:55.5411402Z }, 2025-03-14T05:32:55.5411610Z { 2025-03-14T05:32:55.5411953Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5412375Z "size": 503, 2025-03-14T05:32:55.5412806Z "digest": "sha256:97f84a5ed9bbc76fc24d305ea898bc6fdb0681d4bb55f51e6c0c9015f93d0374" 2025-03-14T05:32:55.5413283Z }, 2025-03-14T05:32:55.5413502Z { 2025-03-14T05:32:55.5413858Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5414278Z "size": 121477401, 2025-03-14T05:32:55.5414706Z "digest": "sha256:1cc35cfdfb65b588b5ef88828f4b4ec601c610c9624b791578d1419558757700" 2025-03-14T05:32:55.5415175Z }, 2025-03-14T05:32:55.5415389Z { 2025-03-14T05:32:55.5415744Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5416166Z "size": 109, 2025-03-14T05:32:55.5416703Z "digest": "sha256:cfe527e10a79566dea8ccbb08dea12bd3a63e6d00eaabec0bfaf591b411e621e" 2025-03-14T05:32:55.5417404Z }, 2025-03-14T05:32:55.5417672Z { 2025-03-14T05:32:55.5418098Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5418619Z "size": 490, 2025-03-14T05:32:55.5419122Z "digest": "sha256:528c12012cc2e036e2fc7b09e0dd5a1a5662efbaaff9ecf923e58df95061310c" 2025-03-14T05:32:55.5419597Z }, 2025-03-14T05:32:55.5419825Z { 2025-03-14T05:32:55.5420176Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5420602Z "size": 295, 2025-03-14T05:32:55.5421016Z "digest": "sha256:3a0e7232565e118d798825dae4723c348544900139cc3b12c462bfd6c772c5e3" 2025-03-14T05:32:55.5421476Z }, 2025-03-14T05:32:55.5421696Z { 2025-03-14T05:32:55.5422040Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5422459Z "size": 103, 2025-03-14T05:32:55.5422906Z "digest": "sha256:ac1e95ec684d244600badaef01b48b1aec2fc11855bc018195f75b941706cf01" 2025-03-14T05:32:55.5423393Z }, 2025-03-14T05:32:55.5423604Z { 2025-03-14T05:32:55.5423953Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5424369Z "size": 1473, 2025-03-14T05:32:55.5424799Z "digest": "sha256:f46926cb3922c5be4b7a255be3cfaff257cb564f9c18145f7e085bd94d3a0b88" 2025-03-14T05:32:55.5425272Z }, 2025-03-14T05:32:55.5425494Z { 2025-03-14T05:32:55.5425844Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5426265Z "size": 427757051, 2025-03-14T05:32:55.5426786Z "digest": "sha256:dd5d197efeb1ca410b03078ebb81189d2f54158d061f744b06567c35bb3682b6" 2025-03-14T05:32:55.5427360Z }, 2025-03-14T05:32:55.5427614Z { 2025-03-14T05:32:55.5428041Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5428548Z "size": 164, 2025-03-14T05:32:55.5428967Z "digest": "sha256:6ec4020a8fae159a0e15b174c108a3581e7723a2eeeb5ec2459cb0ca071de284" 2025-03-14T05:32:55.5429433Z }, 2025-03-14T05:32:55.5429656Z { 2025-03-14T05:32:55.5430007Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5430429Z "size": 802, 2025-03-14T05:32:55.5430848Z "digest": "sha256:39e1750b6399b91d360c3e1fe08ae1084d9f42e99d531e79e32903ff6a9b9dd3" 2025-03-14T05:32:55.5431309Z }, 2025-03-14T05:32:55.5431514Z { 2025-03-14T05:32:55.5431864Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5432279Z "size": 35861010, 2025-03-14T05:32:55.5432709Z "digest": "sha256:d17cf48b138eea7b80eac09fc6eb839a1ca5c79708e6494438018dd7ee05f234" 2025-03-14T05:32:55.5433183Z }, 2025-03-14T05:32:55.5433393Z { 2025-03-14T05:32:55.5433735Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5434150Z "size": 104, 2025-03-14T05:32:55.5434620Z "digest": "sha256:86c6b7977b8bd142d8392a15232d816649002d76609a2671ce99c1483f9bed98" 2025-03-14T05:32:55.5435069Z }, 2025-03-14T05:32:55.5435279Z { 2025-03-14T05:32:55.5435713Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5436130Z "size": 425, 2025-03-14T05:32:55.5436547Z "digest": "sha256:b1829fd86eff785e1cd2a1bc5b43a14eea96b280ab21959d8a655a96551e46f2" 2025-03-14T05:32:55.5437010Z }, 2025-03-14T05:32:55.5437218Z { 2025-03-14T05:32:55.5437551Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5437974Z "size": 20262151, 2025-03-14T05:32:55.5438410Z "digest": "sha256:4c8da014a48e84abbbb0c25d4ceeb509d0d15afa37fd69cf5da6491ee935933d" 2025-03-14T05:32:55.5438884Z }, 2025-03-14T05:32:55.5439098Z { 2025-03-14T05:32:55.5439436Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5439849Z "size": 641, 2025-03-14T05:32:55.5440260Z "digest": "sha256:3e933db6894bc779ff9fe5413678715e3ce6f673c7501c6fc46811e68db0e4b1" 2025-03-14T05:32:55.5440721Z }, 2025-03-14T05:32:55.5440927Z { 2025-03-14T05:32:55.5441272Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5441801Z "size": 701, 2025-03-14T05:32:55.5442215Z "digest": "sha256:fb9c93639b4043da5bee2c9144ca9cfdc4d78f4ea398226d0543eb383872ac82" 2025-03-14T05:32:55.5442677Z }, 2025-03-14T05:32:55.5442883Z { 2025-03-14T05:32:55.5443224Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5443642Z "size": 143, 2025-03-14T05:32:55.5444067Z "digest": "sha256:1bef7a4bcd0a0fc516099cc94ffc09fe3464bc5dafbd7cecf1e4e128e7e7d620" 2025-03-14T05:32:55.5444547Z }, 2025-03-14T05:32:55.5444761Z { 2025-03-14T05:32:55.5445102Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5445515Z "size": 135, 2025-03-14T05:32:55.5445932Z "digest": "sha256:771ad9f2789ae02d673ac942ce9f28a1dc1d5caad67d96c6d328bf71faa65f43" 2025-03-14T05:32:55.5446405Z }, 2025-03-14T05:32:55.5446616Z { 2025-03-14T05:32:55.5446959Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5447443Z "size": 5220115140, 2025-03-14T05:32:55.5447885Z "digest": "sha256:907000cb43f1dc9f5a146723107b2a624def679ac73b1cab2aa5725ab46cf6ab" 2025-03-14T05:32:55.5448351Z }, 2025-03-14T05:32:55.5448565Z { 2025-03-14T05:32:55.5448913Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5449334Z "size": 195, 2025-03-14T05:32:55.5449752Z "digest": "sha256:540fab0ce5376c101d27d4fb240509b69d13325a4064c74d2ca83c982d9ce2d2" 2025-03-14T05:32:55.5450202Z }, 2025-03-14T05:32:55.5450413Z { 2025-03-14T05:32:55.5450754Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5451174Z "size": 565, 2025-03-14T05:32:55.5451609Z "digest": "sha256:ca5aa8a8bb00feae0eddc40332de18eeb052387042fdeed9f92a9bf94de7b492" 2025-03-14T05:32:55.5452085Z }, 2025-03-14T05:32:55.5452295Z { 2025-03-14T05:32:55.5452638Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5453053Z "size": 43163409, 2025-03-14T05:32:55.5453498Z "digest": "sha256:19197f3f8a266e9f14f9ad0bd506be2e3f72eadcb52e5acfc23d594b3ad6e7e7" 2025-03-14T05:32:55.5453969Z }, 2025-03-14T05:32:55.5454176Z { 2025-03-14T05:32:55.5454512Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5454922Z "size": 106, 2025-03-14T05:32:55.5455335Z "digest": "sha256:50b63f41fa174e3b4faec14dc8090ac0a368fd9800231525eff41448616d21ce" 2025-03-14T05:32:55.5455792Z }, 2025-03-14T05:32:55.5455994Z { 2025-03-14T05:32:55.5456337Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5456750Z "size": 1401, 2025-03-14T05:32:55.5457175Z "digest": "sha256:26c4a7b9d650a4992ce4065cf18f3ca13a0a8a5a1fc86c7cce58d1bad230a6bf" 2025-03-14T05:32:55.5457673Z }, 2025-03-14T05:32:55.5457897Z { 2025-03-14T05:32:55.5458234Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5458642Z "size": 701, 2025-03-14T05:32:55.5459149Z "digest": "sha256:fb9c93639b4043da5bee2c9144ca9cfdc4d78f4ea398226d0543eb383872ac82" 2025-03-14T05:32:55.5459624Z }, 2025-03-14T05:32:55.5459834Z { 2025-03-14T05:32:55.5460169Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5460587Z "size": 140, 2025-03-14T05:32:55.5461000Z "digest": "sha256:712e4e3f44e2c62bb49c2de099d220f2482285fbf9b8ff9887375ce35d21dd59" 2025-03-14T05:32:55.5461457Z }, 2025-03-14T05:32:55.5461666Z { 2025-03-14T05:32:55.5462006Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5462418Z "size": 120, 2025-03-14T05:32:55.5462835Z "digest": "sha256:a03d53d11afa8e326d5f7d12154c7e6896b667318b30e0974f98d40fc1126f7a" 2025-03-14T05:32:55.5463300Z }, 2025-03-14T05:32:55.5463509Z { 2025-03-14T05:32:55.5463854Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5464271Z "size": 6570494848, 2025-03-14T05:32:55.5464693Z "digest": "sha256:298cf77443f2d1e60b5692100904240767762489ae61dc57f65645a460b8e445" 2025-03-14T05:32:55.5465219Z }, 2025-03-14T05:32:55.5465429Z { 2025-03-14T05:32:55.5465774Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5466199Z "size": 174, 2025-03-14T05:32:55.5466628Z "digest": "sha256:6debb4801aa588df1f4980615765bbdd4ed2fae729eb7b43fc08c0f01a0c4373" 2025-03-14T05:32:55.5467147Z }, 2025-03-14T05:32:55.5467367Z { 2025-03-14T05:32:55.5467719Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5468566Z "size": 908, 2025-03-14T05:32:55.5469027Z "digest": "sha256:c127e62d25664923c9ee243cc44eb7b8a2c0c7b38f35b4686597e22dea7aa463" 2025-03-14T05:32:55.5469574Z }, 2025-03-14T05:32:55.5469791Z { 2025-03-14T05:32:55.5470155Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5470626Z "size": 701, 2025-03-14T05:32:55.5471098Z "digest": "sha256:fb9c93639b4043da5bee2c9144ca9cfdc4d78f4ea398226d0543eb383872ac82" 2025-03-14T05:32:55.5471657Z }, 2025-03-14T05:32:55.5471873Z { 2025-03-14T05:32:55.5472244Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5472718Z "size": 135, 2025-03-14T05:32:55.5473169Z "digest": "sha256:7d113dac572033116182c6d382508a8d2c4c618b000861a12be543a170b6b971" 2025-03-14T05:32:55.5473686Z }, 2025-03-14T05:32:55.5473897Z { 2025-03-14T05:32:55.5474310Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5474732Z "size": 32, 2025-03-14T05:32:55.5475155Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-03-14T05:32:55.5475621Z }, 2025-03-14T05:32:55.5475829Z { 2025-03-14T05:32:55.5476167Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5476590Z "size": 156, 2025-03-14T05:32:55.5477006Z "digest": "sha256:41636a0cf6cfcfc036f6c93f2dba4935deaa8c58ed41775265efd8845de05349" 2025-03-14T05:32:55.5477501Z }, 2025-03-14T05:32:55.5477741Z { 2025-03-14T05:32:55.5478091Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5478509Z "size": 1899, 2025-03-14T05:32:55.5478926Z "digest": "sha256:6b28eaf423002704e1555e6880afbdaef8bf2d205b1e88d4ae19a1076c3e695b" 2025-03-14T05:32:55.5479386Z }, 2025-03-14T05:32:55.5479596Z { 2025-03-14T05:32:55.5479936Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5480359Z "size": 196365711, 2025-03-14T05:32:55.5480785Z "digest": "sha256:8836075b6cd4d8b767419a7c40c32d3951e1179982c397b52498e2befb99fb3d" 2025-03-14T05:32:55.5481236Z }, 2025-03-14T05:32:55.5481451Z { 2025-03-14T05:32:55.5481791Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5482209Z "size": 163, 2025-03-14T05:32:55.5482625Z "digest": "sha256:794e80013542eeb7620e5177ada4324d00948f1bf001a3bcbd5c9a332e167747" 2025-03-14T05:32:55.5483088Z }, 2025-03-14T05:32:55.5483296Z { 2025-03-14T05:32:55.5483786Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5484217Z "size": 7943, 2025-03-14T05:32:55.5484625Z "digest": "sha256:50f65d9ffc32655e9576df0c73d072777457306525900e61715903c98210f8d7" 2025-03-14T05:32:55.5485072Z }, 2025-03-14T05:32:55.5485289Z { 2025-03-14T05:32:55.5485630Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5486059Z "size": 8071, 2025-03-14T05:32:55.5486501Z "digest": "sha256:eebbd71eb5f766bc8ba4b9d1a76afbab56e97aca1b474a27e7763bca7b5e8e40" 2025-03-14T05:32:55.5486986Z }, 2025-03-14T05:32:55.5487197Z { 2025-03-14T05:32:55.5498529Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5498947Z "size": 302, 2025-03-14T05:32:55.5499363Z "digest": "sha256:6646f0ede462a8d23e5b8f2b84778c2ec51e3ead5d90b9370741823fccbf4192" 2025-03-14T05:32:55.5499816Z }, 2025-03-14T05:32:55.5500020Z { 2025-03-14T05:32:55.5500360Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5500929Z "size": 7630989, 2025-03-14T05:32:55.5501341Z "digest": "sha256:8552cd082ff68ef2fae8cd8af143e3df9e48d2f7a1b343bc18f035006073d6e0" 2025-03-14T05:32:55.5501805Z }, 2025-03-14T05:32:55.5502005Z { 2025-03-14T05:32:55.5502333Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5502738Z "size": 108, 2025-03-14T05:32:55.5503144Z "digest": "sha256:cfc32c95496a06d94e6b67393056faf9a957ed50b4debe47d6a91d4aae102e2a" 2025-03-14T05:32:55.5503601Z }, 2025-03-14T05:32:55.5503798Z { 2025-03-14T05:32:55.5504125Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5504525Z "size": 54145659, 2025-03-14T05:32:55.5504951Z "digest": "sha256:14d81158e9c4ef27e0dec6dc05b8cd3be7101f4be3212ddbec3c2bf397e5c1b3" 2025-03-14T05:32:55.5505406Z }, 2025-03-14T05:32:55.5505597Z { 2025-03-14T05:32:55.5505930Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5506343Z "size": 495, 2025-03-14T05:32:55.5506749Z "digest": "sha256:1f7ab46ead9545fb8e2b5cd879a5e4a177415a84139749e9af2dfa907adb42c0" 2025-03-14T05:32:55.5507199Z }, 2025-03-14T05:32:55.5507423Z { 2025-03-14T05:32:55.5507781Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5508190Z "size": 1179582145, 2025-03-14T05:32:55.5508619Z "digest": "sha256:49ef4891eff9d3fb31752716a248dae0d7e39d9fb76afe324557bcc49a0a68d8" 2025-03-14T05:32:55.5509078Z }, 2025-03-14T05:32:55.5509273Z { 2025-03-14T05:32:55.5509603Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5510007Z "size": 106, 2025-03-14T05:32:55.5510401Z "digest": "sha256:b4c737e437c344480302d7355bc9aa5ba77e267b92ee294bac9e4a28e83a9926" 2025-03-14T05:32:55.5510850Z }, 2025-03-14T05:32:55.5511043Z { 2025-03-14T05:32:55.5511369Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5511771Z "size": 613, 2025-03-14T05:32:55.5512173Z "digest": "sha256:c6d1e46b29684c2e7b8983020377de47a4cb34e3c338c2811a882c5471992f75" 2025-03-14T05:32:55.5512616Z }, 2025-03-14T05:32:55.5512816Z { 2025-03-14T05:32:55.5513144Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5513546Z "size": 317359197, 2025-03-14T05:32:55.5513955Z "digest": "sha256:c754fb44357294111b5d35978291ccb879ba71a9c2f00b19b829fe5e5c203b8a" 2025-03-14T05:32:55.5514454Z }, 2025-03-14T05:32:55.5514648Z { 2025-03-14T05:32:55.5514978Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5515380Z "size": 111, 2025-03-14T05:32:55.5515787Z "digest": "sha256:cfb5071726ef9c3c889d1aaea6a043eaff5227ef2e6c24bf2fd1961828c0cbef" 2025-03-14T05:32:55.5516242Z }, 2025-03-14T05:32:55.5516439Z { 2025-03-14T05:32:55.5516767Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5517169Z "size": 529, 2025-03-14T05:32:55.5517656Z "digest": "sha256:8be47e434613a3148316410ad09dfc7f78b8c38817b3430a011322c61bdf4b91" 2025-03-14T05:32:55.5518153Z }, 2025-03-14T05:32:55.5518348Z { 2025-03-14T05:32:55.5518672Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5519075Z "size": 26099, 2025-03-14T05:32:55.5519484Z "digest": "sha256:a68c1967b44da96765aa1dde9b11dc1586ccf4b4282347a52739d85bdefce7b1" 2025-03-14T05:32:55.5519930Z }, 2025-03-14T05:32:55.5520127Z { 2025-03-14T05:32:55.5520454Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5520854Z "size": 106, 2025-03-14T05:32:55.5521256Z "digest": "sha256:985724505cd9e9e2e0cf8c2b38deb74cf3c395aad4e1f4c4c7e0ba8a8a072620" 2025-03-14T05:32:55.5521707Z }, 2025-03-14T05:32:55.5521901Z { 2025-03-14T05:32:55.5522234Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5522633Z "size": 32, 2025-03-14T05:32:55.5523047Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-03-14T05:32:55.5523578Z }, 2025-03-14T05:32:55.5523777Z { 2025-03-14T05:32:55.5524109Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5524515Z "size": 32, 2025-03-14T05:32:55.5524920Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-03-14T05:32:55.5525369Z }, 2025-03-14T05:32:55.5525561Z { 2025-03-14T05:32:55.5525887Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5526291Z "size": 32, 2025-03-14T05:32:55.5526688Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-03-14T05:32:55.5527187Z }, 2025-03-14T05:32:55.5527382Z { 2025-03-14T05:32:55.5527706Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-03-14T05:32:55.5528109Z "size": 32, 2025-03-14T05:32:55.5528507Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-03-14T05:32:55.5528967Z } 2025-03-14T05:32:55.5529163Z ] 2025-03-14T05:32:55.5529356Z } 2025-03-14T05:32:55.5529572Z + exit 0 2025-03-14T05:32:55.5560614Z ##[group]Run set -eux 2025-03-14T05:32:55.5560899Z set -eux 2025-03-14T05:32:55.5561737Z aws secretsmanager get-secret-value --secret-id docker_hub_readonly_token | jq --raw-output '.SecretString' | jq -r .docker_hub_readonly_token | docker login --username pytorchbot --password-stdin 2025-03-14T05:32:55.5573042Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:32:55.5573420Z env: 2025-03-14T05:32:55.5573657Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:32:55.5573931Z ##[endgroup] 2025-03-14T05:32:55.5605300Z + aws secretsmanager get-secret-value --secret-id docker_hub_readonly_token 2025-03-14T05:32:55.5606392Z + jq --raw-output .SecretString 2025-03-14T05:32:55.5607240Z + jq -r .docker_hub_readonly_token 2025-03-14T05:32:55.5609192Z + docker login --username pytorchbot --password-stdin 2025-03-14T05:32:56.1244365Z WARNING! Your password will be stored unencrypted in /home/ec2-user/.docker/config.json. 2025-03-14T05:32:56.1245207Z Configure a credential helper to remove this warning. See 2025-03-14T05:32:56.1245787Z https://docs.docker.com/engine/reference/commandline/login/#credentials-store 2025-03-14T05:32:56.1246160Z 2025-03-14T05:32:56.1246308Z Login Succeeded 2025-03-14T05:32:56.1339038Z ##[group]Run tag=${ECR_DOCKER_IMAGE##*/} 2025-03-14T05:32:56.1339416Z tag=${ECR_DOCKER_IMAGE##*/} 2025-03-14T05:32:56.1339812Z echo "docker pull ghcr.io/pytorch/ci-image:${tag/:/-}" 2025-03-14T05:32:56.1348640Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:32:56.1349026Z env: 2025-03-14T05:32:56.1349265Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:32:56.1350108Z ECR_DOCKER_IMAGE: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:32:56.1350971Z ##[endgroup] 2025-03-14T05:32:56.1381890Z docker pull ghcr.io/pytorch/ci-image:pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks-aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:32:56.1439675Z ##[group]Run pytorch/test-infra/.github/actions/pull-docker-image@main 2025-03-14T05:32:56.1440121Z with: 2025-03-14T05:32:56.1440909Z docker-image: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:32:56.1441860Z docker-registry: 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-03-14T05:32:56.1442249Z env: 2025-03-14T05:32:56.1442487Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:32:56.1442774Z ##[endgroup] 2025-03-14T05:32:56.1461198Z ##[group]Run set -x 2025-03-14T05:32:56.1461475Z set -x 2025-03-14T05:32:56.1461721Z set +e 2025-03-14T05:32:56.1461956Z  2025-03-14T05:32:56.1462183Z login() { 2025-03-14T05:32:56.1462668Z  aws ecr get-login-password --region us-east-1 | docker login -u AWS --password-stdin "$1" 2025-03-14T05:32:56.1463337Z } 2025-03-14T05:32:56.1463566Z  2025-03-14T05:32:56.1463830Z retry () { 2025-03-14T05:32:56.1464122Z  $* || (sleep 1 && $*) || (sleep 2 && $*) 2025-03-14T05:32:56.1464442Z } 2025-03-14T05:32:56.1464671Z  2025-03-14T05:32:56.1464926Z retry login "${DOCKER_REGISTRY}" 2025-03-14T05:32:56.1465236Z  2025-03-14T05:32:56.1465466Z set -e 2025-03-14T05:32:56.1465819Z # ignore output since only exit code is used for conditional 2025-03-14T05:32:56.1466308Z # only pull docker image if it's not available locally 2025-03-14T05:32:56.1466847Z if ! docker inspect --type=image "${DOCKER_IMAGE}" >/dev/null 2>/dev/null; then 2025-03-14T05:32:56.1467386Z  retry docker pull "${DOCKER_IMAGE}" 2025-03-14T05:32:56.1467720Z fi 2025-03-14T05:32:56.1476149Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:32:56.1476538Z env: 2025-03-14T05:32:56.1476784Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:32:56.1477623Z DOCKER_IMAGE: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:32:56.1478562Z DOCKER_REGISTRY: 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-03-14T05:32:56.1478957Z ##[endgroup] 2025-03-14T05:32:56.1504604Z + set +e 2025-03-14T05:32:56.1504933Z + retry login 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-03-14T05:32:56.1505363Z + login 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-03-14T05:32:56.1508382Z + aws ecr get-login-password --region us-east-1 2025-03-14T05:32:56.1509644Z + docker login -u AWS --password-stdin 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-03-14T05:32:56.6613636Z WARNING! Your password will be stored unencrypted in /home/ec2-user/.docker/config.json. 2025-03-14T05:32:56.6614319Z Configure a credential helper to remove this warning. See 2025-03-14T05:32:56.6614989Z https://docs.docker.com/engine/reference/commandline/login/#credentials-store 2025-03-14T05:32:56.6615442Z 2025-03-14T05:32:56.6615583Z Login Succeeded 2025-03-14T05:32:56.6637670Z + set -e 2025-03-14T05:32:56.6638844Z + docker inspect --type=image 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:32:56.6787291Z + retry docker pull 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:32:56.6788749Z + docker pull 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:32:56.8875253Z aa89d6e739080d90fa18625d57297c6734465849: Pulling from pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks 2025-03-14T05:32:56.8877981Z 86e5016c2693: Pulling fs layer 2025-03-14T05:32:56.8878729Z 49e139a3d6c2: Pulling fs layer 2025-03-14T05:32:56.8879406Z a14844c8c51f: Pulling fs layer 2025-03-14T05:32:56.8879790Z 18fb524087fb: Pulling fs layer 2025-03-14T05:32:56.8880100Z efd686a7b2c8: Pulling fs layer 2025-03-14T05:32:56.8880393Z 52c648e21334: Pulling fs layer 2025-03-14T05:32:56.8880705Z 56e384e4e5aa: Pulling fs layer 2025-03-14T05:32:56.8881131Z fb71b792ec6c: Pulling fs layer 2025-03-14T05:32:56.8881422Z 5509576f2693: Pulling fs layer 2025-03-14T05:32:56.8881738Z 1e6c6f2d2459: Pulling fs layer 2025-03-14T05:32:56.8882037Z a1fe8922734d: Pulling fs layer 2025-03-14T05:32:56.8882347Z b6dbf6a349bb: Pulling fs layer 2025-03-14T05:32:56.8882645Z 73d034163336: Pulling fs layer 2025-03-14T05:32:56.8882943Z f2a6fb332e8e: Pulling fs layer 2025-03-14T05:32:56.8883243Z 991f90b489b1: Pulling fs layer 2025-03-14T05:32:56.8883534Z 68199204396f: Pulling fs layer 2025-03-14T05:32:56.8883820Z 8d533b261ba2: Pulling fs layer 2025-03-14T05:32:56.8884279Z 9552b75545b1: Pulling fs layer 2025-03-14T05:32:56.8884572Z 330a44046cfd: Pulling fs layer 2025-03-14T05:32:56.8884876Z fb9c93639b40: Pulling fs layer 2025-03-14T05:32:56.8885171Z a5d1ad88496e: Pulling fs layer 2025-03-14T05:32:56.8885475Z 4e2ec52a1b4f: Pulling fs layer 2025-03-14T05:32:56.8885771Z 1576583c5ff8: Pulling fs layer 2025-03-14T05:32:56.8886065Z ae76494b3f0f: Pulling fs layer 2025-03-14T05:32:56.8886370Z 96b47e94ca2f: Pulling fs layer 2025-03-14T05:32:56.8886670Z e26a64ee2eeb: Pulling fs layer 2025-03-14T05:32:56.8886967Z 97f84a5ed9bb: Pulling fs layer 2025-03-14T05:32:56.8887250Z 52c648e21334: Waiting 2025-03-14T05:32:56.8887506Z f2a6fb332e8e: Waiting 2025-03-14T05:32:56.8887779Z 1cc35cfdfb65: Pulling fs layer 2025-03-14T05:32:56.8888078Z cfe527e10a79: Pulling fs layer 2025-03-14T05:32:56.8888373Z 528c12012cc2: Pulling fs layer 2025-03-14T05:32:56.8888648Z 991f90b489b1: Waiting 2025-03-14T05:32:56.8888907Z b6dbf6a349bb: Waiting 2025-03-14T05:32:56.8889181Z 3a0e7232565e: Pulling fs layer 2025-03-14T05:32:56.8889533Z 56e384e4e5aa: Waiting 2025-03-14T05:32:56.8889805Z ac1e95ec684d: Pulling fs layer 2025-03-14T05:32:56.8890093Z 73d034163336: Waiting 2025-03-14T05:32:56.8890345Z 18fb524087fb: Waiting 2025-03-14T05:32:56.8890607Z 1e6c6f2d2459: Waiting 2025-03-14T05:32:56.8890910Z fb71b792ec6c: Waiting 2025-03-14T05:32:56.8891210Z 5509576f2693: Waiting 2025-03-14T05:32:56.8891511Z a1fe8922734d: Waiting 2025-03-14T05:32:56.8891824Z 68199204396f: Waiting 2025-03-14T05:32:56.8892080Z efd686a7b2c8: Waiting 2025-03-14T05:32:56.8892336Z 8d533b261ba2: Waiting 2025-03-14T05:32:56.8892590Z a5d1ad88496e: Waiting 2025-03-14T05:32:56.8892847Z 9552b75545b1: Waiting 2025-03-14T05:32:56.8893102Z 330a44046cfd: Waiting 2025-03-14T05:32:56.8893355Z 96b47e94ca2f: Waiting 2025-03-14T05:32:56.8893608Z cfe527e10a79: Waiting 2025-03-14T05:32:56.8893863Z 97f84a5ed9bb: Waiting 2025-03-14T05:32:56.8894117Z fb9c93639b40: Waiting 2025-03-14T05:32:56.8894378Z 528c12012cc2: Waiting 2025-03-14T05:32:56.8894642Z 4e2ec52a1b4f: Waiting 2025-03-14T05:32:56.8894903Z 1cc35cfdfb65: Waiting 2025-03-14T05:32:56.8895163Z e26a64ee2eeb: Waiting 2025-03-14T05:32:56.8895429Z ae76494b3f0f: Waiting 2025-03-14T05:32:56.8895699Z f46926cb3922: Pulling fs layer 2025-03-14T05:32:56.8895992Z 1576583c5ff8: Waiting 2025-03-14T05:32:56.8896252Z 3a0e7232565e: Waiting 2025-03-14T05:32:56.8896518Z ac1e95ec684d: Waiting 2025-03-14T05:32:56.8896798Z dd5d197efeb1: Pulling fs layer 2025-03-14T05:32:56.8897089Z f46926cb3922: Waiting 2025-03-14T05:32:56.8897364Z 6ec4020a8fae: Pulling fs layer 2025-03-14T05:32:56.8897657Z 39e1750b6399: Pulling fs layer 2025-03-14T05:32:56.8897956Z d17cf48b138e: Pulling fs layer 2025-03-14T05:32:56.8898335Z 86c6b7977b8b: Pulling fs layer 2025-03-14T05:32:56.8898682Z b1829fd86eff: Pulling fs layer 2025-03-14T05:32:56.8898989Z 4c8da014a48e: Pulling fs layer 2025-03-14T05:32:56.8899330Z 3e933db6894b: Pulling fs layer 2025-03-14T05:32:56.8899785Z dd5d197efeb1: Waiting 2025-03-14T05:32:56.8900078Z 1bef7a4bcd0a: Pulling fs layer 2025-03-14T05:32:56.8900387Z 771ad9f2789a: Pulling fs layer 2025-03-14T05:32:56.8900678Z 907000cb43f1: Pulling fs layer 2025-03-14T05:32:56.8901202Z 540fab0ce537: Pulling fs layer 2025-03-14T05:32:56.8901491Z 6ec4020a8fae: Waiting 2025-03-14T05:32:56.8901768Z ca5aa8a8bb00: Pulling fs layer 2025-03-14T05:32:56.8902071Z 19197f3f8a26: Pulling fs layer 2025-03-14T05:32:56.8902500Z 50b63f41fa17: Pulling fs layer 2025-03-14T05:32:56.8902896Z 39e1750b6399: Waiting 2025-03-14T05:32:56.8903245Z 3e933db6894b: Waiting 2025-03-14T05:32:56.8903522Z 26c4a7b9d650: Pulling fs layer 2025-03-14T05:32:56.8903817Z 1bef7a4bcd0a: Waiting 2025-03-14T05:32:56.8904084Z 540fab0ce537: Waiting 2025-03-14T05:32:56.8904351Z d17cf48b138e: Waiting 2025-03-14T05:32:56.8904623Z 712e4e3f44e2: Pulling fs layer 2025-03-14T05:32:56.8904917Z ca5aa8a8bb00: Waiting 2025-03-14T05:32:56.8905193Z a03d53d11afa: Pulling fs layer 2025-03-14T05:32:56.8905483Z 771ad9f2789a: Waiting 2025-03-14T05:32:56.8905748Z 19197f3f8a26: Waiting 2025-03-14T05:32:56.8906116Z 86c6b7977b8b: Waiting 2025-03-14T05:32:56.8906389Z 298cf77443f2: Pulling fs layer 2025-03-14T05:32:56.8906675Z 26c4a7b9d650: Waiting 2025-03-14T05:32:56.8906933Z 907000cb43f1: Waiting 2025-03-14T05:32:56.8907207Z 6debb4801aa5: Pulling fs layer 2025-03-14T05:32:56.8907505Z 4c8da014a48e: Waiting 2025-03-14T05:32:56.8907811Z c127e62d2566: Pulling fs layer 2025-03-14T05:32:56.8908125Z 50b63f41fa17: Waiting 2025-03-14T05:32:56.8908400Z 7d113dac5720: Pulling fs layer 2025-03-14T05:32:56.8908705Z 4f4fb700ef54: Pulling fs layer 2025-03-14T05:32:56.8909012Z 41636a0cf6cf: Pulling fs layer 2025-03-14T05:32:56.8909409Z 6b28eaf42300: Pulling fs layer 2025-03-14T05:32:56.8909701Z 712e4e3f44e2: Waiting 2025-03-14T05:32:56.8909974Z 8836075b6cd4: Pulling fs layer 2025-03-14T05:32:56.8910259Z c127e62d2566: Waiting 2025-03-14T05:32:56.8910532Z 794e80013542: Pulling fs layer 2025-03-14T05:32:56.8910830Z 50f65d9ffc32: Pulling fs layer 2025-03-14T05:32:56.8911121Z a03d53d11afa: Waiting 2025-03-14T05:32:56.8911389Z 6debb4801aa5: Waiting 2025-03-14T05:32:56.8911670Z eebbd71eb5f7: Pulling fs layer 2025-03-14T05:32:56.8911974Z 6646f0ede462: Pulling fs layer 2025-03-14T05:32:56.8912270Z 4f4fb700ef54: Waiting 2025-03-14T05:32:56.8912536Z 298cf77443f2: Waiting 2025-03-14T05:32:56.8912792Z 7d113dac5720: Waiting 2025-03-14T05:32:56.8913052Z 41636a0cf6cf: Waiting 2025-03-14T05:32:56.8913384Z 8552cd082ff6: Pulling fs layer 2025-03-14T05:32:56.8913691Z cfc32c95496a: Pulling fs layer 2025-03-14T05:32:56.8914031Z 8836075b6cd4: Waiting 2025-03-14T05:32:56.8914390Z 6646f0ede462: Waiting 2025-03-14T05:32:56.8914651Z 50f65d9ffc32: Waiting 2025-03-14T05:32:56.8914977Z 14d81158e9c4: Pulling fs layer 2025-03-14T05:32:56.8915264Z cfc32c95496a: Waiting 2025-03-14T05:32:56.8915681Z 1f7ab46ead95: Pulling fs layer 2025-03-14T05:32:56.8915985Z 49ef4891eff9: Pulling fs layer 2025-03-14T05:32:56.8916333Z b4c737e437c3: Pulling fs layer 2025-03-14T05:32:56.8916635Z c6d1e46b2968: Pulling fs layer 2025-03-14T05:32:56.8916943Z c754fb443572: Pulling fs layer 2025-03-14T05:32:56.8917253Z cfb5071726ef: Pulling fs layer 2025-03-14T05:32:56.8917560Z 8be47e434613: Pulling fs layer 2025-03-14T05:32:56.8917900Z a68c1967b44d: Pulling fs layer 2025-03-14T05:32:56.8918221Z 985724505cd9: Pulling fs layer 2025-03-14T05:32:56.8918514Z 985724505cd9: Waiting 2025-03-14T05:32:56.8918780Z 1f7ab46ead95: Waiting 2025-03-14T05:32:56.8919047Z 8be47e434613: Waiting 2025-03-14T05:32:56.8919321Z a68c1967b44d: Waiting 2025-03-14T05:32:56.8919582Z cfb5071726ef: Waiting 2025-03-14T05:32:56.8920011Z 49ef4891eff9: Waiting 2025-03-14T05:32:56.8920275Z c6d1e46b2968: Waiting 2025-03-14T05:32:56.8920527Z c754fb443572: Waiting 2025-03-14T05:32:57.0270707Z 49e139a3d6c2: Verifying Checksum 2025-03-14T05:32:57.0271133Z 49e139a3d6c2: Download complete 2025-03-14T05:32:57.0991006Z 18fb524087fb: Verifying Checksum 2025-03-14T05:32:57.0991353Z 18fb524087fb: Download complete 2025-03-14T05:32:57.2081214Z efd686a7b2c8: Verifying Checksum 2025-03-14T05:32:57.2081557Z efd686a7b2c8: Download complete 2025-03-14T05:32:57.2398292Z 86e5016c2693: Verifying Checksum 2025-03-14T05:32:57.2398767Z 86e5016c2693: Download complete 2025-03-14T05:32:57.3429668Z 56e384e4e5aa: Verifying Checksum 2025-03-14T05:32:57.3430149Z 56e384e4e5aa: Download complete 2025-03-14T05:32:57.4217013Z fb71b792ec6c: Verifying Checksum 2025-03-14T05:32:57.4217487Z fb71b792ec6c: Download complete 2025-03-14T05:32:57.5073376Z 5509576f2693: Verifying Checksum 2025-03-14T05:32:57.5073850Z 5509576f2693: Download complete 2025-03-14T05:32:57.5197847Z a14844c8c51f: Verifying Checksum 2025-03-14T05:32:57.5198294Z a14844c8c51f: Download complete 2025-03-14T05:32:57.6183258Z a1fe8922734d: Download complete 2025-03-14T05:32:57.7138453Z b6dbf6a349bb: Verifying Checksum 2025-03-14T05:32:57.7138961Z b6dbf6a349bb: Download complete 2025-03-14T05:32:58.3316245Z 86e5016c2693: Pull complete 2025-03-14T05:32:58.6296057Z 49e139a3d6c2: Pull complete 2025-03-14T05:32:59.4064566Z a14844c8c51f: Pull complete 2025-03-14T05:32:59.4290235Z 18fb524087fb: Pull complete 2025-03-14T05:32:59.4505066Z efd686a7b2c8: Pull complete 2025-03-14T05:33:00.2350420Z 73d034163336: Verifying Checksum 2025-03-14T05:33:00.2350820Z 73d034163336: Download complete 2025-03-14T05:33:00.3147355Z f2a6fb332e8e: Verifying Checksum 2025-03-14T05:33:00.3147958Z f2a6fb332e8e: Download complete 2025-03-14T05:33:00.3800952Z 991f90b489b1: Verifying Checksum 2025-03-14T05:33:00.3801454Z 991f90b489b1: Download complete 2025-03-14T05:33:00.4577375Z 68199204396f: Verifying Checksum 2025-03-14T05:33:00.4577763Z 68199204396f: Download complete 2025-03-14T05:33:01.4392043Z 8d533b261ba2: Verifying Checksum 2025-03-14T05:33:01.4392527Z 8d533b261ba2: Download complete 2025-03-14T05:33:01.5387746Z 9552b75545b1: Verifying Checksum 2025-03-14T05:33:01.5388255Z 9552b75545b1: Download complete 2025-03-14T05:33:01.6364054Z 330a44046cfd: Verifying Checksum 2025-03-14T05:33:01.6364428Z 330a44046cfd: Download complete 2025-03-14T05:33:01.7078820Z fb9c93639b40: Verifying Checksum 2025-03-14T05:33:01.7079496Z fb9c93639b40: Download complete 2025-03-14T05:33:01.7718936Z a5d1ad88496e: Verifying Checksum 2025-03-14T05:33:01.7719381Z a5d1ad88496e: Download complete 2025-03-14T05:33:11.6603899Z 52c648e21334: Verifying Checksum 2025-03-14T05:33:11.6604281Z 52c648e21334: Download complete 2025-03-14T05:33:11.7531754Z 1576583c5ff8: Verifying Checksum 2025-03-14T05:33:11.7532148Z 1576583c5ff8: Download complete 2025-03-14T05:33:11.8217664Z ae76494b3f0f: Verifying Checksum 2025-03-14T05:33:11.8218194Z ae76494b3f0f: Download complete 2025-03-14T05:33:11.8987677Z 96b47e94ca2f: Verifying Checksum 2025-03-14T05:33:11.8988090Z 96b47e94ca2f: Download complete 2025-03-14T05:33:12.0527568Z e26a64ee2eeb: Verifying Checksum 2025-03-14T05:33:12.0528061Z e26a64ee2eeb: Download complete 2025-03-14T05:33:12.1245871Z 97f84a5ed9bb: Verifying Checksum 2025-03-14T05:33:12.1246348Z 97f84a5ed9bb: Download complete 2025-03-14T05:33:13.3859493Z 1cc35cfdfb65: Verifying Checksum 2025-03-14T05:33:13.3859873Z 1cc35cfdfb65: Download complete 2025-03-14T05:33:13.4584354Z cfe527e10a79: Verifying Checksum 2025-03-14T05:33:13.4584780Z cfe527e10a79: Download complete 2025-03-14T05:33:13.5442734Z 528c12012cc2: Download complete 2025-03-14T05:33:13.6321014Z 3a0e7232565e: Verifying Checksum 2025-03-14T05:33:13.6321477Z 3a0e7232565e: Download complete 2025-03-14T05:33:13.7100224Z ac1e95ec684d: Verifying Checksum 2025-03-14T05:33:13.7100705Z ac1e95ec684d: Download complete 2025-03-14T05:33:13.7854530Z f46926cb3922: Download complete 2025-03-14T05:33:18.1120109Z dd5d197efeb1: Verifying Checksum 2025-03-14T05:33:18.2086753Z dd5d197efeb1: Download complete 2025-03-14T05:33:18.2087258Z 6ec4020a8fae: Download complete 2025-03-14T05:33:18.2979079Z 39e1750b6399: Verifying Checksum 2025-03-14T05:33:18.2979461Z 39e1750b6399: Download complete 2025-03-14T05:33:18.7152952Z d17cf48b138e: Verifying Checksum 2025-03-14T05:33:18.7153446Z d17cf48b138e: Download complete 2025-03-14T05:33:18.7900027Z 86c6b7977b8b: Download complete 2025-03-14T05:33:18.8570418Z b1829fd86eff: Verifying Checksum 2025-03-14T05:33:18.8570942Z b1829fd86eff: Download complete 2025-03-14T05:33:19.1125825Z 4c8da014a48e: Verifying Checksum 2025-03-14T05:33:19.1126659Z 4c8da014a48e: Download complete 2025-03-14T05:33:19.1943126Z 3e933db6894b: Verifying Checksum 2025-03-14T05:33:19.1943614Z 3e933db6894b: Download complete 2025-03-14T05:33:19.2694665Z 1bef7a4bcd0a: Verifying Checksum 2025-03-14T05:33:19.2695147Z 1bef7a4bcd0a: Download complete 2025-03-14T05:33:19.3470861Z 771ad9f2789a: Download complete 2025-03-14T05:33:22.7258927Z 1e6c6f2d2459: Verifying Checksum 2025-03-14T05:33:22.7259362Z 1e6c6f2d2459: Download complete 2025-03-14T05:33:22.8041522Z 540fab0ce537: Verifying Checksum 2025-03-14T05:33:22.8041968Z 540fab0ce537: Download complete 2025-03-14T05:33:22.8949588Z ca5aa8a8bb00: Download complete 2025-03-14T05:33:23.3761401Z 19197f3f8a26: Verifying Checksum 2025-03-14T05:33:23.3761876Z 19197f3f8a26: Download complete 2025-03-14T05:33:23.4731532Z 50b63f41fa17: Verifying Checksum 2025-03-14T05:33:23.4731987Z 50b63f41fa17: Download complete 2025-03-14T05:33:23.5528088Z 26c4a7b9d650: Verifying Checksum 2025-03-14T05:33:23.5528594Z 26c4a7b9d650: Download complete 2025-03-14T05:33:23.6327870Z 712e4e3f44e2: Verifying Checksum 2025-03-14T05:33:23.6328248Z 712e4e3f44e2: Download complete 2025-03-14T05:33:23.7152211Z a03d53d11afa: Download complete 2025-03-14T05:33:24.4143463Z 52c648e21334: Pull complete 2025-03-14T05:33:24.5480709Z 56e384e4e5aa: Pull complete 2025-03-14T05:33:24.7351451Z fb71b792ec6c: Pull complete 2025-03-14T05:33:24.9615483Z 5509576f2693: Pull complete 2025-03-14T05:33:31.0557846Z 4e2ec52a1b4f: Verifying Checksum 2025-03-14T05:33:31.0558362Z 4e2ec52a1b4f: Download complete 2025-03-14T05:33:31.1228097Z 6debb4801aa5: Download complete 2025-03-14T05:33:31.2115011Z c127e62d2566: Verifying Checksum 2025-03-14T05:33:31.2115529Z c127e62d2566: Download complete 2025-03-14T05:33:31.2767120Z 7d113dac5720: Verifying Checksum 2025-03-14T05:33:31.2767647Z 7d113dac5720: Download complete 2025-03-14T05:33:31.2828463Z 4f4fb700ef54: Verifying Checksum 2025-03-14T05:33:31.2828919Z 4f4fb700ef54: Download complete 2025-03-14T05:33:31.3735296Z 41636a0cf6cf: Verifying Checksum 2025-03-14T05:33:31.3735856Z 41636a0cf6cf: Download complete 2025-03-14T05:33:31.4548151Z 6b28eaf42300: Verifying Checksum 2025-03-14T05:33:31.4548566Z 6b28eaf42300: Download complete 2025-03-14T05:33:33.4631642Z 8836075b6cd4: Verifying Checksum 2025-03-14T05:33:33.4632138Z 8836075b6cd4: Download complete 2025-03-14T05:33:33.5345225Z 794e80013542: Verifying Checksum 2025-03-14T05:33:33.5345705Z 794e80013542: Download complete 2025-03-14T05:33:33.6031488Z 50f65d9ffc32: Verifying Checksum 2025-03-14T05:33:33.6031859Z 50f65d9ffc32: Download complete 2025-03-14T05:33:33.6817799Z eebbd71eb5f7: Verifying Checksum 2025-03-14T05:33:33.6818243Z eebbd71eb5f7: Download complete 2025-03-14T05:33:33.7669186Z 6646f0ede462: Verifying Checksum 2025-03-14T05:33:33.7671752Z 6646f0ede462: Download complete 2025-03-14T05:33:33.8828562Z 8552cd082ff6: Verifying Checksum 2025-03-14T05:33:33.8829160Z 8552cd082ff6: Download complete 2025-03-14T05:33:33.9730874Z cfc32c95496a: Download complete 2025-03-14T05:33:34.5857011Z 14d81158e9c4: Verifying Checksum 2025-03-14T05:33:34.5857528Z 14d81158e9c4: Download complete 2025-03-14T05:33:34.6695849Z 1f7ab46ead95: Verifying Checksum 2025-03-14T05:33:34.6696208Z 1f7ab46ead95: Download complete 2025-03-14T05:34:01.7324634Z 49ef4891eff9: Verifying Checksum 2025-03-14T05:34:01.7325130Z 49ef4891eff9: Download complete 2025-03-14T05:34:01.8071946Z b4c737e437c3: Verifying Checksum 2025-03-14T05:34:01.8072307Z b4c737e437c3: Download complete 2025-03-14T05:34:01.8859967Z c6d1e46b2968: Download complete 2025-03-14T05:34:10.3293641Z c754fb443572: Verifying Checksum 2025-03-14T05:34:10.3294010Z c754fb443572: Download complete 2025-03-14T05:34:10.4094442Z cfb5071726ef: Verifying Checksum 2025-03-14T05:34:10.4094921Z cfb5071726ef: Download complete 2025-03-14T05:34:10.4791484Z 8be47e434613: Verifying Checksum 2025-03-14T05:34:10.4791822Z 8be47e434613: Download complete 2025-03-14T05:34:10.5744859Z a68c1967b44d: Verifying Checksum 2025-03-14T05:34:10.6410180Z 985724505cd9: Verifying Checksum 2025-03-14T05:34:10.6410804Z 985724505cd9: Download complete 2025-03-14T05:34:49.4975307Z 907000cb43f1: Verifying Checksum 2025-03-14T05:34:49.4975682Z 907000cb43f1: Download complete 2025-03-14T05:35:05.1426052Z 298cf77443f2: Verifying Checksum 2025-03-14T05:35:05.1426518Z 298cf77443f2: Download complete 2025-03-14T05:35:17.5463364Z 1e6c6f2d2459: Pull complete 2025-03-14T05:35:17.7790656Z a1fe8922734d: Pull complete 2025-03-14T05:35:18.0026026Z b6dbf6a349bb: Pull complete 2025-03-14T05:35:25.7615196Z 73d034163336: Pull complete 2025-03-14T05:35:25.8563544Z f2a6fb332e8e: Pull complete 2025-03-14T05:35:25.9661093Z 991f90b489b1: Pull complete 2025-03-14T05:35:26.1559519Z 68199204396f: Pull complete 2025-03-14T05:35:28.4511257Z 8d533b261ba2: Pull complete 2025-03-14T05:35:28.6484931Z 9552b75545b1: Pull complete 2025-03-14T05:35:28.8486083Z 330a44046cfd: Pull complete 2025-03-14T05:35:29.0679160Z fb9c93639b40: Pull complete 2025-03-14T05:35:29.2872550Z a5d1ad88496e: Pull complete 2025-03-14T05:36:21.1889605Z 4e2ec52a1b4f: Pull complete 2025-03-14T05:36:21.2577612Z 1576583c5ff8: Pull complete 2025-03-14T05:36:21.4574937Z ae76494b3f0f: Pull complete 2025-03-14T05:36:21.5559543Z 96b47e94ca2f: Pull complete 2025-03-14T05:36:21.6846403Z e26a64ee2eeb: Pull complete 2025-03-14T05:36:21.8423323Z 97f84a5ed9bb: Pull complete 2025-03-14T05:36:24.3179137Z 1cc35cfdfb65: Pull complete 2025-03-14T05:36:24.5124459Z cfe527e10a79: Pull complete 2025-03-14T05:36:24.6385539Z 528c12012cc2: Pull complete 2025-03-14T05:36:24.7929675Z 3a0e7232565e: Pull complete 2025-03-14T05:36:24.9460960Z ac1e95ec684d: Pull complete 2025-03-14T05:36:25.1680957Z f46926cb3922: Pull complete 2025-03-14T05:36:33.5929178Z dd5d197efeb1: Pull complete 2025-03-14T05:36:33.7722486Z 6ec4020a8fae: Pull complete 2025-03-14T05:36:33.9641538Z 39e1750b6399: Pull complete 2025-03-14T05:36:34.7575090Z d17cf48b138e: Pull complete 2025-03-14T05:36:34.9645622Z 86c6b7977b8b: Pull complete 2025-03-14T05:36:35.1782185Z b1829fd86eff: Pull complete 2025-03-14T05:36:35.5890138Z 4c8da014a48e: Pull complete 2025-03-14T05:36:35.7928684Z 3e933db6894b: Pull complete 2025-03-14T05:36:36.1878337Z 1bef7a4bcd0a: Pull complete 2025-03-14T05:36:36.4087201Z 771ad9f2789a: Pull complete 2025-03-14T05:37:16.1458608Z 907000cb43f1: Pull complete 2025-03-14T05:37:16.3789020Z 540fab0ce537: Pull complete 2025-03-14T05:37:16.6100356Z ca5aa8a8bb00: Pull complete 2025-03-14T05:37:19.1994411Z 19197f3f8a26: Pull complete 2025-03-14T05:37:19.4272332Z 50b63f41fa17: Pull complete 2025-03-14T05:37:19.6484393Z 26c4a7b9d650: Pull complete 2025-03-14T05:37:20.0891964Z 712e4e3f44e2: Pull complete 2025-03-14T05:37:20.3160128Z a03d53d11afa: Pull complete 2025-03-14T05:39:44.1662376Z 298cf77443f2: Pull complete 2025-03-14T05:39:44.4073038Z 6debb4801aa5: Pull complete 2025-03-14T05:39:44.6443794Z c127e62d2566: Pull complete 2025-03-14T05:39:45.0171200Z 7d113dac5720: Pull complete 2025-03-14T05:39:45.2278455Z 4f4fb700ef54: Pull complete 2025-03-14T05:39:45.4528606Z 41636a0cf6cf: Pull complete 2025-03-14T05:39:45.6898825Z 6b28eaf42300: Pull complete 2025-03-14T05:39:52.6589709Z 8836075b6cd4: Pull complete 2025-03-14T05:39:52.8919954Z 794e80013542: Pull complete 2025-03-14T05:39:53.1351763Z 50f65d9ffc32: Pull complete 2025-03-14T05:39:53.3627562Z eebbd71eb5f7: Pull complete 2025-03-14T05:39:53.6020312Z 6646f0ede462: Pull complete 2025-03-14T05:39:54.9982081Z 8552cd082ff6: Pull complete 2025-03-14T05:39:55.2273584Z cfc32c95496a: Pull complete 2025-03-14T05:39:56.9355635Z 14d81158e9c4: Pull complete 2025-03-14T05:39:57.1597107Z 1f7ab46ead95: Pull complete 2025-03-14T05:40:08.1145368Z 49ef4891eff9: Pull complete 2025-03-14T05:40:08.3149954Z b4c737e437c3: Pull complete 2025-03-14T05:40:08.5353825Z c6d1e46b2968: Pull complete 2025-03-14T05:40:11.6504216Z c754fb443572: Pull complete 2025-03-14T05:40:11.8856313Z cfb5071726ef: Pull complete 2025-03-14T05:40:12.1062220Z 8be47e434613: Pull complete 2025-03-14T05:40:12.3417886Z a68c1967b44d: Pull complete 2025-03-14T05:40:12.5779794Z 985724505cd9: Pull complete 2025-03-14T05:40:13.4927438Z Digest: sha256:20eb41577713f879cca4c6a57bd64d737c482cad39fe2b18409f513444e2b522 2025-03-14T05:40:13.5239234Z Status: Downloaded newer image for 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:40:13.5424665Z 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:40:13.5482770Z ##[group]Run echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-03-14T05:40:13.5483829Z echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-03-14T05:40:13.5496055Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:40:13.5496669Z env: 2025-03-14T05:40:13.5496907Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:40:13.5497180Z ##[endgroup] 2025-03-14T05:40:13.5705824Z ##[group]Run pytorch/test-infra/.github/actions/setup-nvidia@main 2025-03-14T05:40:13.5706310Z with: 2025-03-14T05:40:13.5706555Z driver-version: 550.54.15 2025-03-14T05:40:13.5706847Z env: 2025-03-14T05:40:13.5707105Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:40:13.5707378Z ##[endgroup] 2025-03-14T05:40:13.5834080Z ##[group]Run nick-fields/retry@3e91a01664abd3c5cd539100d10d33b9c5b68482 2025-03-14T05:40:13.5834596Z with: 2025-03-14T05:40:13.5834877Z timeout_minutes: 10 2025-03-14T05:40:13.5835249Z max_attempts: 3 2025-03-14T05:40:13.5858423Z command: # Is it disgusting to have a full shell script here in this github action? Sure # But is it the best way to make it so that this action relies on nothing else? Absolutely set -eou pipefail DISTRIBUTION=$(. /etc/os-release;echo $ID$VERSION_ID) DRIVER_FN="NVIDIA-Linux-x86_64-${DRIVER_VERSION}.run" install_nvidia_docker2_amzn2() { ( set -x # Needed for yum-config-manager sudo yum install -y yum-utils if [[ "${DISTRIBUTION}" == "amzn2023" ]] ; then YUM_REPO_URL="https://nvidia.github.io/libnvidia-container/stable/rpm/nvidia-container-toolkit.repo" else # Amazon Linux 2 YUM_REPO_URL="https://nvidia.github.io/nvidia-docker/${DISTRIBUTION}/nvidia-docker.repo" fi sudo yum-config-manager --add-repo "${YUM_REPO_URL}" sudo yum install -y nvidia-docker2 nvidia-container-toolkit-1.16.2 sudo systemctl restart docker ) } install_nvidia_docker2_ubuntu20() { ( set -x # Install nvidia-driver package if not installed status="$(dpkg-query -W --showformat='${db:Status-Status}' nvidia-docker2 2>&1)" if [ ! $? = 0 ] || [ ! "$status" = installed ]; then sudo apt-get install -y nvidia-docker2 nvidia-container-toolkit-1.16.2 sudo systemctl restart docker fi ) } pre_install_nvidia_driver_amzn2() { ( # Purge any nvidia driver installed from RHEL repo sudo yum remove -y nvidia-driver-latest-dkms ) } install_nvidia_driver_common() { ( # Try to gather more information about the runner and its existing NVIDIA driver if any echo "Before installing NVIDIA driver" lspci lsmod modinfo nvidia || true HAS_NVIDIA_DRIVER=0 # Check if NVIDIA driver has already been installed if [ -x "$(command -v nvidia-smi)" ]; then set +e # The driver exists, check its version next. Also check only the first GPU if there are more than one of them # so that the same driver version is not print over multiple lines INSTALLED_DRIVER_VERSION=$(nvidia-smi --query-gpu=driver_version --format=csv,noheader --id=0) NVIDIA_SMI_STATUS=$? if [ "$NVIDIA_SMI_STATUS" -ne 0 ] && [ "$NVIDIA_SMI_STATUS" -ne 14 ]; then echo "Failed to get NVIDIA driver version ($INSTALLED_DRIVER_VERSION). Continuing" elif [ "$INSTALLED_DRIVER_VERSION" != "$DRIVER_VERSION" ]; then echo "NVIDIA driver ($INSTALLED_DRIVER_VERSION) has been installed, but we expect to have $DRIVER_VERSION instead. Continuing" else HAS_NVIDIA_DRIVER=1 echo "NVIDIA driver ($INSTALLED_DRIVER_VERSION) has already been installed. Skipping NVIDIA driver installation" fi set -e fi if [ "$HAS_NVIDIA_DRIVER" -eq 0 ]; then # CAUTION: this may need to be updated in future if [ "${DISTRIBUTION}" != ubuntu20.04 ]; then sudo yum groupinstall -y "Development Tools" # ensure our kernel install is the same as our underlying kernel, # groupinstall "Development Tools" has a habit of mismatching kernel headers sudo yum install -y "kernel-devel-uname-r == $(uname -r)" sudo modprobe backlight fi sudo curl -fsL -o /tmp/nvidia_driver "https://s3.amazonaws.com/ossci-linux/nvidia_driver/$DRIVER_FN" set +e sudo /bin/bash /tmp/nvidia_driver -s --no-drm NVIDIA_INSTALLATION_STATUS=$? RESET_GPU=0 if [ "$NVIDIA_INSTALLATION_STATUS" -ne 0 ]; then sudo cat /var/log/nvidia-installer.log # Fail to install NVIDIA driver, try to reset the GPU RESET_GPU=1 elif [ -x "$(command -v nvidia-smi)" ]; then # Check again if nvidia-smi works even if the driver installation completes successfully INSTALLED_DRIVER_VERSION=$(nvidia-smi --query-gpu=driver_version --format=csv,noheader --id=0) NVIDIA_SMI_STATUS=$? if [ "$NVIDIA_SMI_STATUS" -ne 0 ] && [ "$NVIDIA_SMI_STATUS" -ne 14 ]; then RESET_GPU=1 fi fi if [ "$RESET_GPU" -eq 1 ]; then NVIDIA_DEVICES=$(lspci -D | grep -i NVIDIA | cut -d' ' -f1) # The GPU can get stuck in a failure state if somehow the test crashs the GPU microcode. When this # happens, we'll try to reset all NVIDIA devices https://github.com/pytorch/pytorch/issues/88388 for PCI_ID in $NVIDIA_DEVICES; do DEVICE_ENABLED=$(cat /sys/bus/pci/devices/$PCI_ID/enable) echo "Reseting $PCI_ID (enabled state: $DEVICE_ENABLED)" # This requires sudo permission of course echo "1" | sudo tee /sys/bus/pci/devices/$PCI_ID/reset sleep 1 done fi sudo rm -fv /tmp/nvidia_driver set -e fi ) } post_install_nvidia_driver_common() { ( sudo modprobe nvidia || true echo "After installing NVIDIA driver" lspci lsmod modinfo nvidia || true ( set +e nvidia-smi # NB: Annoyingly, nvidia-smi command returns successfully with return code 0 even in # the case where the driver has already crashed as it still can get the driver version # and some basic information like the bus ID. However, the rest of the information # would be missing (ERR!), for example: # # +-----------------------------------------------------------------------------+ # | NVIDIA-SMI 525.89.02 Driver Version: 525.89.02 CUDA Version: 12.0 | # |-------------------------------+----------------------+----------------------+ # | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC | # | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. | # | | | MIG M. | # |===============================+======================+======================| # | 0 ERR! Off | 00000000:00:1E.0 Off | ERR! | # |ERR! ERR! ERR! ERR! / ERR! | 4184MiB / 23028MiB | ERR! Default | # | | | ERR! | # +-------------------------------+----------------------+----------------------+ # # +-----------------------------------------------------------------------------+ # | Processes: | # | GPU GI CI PID Type Process name GPU Memory | # | ID ID Usage | # |=============================================================================| # +-----------------------------------------------------------------------------+ # # This should be reported as a failure instead as it will guarantee to fail when # Docker tries to run with --gpus all # # So, the correct check here is to query one of the missing piece of info like # GPU name, so that the command can fail accordingly nvidia-smi --query-gpu=gpu_name --format=csv,noheader --id=0 NVIDIA_SMI_STATUS=$? # Allowable exit statuses for nvidia-smi, see: https://github.com/NVIDIA/gpu-operator/issues/285 if [ "$NVIDIA_SMI_STATUS" -eq 0 ] || [ "$NVIDIA_SMI_STATUS" -eq 14 ]; then echo "INFO: Ignoring allowed status ${NVIDIA_SMI_STATUS}" else echo "ERROR: nvidia-smi exited with unresolved status ${NVIDIA_SMI_STATUS}" exit ${NVIDIA_SMI_STATUS} fi set -e ) ) } install_nvidia_driver_amzn2() { ( set -x pre_install_nvidia_driver_amzn2 install_nvidia_driver_common post_install_nvidia_driver_common ) } install_nvidia_driver_ubuntu20() { ( set -x install_nvidia_driver_common post_install_nvidia_driver_common ) } echo "== Installing nvidia driver ${DRIVER_FN} ==" case "${DISTRIBUTION}" in amzn*) install_nvidia_driver_amzn2 ;; ubuntu20.04) install_nvidia_driver_ubuntu20 ;; *) echo "ERROR: Unknown distribution ${DISTRIBUTION}" exit 1 ;; esac # Install container toolkit based on distribution echo "== Installing nvidia container toolkit for ${DISTRIBUTION} ==" case "${DISTRIBUTION}" in amzn*) install_nvidia_docker2_amzn2 ;; ubuntu20.04) install_nvidia_docker2_ubuntu20 ;; *) echo "ERROR: Unknown distribution ${DISTRIBUTION}" exit 1 ;; esac echo "GPU_FLAG=--gpus all -e NVIDIA_DRIVER_CAPABILITIES=all" >> "${GITHUB_ENV}" # Fix https://github.com/NVIDIA/nvidia-docker/issues/1648 on runners with # more than one GPUs. This just needs to be run once. The command fails # on subsequent runs and complains that the mode is already on, but that's # ok sudo nvidia-persistenced || true # This should show persistence mode ON nvidia-smi 2025-03-14T05:40:13.5882051Z retry_wait_seconds: 10 2025-03-14T05:40:13.5882355Z polling_interval_seconds: 1 2025-03-14T05:40:13.5882659Z warning_on_retry: true 2025-03-14T05:40:13.5882945Z continue_on_error: false 2025-03-14T05:40:13.5883217Z env: 2025-03-14T05:40:13.5883453Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:40:13.5883742Z DRIVER_VERSION: 550.54.15 2025-03-14T05:40:13.5884028Z ##[endgroup] 2025-03-14T05:40:13.6736373Z == Installing nvidia driver NVIDIA-Linux-x86_64-550.54.15.run == 2025-03-14T05:40:13.6738338Z + pre_install_nvidia_driver_amzn2 2025-03-14T05:40:13.6738787Z + sudo yum remove -y nvidia-driver-latest-dkms 2025-03-14T05:40:14.3581725Z No match for argument: nvidia-driver-latest-dkms 2025-03-14T05:40:14.3582240Z No packages marked for removal. 2025-03-14T05:40:14.3643851Z Dependencies resolved. 2025-03-14T05:40:14.3653849Z Nothing to do. 2025-03-14T05:40:14.3654558Z Complete! 2025-03-14T05:40:14.4396388Z + install_nvidia_driver_common 2025-03-14T05:40:14.4399340Z + echo 'Before installing NVIDIA driver' 2025-03-14T05:40:14.4399670Z + lspci 2025-03-14T05:40:14.4401328Z Before installing NVIDIA driver 2025-03-14T05:40:14.5317856Z 00:00.0 Host bridge: Intel Corporation 440FX - 82441FX PMC [Natoma] 2025-03-14T05:40:14.5318594Z 00:01.0 ISA bridge: Intel Corporation 82371SB PIIX3 ISA [Natoma/Triton II] 2025-03-14T05:40:14.5319256Z 00:01.3 Non-VGA unclassified device: Intel Corporation 82371AB/EB/MB PIIX4 ACPI (rev 08) 2025-03-14T05:40:14.5319806Z 00:03.0 VGA compatible controller: Amazon.com, Inc. Device 1111 2025-03-14T05:40:14.5320318Z 00:04.0 Non-Volatile memory controller: Amazon.com, Inc. NVMe EBS Controller 2025-03-14T05:40:14.5320883Z 00:05.0 Ethernet controller: Amazon.com, Inc. Elastic Network Adapter (ENA) 2025-03-14T05:40:14.5321404Z 00:1e.0 3D controller: NVIDIA Corporation GA102GL [A10G] (rev a1) 2025-03-14T05:40:14.5323161Z 00:1f.0 Non-Volatile memory controller: Amazon.com, Inc. NVMe SSD Controller 2025-03-14T05:40:14.5323756Z + lsmod 2025-03-14T05:40:14.5366161Z Module Size Used by 2025-03-14T05:40:14.5366507Z xt_conntrack 16384 1 2025-03-14T05:40:14.5366836Z nft_chain_nat 16384 3 2025-03-14T05:40:14.5367148Z xt_MASQUERADE 20480 1 2025-03-14T05:40:14.5367482Z nf_nat 57344 2 nft_chain_nat,xt_MASQUERADE 2025-03-14T05:40:14.5368128Z nf_conntrack_netlink 57344 0 2025-03-14T05:40:14.5368567Z nf_conntrack 184320 4 xt_conntrack,nf_nat,nf_conntrack_netlink,xt_MASQUERADE 2025-03-14T05:40:14.5369039Z nf_defrag_ipv6 24576 1 nf_conntrack 2025-03-14T05:40:14.5369385Z nf_defrag_ipv4 16384 1 nf_conntrack 2025-03-14T05:40:14.5369711Z xfrm_user 57344 1 2025-03-14T05:40:14.5370010Z xfrm_algo 16384 1 xfrm_user 2025-03-14T05:40:14.5370329Z xt_addrtype 16384 2 2025-03-14T05:40:14.5370655Z nft_compat 20480 4 2025-03-14T05:40:14.5370998Z nf_tables 311296 57 nft_compat,nft_chain_nat 2025-03-14T05:40:14.5371458Z nfnetlink 20480 4 nft_compat,nf_conntrack_netlink,nf_tables 2025-03-14T05:40:14.5371864Z br_netfilter 36864 0 2025-03-14T05:40:14.5372173Z bridge 323584 1 br_netfilter 2025-03-14T05:40:14.5372505Z stp 16384 1 bridge 2025-03-14T05:40:14.5372819Z llc 16384 2 bridge,stp 2025-03-14T05:40:14.5373137Z overlay 167936 0 2025-03-14T05:40:14.5373425Z tls 135168 0 2025-03-14T05:40:14.5373707Z nls_ascii 16384 1 2025-03-14T05:40:14.5373993Z nls_cp437 20480 1 2025-03-14T05:40:14.5374272Z vfat 24576 1 2025-03-14T05:40:14.5374554Z fat 86016 1 vfat 2025-03-14T05:40:14.5374843Z sunrpc 696320 1 2025-03-14T05:40:14.5375126Z ena 180224 0 2025-03-14T05:40:14.5375411Z i8042 45056 0 2025-03-14T05:40:14.5375698Z serio 28672 3 i8042 2025-03-14T05:40:14.5376012Z ghash_clmulni_intel 16384 0 2025-03-14T05:40:14.5376305Z button 24576 0 2025-03-14T05:40:14.5376591Z sch_fq_codel 20480 17 2025-03-14T05:40:14.5376881Z dm_mod 188416 0 2025-03-14T05:40:14.5377164Z configfs 57344 1 2025-03-14T05:40:14.5377480Z fuse 163840 1 2025-03-14T05:40:14.5377770Z loop 36864 0 2025-03-14T05:40:14.5378053Z dax 45056 1 dm_mod 2025-03-14T05:40:14.5378355Z dmi_sysfs 20480 0 2025-03-14T05:40:14.5378643Z crc32_pclmul 16384 0 2025-03-14T05:40:14.5378929Z crc32c_intel 24576 0 2025-03-14T05:40:14.5379214Z efivarfs 24576 1 2025-03-14T05:40:14.5379492Z + modinfo nvidia 2025-03-14T05:40:14.5389263Z filename: /lib/modules/6.1.129-138.220.amzn2023.x86_64/kernel/drivers/video/nvidia.ko 2025-03-14T05:40:14.5389791Z alias: char-major-195-* 2025-03-14T05:40:14.5390094Z version: 550.54.15 2025-03-14T05:40:14.5390372Z supported: external 2025-03-14T05:40:14.5390651Z license: NVIDIA 2025-03-14T05:40:14.5390950Z firmware: nvidia/550.54.15/gsp_tu10x.bin 2025-03-14T05:40:14.5391326Z firmware: nvidia/550.54.15/gsp_ga10x.bin 2025-03-14T05:40:14.5391673Z srcversion: 833721318DA517F0C2FEC97 2025-03-14T05:40:14.5392021Z alias: pci:v000010DEd*sv*sd*bc06sc80i00* 2025-03-14T05:40:14.5392395Z alias: pci:v000010DEd*sv*sd*bc03sc02i00* 2025-03-14T05:40:14.5392760Z alias: pci:v000010DEd*sv*sd*bc03sc00i00* 2025-03-14T05:40:14.5393106Z depends: i2c-core,drm 2025-03-14T05:40:14.5393402Z retpoline: Y 2025-03-14T05:40:14.5393659Z name: nvidia 2025-03-14T05:40:14.5394051Z vermagic: 6.1.129-138.220.amzn2023.x86_64 SMP preempt mod_unload modversions 2025-03-14T05:40:14.5394654Z parm: NvSwitchRegDwords:NvSwitch regkey (charp) 2025-03-14T05:40:14.5395320Z parm: NvSwitchBlacklist:NvSwitchBlacklist=uuid[,uuid...] (charp) 2025-03-14T05:40:14.5395890Z parm: NVreg_ResmanDebugLevel:int 2025-03-14T05:40:14.5396234Z parm: NVreg_RmLogonRC:int 2025-03-14T05:40:14.5396569Z parm: NVreg_ModifyDeviceFiles:int 2025-03-14T05:40:14.5396969Z parm: NVreg_DeviceFileUID:int 2025-03-14T05:40:14.5397296Z parm: NVreg_DeviceFileGID:int 2025-03-14T05:40:14.5397630Z parm: NVreg_DeviceFileMode:int 2025-03-14T05:40:14.5398034Z parm: NVreg_InitializeSystemMemoryAllocations:int 2025-03-14T05:40:14.5398452Z parm: NVreg_UsePageAttributeTable:int 2025-03-14T05:40:14.5398815Z parm: NVreg_EnablePCIeGen3:int 2025-03-14T05:40:14.5399146Z parm: NVreg_EnableMSI:int 2025-03-14T05:40:14.5399468Z parm: NVreg_TCEBypassMode:int 2025-03-14T05:40:14.5399817Z parm: NVreg_EnableStreamMemOPs:int 2025-03-14T05:40:14.5400218Z parm: NVreg_RestrictProfilingToAdminUsers:int 2025-03-14T05:40:14.5400646Z parm: NVreg_PreserveVideoMemoryAllocations:int 2025-03-14T05:40:14.5401062Z parm: NVreg_EnableS0ixPowerManagement:int 2025-03-14T05:40:14.5401504Z parm: NVreg_S0ixPowerManagementVideoMemoryThreshold:int 2025-03-14T05:40:14.5401945Z parm: NVreg_DynamicPowerManagement:int 2025-03-14T05:40:14.5402390Z parm: NVreg_DynamicPowerManagementVideoMemoryThreshold:int 2025-03-14T05:40:14.5402825Z parm: NVreg_EnableGpuFirmware:int 2025-03-14T05:40:14.5403191Z parm: NVreg_EnableGpuFirmwareLogs:int 2025-03-14T05:40:14.5403587Z parm: NVreg_OpenRmEnableUnsupportedGpus:int 2025-03-14T05:40:14.5403988Z parm: NVreg_EnableUserNUMAManagement:int 2025-03-14T05:40:14.5404354Z parm: NVreg_MemoryPoolSize:int 2025-03-14T05:40:14.5404797Z parm: NVreg_KMallocHeapMaxSize:int 2025-03-14T05:40:14.5405294Z parm: NVreg_VMallocHeapMaxSize:int 2025-03-14T05:40:14.5405805Z parm: NVreg_IgnoreMMIOCheck:int 2025-03-14T05:40:14.5406290Z parm: NVreg_NvLinkDisable:int 2025-03-14T05:40:14.5406819Z parm: NVreg_EnablePCIERelaxedOrderingMode:int 2025-03-14T05:40:14.5407298Z parm: NVreg_RegisterPCIDriver:int 2025-03-14T05:40:14.5407649Z parm: NVreg_EnableResizableBar:int 2025-03-14T05:40:14.5408013Z parm: NVreg_EnableDbgBreakpoint:int 2025-03-14T05:40:14.5408386Z parm: NVreg_EnableNonblockingOpen:int 2025-03-14T05:40:14.5408750Z parm: NVreg_RegistryDwords:charp 2025-03-14T05:40:14.5409120Z parm: NVreg_RegistryDwordsPerDevice:charp 2025-03-14T05:40:14.5409476Z parm: NVreg_RmMsg:charp 2025-03-14T05:40:14.5409794Z parm: NVreg_GpuBlacklist:charp 2025-03-14T05:40:14.5410148Z parm: NVreg_TemporaryFilePath:charp 2025-03-14T05:40:14.5410498Z parm: NVreg_ExcludedGpus:charp 2025-03-14T05:40:14.5410840Z parm: NVreg_DmaRemapPeerMmio:int 2025-03-14T05:40:14.5411205Z parm: NVreg_RmNvlinkBandwidth:charp 2025-03-14T05:40:14.5411565Z parm: NVreg_ImexChannelCount:int 2025-03-14T05:40:14.5411914Z parm: rm_firmware_active:charp 2025-03-14T05:40:14.5412244Z + HAS_NVIDIA_DRIVER=0 2025-03-14T05:40:14.5412520Z ++ command -v nvidia-smi 2025-03-14T05:40:14.5412811Z + '[' -x /usr/bin/nvidia-smi ']' 2025-03-14T05:40:14.5413094Z + set +e 2025-03-14T05:40:14.5413434Z ++ nvidia-smi --query-gpu=driver_version --format=csv,noheader --id=0 2025-03-14T05:40:16.7954609Z + INSTALLED_DRIVER_VERSION=550.54.15 2025-03-14T05:40:16.7955129Z + NVIDIA_SMI_STATUS=0 2025-03-14T05:40:16.7955463Z + '[' 0 -ne 0 ']' 2025-03-14T05:40:16.7955723Z + '[' 550.54.15 '!=' 550.54.15 ']' 2025-03-14T05:40:16.7956120Z + HAS_NVIDIA_DRIVER=1 2025-03-14T05:40:16.7956755Z + echo 'NVIDIA driver (550.54.15) has already been installed. Skipping NVIDIA driver installation' 2025-03-14T05:40:16.7957360Z + set -e 2025-03-14T05:40:16.7957586Z + '[' 1 -eq 0 ']' 2025-03-14T05:40:16.7958665Z NVIDIA driver (550.54.15) has already been installed. Skipping NVIDIA driver installation 2025-03-14T05:40:16.7959469Z + post_install_nvidia_driver_common 2025-03-14T05:40:16.7961755Z + sudo modprobe nvidia 2025-03-14T05:40:16.8960249Z + echo 'After installing NVIDIA driver' 2025-03-14T05:40:16.8960624Z + lspci 2025-03-14T05:40:16.8960936Z After installing NVIDIA driver 2025-03-14T05:40:16.9076737Z 00:00.0 Host bridge: Intel Corporation 440FX - 82441FX PMC [Natoma] 2025-03-14T05:40:16.9077280Z 00:01.0 ISA bridge: Intel Corporation 82371SB PIIX3 ISA [Natoma/Triton II] 2025-03-14T05:40:16.9077861Z 00:01.3 Non-VGA unclassified device: Intel Corporation 82371AB/EB/MB PIIX4 ACPI (rev 08) 2025-03-14T05:40:16.9078405Z 00:03.0 VGA compatible controller: Amazon.com, Inc. Device 1111 2025-03-14T05:40:16.9078922Z 00:04.0 Non-Volatile memory controller: Amazon.com, Inc. NVMe EBS Controller 2025-03-14T05:40:16.9079497Z 00:05.0 Ethernet controller: Amazon.com, Inc. Elastic Network Adapter (ENA) 2025-03-14T05:40:16.9080038Z 00:1e.0 3D controller: NVIDIA Corporation GA102GL [A10G] (rev a1) 2025-03-14T05:40:16.9080563Z 00:1f.0 Non-Volatile memory controller: Amazon.com, Inc. NVMe SSD Controller 2025-03-14T05:40:16.9081179Z + lsmod 2025-03-14T05:40:16.9111520Z Module Size Used by 2025-03-14T05:40:16.9112135Z nvidia_uvm 4706304 0 2025-03-14T05:40:16.9112638Z nvidia 54071296 1 nvidia_uvm 2025-03-14T05:40:16.9113521Z drm 602112 1 nvidia 2025-03-14T05:40:16.9114406Z drm_panel_orientation_quirks 32768 1 drm 2025-03-14T05:40:16.9114988Z backlight 24576 1 drm 2025-03-14T05:40:16.9115982Z i2c_core 110592 2 nvidia,drm 2025-03-14T05:40:16.9116683Z xt_conntrack 16384 1 2025-03-14T05:40:16.9117978Z nft_chain_nat 16384 3 2025-03-14T05:40:16.9118488Z xt_MASQUERADE 20480 1 2025-03-14T05:40:16.9119411Z nf_nat 57344 2 nft_chain_nat,xt_MASQUERADE 2025-03-14T05:40:16.9120094Z nf_conntrack_netlink 57344 0 2025-03-14T05:40:16.9120831Z nf_conntrack 184320 4 xt_conntrack,nf_nat,nf_conntrack_netlink,xt_MASQUERADE 2025-03-14T05:40:16.9121536Z nf_defrag_ipv6 24576 1 nf_conntrack 2025-03-14T05:40:16.9121972Z nf_defrag_ipv4 16384 1 nf_conntrack 2025-03-14T05:40:16.9122438Z xfrm_user 57344 1 2025-03-14T05:40:16.9122844Z xfrm_algo 16384 1 xfrm_user 2025-03-14T05:40:16.9123231Z xt_addrtype 16384 2 2025-03-14T05:40:16.9123700Z nft_compat 20480 4 2025-03-14T05:40:16.9124131Z nf_tables 311296 57 nft_compat,nft_chain_nat 2025-03-14T05:40:16.9124701Z nfnetlink 20480 4 nft_compat,nf_conntrack_netlink,nf_tables 2025-03-14T05:40:16.9125238Z br_netfilter 36864 0 2025-03-14T05:40:16.9125637Z bridge 323584 1 br_netfilter 2025-03-14T05:40:16.9126125Z stp 16384 1 bridge 2025-03-14T05:40:16.9126535Z llc 16384 2 bridge,stp 2025-03-14T05:40:16.9126928Z overlay 167936 0 2025-03-14T05:40:16.9127386Z tls 135168 0 2025-03-14T05:40:16.9127755Z nls_ascii 16384 1 2025-03-14T05:40:16.9128189Z nls_cp437 20480 1 2025-03-14T05:40:16.9128579Z vfat 24576 1 2025-03-14T05:40:16.9128950Z fat 86016 1 vfat 2025-03-14T05:40:16.9129410Z sunrpc 696320 1 2025-03-14T05:40:16.9129749Z ena 180224 0 2025-03-14T05:40:16.9130109Z i8042 45056 0 2025-03-14T05:40:16.9130556Z serio 28672 3 i8042 2025-03-14T05:40:16.9130955Z ghash_clmulni_intel 16384 0 2025-03-14T05:40:16.9131342Z button 24576 0 2025-03-14T05:40:16.9131909Z sch_fq_codel 20480 17 2025-03-14T05:40:16.9132401Z dm_mod 188416 0 2025-03-14T05:40:16.9132997Z configfs 57344 1 2025-03-14T05:40:16.9133369Z fuse 163840 1 2025-03-14T05:40:16.9133803Z loop 36864 0 2025-03-14T05:40:16.9134419Z dax 45056 1 dm_mod 2025-03-14T05:40:16.9134835Z dmi_sysfs 20480 0 2025-03-14T05:40:16.9135425Z crc32_pclmul 16384 0 2025-03-14T05:40:16.9135825Z crc32c_intel 24576 0 2025-03-14T05:40:16.9136219Z efivarfs 24576 1 2025-03-14T05:40:16.9136854Z + modinfo nvidia 2025-03-14T05:40:16.9137707Z filename: /lib/modules/6.1.129-138.220.amzn2023.x86_64/kernel/drivers/video/nvidia.ko 2025-03-14T05:40:16.9138424Z alias: char-major-195-* 2025-03-14T05:40:16.9138854Z version: 550.54.15 2025-03-14T05:40:16.9139237Z supported: external 2025-03-14T05:40:16.9139643Z license: NVIDIA 2025-03-14T05:40:16.9140060Z firmware: nvidia/550.54.15/gsp_tu10x.bin 2025-03-14T05:40:16.9140502Z firmware: nvidia/550.54.15/gsp_ga10x.bin 2025-03-14T05:40:16.9140964Z srcversion: 833721318DA517F0C2FEC97 2025-03-14T05:40:16.9141426Z alias: pci:v000010DEd*sv*sd*bc06sc80i00* 2025-03-14T05:40:16.9141880Z alias: pci:v000010DEd*sv*sd*bc03sc02i00* 2025-03-14T05:40:16.9142415Z alias: pci:v000010DEd*sv*sd*bc03sc00i00* 2025-03-14T05:40:16.9143146Z depends: i2c-core,drm 2025-03-14T05:40:16.9143993Z retpoline: Y 2025-03-14T05:40:16.9144378Z name: nvidia 2025-03-14T05:40:16.9144918Z vermagic: 6.1.129-138.220.amzn2023.x86_64 SMP preempt mod_unload modversions 2025-03-14T05:40:16.9145578Z parm: NvSwitchRegDwords:NvSwitch regkey (charp) 2025-03-14T05:40:16.9146141Z parm: NvSwitchBlacklist:NvSwitchBlacklist=uuid[,uuid...] (charp) 2025-03-14T05:40:16.9146660Z parm: NVreg_ResmanDebugLevel:int 2025-03-14T05:40:16.9147168Z parm: NVreg_RmLogonRC:int 2025-03-14T05:40:16.9147579Z parm: NVreg_ModifyDeviceFiles:int 2025-03-14T05:40:16.9148109Z parm: NVreg_DeviceFileUID:int 2025-03-14T05:40:16.9148563Z parm: NVreg_DeviceFileGID:int 2025-03-14T05:40:16.9148981Z parm: NVreg_DeviceFileMode:int 2025-03-14T05:40:16.9149539Z parm: NVreg_InitializeSystemMemoryAllocations:int 2025-03-14T05:40:16.9150095Z parm: NVreg_UsePageAttributeTable:int 2025-03-14T05:40:16.9150559Z parm: NVreg_EnablePCIeGen3:int 2025-03-14T05:40:16.9151037Z parm: NVreg_EnableMSI:int 2025-03-14T05:40:16.9151439Z parm: NVreg_TCEBypassMode:int 2025-03-14T05:40:16.9151930Z parm: NVreg_EnableStreamMemOPs:int 2025-03-14T05:40:16.9152429Z parm: NVreg_RestrictProfilingToAdminUsers:int 2025-03-14T05:40:16.9152906Z parm: NVreg_PreserveVideoMemoryAllocations:int 2025-03-14T05:40:16.9153464Z parm: NVreg_EnableS0ixPowerManagement:int 2025-03-14T05:40:16.9154005Z parm: NVreg_S0ixPowerManagementVideoMemoryThreshold:int 2025-03-14T05:40:16.9154641Z parm: NVreg_DynamicPowerManagement:int 2025-03-14T05:40:16.9155233Z parm: NVreg_DynamicPowerManagementVideoMemoryThreshold:int 2025-03-14T05:40:16.9155817Z parm: NVreg_EnableGpuFirmware:int 2025-03-14T05:40:16.9156292Z parm: NVreg_EnableGpuFirmwareLogs:int 2025-03-14T05:40:16.9156855Z parm: NVreg_OpenRmEnableUnsupportedGpus:int 2025-03-14T05:40:16.9157371Z parm: NVreg_EnableUserNUMAManagement:int 2025-03-14T05:40:16.9157944Z parm: NVreg_MemoryPoolSize:int 2025-03-14T05:40:16.9158396Z parm: NVreg_KMallocHeapMaxSize:int 2025-03-14T05:40:16.9158869Z parm: NVreg_VMallocHeapMaxSize:int 2025-03-14T05:40:16.9159360Z parm: NVreg_IgnoreMMIOCheck:int 2025-03-14T05:40:16.9159811Z parm: NVreg_NvLinkDisable:int 2025-03-14T05:40:16.9160257Z parm: NVreg_EnablePCIERelaxedOrderingMode:int 2025-03-14T05:40:16.9160795Z parm: NVreg_RegisterPCIDriver:int 2025-03-14T05:40:16.9161322Z parm: NVreg_EnableResizableBar:int 2025-03-14T05:40:16.9161815Z parm: NVreg_EnableDbgBreakpoint:int 2025-03-14T05:40:16.9162319Z parm: NVreg_EnableNonblockingOpen:int 2025-03-14T05:40:16.9162896Z parm: NVreg_RegistryDwords:charp 2025-03-14T05:40:16.9163431Z parm: NVreg_RegistryDwordsPerDevice:charp 2025-03-14T05:40:16.9163969Z parm: NVreg_RmMsg:charp 2025-03-14T05:40:16.9164371Z parm: NVreg_GpuBlacklist:charp 2025-03-14T05:40:16.9164904Z parm: NVreg_TemporaryFilePath:charp 2025-03-14T05:40:16.9165344Z parm: NVreg_ExcludedGpus:charp 2025-03-14T05:40:16.9165754Z parm: NVreg_DmaRemapPeerMmio:int 2025-03-14T05:40:16.9166331Z parm: NVreg_RmNvlinkBandwidth:charp 2025-03-14T05:40:16.9166778Z parm: NVreg_ImexChannelCount:int 2025-03-14T05:40:16.9167268Z parm: rm_firmware_active:charp 2025-03-14T05:40:16.9167678Z + set +e 2025-03-14T05:40:16.9168483Z + nvidia-smi 2025-03-14T05:40:18.4816998Z Fri Mar 14 05:40:18 2025 2025-03-14T05:40:18.4817873Z +-----------------------------------------------------------------------------------------+ 2025-03-14T05:40:18.4818608Z | NVIDIA-SMI 550.54.15 Driver Version: 550.54.15 CUDA Version: 12.4 | 2025-03-14T05:40:18.4819261Z |-----------------------------------------+------------------------+----------------------+ 2025-03-14T05:40:18.4819903Z | GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC | 2025-03-14T05:40:18.4820597Z | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | 2025-03-14T05:40:18.4821151Z | | | MIG M. | 2025-03-14T05:40:18.4821667Z |=========================================+========================+======================| 2025-03-14T05:40:18.4913610Z | 0 NVIDIA A10G Off | 00000000:00:1E.0 Off | 0 | 2025-03-14T05:40:18.4914259Z | 0% 20C P0 53W / 300W | 0MiB / 23028MiB | 5% Default | 2025-03-14T05:40:18.4914812Z | | | N/A | 2025-03-14T05:40:18.4915349Z +-----------------------------------------+------------------------+----------------------+ 2025-03-14T05:40:18.4915896Z 2025-03-14T05:40:18.4916449Z +-----------------------------------------------------------------------------------------+ 2025-03-14T05:40:18.4917017Z | Processes: | 2025-03-14T05:40:18.4917573Z | GPU GI CI PID Type Process name GPU Memory | 2025-03-14T05:40:18.4918177Z | ID ID Usage | 2025-03-14T05:40:18.4918667Z |=========================================================================================| 2025-03-14T05:40:18.4919504Z | No running processes found | 2025-03-14T05:40:18.4920212Z +-----------------------------------------------------------------------------------------+ 2025-03-14T05:40:19.1144781Z + nvidia-smi --query-gpu=gpu_name --format=csv,noheader --id=0 2025-03-14T05:40:20.6763880Z NVIDIA A10G 2025-03-14T05:40:21.1396213Z + NVIDIA_SMI_STATUS=0 2025-03-14T05:40:21.1396581Z + '[' 0 -eq 0 ']' 2025-03-14T05:40:21.1397015Z + echo 'INFO: Ignoring allowed status 0' 2025-03-14T05:40:21.1397462Z + set -e 2025-03-14T05:40:21.1397792Z INFO: Ignoring allowed status 0 2025-03-14T05:40:21.1406256Z == Installing nvidia container toolkit for amzn2023 == 2025-03-14T05:40:21.1409286Z + sudo yum install -y yum-utils 2025-03-14T05:40:21.5862720Z Last metadata expiration check: 0:10:19 ago on Fri Mar 14 05:30:02 2025. 2025-03-14T05:40:21.6102139Z Package dnf-utils-4.3.0-13.amzn2023.0.5.noarch is already installed. 2025-03-14T05:40:21.6489578Z Dependencies resolved. 2025-03-14T05:40:21.6672631Z Nothing to do. 2025-03-14T05:40:21.6673017Z Complete! 2025-03-14T05:40:21.7063533Z + [[ amzn2023 == \a\m\z\n\2\0\2\3 ]] 2025-03-14T05:40:21.7064422Z + YUM_REPO_URL=https://nvidia.github.io/libnvidia-container/stable/rpm/nvidia-container-toolkit.repo 2025-03-14T05:40:21.7065589Z + sudo yum-config-manager --add-repo https://nvidia.github.io/libnvidia-container/stable/rpm/nvidia-container-toolkit.repo 2025-03-14T05:40:21.9955373Z Adding repo from: https://nvidia.github.io/libnvidia-container/stable/rpm/nvidia-container-toolkit.repo 2025-03-14T05:40:22.0503658Z + sudo yum install -y nvidia-docker2 nvidia-container-toolkit-1.16.2 2025-03-14T05:40:22.5501990Z nvidia-container-toolkit 15 kB/s | 833 B 00:00 2025-03-14T05:40:22.5746860Z Package nvidia-docker2-2.14.0-1.noarch is already installed. 2025-03-14T05:40:22.6135888Z Dependencies resolved. 2025-03-14T05:40:22.6311471Z ================================================================================ 2025-03-14T05:40:22.6311924Z Package Arch Version Repository Size 2025-03-14T05:40:22.6312711Z ================================================================================ 2025-03-14T05:40:22.6313491Z Downgrading: 2025-03-14T05:40:22.6314438Z nvidia-container-toolkit x86_64 1.16.2-1 nvidia-container-toolkit 1.2 M 2025-03-14T05:40:22.6315782Z nvidia-container-toolkit-base x86_64 1.16.2-1 nvidia-container-toolkit 5.6 M 2025-03-14T05:40:22.6316711Z 2025-03-14T05:40:22.6316943Z Transaction Summary 2025-03-14T05:40:22.6317502Z ================================================================================ 2025-03-14T05:40:22.6318181Z Downgrade 2 Packages 2025-03-14T05:40:22.6318509Z 2025-03-14T05:40:22.6318758Z Total download size: 6.8 M 2025-03-14T05:40:22.6319179Z Downloading Packages: 2025-03-14T05:40:22.6677378Z (1/2): nvidia-container-toolkit-1.16.2-1.x86_64 35 MB/s | 1.2 MB 00:00 2025-03-14T05:40:22.7187723Z (2/2): nvidia-container-toolkit-base-1.16.2-1.x 65 MB/s | 5.6 MB 00:00 2025-03-14T05:40:22.7198034Z -------------------------------------------------------------------------------- 2025-03-14T05:40:22.7199706Z Total 78 MB/s | 6.8 MB 00:00 2025-03-14T05:40:22.7202297Z Running transaction check 2025-03-14T05:40:22.7303109Z Transaction check succeeded. 2025-03-14T05:40:22.7304056Z Running transaction test 2025-03-14T05:40:22.7601081Z Transaction test succeeded. 2025-03-14T05:40:22.7603951Z Running transaction 2025-03-14T05:40:23.3100220Z Preparing : 1/1 2025-03-14T05:40:23.4158825Z Downgrading : nvidia-container-toolkit-base-1.16.2-1.x86_64 1/4 2025-03-14T05:40:23.4183816Z Downgrading : nvidia-container-toolkit-1.16.2-1.x86_64 2/4 2025-03-14T05:40:23.4398254Z Running scriptlet: nvidia-container-toolkit-1.16.2-1.x86_64 2/4 2025-03-14T05:40:23.4398870Z Cleanup : nvidia-container-toolkit-1.17.5-1.x86_64 3/4 2025-03-14T05:40:23.4497326Z Running scriptlet: nvidia-container-toolkit-1.17.5-1.x86_64 3/4 2025-03-14T05:40:23.4519647Z Cleanup : nvidia-container-toolkit-base-1.17.5-1.x86_64 4/4 2025-03-14T05:41:09.6653057Z Running scriptlet: nvidia-container-toolkit-1.16.2-1.x86_64 4/4 2025-03-14T05:41:09.6655285Z Verifying : nvidia-container-toolkit-1.16.2-1.x86_64 1/4 2025-03-14T05:41:09.6655901Z Verifying : nvidia-container-toolkit-1.17.5-1.x86_64 2/4 2025-03-14T05:41:09.6656480Z Verifying : nvidia-container-toolkit-base-1.16.2-1.x86_64 3/4 2025-03-14T05:41:09.8355291Z Verifying : nvidia-container-toolkit-base-1.17.5-1.x86_64 4/4 2025-03-14T05:41:09.8355653Z 2025-03-14T05:41:09.8355756Z Downgraded: 2025-03-14T05:41:09.8356141Z nvidia-container-toolkit-1.16.2-1.x86_64 2025-03-14T05:41:09.8356743Z nvidia-container-toolkit-base-1.16.2-1.x86_64 2025-03-14T05:41:09.8357101Z 2025-03-14T05:41:09.8357203Z Complete! 2025-03-14T05:41:09.8779157Z + sudo systemctl restart docker 2025-03-14T05:41:16.1979360Z Fri Mar 14 05:41:16 2025 2025-03-14T05:41:16.1980186Z +-----------------------------------------------------------------------------------------+ 2025-03-14T05:41:16.1980723Z | NVIDIA-SMI 550.54.15 Driver Version: 550.54.15 CUDA Version: 12.4 | 2025-03-14T05:41:16.1981234Z |-----------------------------------------+------------------------+----------------------+ 2025-03-14T05:41:16.1981761Z | GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC | 2025-03-14T05:41:16.1982317Z | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | 2025-03-14T05:41:16.1982783Z | | | MIG M. | 2025-03-14T05:41:16.1983150Z |=========================================+========================+======================| 2025-03-14T05:41:16.2108880Z | 0 NVIDIA A10G On | 00000000:00:1E.0 Off | 0 | 2025-03-14T05:41:16.2109387Z | 0% 20C P0 53W / 300W | 0MiB / 23028MiB | 5% Default | 2025-03-14T05:41:16.2109798Z | | | N/A | 2025-03-14T05:41:16.2110225Z +-----------------------------------------+------------------------+----------------------+ 2025-03-14T05:41:16.2110654Z 2025-03-14T05:41:16.2111086Z +-----------------------------------------------------------------------------------------+ 2025-03-14T05:41:16.2111542Z | Processes: | 2025-03-14T05:41:16.2112022Z | GPU GI CI PID Type Process name GPU Memory | 2025-03-14T05:41:16.2112473Z | ID ID Usage | 2025-03-14T05:41:16.2112856Z |=========================================================================================| 2025-03-14T05:41:16.2113673Z | No running processes found | 2025-03-14T05:41:16.2114170Z +-----------------------------------------------------------------------------------------+ 2025-03-14T05:41:16.7078976Z Command completed after 1 attempt(s). 2025-03-14T05:41:16.7178724Z Prepare all required actions 2025-03-14T05:41:16.7207441Z ##[group]Run ./.github/actions/get-workflow-job-id 2025-03-14T05:41:16.7207796Z with: 2025-03-14T05:41:16.7208199Z github-token: *** 2025-03-14T05:41:16.7208457Z env: 2025-03-14T05:41:16.7208691Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:16.7209041Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:16.7209412Z ##[endgroup] 2025-03-14T05:41:16.7231066Z ##[group]Run set -eux 2025-03-14T05:41:16.7231342Z set -eux 2025-03-14T05:41:16.7231781Z python3 .github/scripts/get_workflow_job_id.py "${GITHUB_RUN_ID}" "${RUNNER_NAME}" 2025-03-14T05:41:16.7246237Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:41:16.7246619Z env: 2025-03-14T05:41:16.7246858Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:16.7247209Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:16.7247714Z GITHUB_TOKEN: *** 2025-03-14T05:41:16.7247968Z ##[endgroup] 2025-03-14T05:41:16.7280386Z + python3 .github/scripts/get_workflow_job_id.py 13849515380 i-0995e781c94ad14d3 2025-03-14T05:41:17.7964425Z setting job-id=38756916179 2025-03-14T05:41:17.7964997Z setting job-name=cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu) 2025-03-14T05:41:17.8088509Z ##[group]Run python3 -m pip install psutil==5.9.1 nvidia-ml-py==11.525.84 dataclasses_json==0.6.7 2025-03-14T05:41:17.8089227Z python3 -m pip install psutil==5.9.1 nvidia-ml-py==11.525.84 dataclasses_json==0.6.7 2025-03-14T05:41:17.8089797Z python3 -m tools.stats.monitor > usage_log.txt 2>&1 & 2025-03-14T05:41:17.8090479Z echo "monitor-script-pid=${!}" >> "${GITHUB_OUTPUT}" 2025-03-14T05:41:17.8099248Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:41:17.8099639Z env: 2025-03-14T05:41:17.8099882Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:17.8100236Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:17.8100608Z JOB_ID: 38756916179 2025-03-14T05:41:17.8101085Z JOB_NAME: cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu) 2025-03-14T05:41:17.8101621Z WORKFLOW_NAME: inductor 2025-03-14T05:41:17.8101912Z WORKFLOW_RUN_ID: 13849515380 2025-03-14T05:41:17.8102228Z ##[endgroup] 2025-03-14T05:41:18.0795175Z Defaulting to user installation because normal site-packages is not writeable 2025-03-14T05:41:18.4648141Z Collecting psutil==5.9.1 2025-03-14T05:41:18.4878972Z Downloading psutil-5.9.1-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (281 kB) 2025-03-14T05:41:18.5283719Z Collecting nvidia-ml-py==11.525.84 2025-03-14T05:41:18.5315403Z Downloading nvidia_ml_py-11.525.84-py3-none-any.whl (34 kB) 2025-03-14T05:41:18.5941646Z Collecting dataclasses_json==0.6.7 2025-03-14T05:41:18.5975103Z Downloading dataclasses_json-0.6.7-py3-none-any.whl (28 kB) 2025-03-14T05:41:18.7226646Z Collecting marshmallow<4.0.0,>=3.18.0 2025-03-14T05:41:18.7257759Z Downloading marshmallow-3.26.1-py3-none-any.whl (50 kB) 2025-03-14T05:41:18.7475063Z Collecting typing-inspect<1,>=0.4.0 2025-03-14T05:41:18.7504243Z Downloading typing_inspect-0.9.0-py3-none-any.whl (8.8 kB) 2025-03-14T05:41:18.8048087Z Collecting packaging>=17.0 2025-03-14T05:41:18.8081943Z Downloading packaging-24.2-py3-none-any.whl (65 kB) 2025-03-14T05:41:18.8291028Z Collecting mypy-extensions>=0.3.0 2025-03-14T05:41:18.8322465Z Downloading mypy_extensions-1.0.0-py3-none-any.whl (4.7 kB) 2025-03-14T05:41:18.8748589Z Collecting typing-extensions>=3.7.4 2025-03-14T05:41:18.8781472Z Downloading typing_extensions-4.12.2-py3-none-any.whl (37 kB) 2025-03-14T05:41:18.9673565Z Installing collected packages: typing-extensions, packaging, mypy-extensions, typing-inspect, marshmallow, psutil, nvidia-ml-py, dataclasses-json 2025-03-14T05:41:19.2161554Z Successfully installed dataclasses-json-0.6.7 marshmallow-3.26.1 mypy-extensions-1.0.0 nvidia-ml-py-11.525.84 packaging-24.2 psutil-5.9.1 typing-extensions-4.12.2 typing-inspect-0.9.0 2025-03-14T05:41:19.4075412Z Prepare all required actions 2025-03-14T05:41:19.4075799Z Getting action download info 2025-03-14T05:41:19.5168688Z Download action repository 'seemethere/download-artifact-s3@v4' (SHA:1da556a7aa0a088e3153970611f6c432d58e80e6) 2025-03-14T05:41:19.7376388Z Download action repository 'actions/download-artifact@v4' (SHA:cc203385981b70ca67e1cc392babf9cc229d5806) 2025-03-14T05:41:19.9897890Z ##[group]Run ./.github/actions/download-build-artifacts 2025-03-14T05:41:19.9898376Z with: 2025-03-14T05:41:19.9898704Z name: linux-focal-cuda12.6-py3.10-gcc9-sm86 2025-03-14T05:41:19.9899111Z s3-bucket: gha-artifacts 2025-03-14T05:41:19.9899389Z env: 2025-03-14T05:41:19.9899673Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:19.9900027Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:19.9900395Z ##[endgroup] 2025-03-14T05:41:19.9961638Z ##[group]Run seemethere/download-artifact-s3@v4 2025-03-14T05:41:19.9961989Z with: 2025-03-14T05:41:19.9962295Z name: linux-focal-cuda12.6-py3.10-gcc9-sm86 2025-03-14T05:41:19.9962647Z s3-bucket: gha-artifacts 2025-03-14T05:41:19.9962932Z region: us-east-1 2025-03-14T05:41:19.9963175Z env: 2025-03-14T05:41:19.9963418Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:19.9963767Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:19.9964133Z ##[endgroup] 2025-03-14T05:41:20.4606032Z (node:54116) NOTE: We are formalizing our plans to enter AWS SDK for JavaScript (v2) into maintenance mode in 2023. 2025-03-14T05:41:20.4606540Z 2025-03-14T05:41:20.4607086Z Please migrate your code to use AWS SDK for JavaScript (v3). 2025-03-14T05:41:20.4607618Z For more information, check the migration guide at https://a.co/7PzMCcy 2025-03-14T05:41:20.4608181Z (Use `node --trace-warnings ...` to show where the warning was created) 2025-03-14T05:41:20.6702493Z Found 1 objects with prefix pytorch/pytorch/13849515380/linux-focal-cuda12.6-py3.10-gcc9-sm86/ 2025-03-14T05:41:20.6704028Z Starting download (1/1): /home/ec2-user/actions-runner/_work/pytorch/pytorch/artifacts.zip 2025-03-14T05:41:28.9428227Z Finished download (1/1): /home/ec2-user/actions-runner/_work/pytorch/pytorch/artifacts.zip 2025-03-14T05:41:28.9434959Z Artifact download has finished successfully 2025-03-14T05:41:28.9801384Z ##[group]Run unzip -o artifacts.zip 2025-03-14T05:41:28.9801738Z unzip -o artifacts.zip 2025-03-14T05:41:28.9811093Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:41:28.9811476Z env: 2025-03-14T05:41:28.9811708Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:28.9812073Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:28.9812443Z ##[endgroup] 2025-03-14T05:41:28.9879907Z Archive: artifacts.zip 2025-03-14T05:41:28.9881043Z creating: dist/ 2025-03-14T05:41:31.3191473Z inflating: dist/torch-2.8.0a0+gitaed0b7a-cp310-cp310-linux_x86_64.whl 2025-03-14T05:41:31.3325336Z inflating: dist/.ninja_log 2025-03-14T05:41:31.3326039Z creating: build/custom_test_artifacts/ 2025-03-14T05:41:31.3326661Z creating: build/custom_test_artifacts/custom-op-build/ 2025-03-14T05:41:31.3327353Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/ 2025-03-14T05:41:31.3328158Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/pkgRedirects/ 2025-03-14T05:41:31.3336221Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/CMakeConfigureLog.yaml 2025-03-14T05:41:31.3337146Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/ 2025-03-14T05:41:31.3338069Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CMakeSystem.cmake 2025-03-14T05:41:31.3339005Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdC/ 2025-03-14T05:41:31.3339682Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdC/tmp/ 2025-03-14T05:41:31.3341801Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdC/CMakeCCompilerId.c 2025-03-14T05:41:31.3343548Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdC/a.out 2025-03-14T05:41:31.3344596Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CMakeCCompiler.cmake 2025-03-14T05:41:31.3345606Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCXX/ 2025-03-14T05:41:31.3346312Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCXX/tmp/ 2025-03-14T05:41:31.3348551Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCXX/CMakeCXXCompilerId.cpp 2025-03-14T05:41:31.3350076Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCXX/a.out 2025-03-14T05:41:31.3351724Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CMakeCXXCompiler.cmake 2025-03-14T05:41:31.3354017Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CMakeDetermineCompilerABI_C.bin 2025-03-14T05:41:31.3356744Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CMakeDetermineCompilerABI_CXX.bin 2025-03-14T05:41:31.3357850Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/ 2025-03-14T05:41:31.3358574Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/ 2025-03-14T05:41:31.3400471Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cpp4.ii 2025-03-14T05:41:31.3441290Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cudafe1.cpp 2025-03-14T05:41:31.3442501Z extracting: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.module_id 2025-03-14T05:41:31.3489866Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cpp1.ii 2025-03-14T05:41:31.3491121Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cudafe1.c 2025-03-14T05:41:31.3492539Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cudafe1.gpu 2025-03-14T05:41:31.3493930Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cudafe1.stub.c 2025-03-14T05:41:31.3495257Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.ptx 2025-03-14T05:41:31.3496573Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.sm_52.cubin 2025-03-14T05:41:31.3497906Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.fatbin 2025-03-14T05:41:31.3499157Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.fatbin.c 2025-03-14T05:41:31.3500539Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.o 2025-03-14T05:41:31.3501821Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/a_dlink.sm_52.cubin 2025-03-14T05:41:31.3502920Z extracting: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/a_dlink.reg.c 2025-03-14T05:41:31.3503852Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/a_dlink.fatbin 2025-03-14T05:41:31.3504834Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/a_dlink.fatbin.c 2025-03-14T05:41:31.3505846Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/a_dlink.o 2025-03-14T05:41:31.3507812Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/CMakeCUDACompilerId.cu 2025-03-14T05:41:31.3582876Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CompilerIdCUDA/a.out 2025-03-14T05:41:31.3583860Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CMakeCUDACompiler.cmake 2025-03-14T05:41:31.3659550Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/3.31.2/CMakeDetermineCompilerABI_CUDA.bin 2025-03-14T05:41:31.3660867Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/CMakeScratch/ 2025-03-14T05:41:31.3661698Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/CMakeTmp/ 2025-03-14T05:41:31.3662599Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/cmake.check_cache 2025-03-14T05:41:31.3663506Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/ 2025-03-14T05:41:31.3664515Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/compiler_depend.ts 2025-03-14T05:41:31.3665522Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/compiler_depend.make 2025-03-14T05:41:31.3666353Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/depend.make 2025-03-14T05:41:31.3667165Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/link.txt 2025-03-14T05:41:31.3668436Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/cmake_clean.cmake 2025-03-14T05:41:31.3669462Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/build.make 2025-03-14T05:41:31.3670723Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/DependInfo.cmake 2025-03-14T05:41:31.3671613Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/flags.make 2025-03-14T05:41:31.3672366Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/progress.make 2025-03-14T05:41:31.3692373Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/op.cpp.o.d 2025-03-14T05:41:31.3890989Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/op.cpp.o 2025-03-14T05:41:31.3891977Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/ 2025-03-14T05:41:31.3893041Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/compiler_depend.ts 2025-03-14T05:41:31.3894233Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/compiler_depend.make 2025-03-14T05:41:31.3895369Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/depend.make 2025-03-14T05:41:31.3896369Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/link.txt 2025-03-14T05:41:31.3897391Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/cmake_clean.cmake 2025-03-14T05:41:31.3898498Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/build.make 2025-03-14T05:41:31.3899635Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/DependInfo.cmake 2025-03-14T05:41:31.3900773Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/flags.make 2025-03-14T05:41:31.3901555Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/progress.make 2025-03-14T05:41:31.3922225Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/test_custom_ops.cpp.o.d 2025-03-14T05:41:31.4005569Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/test_custom_ops.cpp.o 2025-03-14T05:41:31.4006767Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/CMakeDirectoryInformation.cmake 2025-03-14T05:41:31.4007860Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/TargetDirectories.txt 2025-03-14T05:41:31.4008913Z extracting: build/custom_test_artifacts/custom-op-build/CMakeFiles/progress.marks 2025-03-14T05:41:31.4009839Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/Makefile2 2025-03-14T05:41:31.4010848Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/Makefile.cmake 2025-03-14T05:41:31.4011750Z inflating: build/custom_test_artifacts/custom-op-build/detect_cuda_version.cc 2025-03-14T05:41:31.4014867Z inflating: build/custom_test_artifacts/custom-op-build/CMakeCache.txt 2025-03-14T05:41:31.4015662Z inflating: build/custom_test_artifacts/custom-op-build/Makefile 2025-03-14T05:41:31.4016670Z inflating: build/custom_test_artifacts/custom-op-build/cmake_install.cmake 2025-03-14T05:41:31.4181610Z inflating: build/custom_test_artifacts/custom-op-build/libcustom_ops.so 2025-03-14T05:41:31.4244988Z inflating: build/custom_test_artifacts/custom-op-build/test_custom_ops 2025-03-14T05:41:31.4245719Z creating: build/custom_test_artifacts/jit-hook-build/ 2025-03-14T05:41:31.4246385Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/ 2025-03-14T05:41:31.4246976Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/pkgRedirects/ 2025-03-14T05:41:31.4254363Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/CMakeConfigureLog.yaml 2025-03-14T05:41:31.4255269Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/ 2025-03-14T05:41:31.4256152Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CMakeSystem.cmake 2025-03-14T05:41:31.4257108Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdC/ 2025-03-14T05:41:31.4257779Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdC/tmp/ 2025-03-14T05:41:31.4258698Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdC/CMakeCCompilerId.c 2025-03-14T05:41:31.4260714Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdC/a.out 2025-03-14T05:41:31.4261732Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CMakeCCompiler.cmake 2025-03-14T05:41:31.4262706Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCXX/ 2025-03-14T05:41:31.4263495Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCXX/tmp/ 2025-03-14T05:41:31.4265737Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCXX/CMakeCXXCompilerId.cpp 2025-03-14T05:41:31.4267077Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCXX/a.out 2025-03-14T05:41:31.4268481Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CMakeCXXCompiler.cmake 2025-03-14T05:41:31.4270756Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CMakeDetermineCompilerABI_C.bin 2025-03-14T05:41:31.4272805Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CMakeDetermineCompilerABI_CXX.bin 2025-03-14T05:41:31.4273577Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/ 2025-03-14T05:41:31.4274348Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/ 2025-03-14T05:41:31.4316376Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cpp4.ii 2025-03-14T05:41:31.4357490Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cudafe1.cpp 2025-03-14T05:41:31.4359485Z extracting: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.module_id 2025-03-14T05:41:31.4405650Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cpp1.ii 2025-03-14T05:41:31.4406794Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cudafe1.c 2025-03-14T05:41:31.4407793Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cudafe1.gpu 2025-03-14T05:41:31.4408812Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cudafe1.stub.c 2025-03-14T05:41:31.4409840Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.ptx 2025-03-14T05:41:31.4410821Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.sm_52.cubin 2025-03-14T05:41:31.4411790Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.fatbin 2025-03-14T05:41:31.4412758Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.fatbin.c 2025-03-14T05:41:31.4413727Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.o 2025-03-14T05:41:31.4414635Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/a_dlink.sm_52.cubin 2025-03-14T05:41:31.4415691Z extracting: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/a_dlink.reg.c 2025-03-14T05:41:31.4416744Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/a_dlink.fatbin 2025-03-14T05:41:31.4417895Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/a_dlink.fatbin.c 2025-03-14T05:41:31.4418746Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/a_dlink.o 2025-03-14T05:41:31.4421796Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/CMakeCUDACompilerId.cu 2025-03-14T05:41:31.4496506Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CompilerIdCUDA/a.out 2025-03-14T05:41:31.4497267Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CMakeCUDACompiler.cmake 2025-03-14T05:41:31.4572286Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/3.31.2/CMakeDetermineCompilerABI_CUDA.bin 2025-03-14T05:41:31.4573030Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/CMakeScratch/ 2025-03-14T05:41:31.4573627Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/CMakeTmp/ 2025-03-14T05:41:31.4574248Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/cmake.check_cache 2025-03-14T05:41:31.4574894Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/ 2025-03-14T05:41:31.4575624Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/compiler_depend.ts 2025-03-14T05:41:31.4576454Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/compiler_depend.make 2025-03-14T05:41:31.4577244Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/depend.make 2025-03-14T05:41:31.4577983Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/link.txt 2025-03-14T05:41:31.4578749Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/cmake_clean.cmake 2025-03-14T05:41:31.4579524Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/build.make 2025-03-14T05:41:31.4580354Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/DependInfo.cmake 2025-03-14T05:41:31.4581120Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/flags.make 2025-03-14T05:41:31.4581874Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/progress.make 2025-03-14T05:41:31.4603570Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/test_jit_hooks.cpp.o.d 2025-03-14T05:41:31.4667373Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/test_jit_hooks.cpp.o 2025-03-14T05:41:31.4668472Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/CMakeDirectoryInformation.cmake 2025-03-14T05:41:31.4669225Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/TargetDirectories.txt 2025-03-14T05:41:31.4670321Z extracting: build/custom_test_artifacts/jit-hook-build/CMakeFiles/progress.marks 2025-03-14T05:41:31.4671932Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/Makefile2 2025-03-14T05:41:31.4673020Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/Makefile.cmake 2025-03-14T05:41:31.4674364Z inflating: build/custom_test_artifacts/jit-hook-build/detect_cuda_version.cc 2025-03-14T05:41:31.4676433Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeCache.txt 2025-03-14T05:41:31.4677532Z inflating: build/custom_test_artifacts/jit-hook-build/Makefile 2025-03-14T05:41:31.4678630Z inflating: build/custom_test_artifacts/jit-hook-build/cmake_install.cmake 2025-03-14T05:41:31.4727817Z inflating: build/custom_test_artifacts/jit-hook-build/test_jit_hooks 2025-03-14T05:41:31.4728509Z creating: build/custom_test_artifacts/custom-backend-build/ 2025-03-14T05:41:31.4729056Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/ 2025-03-14T05:41:31.4729677Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/pkgRedirects/ 2025-03-14T05:41:31.4737855Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/CMakeConfigureLog.yaml 2025-03-14T05:41:31.4738566Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/ 2025-03-14T05:41:31.4739261Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CMakeSystem.cmake 2025-03-14T05:41:31.4740066Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdC/ 2025-03-14T05:41:31.4740795Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdC/tmp/ 2025-03-14T05:41:31.4757728Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdC/CMakeCCompilerId.c 2025-03-14T05:41:31.4758661Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdC/a.out 2025-03-14T05:41:31.4759429Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CMakeCCompiler.cmake 2025-03-14T05:41:31.4760186Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCXX/ 2025-03-14T05:41:31.4760904Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCXX/tmp/ 2025-03-14T05:41:31.4761729Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCXX/CMakeCXXCompilerId.cpp 2025-03-14T05:41:31.4762576Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCXX/a.out 2025-03-14T05:41:31.4763355Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CMakeCXXCompiler.cmake 2025-03-14T05:41:31.4764199Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CMakeDetermineCompilerABI_C.bin 2025-03-14T05:41:31.4765082Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CMakeDetermineCompilerABI_CXX.bin 2025-03-14T05:41:31.4765890Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/ 2025-03-14T05:41:31.4766630Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/ 2025-03-14T05:41:31.4798048Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cpp4.ii 2025-03-14T05:41:31.4839487Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cudafe1.cpp 2025-03-14T05:41:31.4840552Z extracting: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.module_id 2025-03-14T05:41:31.4887778Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cpp1.ii 2025-03-14T05:41:31.4888806Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cudafe1.c 2025-03-14T05:41:31.4889884Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cudafe1.gpu 2025-03-14T05:41:31.4890975Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.cudafe1.stub.c 2025-03-14T05:41:31.4892016Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.ptx 2025-03-14T05:41:31.4893041Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.sm_52.cubin 2025-03-14T05:41:31.4894067Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.fatbin 2025-03-14T05:41:31.4895094Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.fatbin.c 2025-03-14T05:41:31.4896254Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/CMakeCUDACompilerId.o 2025-03-14T05:41:31.4897545Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/a_dlink.sm_52.cubin 2025-03-14T05:41:31.4898469Z extracting: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/a_dlink.reg.c 2025-03-14T05:41:31.4899414Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/a_dlink.fatbin 2025-03-14T05:41:31.4900342Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/a_dlink.fatbin.c 2025-03-14T05:41:31.4901255Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/tmp/a_dlink.o 2025-03-14T05:41:31.4904496Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/CMakeCUDACompilerId.cu 2025-03-14T05:41:31.4979004Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CompilerIdCUDA/a.out 2025-03-14T05:41:31.4979844Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CMakeCUDACompiler.cmake 2025-03-14T05:41:31.5054098Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/3.31.2/CMakeDetermineCompilerABI_CUDA.bin 2025-03-14T05:41:31.5054914Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/CMakeScratch/ 2025-03-14T05:41:31.5055581Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/CMakeTmp/ 2025-03-14T05:41:31.5056265Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/cmake.check_cache 2025-03-14T05:41:31.5056993Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/ 2025-03-14T05:41:31.5057799Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/compiler_depend.ts 2025-03-14T05:41:31.5058716Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/compiler_depend.make 2025-03-14T05:41:31.5059599Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/depend.make 2025-03-14T05:41:31.5060462Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/link.txt 2025-03-14T05:41:31.5061300Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/cmake_clean.cmake 2025-03-14T05:41:31.5062327Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/build.make 2025-03-14T05:41:31.5063339Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/DependInfo.cmake 2025-03-14T05:41:31.5064361Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/flags.make 2025-03-14T05:41:31.5065352Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/progress.make 2025-03-14T05:41:31.5068907Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/custom_backend.cpp.o.d 2025-03-14T05:41:31.5191224Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/custom_backend.cpp.o 2025-03-14T05:41:31.5192053Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/ 2025-03-14T05:41:31.5192904Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/compiler_depend.ts 2025-03-14T05:41:31.5193850Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/compiler_depend.make 2025-03-14T05:41:31.5194831Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/depend.make 2025-03-14T05:41:31.5195679Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/link.txt 2025-03-14T05:41:31.5196546Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/cmake_clean.cmake 2025-03-14T05:41:31.5197588Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/build.make 2025-03-14T05:41:31.5198462Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/DependInfo.cmake 2025-03-14T05:41:31.5199346Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/flags.make 2025-03-14T05:41:31.5200219Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/progress.make 2025-03-14T05:41:31.5221042Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/test_custom_backend.cpp.o.d 2025-03-14T05:41:31.5276642Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/test_custom_backend.cpp.o 2025-03-14T05:41:31.5277800Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/CMakeDirectoryInformation.cmake 2025-03-14T05:41:31.5278672Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/TargetDirectories.txt 2025-03-14T05:41:31.5279426Z extracting: build/custom_test_artifacts/custom-backend-build/CMakeFiles/progress.marks 2025-03-14T05:41:31.5280383Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/Makefile2 2025-03-14T05:41:31.5281525Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/Makefile.cmake 2025-03-14T05:41:31.5282352Z inflating: build/custom_test_artifacts/custom-backend-build/detect_cuda_version.cc 2025-03-14T05:41:31.5285350Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeCache.txt 2025-03-14T05:41:31.5286253Z inflating: build/custom_test_artifacts/custom-backend-build/Makefile 2025-03-14T05:41:31.5287175Z inflating: build/custom_test_artifacts/custom-backend-build/cmake_install.cmake 2025-03-14T05:41:31.5387828Z inflating: build/custom_test_artifacts/custom-backend-build/libcustom_backend.so 2025-03-14T05:41:31.5430575Z inflating: build/custom_test_artifacts/custom-backend-build/test_custom_backend 2025-03-14T05:41:31.5431074Z creating: build/lib/ 2025-03-14T05:41:31.5520648Z inflating: build/lib/libprotobuf-lite.a 2025-03-14T05:41:31.5976591Z inflating: build/lib/libprotobuf.a 2025-03-14T05:41:31.6478746Z inflating: build/lib/libprotoc.a 2025-03-14T05:41:31.6488614Z inflating: build/lib/libpthreadpool.a 2025-03-14T05:41:31.6496467Z inflating: build/lib/libcpuinfo.a 2025-03-14T05:41:31.6504356Z inflating: build/lib/libcpuinfo_internals.a 2025-03-14T05:41:31.6505111Z inflating: build/lib/libclog.a 2025-03-14T05:41:31.6525081Z inflating: build/lib/libpytorch_qnnpack.a 2025-03-14T05:41:31.6527905Z inflating: build/lib/libnnpack_reference_layers.a 2025-03-14T05:41:31.6546021Z inflating: build/lib/libnnpack.a 2025-03-14T05:41:31.6732280Z inflating: build/lib/libmicrokernels-prod.a 2025-03-14T05:41:31.7534196Z inflating: build/lib/libmicrokernels-all.a 2025-03-14T05:41:31.7626277Z inflating: build/lib/libXNNPACK.a 2025-03-14T05:41:31.7694740Z inflating: build/lib/libgtest.a 2025-03-14T05:41:31.7711914Z inflating: build/lib/libgmock.a 2025-03-14T05:41:31.7712360Z inflating: build/lib/libgmock_main.a 2025-03-14T05:41:31.7713422Z inflating: build/lib/libgtest_main.a 2025-03-14T05:41:31.7788733Z inflating: build/lib/libbenchmark.a 2025-03-14T05:41:31.7789269Z inflating: build/lib/libbenchmark_main.a 2025-03-14T05:41:31.7852986Z inflating: build/lib/libasmjit.a 2025-03-14T05:41:31.9107104Z inflating: build/lib/libfbgemm.a 2025-03-14T05:41:31.9115083Z inflating: build/lib/libittnotify.a 2025-03-14T05:41:31.9143297Z inflating: build/lib/libtensorpipe_uv.a 2025-03-14T05:41:31.9715214Z inflating: build/lib/libtensorpipe.a 2025-03-14T05:41:31.9970314Z inflating: build/lib/libtensorpipe_cuda.a 2025-03-14T05:41:32.0097472Z inflating: build/lib/libgloo.a 2025-03-14T05:41:32.0139144Z inflating: build/lib/libonnx_proto.a 2025-03-14T05:41:32.0868152Z inflating: build/lib/libonnx.a 2025-03-14T05:41:33.1425796Z inflating: build/lib/libdnnl.a 2025-03-14T05:41:33.1445003Z inflating: build/lib/libfmt.a 2025-03-14T05:41:33.1911420Z inflating: build/lib/libkineto.a 2025-03-14T05:41:33.2023874Z inflating: build/lib/libc10.so 2025-03-14T05:41:33.2083614Z inflating: build/lib/libc10_cuda.so 2025-03-14T05:41:33.2085314Z inflating: build/lib/libcaffe2_nvrtc.so 2025-03-14T05:41:33.2086902Z inflating: build/lib/libtorch_global_deps.so 2025-03-14T05:41:33.2475490Z inflating: build/lib/libgloo_cuda.a 2025-03-14T05:41:35.8011633Z inflating: build/lib/libtorch_cpu.so 2025-03-14T05:41:38.2544666Z inflating: build/lib/libtorch_cuda.so 2025-03-14T05:41:38.2549661Z inflating: build/lib/libunbox_lib.a 2025-03-14T05:41:38.2551189Z inflating: build/lib/libtorch.so 2025-03-14T05:41:39.1326116Z inflating: build/lib/libtorch_cuda_linalg.so 2025-03-14T05:41:39.1395955Z inflating: build/lib/libtorchbind_test.so 2025-03-14T05:41:39.1416534Z inflating: build/lib/libjitbackend_test.so 2025-03-14T05:41:39.1440844Z inflating: build/lib/libbackend_with_compiler.so 2025-03-14T05:41:39.1466017Z inflating: build/lib/libaoti_custom_ops.so 2025-03-14T05:41:39.1469004Z inflating: build/lib/libc10d_cuda_test.so 2025-03-14T05:41:39.1473142Z inflating: build/lib/libshm.so 2025-03-14T05:41:39.3531340Z inflating: build/lib/libtorch_python.so 2025-03-14T05:41:39.3565911Z inflating: build/lib/libnnapi_backend.so 2025-03-14T05:41:39.3566263Z creating: build/bin/ 2025-03-14T05:41:39.4015916Z inflating: build/bin/protoc-3.13.0.0 2025-03-14T05:41:39.4463214Z inflating: build/bin/protoc 2025-03-14T05:41:39.4517337Z inflating: build/bin/c10_CompileTimeFunctionPointer_test 2025-03-14T05:41:39.4572256Z inflating: build/bin/c10_DeviceGuard_test 2025-03-14T05:41:39.4627956Z inflating: build/bin/c10_Device_test 2025-03-14T05:41:39.4690337Z inflating: build/bin/c10_DispatchKeySet_test 2025-03-14T05:41:39.4747053Z inflating: build/bin/c10_Scalar_test 2025-03-14T05:41:39.4800173Z inflating: build/bin/c10_StreamGuard_test 2025-03-14T05:41:39.4854258Z inflating: build/bin/c10_SymInt_test 2025-03-14T05:41:39.4912418Z inflating: build/bin/c10_InlineDeviceGuard_test 2025-03-14T05:41:39.4972144Z inflating: build/bin/c10_InlineStreamGuard_test 2025-03-14T05:41:39.5031908Z inflating: build/bin/c10_SizesAndStrides_test 2025-03-14T05:41:39.5106375Z inflating: build/bin/c10_cow_test 2025-03-14T05:41:39.5158906Z inflating: build/bin/c10_ArrayRef_test 2025-03-14T05:41:39.5215977Z inflating: build/bin/c10_Bitset_test 2025-03-14T05:41:39.5268010Z inflating: build/bin/c10_ConstexprCrc_test 2025-03-14T05:41:39.5321158Z inflating: build/bin/c10_DeadlockDetection_test 2025-03-14T05:41:39.5375240Z inflating: build/bin/c10_Half_test 2025-03-14T05:41:39.5434860Z inflating: build/bin/c10_LeftRight_test 2025-03-14T05:41:39.5494424Z inflating: build/bin/c10_Metaprogramming_test 2025-03-14T05:41:39.5552102Z inflating: build/bin/c10_NetworkFlow_test 2025-03-14T05:41:39.5605020Z inflating: build/bin/c10_Synchronized_test 2025-03-14T05:41:39.5665012Z inflating: build/bin/c10_ThreadLocal_test 2025-03-14T05:41:39.5720126Z inflating: build/bin/c10_TypeIndex_test 2025-03-14T05:41:39.5775027Z inflating: build/bin/c10_TypeList_test 2025-03-14T05:41:39.5827086Z inflating: build/bin/c10_TypeTraits_test 2025-03-14T05:41:39.5882258Z inflating: build/bin/c10_accumulate_test 2025-03-14T05:41:39.5941538Z inflating: build/bin/c10_bfloat16_test 2025-03-14T05:41:39.5995595Z inflating: build/bin/c10_bit_cast_test 2025-03-14T05:41:39.6055846Z inflating: build/bin/c10_complex_math_test 2025-03-14T05:41:39.6115747Z inflating: build/bin/c10_complex_test 2025-03-14T05:41:39.6168164Z inflating: build/bin/c10_error_test 2025-03-14T05:41:39.6224545Z inflating: build/bin/c10_exception_test 2025-03-14T05:41:39.6278934Z inflating: build/bin/c10_flags_test 2025-03-14T05:41:39.6332400Z inflating: build/bin/c10_generic_math_test 2025-03-14T05:41:39.6503358Z inflating: build/bin/c10_intrusive_ptr_test 2025-03-14T05:41:39.6558232Z inflating: build/bin/c10_irange_test 2025-03-14T05:41:39.6615266Z inflating: build/bin/c10_lazy_test 2025-03-14T05:41:39.6676467Z inflating: build/bin/c10_logging_test 2025-03-14T05:41:39.6755516Z inflating: build/bin/c10_optional_test 2025-03-14T05:41:39.6821586Z inflating: build/bin/c10_ordered_preserving_dict_test 2025-03-14T05:41:39.6878794Z inflating: build/bin/c10_registry_test 2025-03-14T05:41:39.7038296Z inflating: build/bin/c10_small_vector_test 2025-03-14T05:41:39.7093696Z inflating: build/bin/c10_ssize_test 2025-03-14T05:41:39.7148843Z inflating: build/bin/c10_string_util_test 2025-03-14T05:41:39.7201562Z inflating: build/bin/c10_string_view_test 2025-03-14T05:41:39.7255208Z inflating: build/bin/c10_tempfile_test 2025-03-14T05:41:39.7314482Z inflating: build/bin/c10_typeid_test 2025-03-14T05:41:39.7363111Z inflating: build/bin/c10_intrusive_ptr_benchmark 2025-03-14T05:41:39.7419849Z inflating: build/bin/c10_cuda_CUDAAssertionsTest_1_var_test 2025-03-14T05:41:39.7475542Z inflating: build/bin/c10_cuda_CUDAAssertionsTest_catches_stream 2025-03-14T05:41:39.7531552Z inflating: build/bin/c10_cuda_CUDAAssertionsTest_catches_thread_and_block_and_device 2025-03-14T05:41:39.7587026Z inflating: build/bin/c10_cuda_CUDAAssertionsTest_from_2_processes 2025-03-14T05:41:39.7643125Z inflating: build/bin/c10_cuda_CUDAAssertionsTest_multiple_writes_from_blocks_and_threads 2025-03-14T05:41:39.7699919Z inflating: build/bin/c10_cuda_CUDAAssertionsTest_multiple_writes_from_multiple_blocks 2025-03-14T05:41:39.7755015Z inflating: build/bin/c10_cuda_CUDAAssertionsTest_multiple_writes_from_same_block 2025-03-14T05:41:39.7807704Z inflating: build/bin/c10_cuda_CUDATest 2025-03-14T05:41:39.8218958Z inflating: build/bin/vec_test_all_types_DEFAULT 2025-03-14T05:41:39.8637262Z inflating: build/bin/vec_test_all_types_AVX512 2025-03-14T05:41:39.9070313Z inflating: build/bin/vec_test_all_types_AVX2 2025-03-14T05:41:39.9127242Z inflating: build/bin/test_edge_op_registration 2025-03-14T05:41:39.9205991Z inflating: build/bin/Dict_test 2025-03-14T05:41:39.9261518Z inflating: build/bin/Dimname_test 2025-03-14T05:41:39.9330025Z inflating: build/bin/MaybeOwned_test 2025-03-14T05:41:39.9391173Z inflating: build/bin/NamedTensor_test 2025-03-14T05:41:39.9454042Z inflating: build/bin/apply_utils_test 2025-03-14T05:41:39.9517097Z inflating: build/bin/atest 2025-03-14T05:41:39.9583975Z inflating: build/bin/basic 2025-03-14T05:41:39.9642220Z inflating: build/bin/broadcast_test 2025-03-14T05:41:39.9696739Z inflating: build/bin/cpu_allocator_test 2025-03-14T05:41:39.9758715Z inflating: build/bin/cpu_generator_test 2025-03-14T05:41:39.9816241Z inflating: build/bin/cpu_profiling_allocator_test 2025-03-14T05:41:39.9912300Z inflating: build/bin/cpu_rng_test 2025-03-14T05:41:39.9966087Z inflating: build/bin/dispatch_key_set_test 2025-03-14T05:41:40.0020117Z inflating: build/bin/dlconvertor_test 2025-03-14T05:41:40.0081831Z inflating: build/bin/extension_backend_test 2025-03-14T05:41:40.0140438Z inflating: build/bin/half_test 2025-03-14T05:41:40.0242363Z inflating: build/bin/ivalue_test 2025-03-14T05:41:40.0295844Z inflating: build/bin/lazy_tensor_test 2025-03-14T05:41:40.0353733Z inflating: build/bin/math_kernel_test 2025-03-14T05:41:40.0411239Z inflating: build/bin/memory_format_test 2025-03-14T05:41:40.0467650Z inflating: build/bin/memory_overlapping_test 2025-03-14T05:41:40.0524369Z inflating: build/bin/mobile_memory_cleanup 2025-03-14T05:41:40.0584681Z inflating: build/bin/native_test 2025-03-14T05:41:40.0638314Z inflating: build/bin/operator_name_test 2025-03-14T05:41:40.0694913Z inflating: build/bin/operators_test 2025-03-14T05:41:40.0750160Z inflating: build/bin/packedtensoraccessor_test 2025-03-14T05:41:40.0822239Z inflating: build/bin/pow_test 2025-03-14T05:41:40.0883073Z inflating: build/bin/quantized_test 2025-03-14T05:41:40.0936254Z inflating: build/bin/reduce_ops_test 2025-03-14T05:41:40.0991707Z inflating: build/bin/reportMemoryUsage_test 2025-03-14T05:41:40.1053795Z inflating: build/bin/scalar_tensor_test 2025-03-14T05:41:40.1116699Z inflating: build/bin/scalar_test 2025-03-14T05:41:40.1172023Z inflating: build/bin/StorageUtils_test 2025-03-14T05:41:40.1228178Z inflating: build/bin/stride_properties_test 2025-03-14T05:41:40.1312700Z inflating: build/bin/tensor_iterator_test 2025-03-14T05:41:40.1371061Z inflating: build/bin/test_parallel 2025-03-14T05:41:40.1373911Z inflating: build/bin/thread_init_test 2025-03-14T05:41:40.1432899Z inflating: build/bin/type_ptr_test 2025-03-14T05:41:40.1496410Z inflating: build/bin/type_test 2025-03-14T05:41:40.1551781Z inflating: build/bin/undefined_tensor_test 2025-03-14T05:41:40.1553546Z inflating: build/bin/verify_api_visibility 2025-03-14T05:41:40.1628225Z inflating: build/bin/legacy_vmap_test 2025-03-14T05:41:40.1682408Z inflating: build/bin/weakref_test 2025-03-14T05:41:40.1737013Z inflating: build/bin/wrapdim_test 2025-03-14T05:41:40.1792474Z inflating: build/bin/xla_tensor_test 2025-03-14T05:41:40.1857864Z inflating: build/bin/IListRef_test 2025-03-14T05:41:40.1968921Z inflating: build/bin/List_test 2025-03-14T05:41:40.2038825Z inflating: build/bin/KernelFunction_test 2025-03-14T05:41:40.2165136Z inflating: build/bin/kernel_function_legacy_test 2025-03-14T05:41:40.2265048Z inflating: build/bin/kernel_function_test 2025-03-14T05:41:40.2397642Z inflating: build/bin/kernel_lambda_legacy_test 2025-03-14T05:41:40.2504669Z inflating: build/bin/kernel_lambda_test 2025-03-14T05:41:40.2569576Z inflating: build/bin/kernel_stackbased_test 2025-03-14T05:41:40.2670257Z inflating: build/bin/make_boxed_from_unboxed_functor_test 2025-03-14T05:41:40.2725041Z inflating: build/bin/CppSignature_test 2025-03-14T05:41:40.2784839Z inflating: build/bin/backend_fallback_test 2025-03-14T05:41:40.2836438Z inflating: build/bin/op_allowlist_test 2025-03-14T05:41:40.3143584Z inflating: build/bin/op_registration_test 2025-03-14T05:41:40.3211180Z inflating: build/bin/inline_container_test 2025-03-14T05:41:40.3266522Z inflating: build/bin/cuda_allocator_test 2025-03-14T05:41:40.3323314Z inflating: build/bin/cuda_apply_test 2025-03-14T05:41:40.3386944Z inflating: build/bin/cuda_atomic_ops_test 2025-03-14T05:41:40.3446722Z inflating: build/bin/cuda_caching_host_allocator_test 2025-03-14T05:41:40.3521058Z inflating: build/bin/cuda_complex_math_test 2025-03-14T05:41:40.3584083Z inflating: build/bin/cuda_complex_test 2025-03-14T05:41:40.3649039Z inflating: build/bin/cuda_cub_test 2025-03-14T05:41:40.3702762Z inflating: build/bin/cuda_device_test 2025-03-14T05:41:40.3770996Z inflating: build/bin/cuda_distributions_test 2025-03-14T05:41:40.3825283Z inflating: build/bin/cuda_dlconvertor_test 2025-03-14T05:41:40.3885786Z inflating: build/bin/cuda_generator_test 2025-03-14T05:41:40.3939219Z inflating: build/bin/cuda_half_test 2025-03-14T05:41:40.3994090Z inflating: build/bin/cuda_integer_divider_test 2025-03-14T05:41:40.4047171Z inflating: build/bin/cuda_optional_test 2025-03-14T05:41:40.4102776Z inflating: build/bin/cuda_packedtensoraccessor_test 2025-03-14T05:41:40.4158980Z inflating: build/bin/cuda_reportMemoryUsage_test 2025-03-14T05:41:40.4212282Z inflating: build/bin/cuda_allocatorTraceTracker_test 2025-03-14T05:41:40.4277787Z inflating: build/bin/cuda_stream_test 2025-03-14T05:41:40.4335280Z inflating: build/bin/cuda_vectorized_test 2025-03-14T05:41:40.4388295Z inflating: build/bin/cuda_cudnn_test 2025-03-14T05:41:40.4972860Z inflating: build/bin/test_jit 2025-03-14T05:41:40.4987405Z inflating: build/bin/tutorial_tensorexpr 2025-03-14T05:41:40.5043378Z inflating: build/bin/BackoffTest 2025-03-14T05:41:40.5102233Z inflating: build/bin/FileStoreTest 2025-03-14T05:41:40.5160447Z inflating: build/bin/TCPStoreTest 2025-03-14T05:41:40.5217601Z inflating: build/bin/HashStoreTest 2025-03-14T05:41:40.5286938Z inflating: build/bin/ProcessGroupGlooTest 2025-03-14T05:41:40.6125642Z inflating: build/bin/test_tensorexpr 2025-03-14T05:41:40.6187132Z inflating: build/bin/ProcessGroupGlooAsyncTest 2025-03-14T05:41:40.6255151Z inflating: build/bin/ProcessGroupNCCLTest 2025-03-14T05:41:40.6268587Z inflating: build/bin/ProcessGroupMPITest 2025-03-14T05:41:40.6271790Z inflating: build/bin/example_allreduce 2025-03-14T05:41:40.6336849Z inflating: build/bin/ProcessGroupNCCLErrorsTest 2025-03-14T05:41:40.6396080Z inflating: build/bin/test_dist_autograd 2025-03-14T05:41:40.6468118Z inflating: build/bin/test_cpp_rpc 2025-03-14T05:41:40.6471314Z inflating: build/bin/parallel_benchmark 2025-03-14T05:41:40.6541084Z inflating: build/bin/test_mobile_nnc 2025-03-14T05:41:40.6550958Z inflating: build/bin/aot_model_compiler_test 2025-03-14T05:41:40.7752827Z inflating: build/bin/test_api 2025-03-14T05:41:40.8108713Z inflating: build/bin/test_lazy 2025-03-14T05:41:40.8113292Z inflating: build/bin/torch_shm_manager 2025-03-14T05:41:40.8113774Z creating: .additional_ci_files/ 2025-03-14T05:41:40.8220776Z inflating: .additional_ci_files/test-times.json 2025-03-14T05:41:40.8638240Z inflating: .additional_ci_files/test-class-times.json 2025-03-14T05:41:40.8677434Z ##[group]Run rm artifacts.zip 2025-03-14T05:41:40.8677772Z rm artifacts.zip 2025-03-14T05:41:40.8686488Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:41:40.8686860Z env: 2025-03-14T05:41:40.8696353Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:40.8696772Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:40.8697143Z ##[endgroup] 2025-03-14T05:41:41.0298301Z ##[group]Run df -H 2025-03-14T05:41:41.0298578Z df -H 2025-03-14T05:41:41.0307245Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:41:41.0307619Z env: 2025-03-14T05:41:41.0307848Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:41.0308185Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:41.0308743Z ##[endgroup] 2025-03-14T05:41:41.0361566Z Filesystem Size Used Avail Use% Mounted on 2025-03-14T05:41:41.0361965Z devtmpfs 4.2M 0 4.2M 0% /dev 2025-03-14T05:41:41.0362319Z tmpfs 34G 0 34G 0% /dev/shm 2025-03-14T05:41:41.0362664Z tmpfs 14G 553k 14G 1% /run 2025-03-14T05:41:41.0363003Z /dev/nvme0n1p1 161G 66G 96G 41% / 2025-03-14T05:41:41.0363336Z tmpfs 34G 13k 34G 1% /tmp 2025-03-14T05:41:41.0363686Z /dev/nvme0n1p128 11M 1.4M 9.2M 13% /boot/efi 2025-03-14T05:41:41.0364069Z tmpfs 6.7G 0 6.7G 0% /run/user/0 2025-03-14T05:41:41.0398641Z Prepare all required actions 2025-03-14T05:41:41.0399230Z Getting action download info 2025-03-14T05:41:41.1592619Z ##[group]Run ./.github/actions/download-td-artifacts 2025-03-14T05:41:41.1593027Z with: 2025-03-14T05:41:41.1593246Z env: 2025-03-14T05:41:41.1593478Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:41.1593818Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:41.1594172Z ##[endgroup] 2025-03-14T05:41:41.1626620Z ##[group]Run seemethere/download-artifact-s3@v4 2025-03-14T05:41:41.1626964Z with: 2025-03-14T05:41:41.1627197Z name: td_results 2025-03-14T05:41:41.1627454Z s3-bucket: gha-artifacts 2025-03-14T05:41:41.1627737Z region: us-east-1 2025-03-14T05:41:41.1627977Z env: 2025-03-14T05:41:41.1628209Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:41.1628552Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:41.1629095Z ##[endgroup] 2025-03-14T05:41:41.6165735Z (node:54135) NOTE: We are formalizing our plans to enter AWS SDK for JavaScript (v2) into maintenance mode in 2023. 2025-03-14T05:41:41.6166204Z 2025-03-14T05:41:41.6166406Z Please migrate your code to use AWS SDK for JavaScript (v3). 2025-03-14T05:41:41.6166925Z For more information, check the migration guide at https://a.co/7PzMCcy 2025-03-14T05:41:41.6167472Z (Use `node --trace-warnings ...` to show where the warning was created) 2025-03-14T05:41:41.7140456Z Found 0 objects with prefix pytorch/pytorch/13849515380/td_results/ 2025-03-14T05:41:41.7146650Z Artifact download has finished successfully 2025-03-14T05:41:41.7499978Z ##[group]Run mkdir -p .additional_ci_files 2025-03-14T05:41:41.7500370Z mkdir -p .additional_ci_files 2025-03-14T05:41:41.7500817Z mv td_results.json .additional_ci_files/td_results.json || true 2025-03-14T05:41:41.7509620Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:41:41.7510024Z env: 2025-03-14T05:41:41.7510267Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:41.7510625Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:41.7510994Z ##[endgroup] 2025-03-14T05:41:41.7565715Z mv: cannot stat 'td_results.json': No such file or directory 2025-03-14T05:41:41.7626536Z ##[group]Run .github/scripts/parse_ref.py 2025-03-14T05:41:41.7626916Z .github/scripts/parse_ref.py 2025-03-14T05:41:41.7634867Z shell: /usr/bin/bash -e {0} 2025-03-14T05:41:41.7635193Z env: 2025-03-14T05:41:41.7635462Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:41.7635851Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:41.7636252Z ##[endgroup] 2025-03-14T05:41:41.7974457Z Prepare all required actions 2025-03-14T05:41:41.7975245Z Getting action download info 2025-03-14T05:41:41.9134019Z ##[group]Run ./.github/actions/filter-test-configs 2025-03-14T05:41:41.9134389Z with: 2025-03-14T05:41:41.9134800Z github-token: *** 2025-03-14T05:41:41.9136465Z test-matrix: {"include": [{"config": "inductor_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.g5.4xlarge.nvidia.gpu"}, {"config": "inductor_timm", "shard": 1, "num_shards": 2, "runner": "linux.g5.4xlarge.nvidia.gpu"}, {"config": "inductor_timm", "shard": 2, "num_shards": 2, "runner": "linux.g5.4xlarge.nvidia.gpu"}, {"config": "inductor_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.g5.4xlarge.nvidia.gpu"}, {"config": "inductor_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.g5.4xlarge.nvidia.gpu"}]} 2025-03-14T05:41:41.9138403Z job-name: cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu) 2025-03-14T05:41:41.9138922Z env: 2025-03-14T05:41:41.9139364Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:41.9139725Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:41.9140090Z ##[endgroup] 2025-03-14T05:41:41.9192973Z ##[group]Run nick-fields/retry@v3.0.0 2025-03-14T05:41:41.9193286Z with: 2025-03-14T05:41:41.9193506Z shell: bash 2025-03-14T05:41:41.9193753Z timeout_minutes: 10 2025-03-14T05:41:41.9194011Z max_attempts: 5 2025-03-14T05:41:41.9194346Z retry_wait_seconds: 30 2025-03-14T05:41:41.9195093Z command: set -eux # PyYAML 6.0 doesn't work with MacOS x86 anymore # This must run on Python-3.7 (AmazonLinux2) so can't use request=3.32.2 python3 -m pip install requests==2.27.1 pyyaml==6.0.1 2025-03-14T05:41:41.9195874Z polling_interval_seconds: 1 2025-03-14T05:41:41.9196174Z warning_on_retry: true 2025-03-14T05:41:41.9196450Z continue_on_error: false 2025-03-14T05:41:41.9196723Z env: 2025-03-14T05:41:41.9196954Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:41.9197301Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:41.9197816Z GITHUB_TOKEN: *** 2025-03-14T05:41:41.9198076Z ##[endgroup] 2025-03-14T05:41:42.0283295Z + python3 -m pip install requests==2.27.1 pyyaml==6.0.1 2025-03-14T05:41:42.2586745Z Defaulting to user installation because normal site-packages is not writeable 2025-03-14T05:41:42.3835910Z Collecting requests==2.27.1 2025-03-14T05:41:42.4072163Z Downloading requests-2.27.1-py2.py3-none-any.whl (63 kB) 2025-03-14T05:41:42.5808982Z Collecting pyyaml==6.0.1 2025-03-14T05:41:42.5845538Z Downloading PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (738 kB) 2025-03-14T05:41:42.6074838Z Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/lib/python3.9/site-packages (from requests==2.27.1) (1.25.10) 2025-03-14T05:41:42.9442466Z Collecting charset-normalizer~=2.0.0 2025-03-14T05:41:42.9477880Z Downloading charset_normalizer-2.0.12-py3-none-any.whl (39 kB) 2025-03-14T05:41:42.9943697Z Collecting certifi>=2017.4.17 2025-03-14T05:41:42.9978112Z Downloading certifi-2025.1.31-py3-none-any.whl (166 kB) 2025-03-14T05:41:43.0028109Z Requirement already satisfied: idna<4,>=2.5 in /usr/lib/python3.9/site-packages (from requests==2.27.1) (2.10) 2025-03-14T05:41:43.0850811Z Installing collected packages: charset-normalizer, certifi, requests, pyyaml 2025-03-14T05:41:43.1948033Z Successfully installed certifi-2025.1.31 charset-normalizer-2.0.12 pyyaml-6.0.1 requests-2.27.1 2025-03-14T05:41:44.0064537Z Command completed after 1 attempt(s). 2025-03-14T05:41:44.0139794Z ##[group]Run set -x 2025-03-14T05:41:44.0140196Z set -x 2025-03-14T05:41:44.0140442Z  2025-03-14T05:41:44.0140837Z # Use relative path here as this could be checked out anywhere, not necessarily 2025-03-14T05:41:44.0141313Z # in runner workspace 2025-03-14T05:41:44.0141716Z python3 "${GITHUB_ACTION_PATH}/../../scripts/parse_ref.py" 2025-03-14T05:41:44.0150576Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:41:44.0151186Z env: 2025-03-14T05:41:44.0151427Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:44.0151779Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:44.0152140Z ##[endgroup] 2025-03-14T05:41:44.0180573Z + python3 /home/ec2-user/actions-runner/_work/pytorch/pytorch/./.github/actions/filter-test-configs/../../scripts/parse_ref.py 2025-03-14T05:41:44.0419025Z ##[group]Run echo "Workflow: ${GITHUB_WORKFLOW}" 2025-03-14T05:41:44.0419449Z echo "Workflow: ${GITHUB_WORKFLOW}" 2025-03-14T05:41:44.0419815Z echo "Job name: ${JOB_NAME}" 2025-03-14T05:41:44.0420135Z  2025-03-14T05:41:44.0420537Z # Use relative path here as this could be checked out anywhere, not necessarily 2025-03-14T05:41:44.0421025Z # in runner workspace 2025-03-14T05:41:44.0421466Z python3 "${GITHUB_ACTION_PATH}/../../scripts/filter_test_configs.py" \ 2025-03-14T05:41:44.0421954Z  --workflow "${GITHUB_WORKFLOW}" \ 2025-03-14T05:41:44.0422325Z  --job-name "${JOB_NAME}" \ 2025-03-14T05:41:44.0424130Z  --test-matrix "{"include": [{"config": "inductor_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.g5.4xlarge.nvidia.gpu"}, {"config": "inductor_timm", "shard": 1, "num_shards": 2, "runner": "linux.g5.4xlarge.nvidia.gpu"}, {"config": "inductor_timm", "shard": 2, "num_shards": 2, "runner": "linux.g5.4xlarge.nvidia.gpu"}, {"config": "inductor_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.g5.4xlarge.nvidia.gpu"}, {"config": "inductor_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.g5.4xlarge.nvidia.gpu"}]}" \ 2025-03-14T05:41:44.0425893Z  --selected-test-configs "" \ 2025-03-14T05:41:44.0426256Z  --pr-number "${PR_NUMBER}" \ 2025-03-14T05:41:44.0426588Z  --tag "${TAG}" \ 2025-03-14T05:41:44.0426904Z  --event-name "${EVENT_NAME}" \ 2025-03-14T05:41:44.0427253Z  --schedule "${SCHEDULE}" \ 2025-03-14T05:41:44.0427592Z  --branch "${HEAD_BRANCH}" 2025-03-14T05:41:44.0436545Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:41:44.0436933Z env: 2025-03-14T05:41:44.0437173Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:44.0437529Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:44.0438083Z GITHUB_TOKEN: *** 2025-03-14T05:41:44.0438763Z JOB_NAME: cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu) 2025-03-14T05:41:44.0439286Z PR_NUMBER: 2025-03-14T05:41:44.0439532Z TAG: 2025-03-14T05:41:44.0439768Z EVENT_NAME: push 2025-03-14T05:41:44.0440027Z SCHEDULE: 2025-03-14T05:41:44.0440263Z HEAD_BRANCH: 2025-03-14T05:41:44.0440512Z ##[endgroup] 2025-03-14T05:41:44.0472745Z Workflow: inductor 2025-03-14T05:41:44.0473255Z Job name: cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu) 2025-03-14T05:41:44.2224822Z ##[group]Run echo "Filtered matrix:" 2025-03-14T05:41:44.2225209Z echo "Filtered matrix:" 2025-03-14T05:41:44.2226978Z echo "{"include": [{"config": "inductor_huggingface", "shard": 1, "num_shards": 1, "runner": "linux.g5.4xlarge.nvidia.gpu"}, {"config": "inductor_timm", "shard": 1, "num_shards": 2, "runner": "linux.g5.4xlarge.nvidia.gpu"}, {"config": "inductor_timm", "shard": 2, "num_shards": 2, "runner": "linux.g5.4xlarge.nvidia.gpu"}, {"config": "inductor_torchbench", "shard": 1, "num_shards": 2, "runner": "linux.g5.4xlarge.nvidia.gpu"}, {"config": "inductor_torchbench", "shard": 2, "num_shards": 2, "runner": "linux.g5.4xlarge.nvidia.gpu"}]}" 2025-03-14T05:41:44.2228733Z  2025-03-14T05:41:44.2228964Z echo 2025-03-14T05:41:44.2229265Z echo "Is the current job unstable? False" 2025-03-14T05:41:44.2229619Z  2025-03-14T05:41:44.2229851Z echo 2025-03-14T05:41:44.2230137Z echo "Is keep-going label set? False" 2025-03-14T05:41:44.2230474Z  2025-03-14T05:41:44.2230707Z echo 2025-03-14T05:41:44.2231162Z echo "Renabled issues? " 2025-03-14T05:41:44.2239987Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:41:44.2240362Z env: 2025-03-14T05:41:44.2240601Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:44.2252403Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:44.2252792Z ##[endgroup] 2025-03-14T05:41:44.2279966Z Filtered matrix: 2025-03-14T05:41:44.2281607Z {include: [{config: inductor_huggingface, shard: 1, num_shards: 1, runner: linux.g5.4xlarge.nvidia.gpu}, {config: inductor_timm, shard: 1, num_shards: 2, runner: linux.g5.4xlarge.nvidia.gpu}, {config: inductor_timm, shard: 2, num_shards: 2, runner: linux.g5.4xlarge.nvidia.gpu}, {config: inductor_torchbench, shard: 1, num_shards: 2, runner: linux.g5.4xlarge.nvidia.gpu}, {config: inductor_torchbench, shard: 2, num_shards: 2, runner: linux.g5.4xlarge.nvidia.gpu}]} 2025-03-14T05:41:44.2283225Z 2025-03-14T05:41:44.2283374Z Is the current job unstable? False 2025-03-14T05:41:44.2283594Z 2025-03-14T05:41:44.2283734Z Is keep-going label set? False 2025-03-14T05:41:44.2283933Z 2025-03-14T05:41:44.2284035Z Renabled issues? 2025-03-14T05:41:44.2327195Z ##[group]Run echo "timeout=$((JOB_TIMEOUT-30))" >> "${GITHUB_OUTPUT}" 2025-03-14T05:41:44.2327723Z echo "timeout=$((JOB_TIMEOUT-30))" >> "${GITHUB_OUTPUT}" 2025-03-14T05:41:44.2336542Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T05:41:44.2336930Z env: 2025-03-14T05:41:44.2337164Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:44.2337512Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:44.2337881Z JOB_TIMEOUT: 240 2025-03-14T05:41:44.2338138Z ##[endgroup] 2025-03-14T05:41:44.2415283Z ##[group]Run set -x 2025-03-14T05:41:44.2415630Z set -x 2025-03-14T05:41:44.2415876Z  2025-03-14T05:41:44.2416149Z if [[ $TEST_CONFIG == 'multigpu' ]]; then 2025-03-14T05:41:44.2416558Z  TEST_COMMAND=.ci/pytorch/multigpu-test.sh 2025-03-14T05:41:44.2416969Z elif [[ $BUILD_ENVIRONMENT == *onnx* ]]; then 2025-03-14T05:41:44.2417349Z  TEST_COMMAND=.ci/onnx/test.sh 2025-03-14T05:41:44.2417667Z else 2025-03-14T05:41:44.2417940Z  TEST_COMMAND=.ci/pytorch/test.sh 2025-03-14T05:41:44.2418272Z fi 2025-03-14T05:41:44.2418509Z  2025-03-14T05:41:44.2418972Z # Leaving 1GB for the runner and other things 2025-03-14T05:41:44.2419556Z TOTAL_AVAILABLE_MEMORY_IN_GB=$(awk '/MemTotal/ { printf "%.3f \n", $2/1024/1024 - 1 }' /proc/meminfo) 2025-03-14T05:41:44.2420393Z # https://docs.docker.com/engine/containers/resource_constraints/#--memory-swap-details, the 3GB swap 2025-03-14T05:41:44.2421089Z # comes from https://github.com/pytorch/test-infra/pull/6058 2025-03-14T05:41:44.2421625Z TOTAL_MEMORY_WITH_SWAP=$(("${TOTAL_AVAILABLE_MEMORY_IN_GB%.*}" + 3)) 2025-03-14T05:41:44.2422049Z  2025-03-14T05:41:44.2422340Z if [[ ${BUILD_ENVIRONMENT} == *"s390x"* ]]; then 2025-03-14T05:41:44.2422704Z  SHM_OPTS= 2025-03-14T05:41:44.2422982Z  JENKINS_USER= 2025-03-14T05:41:44.2423342Z  # ensure that docker container cleanly exits in 12 hours 2025-03-14T05:41:44.2423831Z  # if for some reason cleanup action doesn't stop container 2025-03-14T05:41:44.2424302Z  # when job is cancelled 2025-03-14T05:41:44.2424638Z  DOCKER_SHELL_CMD="sleep 12h" 2025-03-14T05:41:44.2424949Z  2025-03-14T05:41:44.2425330Z  # since some steps are skipped on s390x, if they are necessary, run them here 2025-03-14T05:41:44.2425873Z  env | grep '^GITHUB' >> "/tmp/github_env_${GITHUB_RUN_ID}" 2025-03-14T05:41:44.2426321Z  env | grep '^CI' >> "/tmp/github_env_${GITHUB_RUN_ID}" 2025-03-14T05:41:44.2426676Z else 2025-03-14T05:41:44.2426945Z  SHM_OPTS="--shm-size=${SHM_SIZE}" 2025-03-14T05:41:44.2427296Z  JENKINS_USER="--user jenkins" 2025-03-14T05:41:44.2427627Z  DOCKER_SHELL_CMD= 2025-03-14T05:41:44.2427901Z fi 2025-03-14T05:41:44.2428129Z  2025-03-14T05:41:44.2428479Z # detached container should get cleaned up by teardown_ec2_linux 2025-03-14T05:41:44.2429001Z # TODO: Stop building test binaries as part of the build phase 2025-03-14T05:41:44.2429598Z # Used for GPU_FLAG, SHM_OPTS, JENKINS_USER and DOCKER_SHELL_CMD since that doesn't play nice 2025-03-14T05:41:44.2430127Z # shellcheck disable=SC2086,SC2090 2025-03-14T05:41:44.2430471Z container_name=$(docker run \ 2025-03-14T05:41:44.2430798Z  ${GPU_FLAG:-} \ 2025-03-14T05:41:44.2431116Z  ${SCCACHE_SERVER_PORT_DOCKER_FLAG:-} \ 2025-03-14T05:41:44.2431467Z  -e BUILD_ENVIRONMENT \ 2025-03-14T05:41:44.2431774Z  -e PR_NUMBER \ 2025-03-14T05:41:44.2432062Z  -e GITHUB_ACTIONS \ 2025-03-14T05:41:44.2432356Z  -e GITHUB_REPOSITORY \ 2025-03-14T05:41:44.2432669Z  -e GITHUB_WORKFLOW \ 2025-03-14T05:41:44.2432971Z  -e GITHUB_JOB \ 2025-03-14T05:41:44.2433258Z  -e GITHUB_RUN_ID \ 2025-03-14T05:41:44.2433553Z  -e GITHUB_RUN_NUMBER \ 2025-03-14T05:41:44.2433868Z  -e GITHUB_RUN_ATTEMPT \ 2025-03-14T05:41:44.2434176Z  -e JOB_ID \ 2025-03-14T05:41:44.2434557Z  -e JOB_NAME \ 2025-03-14T05:41:44.2434858Z  -e BASE_SHA \ 2025-03-14T05:41:44.2435136Z  -e BRANCH \ 2025-03-14T05:41:44.2435408Z  -e SHA1 \ 2025-03-14T05:41:44.2435684Z  -e AWS_DEFAULT_REGION \ 2025-03-14T05:41:44.2436121Z  -e IN_WHEEL_TEST \ 2025-03-14T05:41:44.2436423Z  -e SHARD_NUMBER \ 2025-03-14T05:41:44.2436721Z  -e TEST_CONFIG \ 2025-03-14T05:41:44.2437010Z  -e NUM_TEST_SHARDS \ 2025-03-14T05:41:44.2437318Z  -e REENABLED_ISSUES \ 2025-03-14T05:41:44.2437631Z  -e CONTINUE_THROUGH_ERROR \ 2025-03-14T05:41:44.2437962Z  -e VERBOSE_TEST_LOGS \ 2025-03-14T05:41:44.2438276Z  -e TEST_SHOWLOCALS \ 2025-03-14T05:41:44.2438592Z  -e NO_TEST_TIMEOUT \ 2025-03-14T05:41:44.2438885Z  -e NO_TD \ 2025-03-14T05:41:44.2439158Z  -e TD_DISTRIBUTED \ 2025-03-14T05:41:44.2439456Z  -e PR_LABELS \ 2025-03-14T05:41:44.2439848Z  -e MAX_JOBS="$(nproc --ignore=2)" \ 2025-03-14T05:41:44.2440198Z  -e SCCACHE_BUCKET \ 2025-03-14T05:41:44.2440502Z  -e SCCACHE_REGION \ 2025-03-14T05:41:44.2440796Z  -e XLA_CUDA \ 2025-03-14T05:41:44.2441101Z  -e XLA_CLANG_CACHE_S3_BUCKET_NAME \ 2025-03-14T05:41:44.2441487Z  -e PYTORCH_TEST_CUDA_MEM_LEAK_CHECK \ 2025-03-14T05:41:44.2441858Z  -e PYTORCH_TEST_RERUN_DISABLED_TESTS \ 2025-03-14T05:41:44.2442231Z  -e SKIP_SCCACHE_INITIALIZATION=1 \ 2025-03-14T05:41:44.2442585Z  -e HUGGING_FACE_HUB_TOKEN \ 2025-03-14T05:41:44.2442931Z  -e SCRIBE_GRAPHQL_ACCESS_TOKEN \ 2025-03-14T05:41:44.2443263Z  -e DASHBOARD_TAG \ 2025-03-14T05:41:44.2443592Z  -e IS_A100_RUNNER \ 2025-03-14T05:41:44.2443925Z  -e ARTIFACTS_FILE_SUFFIX \ 2025-03-14T05:41:44.2444291Z  --memory="${TOTAL_AVAILABLE_MEMORY_IN_GB%.*}g" \ 2025-03-14T05:41:44.2444723Z  --memory-swap="${TOTAL_MEMORY_WITH_SWAP}g" \ 2025-03-14T05:41:44.2445144Z  --env-file="/tmp/github_env_${GITHUB_RUN_ID}" \ 2025-03-14T05:41:44.2445548Z  --security-opt seccomp=unconfined \ 2025-03-14T05:41:44.2445908Z  --cap-add=SYS_PTRACE \ 2025-03-14T05:41:44.2446230Z  --ipc=host \ 2025-03-14T05:41:44.2446511Z  ${SHM_OPTS} \ 2025-03-14T05:41:44.2446781Z  --tty \ 2025-03-14T05:41:44.2447034Z  --detach \ 2025-03-14T05:41:44.2447326Z  --name="${container_name}" \ 2025-03-14T05:41:44.2447648Z  ${JENKINS_USER} \ 2025-03-14T05:41:44.2448005Z  -v "${GITHUB_WORKSPACE}:/var/lib/jenkins/workspace" \ 2025-03-14T05:41:44.2448412Z  -w /var/lib/jenkins/workspace \ 2025-03-14T05:41:44.2448745Z  "${DOCKER_IMAGE}" \ 2025-03-14T05:41:44.2449040Z  ${DOCKER_SHELL_CMD} 2025-03-14T05:41:44.2449321Z ) 2025-03-14T05:41:44.2449632Z # Propagate download.pytorch.org IP to container 2025-03-14T05:41:44.2450298Z grep download.pytorch.org /etc/hosts | docker exec -i "${container_name}" sudo bash -c "/bin/cat >> /etc/hosts" 2025-03-14T05:41:44.2450993Z echo "DOCKER_CONTAINER_ID=${container_name}" >> "${GITHUB_ENV}" 2025-03-14T05:41:44.2451414Z  2025-03-14T05:41:44.2451709Z if [[ ${BUILD_ENVIRONMENT} == *"s390x"* ]]; then 2025-03-14T05:41:44.2452282Z  docker exec -t "${container_name}" sh -c "python3 -m pip install -r .ci/docker/requirements-ci.txt" 2025-03-14T05:41:44.2452800Z fi 2025-03-14T05:41:44.2453026Z  2025-03-14T05:41:44.2453523Z docker exec -t "${container_name}" sh -c "python3 -m pip install $(echo dist/*.whl)[opt-einsum] && ${TEST_COMMAND}" 2025-03-14T05:41:44.2461806Z shell: /usr/bin/bash -e {0} 2025-03-14T05:41:44.2462102Z env: 2025-03-14T05:41:44.2462352Z GIT_DEFAULT_BRANCH: main 2025-03-14T05:41:44.2462699Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:41:44.2463192Z BUILD_ENVIRONMENT: linux-focal-cuda12.6-py3.10-gcc9-sm86 2025-03-14T05:41:44.2463592Z PR_NUMBER: 2025-03-14T05:41:44.2463879Z GITHUB_REPOSITORY: pytorch/pytorch 2025-03-14T05:41:44.2464221Z GITHUB_WORKFLOW: inductor 2025-03-14T05:41:44.2464550Z GITHUB_JOB: test 2025-03-14T05:41:44.2464937Z GITHUB_RUN_ID: 13849515380 2025-03-14T05:41:44.2465241Z GITHUB_RUN_NUMBER: 122697 2025-03-14T05:41:44.2465542Z GITHUB_RUN_ATTEMPT: 1 2025-03-14T05:41:44.2465811Z JOB_ID: 38756916179 2025-03-14T05:41:44.2466303Z JOB_NAME: cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu) 2025-03-14T05:41:44.2466824Z BRANCH: main 2025-03-14T05:41:44.2467110Z SHA1: aed0b7a742a2d7b7901790622829cbd2135049a4 2025-03-14T05:41:44.2467504Z BASE_SHA: aed0b7a742a2d7b7901790622829cbd2135049a4 2025-03-14T05:41:44.2468240Z TEST_CONFIG: inductor_huggingface 2025-03-14T05:41:44.2468557Z SHARD_NUMBER: 1 2025-03-14T05:41:44.2468806Z NUM_TEST_SHARDS: 1 2025-03-14T05:41:44.2469202Z REENABLED_ISSUES: 2025-03-14T05:41:44.2469486Z CONTINUE_THROUGH_ERROR: False 2025-03-14T05:41:44.2469776Z VERBOSE_TEST_LOGS: False 2025-03-14T05:41:44.2470058Z TEST_SHOWLOCALS: False 2025-03-14T05:41:44.2470336Z NO_TEST_TIMEOUT: False 2025-03-14T05:41:44.2470597Z NO_TD: False 2025-03-14T05:41:44.2470855Z TD_DISTRIBUTED: False 2025-03-14T05:41:44.2471180Z SCCACHE_BUCKET: ossci-compiler-cache-circleci-v2 2025-03-14T05:41:44.2471556Z SCCACHE_REGION: us-east-1 2025-03-14T05:41:44.2471829Z SHM_SIZE: 2g 2025-03-14T05:41:44.2472622Z DOCKER_IMAGE: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:41:44.2473488Z XLA_CUDA: 2025-03-14T05:41:44.2473882Z XLA_CLANG_CACHE_S3_BUCKET_NAME: ossci-compiler-clang-cache-circleci-xla 2025-03-14T05:41:44.2474416Z PYTORCH_TEST_CUDA_MEM_LEAK_CHECK: 0 2025-03-14T05:41:44.2474756Z PYTORCH_TEST_RERUN_DISABLED_TESTS: 0 2025-03-14T05:41:44.2475084Z DASHBOARD_TAG: 2025-03-14T05:41:44.2475515Z HUGGING_FACE_HUB_TOKEN: *** 2025-03-14T05:41:44.2475946Z SCRIBE_GRAPHQL_ACCESS_TOKEN: *** 2025-03-14T05:41:44.2476266Z IS_A100_RUNNER: 0 2025-03-14T05:41:44.2476751Z ARTIFACTS_FILE_SUFFIX: test-inductor_huggingface-1-1-linux.g5.4xlarge.nvidia.gpu_38756916179 2025-03-14T05:41:44.2477285Z ##[endgroup] 2025-03-14T05:41:44.2504236Z + [[ inductor_huggingface == \m\u\l\t\i\g\p\u ]] 2025-03-14T05:41:44.2504678Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *onnx* ]] 2025-03-14T05:41:44.2505067Z + TEST_COMMAND=.ci/pytorch/test.sh 2025-03-14T05:41:44.2508070Z ++ awk '/MemTotal/ { printf "%.3f \n", $2/1024/1024 - 1 }' /proc/meminfo 2025-03-14T05:41:44.2531778Z + TOTAL_AVAILABLE_MEMORY_IN_GB='61.094 ' 2025-03-14T05:41:44.2532127Z + TOTAL_MEMORY_WITH_SWAP=64 2025-03-14T05:41:44.2532496Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *\s\3\9\0\x* ]] 2025-03-14T05:41:44.2532884Z + SHM_OPTS=--shm-size=2g 2025-03-14T05:41:44.2533181Z + JENKINS_USER='--user jenkins' 2025-03-14T05:41:44.2533485Z + DOCKER_SHELL_CMD= 2025-03-14T05:41:44.2541876Z +++ nproc --ignore=2 2025-03-14T05:41:44.2576033Z ++ docker run --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all -e BUILD_ENVIRONMENT -e PR_NUMBER -e GITHUB_ACTIONS -e GITHUB_REPOSITORY -e GITHUB_WORKFLOW -e GITHUB_JOB -e GITHUB_RUN_ID -e GITHUB_RUN_NUMBER -e GITHUB_RUN_ATTEMPT -e JOB_ID -e JOB_NAME -e BASE_SHA -e BRANCH -e SHA1 -e AWS_DEFAULT_REGION -e IN_WHEEL_TEST -e SHARD_NUMBER -e TEST_CONFIG -e NUM_TEST_SHARDS -e REENABLED_ISSUES -e CONTINUE_THROUGH_ERROR -e VERBOSE_TEST_LOGS -e TEST_SHOWLOCALS -e NO_TEST_TIMEOUT -e NO_TD -e TD_DISTRIBUTED -e PR_LABELS -e MAX_JOBS=14 -e SCCACHE_BUCKET -e SCCACHE_REGION -e XLA_CUDA -e XLA_CLANG_CACHE_S3_BUCKET_NAME -e PYTORCH_TEST_CUDA_MEM_LEAK_CHECK -e PYTORCH_TEST_RERUN_DISABLED_TESTS -e SKIP_SCCACHE_INITIALIZATION=1 -e HUGGING_FACE_HUB_TOKEN -e SCRIBE_GRAPHQL_ACCESS_TOKEN -e DASHBOARD_TAG -e IS_A100_RUNNER -e ARTIFACTS_FILE_SUFFIX --memory=61g --memory-swap=64g --env-file=/tmp/github_env_13849515380 --security-opt seccomp=unconfined --cap-add=SYS_PTRACE --ipc=host --shm-size=2g --tty --detach --name= --user jenkins -v /home/ec2-user/actions-runner/_work/pytorch/pytorch:/var/lib/jenkins/workspace -w /var/lib/jenkins/workspace 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T05:41:56.7446287Z + container_name=fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T05:41:56.7448481Z + grep download.pytorch.org /etc/hosts 2025-03-14T05:41:56.7451089Z + docker exec -i fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 sudo bash -c '/bin/cat >> /etc/hosts' 2025-03-14T05:41:56.9048543Z + echo DOCKER_CONTAINER_ID=fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T05:41:56.9049191Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *\s\3\9\0\x* ]] 2025-03-14T05:41:56.9052966Z ++ echo dist/torch-2.8.0a0+gitaed0b7a-cp310-cp310-linux_x86_64.whl 2025-03-14T05:41:56.9056107Z + docker exec -t fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 sh -c 'python3 -m pip install dist/torch-2.8.0a0+gitaed0b7a-cp310-cp310-linux_x86_64.whl[opt-einsum] && .ci/pytorch/test.sh' 2025-03-14T05:41:57.3639464Z Processing ./dist/torch-2.8.0a0+gitaed0b7a-cp310-cp310-linux_x86_64.whl (from torch==2.8.0a0+gitaed0b7a) 2025-03-14T05:41:57.7136870Z Requirement already satisfied: filelock in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from torch==2.8.0a0+gitaed0b7a->torch==2.8.0a0+gitaed0b7a) (3.16.1) 2025-03-14T05:41:57.7139618Z Requirement already satisfied: typing-extensions>=4.10.0 in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from torch==2.8.0a0+gitaed0b7a->torch==2.8.0a0+gitaed0b7a) (4.12.2) 2025-03-14T05:41:57.7604551Z Collecting sympy>=1.13.3 (from torch==2.8.0a0+gitaed0b7a->torch==2.8.0a0+gitaed0b7a) 2025-03-14T05:41:57.7616250Z Using cached sympy-1.13.3-py3-none-any.whl.metadata (12 kB) 2025-03-14T05:41:57.7634040Z Requirement already satisfied: networkx in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from torch==2.8.0a0+gitaed0b7a->torch==2.8.0a0+gitaed0b7a) (2.8.8) 2025-03-14T05:41:57.7637815Z Requirement already satisfied: jinja2 in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from torch==2.8.0a0+gitaed0b7a->torch==2.8.0a0+gitaed0b7a) (3.1.6) 2025-03-14T05:41:57.7641234Z Requirement already satisfied: fsspec in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from torch==2.8.0a0+gitaed0b7a->torch==2.8.0a0+gitaed0b7a) (2024.10.0) 2025-03-14T05:41:57.7656960Z Requirement already satisfied: opt-einsum>=3.3 in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from torch==2.8.0a0+gitaed0b7a->torch==2.8.0a0+gitaed0b7a) (3.3.0) 2025-03-14T05:41:57.7676209Z Requirement already satisfied: numpy>=1.7 in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from opt-einsum>=3.3->torch==2.8.0a0+gitaed0b7a->torch==2.8.0a0+gitaed0b7a) (1.22.4) 2025-03-14T05:41:57.7684165Z Requirement already satisfied: mpmath<1.4,>=1.1.0 in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from sympy>=1.13.3->torch==2.8.0a0+gitaed0b7a->torch==2.8.0a0+gitaed0b7a) (1.3.0) 2025-03-14T05:41:57.8072473Z Requirement already satisfied: MarkupSafe>=2.0 in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from jinja2->torch==2.8.0a0+gitaed0b7a->torch==2.8.0a0+gitaed0b7a) (3.0.2) 2025-03-14T05:41:57.8153322Z Using cached sympy-1.13.3-py3-none-any.whl (6.2 MB) 2025-03-14T05:41:58.4429132Z Installing collected packages: sympy, torch 2025-03-14T05:41:58.4429649Z Attempting uninstall: sympy 2025-03-14T05:41:58.4438405Z Found existing installation: sympy 1.13.1 2025-03-14T05:41:58.6414811Z Uninstalling sympy-1.13.1: 2025-03-14T05:41:59.8493056Z Successfully uninstalled sympy-1.13.1 2025-03-14T05:42:14.4818947Z ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 2025-03-14T05:42:14.4820915Z timm 1.0.14 requires torchvision, which is not installed. 2025-03-14T05:42:14.4822042Z Successfully installed sympy-1.13.3 torch-2.8.0a0+gitaed0b7a 2025-03-14T05:42:14.5840759Z + export TERM=vt100 2025-03-14T05:42:14.5841179Z + TERM=vt100 2025-03-14T05:42:14.5844964Z ++ dirname .ci/pytorch/test.sh 2025-03-14T05:42:14.5855376Z + source .ci/pytorch/common.sh 2025-03-14T05:42:14.5859474Z +++ dirname .ci/pytorch/common.sh 2025-03-14T05:42:14.5868448Z ++ source .ci/pytorch/common_utils.sh 2025-03-14T05:42:14.5870242Z +++ declare -f -t trap_add 2025-03-14T05:42:14.5876615Z ++ set -ex -o pipefail 2025-03-14T05:42:14.5877468Z ++ [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *rocm* ]] 2025-03-14T05:42:14.5878102Z ++ BUILD_TEST_LIBTORCH=0 2025-03-14T05:42:14.5878642Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 != *rocm* ]] 2025-03-14T05:42:14.5879332Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 != *s390x* ]] 2025-03-14T05:42:14.5879941Z + [[ -d /var/lib/jenkins/workspace ]] 2025-03-14T05:42:14.5881182Z ++ stat -c %u /var/lib/jenkins/workspace 2025-03-14T05:42:14.5903045Z + WORKSPACE_ORIGINAL_OWNER_ID=1000 2025-03-14T05:42:14.5903625Z + trap_add cleanup_workspace EXIT 2025-03-14T05:42:14.5904155Z + trap_add_cmd=cleanup_workspace 2025-03-14T05:42:14.5904625Z + shift 2025-03-14T05:42:14.5905005Z + for trap_add_name in "$@" 2025-03-14T05:42:14.5911226Z +++ trap -p EXIT 2025-03-14T05:42:14.5914719Z ++ eval 'extract_trap_cmd ' 2025-03-14T05:42:14.5915199Z +++ extract_trap_cmd 2025-03-14T05:42:14.5915613Z +++ printf '%s\n' '' 2025-03-14T05:42:14.5916047Z ++ printf '%s\n' cleanup_workspace 2025-03-14T05:42:14.5917379Z + trap -- ' 2025-03-14T05:42:14.5917768Z cleanup_workspace' EXIT 2025-03-14T05:42:14.5918283Z + sudo chown -R jenkins /var/lib/jenkins/workspace 2025-03-14T05:42:15.3803889Z + git config --global --add safe.directory /var/lib/jenkins/workspace 2025-03-14T05:42:15.3827553Z + echo 'Environment variables:' 2025-03-14T05:42:15.3827880Z Environment variables: 2025-03-14T05:42:15.3828263Z + env 2025-03-14T05:42:15.3838491Z INSTALLED_DB=yes 2025-03-14T05:42:15.3839597Z NV_LIBCUBLAS_VERSION=12.6.4.1-1 2025-03-14T05:42:15.3840193Z NVIDIA_VISIBLE_DEVICES=all 2025-03-14T05:42:15.3840616Z NV_NVML_DEV_VERSION=12.6.77-1 2025-03-14T05:42:15.3841262Z GITHUB_WORKSPACE=/home/ec2-user/actions-runner/_work/pytorch/pytorch 2025-03-14T05:42:15.3841916Z CONTINUE_THROUGH_ERROR=False 2025-03-14T05:42:15.3842318Z NV_LIBNCCL_DEV_PACKAGE=libnccl-dev=2.23.4-1+cuda12.6 2025-03-14T05:42:15.3842706Z NV_LIBNCCL_DEV_PACKAGE_VERSION=2.23.4-1 2025-03-14T05:42:15.3843112Z BUILD_ENVIRONMENT=linux-focal-cuda12.6-py3.10-gcc9-sm86 2025-03-14T05:42:15.3843495Z HOSTNAME=fb3818aafd9c 2025-03-14T05:42:15.3844067Z GITHUB_PATH=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/add_path_60cabe66-5022-4bac-ad03-908a5fd7ef05 2025-03-14T05:42:15.3844676Z GITHUB_ACTION=__self 2025-03-14T05:42:15.3844965Z PYTORCH_TEST_CUDA_MEM_LEAK_CHECK=0 2025-03-14T05:42:15.3849276Z NVIDIA_REQUIRE_CUDA=cuda>=12.6 brand=unknown,driver>=470,driver<471 brand=grid,driver>=470,driver<471 brand=tesla,driver>=470,driver<471 brand=nvidia,driver>=470,driver<471 brand=quadro,driver>=470,driver<471 brand=quadrortx,driver>=470,driver<471 brand=nvidiartx,driver>=470,driver<471 brand=vapps,driver>=470,driver<471 brand=vpc,driver>=470,driver<471 brand=vcs,driver>=470,driver<471 brand=vws,driver>=470,driver<471 brand=cloudgaming,driver>=470,driver<471 brand=unknown,driver>=535,driver<536 brand=grid,driver>=535,driver<536 brand=tesla,driver>=535,driver<536 brand=nvidia,driver>=535,driver<536 brand=quadro,driver>=535,driver<536 brand=quadrortx,driver>=535,driver<536 brand=nvidiartx,driver>=535,driver<536 brand=vapps,driver>=535,driver<536 brand=vpc,driver>=535,driver<536 brand=vcs,driver>=535,driver<536 brand=vws,driver>=535,driver<536 brand=cloudgaming,driver>=535,driver<536 brand=unknown,driver>=550,driver<551 brand=grid,driver>=550,driver<551 brand=tesla,driver>=550,driver<551 brand=nvidia,driver>=550,driver<551 brand=quadro,driver>=550,driver<551 brand=quadrortx,driver>=550,driver<551 brand=nvidiartx,driver>=550,driver<551 brand=vapps,driver>=550,driver<551 brand=vpc,driver>=550,driver<551 brand=vcs,driver>=550,driver<551 brand=vws,driver>=550,driver<551 brand=cloudgaming,driver>=550,driver<551 2025-03-14T05:42:15.3854100Z NV_LIBCUBLAS_DEV_PACKAGE=libcublas-dev-12-6=12.6.4.1-1 2025-03-14T05:42:15.3854485Z NV_NVTX_VERSION=12.6.77-1 2025-03-14T05:42:15.3854771Z GITHUB_RUN_NUMBER=122697 2025-03-14T05:42:15.3855066Z TEST_CONFIG=inductor_huggingface 2025-03-14T05:42:15.3855385Z GITHUB_REPOSITORY_OWNER_ID=21003710 2025-03-14T05:42:15.3855731Z TORCH_NVCC_FLAGS=-Xfatbin -compress-all 2025-03-14T05:42:15.3856051Z IS_A100_RUNNER=0 2025-03-14T05:42:15.3856317Z NV_CUDA_CUDART_DEV_VERSION=12.6.77-1 2025-03-14T05:42:15.3856641Z NV_LIBCUSPARSE_VERSION=12.5.4.2-1 2025-03-14T05:42:15.3857182Z SCRIBE_GRAPHQL_ACCESS_TOKEN=*** 2025-03-14T05:42:15.3857489Z NV_LIBNPP_VERSION=12.3.1.54-1 2025-03-14T05:42:15.3857807Z GITHUB_TRIGGERING_ACTOR=pytorchmergebot 2025-03-14T05:42:15.3858339Z CMAKE_CUDA_COMPILER_LAUNCHER=/opt/cache/bin/sccache 2025-03-14T05:42:15.3858701Z GITHUB_REF_TYPE=branch 2025-03-14T05:42:15.3858978Z TORCH_CUDA_ARCH_LIST=Maxwell 2025-03-14T05:42:15.3859270Z NCCL_VERSION=2.23.4-1 2025-03-14T05:42:15.3859580Z BASE_SHA=aed0b7a742a2d7b7901790622829cbd2135049a4 2025-03-14T05:42:15.3859917Z XLA_CUDA= 2025-03-14T05:42:15.3860273Z HUGGING_FACE_HUB_TOKEN=*** 2025-03-14T05:42:15.3860856Z *** 2025-03-14T05:42:15.3861103Z CARGO_NET_GIT_FETCH_WITH_CLI=true 2025-03-14T05:42:15.3861425Z GITHUB_REPOSITORY_ID=65600975 2025-03-14T05:42:15.3861719Z GITHUB_ACTIONS=true 2025-03-14T05:42:15.3861989Z NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:42:15.3862340Z NV_NVPROF_DEV_PACKAGE=cuda-nvprof-12-6=12.6.80-1 2025-03-14T05:42:15.3862714Z NV_LIBNPP_PACKAGE=libnpp-12-6=12.3.1.54-1 2025-03-14T05:42:15.3863079Z SHA1=aed0b7a742a2d7b7901790622829cbd2135049a4 2025-03-14T05:42:15.3863444Z NV_LIBNCCL_DEV_PACKAGE_NAME=libnccl-dev 2025-03-14T05:42:15.3863811Z GITHUB_SHA=aed0b7a742a2d7b7901790622829cbd2135049a4 2025-03-14T05:42:15.3864336Z GITHUB_WORKFLOW_REF=pytorch/pytorch/.github/workflows/inductor.yml@refs/heads/main 2025-03-14T05:42:15.3864808Z UCC_HOME=/usr 2025-03-14T05:42:15.3865062Z NV_LIBCUBLAS_DEV_VERSION=12.6.4.1-1 2025-03-14T05:42:15.3865373Z VERBOSE_TEST_LOGS=False 2025-03-14T05:42:15.3865653Z NVIDIA_PRODUCT_NAME=CUDA 2025-03-14T05:42:15.3865977Z NV_LIBCUBLAS_DEV_PACKAGE_NAME=libcublas-dev-12-6 2025-03-14T05:42:15.3866335Z GITHUB_REF=refs/heads/main 2025-03-14T05:42:15.3866628Z NV_CUDA_CUDART_VERSION=12.6.77-1 2025-03-14T05:42:15.3866927Z SHARD_NUMBER=1 2025-03-14T05:42:15.3867182Z GITHUB_REF_PROTECTED=true 2025-03-14T05:42:15.3867460Z HOME=/var/lib/jenkins 2025-03-14T05:42:15.3867753Z GITHUB_API_URL=https://api.github.com 2025-03-14T05:42:15.3868484Z PYTORCH_TEST_RERUN_DISABLED_TESTS=0 2025-03-14T05:42:15.3868854Z UCX_COMMIT=7bb2722ff2187a0cad557ae4a6afa090569f83fb 2025-03-14T05:42:15.3869202Z CUDA_VERSION=12.6.3 2025-03-14T05:42:15.3869496Z NV_LIBCUBLAS_PACKAGE=libcublas-12-6=12.6.4.1-1 2025-03-14T05:42:15.3869840Z NUM_TEST_SHARDS=1 2025-03-14T05:42:15.3870085Z UCX_HOME=/usr 2025-03-14T05:42:15.3870442Z NV_CUDA_NSIGHT_COMPUTE_DEV_PACKAGE=cuda-nsight-compute-12-6=12.6.3-1 2025-03-14T05:42:15.3871188Z GITHUB_STATE=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/save_state_60cabe66-5022-4bac-ad03-908a5fd7ef05 2025-03-14T05:42:15.3872019Z JOB_NAME=cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu) 2025-03-14T05:42:15.3872829Z GITHUB_ENV=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/set_env_60cabe66-5022-4bac-ad03-908a5fd7ef05 2025-03-14T05:42:15.3873603Z GITHUB_EVENT_PATH=/home/ec2-user/actions-runner/_work/_temp/_github_workflow/event.json 2025-03-14T05:42:15.3874094Z GITHUB_EVENT_NAME=push 2025-03-14T05:42:15.3874447Z DASHBOARD_TAG= 2025-03-14T05:42:15.3874698Z GITHUB_RUN_ID=13849515380 2025-03-14T05:42:15.3875020Z NV_LIBNPP_DEV_PACKAGE=libnpp-dev-12-6=12.3.1.54-1 2025-03-14T05:42:15.3875403Z NV_LIBCUBLAS_PACKAGE_NAME=libcublas-12-6 2025-03-14T05:42:15.3876084Z GITHUB_STEP_SUMMARY=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/step_summary_60cabe66-5022-4bac-ad03-908a5fd7ef05 2025-03-14T05:42:15.3876750Z GITHUB_ACTOR=pytorchmergebot 2025-03-14T05:42:15.3877064Z NV_LIBNPP_DEV_VERSION=12.3.1.54-1 2025-03-14T05:42:15.3877600Z PR_NUMBER= 2025-03-14T05:42:15.3877848Z GITHUB_RUN_ATTEMPT=1 2025-03-14T05:42:15.3878124Z ANACONDA_PYTHON_VERSION=3.10 2025-03-14T05:42:15.3878472Z GITHUB_GRAPHQL_URL=https://api.github.com/graphql 2025-03-14T05:42:15.3878828Z TERM=vt100 2025-03-14T05:42:15.3879089Z NV_LIBCUSPARSE_DEV_VERSION=12.5.4.2-1 2025-03-14T05:42:15.3879411Z INSTALLED_VISION=yes 2025-03-14T05:42:15.3879670Z BRANCH=main 2025-03-14T05:42:15.3879916Z SCCACHE_REGION=us-east-1 2025-03-14T05:42:15.3880208Z OPENSSL_ROOT_DIR=/opt/openssl 2025-03-14T05:42:15.3880528Z LIBRARY_PATH=/usr/local/cuda/lib64/stubs 2025-03-14T05:42:15.3880859Z CUDA_PATH=/usr/local/cuda 2025-03-14T05:42:15.3881384Z GITHUB_ACTION_PATH=/home/ec2-user/actions-runner/_work/pytorch/pytorch/./.github/actions/setup-linux 2025-03-14T05:42:15.3882130Z GITHUB_SERVER_URL=https://github.com 2025-03-14T05:42:15.3882507Z UCC_COMMIT=20eae37090a4ce1b32bcce6144ccad0b49943e0b 2025-03-14T05:42:15.3882914Z REENABLED_ISSUES= 2025-03-14T05:42:15.3883155Z SHLVL=1 2025-03-14T05:42:15.3883382Z MAX_JOBS=14 2025-03-14T05:42:15.3883615Z NV_CUDA_LIB_VERSION=12.6.3-1 2025-03-14T05:42:15.3883899Z NVARCH=x86_64 2025-03-14T05:42:15.3884144Z GITHUB_ACTOR_ID=97764156 2025-03-14T05:42:15.3884497Z GITHUB_WORKFLOW_SHA=aed0b7a742a2d7b7901790622829cbd2135049a4 2025-03-14T05:42:15.3884909Z GITHUB_REF_NAME=main 2025-03-14T05:42:15.3885307Z XLA_CLANG_CACHE_S3_BUCKET_NAME=ossci-compiler-clang-cache-circleci-xla 2025-03-14T05:42:15.3885735Z GITHUB_JOB=test 2025-03-14T05:42:15.3886018Z NV_LIBNCCL_PACKAGE=libnccl2=2.23.4-1+cuda12.6 2025-03-14T05:42:15.3886440Z LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64 2025-03-14T05:42:15.3886830Z NO_TEST_TIMEOUT=False 2025-03-14T05:42:15.3887104Z TD_DISTRIBUTED=False 2025-03-14T05:42:15.3887389Z NV_CUDA_NSIGHT_COMPUTE_VERSION=12.6.3-1 2025-03-14T05:42:15.3887728Z GITHUB_REPOSITORY=pytorch/pytorch 2025-03-14T05:42:15.3888043Z NV_NVPROF_VERSION=12.6.80-1 2025-03-14T05:42:15.3888330Z GITHUB_RETENTION_DAYS=90 2025-03-14T05:42:15.3888610Z OPENSSL_DIR=/opt/openssl 2025-03-14T05:42:15.3888895Z GITHUB_ACTION_REPOSITORY= 2025-03-14T05:42:15.3889633Z PATH=/opt/cache/bin:/opt/conda/envs/py_3.10/bin:/opt/conda/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin 2025-03-14T05:42:15.3890385Z GITHUB_BASE_REF= 2025-03-14T05:42:15.3890855Z ARTIFACTS_FILE_SUFFIX=test-inductor_huggingface-1-1-linux.g5.4xlarge.nvidia.gpu_38756916179 2025-03-14T05:42:15.3891404Z NV_LIBNCCL_PACKAGE_NAME=libnccl2 2025-03-14T05:42:15.3891693Z CI=true 2025-03-14T05:42:15.3891941Z NV_LIBNCCL_PACKAGE_VERSION=2.23.4-1 2025-03-14T05:42:15.3892265Z GITHUB_REPOSITORY_OWNER=pytorch 2025-03-14T05:42:15.3892561Z JOB_ID=38756916179 2025-03-14T05:42:15.3892823Z INSTALLED_PROTOBUF=yes 2025-03-14T05:42:15.3893093Z GITHUB_HEAD_REF= 2025-03-14T05:42:15.3893338Z GITHUB_ACTION_REF= 2025-03-14T05:42:15.3893650Z SCCACHE_BUCKET=ossci-compiler-cache-circleci-v2 2025-03-14T05:42:15.3894014Z TEST_SHOWLOCALS=False 2025-03-14T05:42:15.3894295Z GITHUB_WORKFLOW=inductor 2025-03-14T05:42:15.3894590Z DEBIAN_FRONTEND=noninteractive 2025-03-14T05:42:15.3895204Z GITHUB_OUTPUT=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/set_output_60cabe66-5022-4bac-ad03-908a5fd7ef05 2025-03-14T05:42:15.3895815Z NO_TD=False 2025-03-14T05:42:15.3896067Z SKIP_SCCACHE_INITIALIZATION=1 2025-03-14T05:42:15.3896363Z _=/usr/bin/env 2025-03-14T05:42:15.3896689Z ++ python -c 'import site; print(site.getsitepackages()[0])' 2025-03-14T05:42:15.4018514Z + TORCH_INSTALL_DIR=/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch 2025-03-14T05:42:15.4019202Z + TORCH_BIN_DIR=/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/bin 2025-03-14T05:42:15.4019781Z + TORCH_LIB_DIR=/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/lib 2025-03-14T05:42:15.4020465Z + TORCH_TEST_DIR=/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/test 2025-03-14T05:42:15.4020907Z + BUILD_DIR=build 2025-03-14T05:42:15.4021170Z + BUILD_RENAMED_DIR=build_renamed 2025-03-14T05:42:15.4021644Z + BUILD_BIN_DIR=build/bin 2025-03-14T05:42:15.4021923Z + SHARD_NUMBER=1 2025-03-14T05:42:15.4022165Z + NUM_TEST_SHARDS=1 2025-03-14T05:42:15.4022449Z + export TORCH_SERIALIZATION_DEBUG=1 2025-03-14T05:42:15.4022776Z + TORCH_SERIALIZATION_DEBUG=1 2025-03-14T05:42:15.4023113Z + export VALGRIND=ON 2025-03-14T05:42:15.4023374Z + VALGRIND=ON 2025-03-14T05:42:15.4023687Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *clang9* ]] 2025-03-14T05:42:15.4024118Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *xpu* ]] 2025-03-14T05:42:15.4024581Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *s390x* ]] 2025-03-14T05:42:15.4024976Z + [[ 0 == \1 ]] 2025-03-14T05:42:15.4025212Z + [[ False == \1 ]] 2025-03-14T05:42:15.4025647Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 != *bazel* ]] 2025-03-14T05:42:15.4026034Z ++ realpath build/custom_test_artifacts 2025-03-14T05:42:15.4037724Z + CUSTOM_TEST_ARTIFACT_BUILD_DIR=/var/lib/jenkins/workspace/build/custom_test_artifacts 2025-03-14T05:42:15.4038219Z + [[ -n '' ]] 2025-03-14T05:42:15.4038480Z + echo 'Environment variables' 2025-03-14T05:42:15.4038779Z Environment variables 2025-03-14T05:42:15.4039047Z + env 2025-03-14T05:42:15.4047123Z INSTALLED_DB=yes 2025-03-14T05:42:15.4047501Z NV_LIBCUBLAS_VERSION=12.6.4.1-1 2025-03-14T05:42:15.4047839Z NVIDIA_VISIBLE_DEVICES=all 2025-03-14T05:42:15.4048133Z NV_NVML_DEV_VERSION=12.6.77-1 2025-03-14T05:42:15.4048629Z GITHUB_WORKSPACE=/home/ec2-user/actions-runner/_work/pytorch/pytorch 2025-03-14T05:42:15.4049137Z CONTINUE_THROUGH_ERROR=False 2025-03-14T05:42:15.4049492Z NV_LIBNCCL_DEV_PACKAGE=libnccl-dev=2.23.4-1+cuda12.6 2025-03-14T05:42:15.4049962Z NV_LIBNCCL_DEV_PACKAGE_VERSION=2.23.4-1 2025-03-14T05:42:15.4050434Z BUILD_ENVIRONMENT=linux-focal-cuda12.6-py3.10-gcc9-sm86 2025-03-14T05:42:15.4050836Z HOSTNAME=fb3818aafd9c 2025-03-14T05:42:15.4051472Z GITHUB_PATH=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/add_path_60cabe66-5022-4bac-ad03-908a5fd7ef05 2025-03-14T05:42:15.4052084Z GITHUB_ACTION=__self 2025-03-14T05:42:15.4052373Z PYTORCH_TEST_CUDA_MEM_LEAK_CHECK=0 2025-03-14T05:42:15.4056897Z NVIDIA_REQUIRE_CUDA=cuda>=12.6 brand=unknown,driver>=470,driver<471 brand=grid,driver>=470,driver<471 brand=tesla,driver>=470,driver<471 brand=nvidia,driver>=470,driver<471 brand=quadro,driver>=470,driver<471 brand=quadrortx,driver>=470,driver<471 brand=nvidiartx,driver>=470,driver<471 brand=vapps,driver>=470,driver<471 brand=vpc,driver>=470,driver<471 brand=vcs,driver>=470,driver<471 brand=vws,driver>=470,driver<471 brand=cloudgaming,driver>=470,driver<471 brand=unknown,driver>=535,driver<536 brand=grid,driver>=535,driver<536 brand=tesla,driver>=535,driver<536 brand=nvidia,driver>=535,driver<536 brand=quadro,driver>=535,driver<536 brand=quadrortx,driver>=535,driver<536 brand=nvidiartx,driver>=535,driver<536 brand=vapps,driver>=535,driver<536 brand=vpc,driver>=535,driver<536 brand=vcs,driver>=535,driver<536 brand=vws,driver>=535,driver<536 brand=cloudgaming,driver>=535,driver<536 brand=unknown,driver>=550,driver<551 brand=grid,driver>=550,driver<551 brand=tesla,driver>=550,driver<551 brand=nvidia,driver>=550,driver<551 brand=quadro,driver>=550,driver<551 brand=quadrortx,driver>=550,driver<551 brand=nvidiartx,driver>=550,driver<551 brand=vapps,driver>=550,driver<551 brand=vpc,driver>=550,driver<551 brand=vcs,driver>=550,driver<551 brand=vws,driver>=550,driver<551 brand=cloudgaming,driver>=550,driver<551 2025-03-14T05:42:15.4061837Z NV_LIBCUBLAS_DEV_PACKAGE=libcublas-dev-12-6=12.6.4.1-1 2025-03-14T05:42:15.4062210Z NV_NVTX_VERSION=12.6.77-1 2025-03-14T05:42:15.4062498Z GITHUB_RUN_NUMBER=122697 2025-03-14T05:42:15.4062791Z TEST_CONFIG=inductor_huggingface 2025-03-14T05:42:15.4063110Z GITHUB_REPOSITORY_OWNER_ID=21003710 2025-03-14T05:42:15.4063502Z TORCH_NVCC_FLAGS=-Xfatbin -compress-all 2025-03-14T05:42:15.4063818Z IS_A100_RUNNER=0 2025-03-14T05:42:15.4064077Z NV_CUDA_CUDART_DEV_VERSION=12.6.77-1 2025-03-14T05:42:15.4064402Z NV_LIBCUSPARSE_VERSION=12.5.4.2-1 2025-03-14T05:42:15.4064859Z SCRIBE_GRAPHQL_ACCESS_TOKEN=*** 2025-03-14T05:42:15.4065165Z NV_LIBNPP_VERSION=12.3.1.54-1 2025-03-14T05:42:15.4065620Z GITHUB_TRIGGERING_ACTOR=pytorchmergebot 2025-03-14T05:42:15.4066003Z CMAKE_CUDA_COMPILER_LAUNCHER=/opt/cache/bin/sccache 2025-03-14T05:42:15.4066362Z GITHUB_REF_TYPE=branch 2025-03-14T05:42:15.4066638Z TORCH_CUDA_ARCH_LIST=Maxwell 2025-03-14T05:42:15.4066923Z NCCL_VERSION=2.23.4-1 2025-03-14T05:42:15.4067231Z BASE_SHA=aed0b7a742a2d7b7901790622829cbd2135049a4 2025-03-14T05:42:15.4067596Z XLA_CUDA= 2025-03-14T05:42:15.4068209Z HUGGING_FACE_HUB_TOKEN=*** 2025-03-14T05:42:15.4068939Z *** 2025-03-14T05:42:15.4069179Z CARGO_NET_GIT_FETCH_WITH_CLI=true 2025-03-14T05:42:15.4069499Z GITHUB_REPOSITORY_ID=65600975 2025-03-14T05:42:15.4069785Z GITHUB_ACTIONS=true 2025-03-14T05:42:15.4070205Z NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T05:42:15.4070545Z NV_NVPROF_DEV_PACKAGE=cuda-nvprof-12-6=12.6.80-1 2025-03-14T05:42:15.4070913Z NV_LIBNPP_PACKAGE=libnpp-12-6=12.3.1.54-1 2025-03-14T05:42:15.4071277Z SHA1=aed0b7a742a2d7b7901790622829cbd2135049a4 2025-03-14T05:42:15.4071637Z NV_LIBNCCL_DEV_PACKAGE_NAME=libnccl-dev 2025-03-14T05:42:15.4072003Z GITHUB_SHA=aed0b7a742a2d7b7901790622829cbd2135049a4 2025-03-14T05:42:15.4072523Z GITHUB_WORKFLOW_REF=pytorch/pytorch/.github/workflows/inductor.yml@refs/heads/main 2025-03-14T05:42:15.4073004Z UCC_HOME=/usr 2025-03-14T05:42:15.4073254Z TORCH_SERIALIZATION_DEBUG=1 2025-03-14T05:42:15.4073608Z NV_LIBCUBLAS_DEV_VERSION=12.6.4.1-1 2025-03-14T05:42:15.4073919Z VERBOSE_TEST_LOGS=False 2025-03-14T05:42:15.4074193Z NVIDIA_PRODUCT_NAME=CUDA 2025-03-14T05:42:15.4074603Z NV_LIBCUBLAS_DEV_PACKAGE_NAME=libcublas-dev-12-6 2025-03-14T05:42:15.4074957Z GITHUB_REF=refs/heads/main 2025-03-14T05:42:15.4075247Z NV_CUDA_CUDART_VERSION=12.6.77-1 2025-03-14T05:42:15.4075543Z SHARD_NUMBER=1 2025-03-14T05:42:15.4075793Z GITHUB_REF_PROTECTED=true 2025-03-14T05:42:15.4076074Z HOME=/var/lib/jenkins 2025-03-14T05:42:15.4076359Z GITHUB_API_URL=https://api.github.com 2025-03-14T05:42:15.4076702Z PYTORCH_TEST_RERUN_DISABLED_TESTS=0 2025-03-14T05:42:15.4077063Z UCX_COMMIT=7bb2722ff2187a0cad557ae4a6afa090569f83fb 2025-03-14T05:42:15.4077419Z CUDA_VERSION=12.6.3 2025-03-14T05:42:15.4077714Z NV_LIBCUBLAS_PACKAGE=libcublas-12-6=12.6.4.1-1 2025-03-14T05:42:15.4078044Z NUM_TEST_SHARDS=1 2025-03-14T05:42:15.4078285Z UCX_HOME=/usr 2025-03-14T05:42:15.4078636Z NV_CUDA_NSIGHT_COMPUTE_DEV_PACKAGE=cuda-nsight-compute-12-6=12.6.3-1 2025-03-14T05:42:15.4079369Z GITHUB_STATE=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/save_state_60cabe66-5022-4bac-ad03-908a5fd7ef05 2025-03-14T05:42:15.4080200Z JOB_NAME=cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu) 2025-03-14T05:42:15.4081009Z GITHUB_ENV=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/set_env_60cabe66-5022-4bac-ad03-908a5fd7ef05 2025-03-14T05:42:15.4081780Z GITHUB_EVENT_PATH=/home/ec2-user/actions-runner/_work/_temp/_github_workflow/event.json 2025-03-14T05:42:15.4082267Z GITHUB_EVENT_NAME=push 2025-03-14T05:42:15.4093854Z DASHBOARD_TAG= 2025-03-14T05:42:15.4094165Z GITHUB_RUN_ID=13849515380 2025-03-14T05:42:15.4094508Z NV_LIBNPP_DEV_PACKAGE=libnpp-dev-12-6=12.3.1.54-1 2025-03-14T05:42:15.4094899Z NV_LIBCUBLAS_PACKAGE_NAME=libcublas-12-6 2025-03-14T05:42:15.4095581Z GITHUB_STEP_SUMMARY=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/step_summary_60cabe66-5022-4bac-ad03-908a5fd7ef05 2025-03-14T05:42:15.4096252Z GITHUB_ACTOR=pytorchmergebot 2025-03-14T05:42:15.4096567Z NV_LIBNPP_DEV_VERSION=12.3.1.54-1 2025-03-14T05:42:15.4096870Z PR_NUMBER= 2025-03-14T05:42:15.4097108Z GITHUB_RUN_ATTEMPT=1 2025-03-14T05:42:15.4097374Z VALGRIND=ON 2025-03-14T05:42:15.4097629Z ANACONDA_PYTHON_VERSION=3.10 2025-03-14T05:42:15.4097993Z GITHUB_GRAPHQL_URL=https://api.github.com/graphql 2025-03-14T05:42:15.4098360Z TERM=vt100 2025-03-14T05:42:15.4098607Z NV_LIBCUSPARSE_DEV_VERSION=12.5.4.2-1 2025-03-14T05:42:15.4098931Z INSTALLED_VISION=yes 2025-03-14T05:42:15.4099196Z BRANCH=main 2025-03-14T05:42:15.4099445Z SCCACHE_REGION=us-east-1 2025-03-14T05:42:15.4099740Z OPENSSL_ROOT_DIR=/opt/openssl 2025-03-14T05:42:15.4100225Z LIBRARY_PATH=/usr/local/cuda/lib64/stubs 2025-03-14T05:42:15.4100563Z CUDA_PATH=/usr/local/cuda 2025-03-14T05:42:15.4101093Z GITHUB_ACTION_PATH=/home/ec2-user/actions-runner/_work/pytorch/pytorch/./.github/actions/setup-linux 2025-03-14T05:42:15.4101666Z GITHUB_SERVER_URL=https://github.com 2025-03-14T05:42:15.4102052Z UCC_COMMIT=20eae37090a4ce1b32bcce6144ccad0b49943e0b 2025-03-14T05:42:15.4102408Z REENABLED_ISSUES= 2025-03-14T05:42:15.4102653Z SHLVL=1 2025-03-14T05:42:15.4102870Z MAX_JOBS=14 2025-03-14T05:42:15.4103113Z NV_CUDA_LIB_VERSION=12.6.3-1 2025-03-14T05:42:15.4103406Z NVARCH=x86_64 2025-03-14T05:42:15.4103691Z GITHUB_ACTOR_ID=97764156 2025-03-14T05:42:15.4104145Z GITHUB_WORKFLOW_SHA=aed0b7a742a2d7b7901790622829cbd2135049a4 2025-03-14T05:42:15.4104536Z GITHUB_REF_NAME=main 2025-03-14T05:42:15.4104933Z XLA_CLANG_CACHE_S3_BUCKET_NAME=ossci-compiler-clang-cache-circleci-xla 2025-03-14T05:42:15.4105370Z GITHUB_JOB=test 2025-03-14T05:42:15.4105656Z NV_LIBNCCL_PACKAGE=libnccl2=2.23.4-1+cuda12.6 2025-03-14T05:42:15.4106078Z LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64 2025-03-14T05:42:15.4106466Z NO_TEST_TIMEOUT=False 2025-03-14T05:42:15.4106733Z TD_DISTRIBUTED=False 2025-03-14T05:42:15.4107026Z NV_CUDA_NSIGHT_COMPUTE_VERSION=12.6.3-1 2025-03-14T05:42:15.4107370Z GITHUB_REPOSITORY=pytorch/pytorch 2025-03-14T05:42:15.4107691Z NV_NVPROF_VERSION=12.6.80-1 2025-03-14T05:42:15.4107983Z GITHUB_RETENTION_DAYS=90 2025-03-14T05:42:15.4108267Z OPENSSL_DIR=/opt/openssl 2025-03-14T05:42:15.4108555Z GITHUB_ACTION_REPOSITORY= 2025-03-14T05:42:15.4109293Z PATH=/opt/cache/bin:/opt/conda/envs/py_3.10/bin:/opt/conda/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin 2025-03-14T05:42:15.4110048Z GITHUB_BASE_REF= 2025-03-14T05:42:15.4110524Z ARTIFACTS_FILE_SUFFIX=test-inductor_huggingface-1-1-linux.g5.4xlarge.nvidia.gpu_38756916179 2025-03-14T05:42:15.4111066Z NV_LIBNCCL_PACKAGE_NAME=libnccl2 2025-03-14T05:42:15.4111362Z CI=true 2025-03-14T05:42:15.4111610Z NV_LIBNCCL_PACKAGE_VERSION=2.23.4-1 2025-03-14T05:42:15.4111943Z GITHUB_REPOSITORY_OWNER=pytorch 2025-03-14T05:42:15.4112238Z JOB_ID=38756916179 2025-03-14T05:42:15.4112501Z INSTALLED_PROTOBUF=yes 2025-03-14T05:42:15.4112777Z GITHUB_HEAD_REF= 2025-03-14T05:42:15.4113029Z GITHUB_ACTION_REF= 2025-03-14T05:42:15.4113341Z SCCACHE_BUCKET=ossci-compiler-cache-circleci-v2 2025-03-14T05:42:15.4113708Z TEST_SHOWLOCALS=False 2025-03-14T05:42:15.4113989Z GITHUB_WORKFLOW=inductor 2025-03-14T05:42:15.4114373Z DEBIAN_FRONTEND=noninteractive 2025-03-14T05:42:15.4115001Z GITHUB_OUTPUT=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/set_output_60cabe66-5022-4bac-ad03-908a5fd7ef05 2025-03-14T05:42:15.4115624Z NO_TD=False 2025-03-14T05:42:15.4115881Z SKIP_SCCACHE_INITIALIZATION=1 2025-03-14T05:42:15.4116179Z _=/usr/bin/env 2025-03-14T05:42:15.4116426Z + echo 'Testing pytorch' 2025-03-14T05:42:15.4116705Z Testing pytorch 2025-03-14T05:42:15.4116972Z + export LANG=C.UTF-8 2025-03-14T05:42:15.4117240Z + LANG=C.UTF-8 2025-03-14T05:42:15.4117480Z + PR_NUMBER= 2025-03-14T05:42:15.4117756Z + [[ inductor_huggingface == \d\e\f\a\u\l\t ]] 2025-03-14T05:42:15.4118153Z + [[ inductor_huggingface == \d\i\s\t\r\i\b\u\t\e\d ]] 2025-03-14T05:42:15.4118530Z + [[ inductor_huggingface == \s\l\o\w ]] 2025-03-14T05:42:15.4118956Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *slow-gradcheck* ]] 2025-03-14T05:42:15.4119434Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *cuda* ]] 2025-03-14T05:42:15.4119839Z + export PYTORCH_TESTING_DEVICE_ONLY_FOR=cuda 2025-03-14T05:42:15.4120197Z + PYTORCH_TESTING_DEVICE_ONLY_FOR=cuda 2025-03-14T05:42:15.4120551Z + [[ inductor_huggingface == *crossref* ]] 2025-03-14T05:42:15.4120947Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *rocm* ]] 2025-03-14T05:42:15.4121374Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *xpu* ]] 2025-03-14T05:42:15.4121809Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 != *-bazel-* ]] 2025-03-14T05:42:15.4122203Z + pip_install --user ninja==1.10.2 2025-03-14T05:42:15.4122702Z + pip_install_pkg='python3 -m pip install --progress-bar off' 2025-03-14T05:42:15.4123192Z + python3 -m pip install --progress-bar off --user ninja==1.10.2 2025-03-14T05:42:15.9366961Z Collecting ninja==1.10.2 2025-03-14T05:42:15.9704779Z Downloading ninja-1.10.2-py2.py3-none-manylinux_2_5_x86_64.manylinux1_x86_64.whl.metadata (5.0 kB) 2025-03-14T05:42:15.9818864Z Downloading ninja-1.10.2-py2.py3-none-manylinux_2_5_x86_64.manylinux1_x86_64.whl (108 kB) 2025-03-14T05:42:16.5783174Z Installing collected packages: ninja 2025-03-14T05:42:16.5861709Z  WARNING: The script ninja is installed in '/var/lib/jenkins/.local/bin' which is not on PATH. 2025-03-14T05:42:16.5862876Z Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. 2025-03-14T05:42:16.5922133Z Successfully installed ninja-1.10.2 2025-03-14T05:42:16.6897684Z + export PATH=/var/lib/jenkins/.local/bin:/opt/cache/bin:/opt/conda/envs/py_3.10/bin:/opt/conda/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin 2025-03-14T05:42:16.6899142Z + PATH=/var/lib/jenkins/.local/bin:/opt/cache/bin:/opt/conda/envs/py_3.10/bin:/opt/conda/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin 2025-03-14T05:42:16.6900078Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *aarch64* ]] 2025-03-14T05:42:16.6900464Z + install_tlparse 2025-03-14T05:42:16.6900744Z + pip_install --user tlparse==0.3.30 2025-03-14T05:42:16.6901159Z + pip_install_pkg='python3 -m pip install --progress-bar off' 2025-03-14T05:42:16.6901662Z + python3 -m pip install --progress-bar off --user tlparse==0.3.30 2025-03-14T05:42:17.1526104Z Collecting tlparse==0.3.30 2025-03-14T05:42:17.1850379Z Downloading tlparse-0.3.30-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (1.9 kB) 2025-03-14T05:42:17.1966019Z Downloading tlparse-0.3.30-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.3 MB) 2025-03-14T05:42:17.8061144Z Installing collected packages: tlparse 2025-03-14T05:42:17.8417873Z Successfully installed tlparse-0.3.30 2025-03-14T05:42:17.9376033Z ++ python -m site --user-base 2025-03-14T05:42:17.9560922Z + PATH=/var/lib/jenkins/.local/bin:/var/lib/jenkins/.local/bin:/opt/cache/bin:/opt/conda/envs/py_3.10/bin:/opt/conda/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin 2025-03-14T05:42:17.9562897Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *asan* ]] 2025-03-14T05:42:17.9563378Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *-debug* ]] 2025-03-14T05:42:17.9563834Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 != *-bazel-* ]] 2025-03-14T05:42:17.9564465Z + echo 'We are not in debug mode: linux-focal-cuda12.6-py3.10-gcc9-sm86. Expect the assertion to pass' 2025-03-14T05:42:17.9565176Z We are not in debug mode: linux-focal-cuda12.6-py3.10-gcc9-sm86. Expect the assertion to pass 2025-03-14T05:42:17.9566732Z + cd test 2025-03-14T05:42:17.9567395Z + python -c 'import torch; torch._C._crash_if_debug_asserts_fail(424242)' 2025-03-14T05:42:19.6832295Z + [[ inductor_huggingface == \n\o\g\p\u\_\N\O\_\A\V\X\2 ]] 2025-03-14T05:42:19.6832759Z + [[ inductor_huggingface == \n\o\g\p\u\_\A\V\X\5\1\2 ]] 2025-03-14T05:42:19.6838129Z + DYNAMO_BENCHMARK_FLAGS=() 2025-03-14T05:42:19.6838710Z + [[ inductor_huggingface == *pr_time_benchmarks* ]] 2025-03-14T05:42:19.6839318Z + [[ inductor_huggingface == *dynamo_eager* ]] 2025-03-14T05:42:19.6839880Z + [[ inductor_huggingface == *aot_eager* ]] 2025-03-14T05:42:19.6840446Z + [[ inductor_huggingface == *aot_inductor* ]] 2025-03-14T05:42:19.6841059Z + [[ inductor_huggingface == *max_autotune_inductor* ]] 2025-03-14T05:42:19.6841702Z + [[ inductor_huggingface == *inductor* ]] 2025-03-14T05:42:19.6842224Z + [[ inductor_huggingface != *perf* ]] 2025-03-14T05:42:19.6842757Z + DYNAMO_BENCHMARK_FLAGS+=(--inductor) 2025-03-14T05:42:19.6843285Z + [[ inductor_huggingface == *dynamic* ]] 2025-03-14T05:42:19.6844164Z + [[ inductor_huggingface == *cpu* ]] 2025-03-14T05:42:19.6844699Z + DYNAMO_BENCHMARK_FLAGS+=(--device cuda) 2025-03-14T05:42:19.6875891Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *libtorch* ]] 2025-03-14T05:42:19.6876382Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *-bazel-* ]] 2025-03-14T05:42:19.6879190Z + cd test 2025-03-14T05:42:19.6880018Z + python -c 'import torch; print(torch.__config__.show())' 2025-03-14T05:42:21.2503794Z PyTorch built with: 2025-03-14T05:42:21.2504092Z - GCC 9.4 2025-03-14T05:42:21.2504337Z - C++ Version: 201703 2025-03-14T05:42:21.2504896Z - Intel(R) oneAPI Math Kernel Library Version 2021.4-Product Build 20210904 for Intel(R) 64 architecture applications 2025-03-14T05:42:21.2505867Z - Intel(R) MKL-DNN v3.7.1 (Git Hash 8d263e693366ef8db40acc569cc7d8edf644556d) 2025-03-14T05:42:21.2506314Z - OpenMP 201511 (a.k.a. OpenMP 4.5) 2025-03-14T05:42:21.2506665Z - LAPACK is enabled (usually provided by MKL) 2025-03-14T05:42:21.2507007Z - NNPACK is enabled 2025-03-14T05:42:21.2507301Z - CPU capability usage: AVX2 2025-03-14T05:42:21.2507596Z - CUDA Runtime 12.6 2025-03-14T05:42:21.2507956Z - NVCC architecture flags: -gencode;arch=compute_86,code=sm_86 2025-03-14T05:42:21.2508353Z - CuDNN 90.5.1 2025-03-14T05:42:21.2508595Z - Magma 2.6.1 2025-03-14T05:42:21.2512830Z - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, COMMIT_SHA=aed0b7a742a2d7b7901790622829cbd2135049a4, CUDA_VERSION=12.6, CUDNN_VERSION=9.5.1, CXX_COMPILER=/opt/cache/bin/c++, CXX_FLAGS= -D_GLIBCXX_USE_CXX11_ABI=1 -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -DNDEBUG -DUSE_KINETO -DLIBKINETO_NOROCTRACER -DLIBKINETO_NOXPUPTI=ON -DUSE_FBGEMM -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -O2 -fPIC -Wall -Wextra -Werror=return-type -Werror=non-virtual-dtor -Werror=bool-operation -Wnarrowing -Wno-missing-field-initializers -Wno-unknown-pragmas -Wno-unused-parameter -Wno-strict-overflow -Wno-strict-aliasing -Wno-stringop-overflow -Wsuggest-override -Wno-psabi -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Werror -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Wno-stringop-overflow, FORCE_FALLBACK_CUDA_MPI=1, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, TORCH_VERSION=2.8.0, USE_CUDA=ON, USE_CUDNN=ON, USE_CUSPARSELT=ON, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_GLOO=ON, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=ON, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF, USE_ROCM_KERNEL_ASSERT=OFF, 2025-03-14T05:42:21.2517370Z 2025-03-14T05:42:21.6000263Z + cd test 2025-03-14T05:42:21.6000668Z + python -c 'import torch; print(torch.__config__.parallel_info())' 2025-03-14T05:42:23.0201907Z ATen/Parallel: 2025-03-14T05:42:23.0202232Z at::get_num_threads() : 8 2025-03-14T05:42:23.0202562Z at::get_num_interop_threads() : 16 2025-03-14T05:42:23.0202884Z OpenMP 201511 (a.k.a. OpenMP 4.5) 2025-03-14T05:42:23.0203191Z omp_get_max_threads() : 8 2025-03-14T05:42:23.0203753Z Intel(R) oneAPI Math Kernel Library Version 2021.4-Product Build 20210904 for Intel(R) 64 architecture applications 2025-03-14T05:42:23.0204325Z mkl_get_max_threads() : 8 2025-03-14T05:42:23.0204719Z Intel(R) MKL-DNN v3.7.1 (Git Hash 8d263e693366ef8db40acc569cc7d8edf644556d) 2025-03-14T05:42:23.0205173Z std::thread::hardware_concurrency() : 16 2025-03-14T05:42:23.0205531Z Environment variables: 2025-03-14T05:42:23.0205830Z OMP_NUM_THREADS : [not set] 2025-03-14T05:42:23.0206121Z MKL_NUM_THREADS : [not set] 2025-03-14T05:42:23.0206417Z ATen parallel backend: OpenMP 2025-03-14T05:42:23.0206606Z 2025-03-14T05:42:23.3185235Z + [[ inductor_huggingface == *numpy_2* ]] 2025-03-14T05:42:23.3185684Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *aarch64* ]] 2025-03-14T05:42:23.3186107Z + [[ inductor_huggingface == *backward* ]] 2025-03-14T05:42:23.3186452Z + [[ inductor_huggingface == *xla* ]] 2025-03-14T05:42:23.3186793Z + [[ inductor_huggingface == *executorch* ]] 2025-03-14T05:42:23.3187171Z + [[ inductor_huggingface == \j\i\t\_\l\e\g\a\c\y ]] 2025-03-14T05:42:23.3187829Z + [[ linux-focal-cuda12.6-py3.10-gcc9-sm86 == *libtorch* ]] 2025-03-14T05:42:23.3188242Z + [[ inductor_huggingface == distributed ]] 2025-03-14T05:42:23.3188626Z + [[ inductor_huggingface == *inductor_distributed* ]] 2025-03-14T05:42:23.3189027Z + [[ inductor_huggingface == *inductor-halide* ]] 2025-03-14T05:42:23.3189535Z + [[ inductor_huggingface == *inductor-triton-cpu* ]] 2025-03-14T05:42:23.3190090Z + [[ inductor_huggingface == *inductor-micro-benchmark* ]] 2025-03-14T05:42:23.3190502Z + [[ inductor_huggingface == *huggingface* ]] 2025-03-14T05:42:23.3190839Z + install_torchvision 2025-03-14T05:42:23.3191109Z + local orig_preload 2025-03-14T05:42:23.3191365Z + local commit 2025-03-14T05:42:23.3191813Z ++ get_pinned_commit vision 2025-03-14T05:42:23.3192123Z ++ cat .github/ci_commit_pins/vision.txt 2025-03-14T05:42:23.3209391Z + commit=d23a6e1664d20707c11781299611436e1f0c104f 2025-03-14T05:42:23.3209792Z + orig_preload= 2025-03-14T05:42:23.3210036Z + '[' -n '' ']' 2025-03-14T05:42:23.3210600Z + pip_install --no-use-pep517 --user git+https://github.com/pytorch/vision.git@d23a6e1664d20707c11781299611436e1f0c104f 2025-03-14T05:42:23.3211292Z + pip_install_pkg='python3 -m pip install --progress-bar off' 2025-03-14T05:42:23.3212067Z + python3 -m pip install --progress-bar off --no-use-pep517 --user git+https://github.com/pytorch/vision.git@d23a6e1664d20707c11781299611436e1f0c104f 2025-03-14T05:42:23.7144110Z Collecting git+https://github.com/pytorch/vision.git@d23a6e1664d20707c11781299611436e1f0c104f 2025-03-14T05:42:23.7145022Z Cloning https://github.com/pytorch/vision.git (to revision d23a6e1664d20707c11781299611436e1f0c104f) to /tmp/pip-req-build-enc7wxod 2025-03-14T05:42:23.7173872Z Running command git clone --filter=blob:none --quiet https://github.com/pytorch/vision.git /tmp/pip-req-build-enc7wxod 2025-03-14T05:42:25.2359623Z Running command git rev-parse -q --verify 'sha^d23a6e1664d20707c11781299611436e1f0c104f' 2025-03-14T05:42:25.2386020Z Running command git fetch -q https://github.com/pytorch/vision.git d23a6e1664d20707c11781299611436e1f0c104f 2025-03-14T05:42:26.6398937Z Running command git checkout -q d23a6e1664d20707c11781299611436e1f0c104f 2025-03-14T05:42:26.9987944Z Resolved https://github.com/pytorch/vision.git to commit d23a6e1664d20707c11781299611436e1f0c104f 2025-03-14T05:42:29.6640508Z Preparing metadata (setup.py) ... [?25l- \ done 2025-03-14T05:42:29.6673854Z [?25hRequirement already satisfied: numpy in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from torchvision==0.19.0a0+d23a6e1) (1.22.4) 2025-03-14T05:42:29.6676828Z Requirement already satisfied: torch in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from torchvision==0.19.0a0+d23a6e1) (2.8.0a0+gitaed0b7a) 2025-03-14T05:42:29.6681137Z Requirement already satisfied: pillow!=8.3.*,>=5.3.0 in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from torchvision==0.19.0a0+d23a6e1) (11.0.0) 2025-03-14T05:42:29.6747155Z Requirement already satisfied: filelock in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from torch->torchvision==0.19.0a0+d23a6e1) (3.16.1) 2025-03-14T05:42:29.6751190Z Requirement already satisfied: typing-extensions>=4.10.0 in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from torch->torchvision==0.19.0a0+d23a6e1) (4.12.2) 2025-03-14T05:42:29.6754604Z Requirement already satisfied: sympy>=1.13.3 in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from torch->torchvision==0.19.0a0+d23a6e1) (1.13.3) 2025-03-14T05:42:29.6757832Z Requirement already satisfied: networkx in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from torch->torchvision==0.19.0a0+d23a6e1) (2.8.8) 2025-03-14T05:42:29.6760456Z Requirement already satisfied: jinja2 in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from torch->torchvision==0.19.0a0+d23a6e1) (3.1.6) 2025-03-14T05:42:29.6763452Z Requirement already satisfied: fsspec in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from torch->torchvision==0.19.0a0+d23a6e1) (2024.10.0) 2025-03-14T05:42:29.6778910Z Requirement already satisfied: mpmath<1.4,>=1.1.0 in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from sympy>=1.13.3->torch->torchvision==0.19.0a0+d23a6e1) (1.3.0) 2025-03-14T05:42:29.7253151Z Requirement already satisfied: MarkupSafe>=2.0 in /opt/conda/envs/py_3.10/lib/python3.10/site-packages (from jinja2->torch->torchvision==0.19.0a0+d23a6e1) (3.0.2) 2025-03-14T05:42:29.7318878Z Building wheels for collected packages: torchvision 2025-03-14T05:43:46.3834136Z Building wheel for torchvision (setup.py) ... [?25l- \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / - \ | / done 2025-03-14T05:43:46.3868033Z [?25h Created wheel for torchvision: filename=torchvision-0.19.0a0+d23a6e1-cp310-cp310-linux_x86_64.whl size=2076430 sha256=bde9423f4d04d8b3c8554f8879b3bbdd19a188193b0aaa14ca0b3f5576d77918 2025-03-14T05:43:46.3870729Z Stored in directory: /var/lib/jenkins/.cache/pip/wheels/0e/56/35/02931e71eb23fd2b85591c7ec05b733ca7c8b328a2fd151f96 2025-03-14T05:43:46.3907520Z Successfully built torchvision 2025-03-14T05:43:46.9038458Z Installing collected packages: torchvision 2025-03-14T05:43:47.3199872Z Successfully installed torchvision-0.19.0a0+d23a6e1 2025-03-14T05:43:47.4791240Z + '[' -n '' ']' 2025-03-14T05:43:47.4791587Z + id=0 2025-03-14T05:43:47.4791846Z + test_dynamo_benchmark huggingface 0 2025-03-14T05:43:47.4794722Z ++ pwd 2025-03-14T05:43:47.4798658Z + TEST_REPORTS_DIR=/var/lib/jenkins/workspace/test/test-reports 2025-03-14T05:43:47.4799223Z + local suite=huggingface 2025-03-14T05:43:47.4799498Z + shift 2025-03-14T05:43:47.4799730Z + local shard_id=0 2025-03-14T05:43:47.4799969Z + shift 2025-03-14T05:43:47.4800255Z + [[ inductor_huggingface == *perf_compare* ]] 2025-03-14T05:43:47.4800616Z + [[ inductor_huggingface == *perf* ]] 2025-03-14T05:43:47.4800952Z + [[ inductor_huggingface == *cpu* ]] 2025-03-14T05:43:47.4801303Z + [[ inductor_huggingface == *aot_inductor* ]] 2025-03-14T05:43:47.4801846Z + [[ inductor_huggingface == *max_autotune_inductor* ]] 2025-03-14T05:43:47.4802595Z + test_single_dynamo_benchmark inference huggingface 0 --inference --bfloat16 2025-03-14T05:43:47.4803452Z ++ pwd 2025-03-14T05:43:47.4806360Z + TEST_REPORTS_DIR=/var/lib/jenkins/workspace/test/test-reports 2025-03-14T05:43:47.4806839Z + mkdir -p /var/lib/jenkins/workspace/test/test-reports 2025-03-14T05:43:47.4863873Z + local name=inference 2025-03-14T05:43:47.4864787Z + shift 2025-03-14T05:43:47.4865098Z + local suite=huggingface 2025-03-14T05:43:47.4865392Z + shift 2025-03-14T05:43:47.4865619Z + local shard_id=0 2025-03-14T05:43:47.4865866Z + shift 2025-03-14T05:43:47.4866098Z + partition_flags=() 2025-03-14T05:43:47.4866378Z + local partition_flags 2025-03-14T05:43:47.4866689Z + [[ -n 1 ]] 2025-03-14T05:43:47.4866926Z + [[ -n 0 ]] 2025-03-14T05:43:47.4867336Z + partition_flags=(--total-partitions "$NUM_TEST_SHARDS" --partition-id "$shard_id") 2025-03-14T05:43:47.4868142Z + [[ inductor_huggingface == *perf_compare* ]] 2025-03-14T05:43:47.4868545Z + [[ inductor_huggingface == *perf* ]] 2025-03-14T05:43:47.4868886Z + [[ inductor_huggingface == *_avx2* ]] 2025-03-14T05:43:47.4869225Z + [[ inductor_huggingface == *_avx512* ]] 2025-03-14T05:43:47.4870434Z + python benchmarks/dynamo/huggingface.py --ci --accuracy --timing --explain --print-compilation-time --inductor --device cuda --inference --bfloat16 --total-partitions 1 --partition-id 0 --output /var/lib/jenkins/workspace/test/test-reports/inference_huggingface.csv 2025-03-14T05:43:51.3496626Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T05:43:51.3497811Z warnings.warn( 2025-03-14T05:43:51.4099845Z 2025-03-14T05:43:51.4101079Z config.json: 0% 0.00/694 [00:00 will be ignored 2025-03-14T06:09:01.0522917Z Compilation time (from dynamo_timed): 32.082636992999994 2025-03-14T06:09:01.0530987Z pass 2025-03-14T06:09:01.0900280Z TIMING: entire_frame_compile:23.77818 gc:0.00185 _recursive_pre_grad_passes:0.0135 pad_mm_benchmark:0.37912 _recursive_joint_graph_passes:1.57691 _recursive_post_grad_passes:0.6532 async_compile.wait:4.46423 code_gen:11.24933 inductor_compile:18.09643 backend_compile:19.55888 cudagraphify.get_container:0.3207 entire_backward_compile:8.30445 CachingAutotuner.benchmark_all_configs:0.07169 CUDAGraphNode.record:0.96545 total_wall_time:32.08264 2025-03-14T06:09:01.0902335Z STATS: call_* op count: 585 | FakeTensorMode.__torch_dispatch__:40868 | FakeTensor.__torch_dispatch__:6399 | ProxyTorchDispatchMode.__torch_dispatch__:20196 2025-03-14T06:09:01.0903152Z Dynamo produced 2 graphs covering 585 ops with 5 graph breaks (4 unique) 2025-03-14T06:09:07.4575024Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:09:07.4576272Z warnings.warn( 2025-03-14T06:09:07.6869367Z 2025-03-14T06:09:10.6073376Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:09:10.6073838Z loading model: 0it [00:02, ?it/s] 2025-03-14T06:09:10.6074752Z cuda train AllenaiLongformerBase 2025-03-14T06:09:17.9456850Z W0314 06:09:17.944000 13853 site-packages/torch/_dynamo/variables/tensor.py:913] [2/0] Graph break from `Tensor.item()`, consider setting: 2025-03-14T06:09:17.9457862Z W0314 06:09:17.944000 13853 site-packages/torch/_dynamo/variables/tensor.py:913] [2/0] torch._dynamo.config.capture_scalar_outputs = True 2025-03-14T06:09:17.9458676Z W0314 06:09:17.944000 13853 site-packages/torch/_dynamo/variables/tensor.py:913] [2/0] or: 2025-03-14T06:09:17.9459445Z W0314 06:09:17.944000 13853 site-packages/torch/_dynamo/variables/tensor.py:913] [2/0] env TORCHDYNAMO_CAPTURE_SCALAR_OUTPUTS=1 2025-03-14T06:09:17.9460872Z W0314 06:09:17.944000 13853 site-packages/torch/_dynamo/variables/tensor.py:913] [2/0] to include these operations in the captured graph. 2025-03-14T06:09:17.9461642Z W0314 06:09:17.944000 13853 site-packages/torch/_dynamo/variables/tensor.py:913] [2/0] 2025-03-14T06:09:17.9462375Z W0314 06:09:17.944000 13853 site-packages/torch/_dynamo/variables/tensor.py:913] [2/0] Graph break: from user code at: 2025-03-14T06:09:17.9463605Z W0314 06:09:17.944000 13853 site-packages/torch/_dynamo/variables/tensor.py:913] [2/0] File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:09:17.9464860Z W0314 06:09:17.944000 13853 site-packages/torch/_dynamo/variables/tensor.py:913] [2/0] pred = mod(**cloned_inputs) 2025-03-14T06:09:17.9466092Z W0314 06:09:17.944000 13853 site-packages/torch/_dynamo/variables/tensor.py:913] [2/0] File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/longformer/modeling_longformer.py", line 1835, in forward 2025-03-14T06:09:17.9467314Z W0314 06:09:17.944000 13853 site-packages/torch/_dynamo/variables/tensor.py:913] [2/0] outputs = self.longformer( 2025-03-14T06:09:17.9468819Z W0314 06:09:17.944000 13853 site-packages/torch/_dynamo/variables/tensor.py:913] [2/0] File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/longformer/modeling_longformer.py", line 1738, in forward 2025-03-14T06:09:17.9470050Z W0314 06:09:17.944000 13853 site-packages/torch/_dynamo/variables/tensor.py:913] [2/0] encoder_outputs = self.encoder( 2025-03-14T06:09:17.9471279Z W0314 06:09:17.944000 13853 site-packages/torch/_dynamo/variables/tensor.py:913] [2/0] File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/longformer/modeling_longformer.py", line 1291, in forward 2025-03-14T06:09:17.9472574Z W0314 06:09:17.944000 13853 site-packages/torch/_dynamo/variables/tensor.py:913] [2/0] is_global_attn = is_index_global_attn.flatten().any().item() 2025-03-14T06:09:17.9473374Z W0314 06:09:17.944000 13853 site-packages/torch/_dynamo/variables/tensor.py:913] [2/0] 2025-03-14T06:09:17.9473989Z W0314 06:09:17.944000 13853 site-packages/torch/_dynamo/variables/tensor.py:913] [2/0] 2025-03-14T06:10:23.7713912Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:10:23.7715140Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/longformer/modeling_longformer.py", line 1318, in torch_dynamo_resume_in_forward_at_1291 2025-03-14T06:10:23.7716059Z layer_outputs = layer_module( 2025-03-14T06:10:23.7716864Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/longformer/modeling_longformer.py", line 1246, in forward 2025-03-14T06:10:23.7717665Z self_attn_outputs = self.attention( 2025-03-14T06:10:23.7718448Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/longformer/modeling_longformer.py", line 1182, in forward 2025-03-14T06:10:23.7719242Z self_outputs = self.self( 2025-03-14T06:10:23.7719962Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/longformer/modeling_longformer.py", line 571, in forward 2025-03-14T06:10:23.7720776Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-03-14T06:10:23.7722068Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/longformer/modeling_longformer.py", line 836, in _sliding_chunks_query_key_matmul 2025-03-14T06:10:23.7723129Z query = self._chunk(query, window_overlap, getattr(self.config, "onnx_export", False)) 2025-03-14T06:10:23.7724060Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/longformer/modeling_longformer.py", line 778, in _chunk 2025-03-14T06:10:23.7724943Z return hidden_states.as_strided(size=chunk_size, stride=chunk_stride) 2025-03-14T06:10:23.7725352Z 2025-03-14T06:10:24.3149194Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:10:24.3150540Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/longformer/modeling_longformer.py", line 1734, in forward 2025-03-14T06:10:24.3151273Z embedding_output = self.embeddings( 2025-03-14T06:10:24.3152000Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/longformer/modeling_longformer.py", line 470, in forward 2025-03-14T06:10:24.3152741Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:10:24.3152993Z 2025-03-14T06:10:24.3299936Z W0314 06:10:24.329000 13853 site-packages/torch/_logging/_internal.py:1130] [18/0] Profiler function will be ignored 2025-03-14T06:11:12.2268574Z Compilation time (from dynamo_timed): 106.25287261 2025-03-14T06:11:12.2308581Z pass 2025-03-14T06:11:12.2732616Z TIMING: entire_frame_compile:89.08531 gc:0.01056 _recursive_pre_grad_passes:0.04827 _recursive_joint_graph_passes:3.08402 inductor_compile:53.58193 backend_compile:75.5072 _recursive_post_grad_passes:2.46625 async_compile.precompile:0.30616 async_compile.wait:6.81627 code_gen:33.25975 cudagraphify.get_container:0.17326 pad_mm_benchmark:0.42724 CachingAutotuner.benchmark_all_configs:0.19116 entire_backward_compile:17.16756 CUDAGraphNode.record:6.84528 total_wall_time:106.25287 2025-03-14T06:11:12.2734884Z STATS: call_* op count: 2772 | FakeTensorMode.__torch_dispatch__:119650 | FakeTensor.__torch_dispatch__:17385 | ProxyTorchDispatchMode.__torch_dispatch__:59739 2025-03-14T06:11:12.2735734Z Dynamo produced 7 graphs covering 2772 ops with 9 graph breaks (5 unique) 2025-03-14T06:11:21.6673740Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:11:21.6675015Z warnings.warn( 2025-03-14T06:11:22.0054769Z 2025-03-14T06:11:25.9540296Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:11:25.9540709Z loading model: 0it [00:03, ?it/s] 2025-03-14T06:11:25.9541043Z cuda train BartForCausalLM 2025-03-14T06:11:34.4210887Z 2025-03-14T06:11:34.4211468Z class GraphModule(torch.nn.Module): 2025-03-14T06:11:34.4213626Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:11:34.4215747Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:11:34.4216643Z l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:11:34.4217959Z l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:11:34.4219617Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:11:34.4220935Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:11:34.4221759Z 2025-03-14T06:11:34.4222100Z # No stacktrace found for following nodes 2025-03-14T06:11:34.4222727Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:11:34.4223333Z 2025-03-14T06:11:34.4224244Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1364 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:11:34.4225591Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 1024); input_ids = None 2025-03-14T06:11:34.4226220Z 2025-03-14T06:11:34.4227076Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1375 in forward, code: inputs_embeds = self.embed_tokens(input) * self.embed_scale 2025-03-14T06:11:34.4228956Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(l_cloned_inputs_input_ids_, l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); l_cloned_inputs_input_ids_ = l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:11:34.4230352Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:11:34.4230816Z 2025-03-14T06:11:34.4231478Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:131 in forward, code: positions = torch.arange( 2025-03-14T06:11:34.4232432Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:11:34.4232933Z 2025-03-14T06:11:34.4233580Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:133 in forward, code: ).expand(bsz, -1) 2025-03-14T06:11:34.4234496Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:11:34.4234913Z 2025-03-14T06:11:34.4235646Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:135 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:11:34.4236534Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:11:34.4237768Z positions_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:11:34.4238872Z 2025-03-14T06:11:34.4239609Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1416 in forward, code: positions = positions.to(inputs_embeds.device) 2025-03-14T06:11:34.4240657Z positions_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = positions_1.to(device(type='cuda', index=0)); positions_1 = None 2025-03-14T06:11:34.4241195Z 2025-03-14T06:11:34.4241903Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1418 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:11:34.4242946Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + positions_2; inputs_embeds = positions_2 = None 2025-03-14T06:11:34.4243487Z 2025-03-14T06:11:34.4244386Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1419 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:11:34.4246709Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:11:34.4248444Z 2025-03-14T06:11:34.4249428Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1421 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:11:34.4250813Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:11:34.4251501Z 2025-03-14T06:11:34.4252196Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1450 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:11:34.4253038Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:11:34.4253502Z 2025-03-14T06:11:34.4254265Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1451 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:11:34.4255168Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:11:34.4255579Z 2025-03-14T06:11:34.4255723Z 2025-03-14T06:11:34.4255853Z class GraphModule(torch.nn.Module): 2025-03-14T06:11:34.4257619Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:11:34.4259420Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:11:34.4260192Z l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:11:34.4261308Z l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:11:34.4262481Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:11:34.4263659Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:11:34.4264379Z 2025-03-14T06:11:34.4264636Z # No stacktrace found for following nodes 2025-03-14T06:11:34.4265220Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:11:34.4265759Z 2025-03-14T06:11:34.4266496Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1364 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:11:34.4267474Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 1024); input_ids = None 2025-03-14T06:11:34.4268461Z 2025-03-14T06:11:34.4269236Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1375 in forward, code: inputs_embeds = self.embed_tokens(input) * self.embed_scale 2025-03-14T06:11:34.4271002Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(l_cloned_inputs_input_ids_, l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); l_cloned_inputs_input_ids_ = l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:11:34.4272381Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:11:34.4273011Z 2025-03-14T06:11:34.4273671Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:131 in forward, code: positions = torch.arange( 2025-03-14T06:11:34.4274748Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:11:34.4275245Z 2025-03-14T06:11:34.4275886Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:133 in forward, code: ).expand(bsz, -1) 2025-03-14T06:11:34.4276702Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:11:34.4277114Z 2025-03-14T06:11:34.4277844Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:135 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:11:34.4278733Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:11:34.4279947Z positions_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:11:34.4281047Z 2025-03-14T06:11:34.4281782Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1416 in forward, code: positions = positions.to(inputs_embeds.device) 2025-03-14T06:11:34.4282825Z positions_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = positions_1.to(device(type='cuda', index=0)); positions_1 = None 2025-03-14T06:11:34.4283355Z 2025-03-14T06:11:34.4284063Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1418 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:11:34.4285099Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + positions_2; inputs_embeds = positions_2 = None 2025-03-14T06:11:34.4285643Z 2025-03-14T06:11:34.4286404Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1419 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:11:34.4288670Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:11:34.4299983Z 2025-03-14T06:11:34.4300971Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1421 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:11:34.4302487Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:11:34.4303190Z 2025-03-14T06:11:34.4303897Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1450 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:11:34.4304757Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:11:34.4305137Z 2025-03-14T06:11:34.4305849Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1451 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:11:34.4306835Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:11:34.4307247Z 2025-03-14T06:11:35.3673471Z 2025-03-14T06:11:35.3673942Z class GraphModule(torch.nn.Module): 2025-03-14T06:11:35.3677017Z def forward(self, L_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:11:35.3679982Z l_input_ids_ = L_input_ids_ 2025-03-14T06:11:35.3681175Z l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:11:35.3683096Z l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:11:35.3685268Z l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:11:35.3687407Z l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:11:35.3688703Z 2025-03-14T06:11:35.3690005Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1364 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:11:35.3691687Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_input_ids_.view(-1, 1024); input_ids = None 2025-03-14T06:11:35.3692440Z 2025-03-14T06:11:35.3693814Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1375 in forward, code: inputs_embeds = self.embed_tokens(input) * self.embed_scale 2025-03-14T06:11:35.3696913Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(l_input_ids_, l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); l_input_ids_ = l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:11:35.3699242Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:11:35.3700049Z 2025-03-14T06:11:35.3701209Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:131 in forward, code: positions = torch.arange( 2025-03-14T06:11:35.3702881Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:11:35.3703778Z 2025-03-14T06:11:35.3704878Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:133 in forward, code: ).expand(bsz, -1) 2025-03-14T06:11:35.3706846Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:11:35.3707592Z 2025-03-14T06:11:35.3708866Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:135 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:11:35.3710458Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:11:35.3712678Z positions_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:11:35.3715029Z 2025-03-14T06:11:35.3716299Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1416 in forward, code: positions = positions.to(inputs_embeds.device) 2025-03-14T06:11:35.3718160Z positions_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = positions_1.to(device(type='cuda', index=0)); positions_1 = None 2025-03-14T06:11:35.3719030Z 2025-03-14T06:11:35.3720210Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1418 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:11:35.3722013Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + positions_2; inputs_embeds = positions_2 = None 2025-03-14T06:11:35.3722983Z 2025-03-14T06:11:35.3724283Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1419 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:11:35.3728423Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:11:35.3731553Z 2025-03-14T06:11:35.3733175Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1421 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:11:35.3735591Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:11:35.3736829Z 2025-03-14T06:11:35.3738045Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1450 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:11:35.3739511Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:11:35.3740094Z 2025-03-14T06:11:35.3741337Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1451 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:11:35.3742913Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:11:35.3743634Z 2025-03-14T06:11:39.9505555Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:11:39.9506396Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py", line 135, in forward 2025-03-14T06:11:39.9507093Z return super().forward(positions + self.offset) 2025-03-14T06:11:39.9507346Z 2025-03-14T06:11:46.0344033Z Compilation time (from dynamo_timed): 5.3700119729999995 2025-03-14T06:11:46.0372143Z pass 2025-03-14T06:11:46.1640697Z TIMING: entire_frame_compile:4.16719 gc:0.00449 _recursive_pre_grad_passes:0.00522 _recursive_joint_graph_passes:0.50109 inductor_compile:2.55571 backend_compile:3.35157 _recursive_post_grad_passes:0.09335 async_compile.precompile:0.164 async_compile.wait:0.67633 code_gen:1.71601 cudagraphify.get_container:0.20044 pad_mm_benchmark:0.27912 entire_backward_compile:1.20282 CUDAGraphNode.record:5.41454 total_wall_time:5.37001 2025-03-14T06:11:46.1642673Z STATS: call_* op count: 39 | FakeTensorMode.__torch_dispatch__:3877 | FakeTensor.__torch_dispatch__:620 | ProxyTorchDispatchMode.__torch_dispatch__:1681 2025-03-14T06:11:46.1643714Z Dynamo produced 5 graphs covering 39 ops with 6 graph breaks (5 unique) 2025-03-14T06:11:51.6034443Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:11:51.6035647Z warnings.warn( 2025-03-14T06:11:52.0087008Z 2025-03-14T06:11:58.9504670Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:11:58.9505027Z loading model: 0it [00:06, ?it/s] 2025-03-14T06:11:58.9505380Z cuda train BartForConditionalGeneration 2025-03-14T06:12:15.4704155Z 2025-03-14T06:12:15.4704611Z class GraphModule(torch.nn.Module): 2025-03-14T06:12:15.4706635Z def forward(self, L_cloned_inputs_labels_: "i64[1, 1024][1024, 1]cuda:0", L_cloned_inputs_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:12:15.4708751Z l_cloned_inputs_labels_ = L_cloned_inputs_labels_ 2025-03-14T06:12:15.4709189Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:12:15.4709979Z l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:12:15.4711127Z l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:12:15.4712308Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:12:15.4713507Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:12:15.4714339Z 2025-03-14T06:12:15.4714615Z # No stacktrace found for following nodes 2025-03-14T06:12:15.4715208Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:12:15.4715754Z 2025-03-14T06:12:15.4716581Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:104 in shift_tokens_right, code: shifted_input_ids = input_ids.new_zeros(input_ids.shape) 2025-03-14T06:12:15.4717634Z shifted_input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_labels_.new_zeros((1, 1024)) 2025-03-14T06:12:15.4718088Z 2025-03-14T06:12:15.4718864Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:105 in shift_tokens_right, code: shifted_input_ids[:, 1:] = input_ids[:, :-1].clone() 2025-03-14T06:12:15.4720400Z getitem: "i64[1, 1023][1024, 1]cuda:0" = l_cloned_inputs_labels_[(slice(None, None, None), slice(None, -1, None))]; l_cloned_inputs_labels_ = None 2025-03-14T06:12:15.4721110Z clone: "i64[1, 1023][1023, 1]cuda:0" = getitem.clone(); getitem = None 2025-03-14T06:12:15.4721771Z shifted_input_ids[(slice(None, None, None), slice(1, None, None))] = clone; setitem = shifted_input_ids; clone = setitem = None 2025-03-14T06:12:15.4722330Z 2025-03-14T06:12:15.4723100Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:106 in shift_tokens_right, code: shifted_input_ids[:, 0] = decoder_start_token_id 2025-03-14T06:12:15.4724368Z shifted_input_ids[(slice(None, None, None), 0)] = 2; setitem_1 = shifted_input_ids; setitem_1 = None 2025-03-14T06:12:15.4725154Z 2025-03-14T06:12:15.4726017Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:111 in shift_tokens_right, code: shifted_input_ids.masked_fill_(shifted_input_ids == -100, pad_token_id) 2025-03-14T06:12:15.4727063Z eq: "b8[1, 1024][1024, 1]cuda:0" = shifted_input_ids == -100 2025-03-14T06:12:15.4727690Z masked_fill_: "i64[1, 1024][1024, 1]cuda:0" = shifted_input_ids.masked_fill_(eq, 1); shifted_input_ids = eq = masked_fill_ = None 2025-03-14T06:12:15.4728237Z 2025-03-14T06:12:15.4728985Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1145 in forward, code: input_ids = input_ids.view(-1, input_ids.shape[-1]) 2025-03-14T06:12:15.4730030Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 1024); l_cloned_inputs_input_ids_ = None 2025-03-14T06:12:15.4730551Z 2025-03-14T06:12:15.4731336Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1152 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:12:15.4733019Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:12:15.4734295Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:12:15.4734750Z 2025-03-14T06:12:15.4735417Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:131 in forward, code: positions = torch.arange( 2025-03-14T06:12:15.4736364Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:12:15.4736918Z 2025-03-14T06:12:15.4737548Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:133 in forward, code: ).expand(bsz, -1) 2025-03-14T06:12:15.4738371Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:12:15.4738783Z 2025-03-14T06:12:15.4739515Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:135 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:12:15.4740403Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:12:15.4741609Z embed_pos: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:12:15.4742706Z 2025-03-14T06:12:15.4743441Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1155 in forward, code: embed_pos = embed_pos.to(inputs_embeds.device) 2025-03-14T06:12:15.4744559Z embed_pos_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:12:15.4745076Z 2025-03-14T06:12:15.4745786Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1157 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:12:15.4747041Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + embed_pos_1; inputs_embeds = embed_pos_1 = None 2025-03-14T06:12:15.4747614Z 2025-03-14T06:12:15.4748447Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1158 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:12:15.4750906Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:12:15.4752632Z 2025-03-14T06:12:15.4753535Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1159 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:12:15.4755006Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:12:15.4755705Z 2025-03-14T06:12:15.4756453Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1191 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:12:15.4757284Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:12:15.4757636Z 2025-03-14T06:12:15.4758404Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1192 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:12:15.4759357Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:12:15.4759765Z 2025-03-14T06:12:15.4759928Z 2025-03-14T06:12:15.4760051Z class GraphModule(torch.nn.Module): 2025-03-14T06:12:15.4761950Z def forward(self, L_cloned_inputs_labels_: "i64[1, 1024][1024, 1]cuda:0", L_cloned_inputs_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:12:15.4763870Z l_cloned_inputs_labels_ = L_cloned_inputs_labels_ 2025-03-14T06:12:15.4764299Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:12:15.4765070Z l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:12:15.4766186Z l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:12:15.4767359Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:12:15.4768976Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:12:15.4769711Z 2025-03-14T06:12:15.4769971Z # No stacktrace found for following nodes 2025-03-14T06:12:15.4770553Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:12:15.4771085Z 2025-03-14T06:12:15.4771882Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:104 in shift_tokens_right, code: shifted_input_ids = input_ids.new_zeros(input_ids.shape) 2025-03-14T06:12:15.4773028Z shifted_input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_labels_.new_zeros((1, 1024)) 2025-03-14T06:12:15.4773477Z 2025-03-14T06:12:15.4774247Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:105 in shift_tokens_right, code: shifted_input_ids[:, 1:] = input_ids[:, :-1].clone() 2025-03-14T06:12:15.4775397Z getitem: "i64[1, 1023][1024, 1]cuda:0" = l_cloned_inputs_labels_[(slice(None, None, None), slice(None, -1, None))]; l_cloned_inputs_labels_ = None 2025-03-14T06:12:15.4776087Z clone: "i64[1, 1023][1023, 1]cuda:0" = getitem.clone(); getitem = None 2025-03-14T06:12:15.4776750Z shifted_input_ids[(slice(None, None, None), slice(1, None, None))] = clone; setitem = shifted_input_ids; clone = setitem = None 2025-03-14T06:12:15.4777304Z 2025-03-14T06:12:15.4778064Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:106 in shift_tokens_right, code: shifted_input_ids[:, 0] = decoder_start_token_id 2025-03-14T06:12:15.4779098Z shifted_input_ids[(slice(None, None, None), 0)] = 2; setitem_1 = shifted_input_ids; setitem_1 = None 2025-03-14T06:12:15.4779570Z 2025-03-14T06:12:15.4780413Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:111 in shift_tokens_right, code: shifted_input_ids.masked_fill_(shifted_input_ids == -100, pad_token_id) 2025-03-14T06:12:15.4781386Z eq: "b8[1, 1024][1024, 1]cuda:0" = shifted_input_ids == -100 2025-03-14T06:12:15.4781999Z masked_fill_: "i64[1, 1024][1024, 1]cuda:0" = shifted_input_ids.masked_fill_(eq, 1); shifted_input_ids = eq = masked_fill_ = None 2025-03-14T06:12:15.4782543Z 2025-03-14T06:12:15.4783285Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1145 in forward, code: input_ids = input_ids.view(-1, input_ids.shape[-1]) 2025-03-14T06:12:15.4784329Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 1024); l_cloned_inputs_input_ids_ = None 2025-03-14T06:12:15.4784839Z 2025-03-14T06:12:15.4785622Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1152 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:12:15.4787297Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:12:15.4788582Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:12:15.4789036Z 2025-03-14T06:12:15.4789690Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:131 in forward, code: positions = torch.arange( 2025-03-14T06:12:15.4790641Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:12:15.4791143Z 2025-03-14T06:12:15.4791863Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:133 in forward, code: ).expand(bsz, -1) 2025-03-14T06:12:15.4792683Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:12:15.4793095Z 2025-03-14T06:12:15.4793825Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:135 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:12:15.4794761Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:12:15.4795958Z embed_pos: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:12:15.4797151Z 2025-03-14T06:12:15.4797890Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1155 in forward, code: embed_pos = embed_pos.to(inputs_embeds.device) 2025-03-14T06:12:15.4798910Z embed_pos_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:12:15.4799414Z 2025-03-14T06:12:15.4800130Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1157 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:12:15.4801152Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + embed_pos_1; inputs_embeds = embed_pos_1 = None 2025-03-14T06:12:15.4801689Z 2025-03-14T06:12:15.4802446Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1158 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:12:15.4804722Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:12:15.4806497Z 2025-03-14T06:12:15.4807395Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1159 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:12:15.4808767Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:12:15.4809458Z 2025-03-14T06:12:15.4810151Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1191 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:12:15.4810974Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:12:15.4811320Z 2025-03-14T06:12:15.4812083Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1192 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:12:15.4813033Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:12:15.4813449Z 2025-03-14T06:12:15.4813576Z 2025-03-14T06:12:15.4813705Z class GraphModule(torch.nn.Module): 2025-03-14T06:12:15.4815681Z def forward(self, L_cloned_inputs_labels_: "i64[1, 1024][1024, 1]cuda:0", L_cloned_inputs_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:12:15.4817591Z l_cloned_inputs_labels_ = L_cloned_inputs_labels_ 2025-03-14T06:12:15.4818013Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:12:15.4818793Z l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:12:15.4819986Z l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:12:15.4821163Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:12:15.4822351Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:12:15.4823078Z 2025-03-14T06:12:15.4823337Z # No stacktrace found for following nodes 2025-03-14T06:12:15.4823925Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:12:15.4824463Z 2025-03-14T06:12:15.4825264Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:104 in shift_tokens_right, code: shifted_input_ids = input_ids.new_zeros(input_ids.shape) 2025-03-14T06:12:15.4826303Z shifted_input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_labels_.new_zeros((1, 1024)) 2025-03-14T06:12:15.4826809Z 2025-03-14T06:12:15.4827593Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:105 in shift_tokens_right, code: shifted_input_ids[:, 1:] = input_ids[:, :-1].clone() 2025-03-14T06:12:15.4828745Z getitem: "i64[1, 1023][1024, 1]cuda:0" = l_cloned_inputs_labels_[(slice(None, None, None), slice(None, -1, None))]; l_cloned_inputs_labels_ = None 2025-03-14T06:12:15.4829441Z clone: "i64[1, 1023][1023, 1]cuda:0" = getitem.clone(); getitem = None 2025-03-14T06:12:15.4830109Z shifted_input_ids[(slice(None, None, None), slice(1, None, None))] = clone; setitem = shifted_input_ids; clone = setitem = None 2025-03-14T06:12:15.4830674Z 2025-03-14T06:12:15.4831449Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:106 in shift_tokens_right, code: shifted_input_ids[:, 0] = decoder_start_token_id 2025-03-14T06:12:15.4832481Z shifted_input_ids[(slice(None, None, None), 0)] = 2; setitem_1 = shifted_input_ids; setitem_1 = None 2025-03-14T06:12:15.4832961Z 2025-03-14T06:12:15.4833803Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:111 in shift_tokens_right, code: shifted_input_ids.masked_fill_(shifted_input_ids == -100, pad_token_id) 2025-03-14T06:12:15.4834838Z eq: "b8[1, 1024][1024, 1]cuda:0" = shifted_input_ids == -100 2025-03-14T06:12:15.4835459Z masked_fill_: "i64[1, 1024][1024, 1]cuda:0" = shifted_input_ids.masked_fill_(eq, 1); shifted_input_ids = eq = masked_fill_ = None 2025-03-14T06:12:15.4836005Z 2025-03-14T06:12:15.4836774Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1145 in forward, code: input_ids = input_ids.view(-1, input_ids.shape[-1]) 2025-03-14T06:12:15.4837934Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 1024); l_cloned_inputs_input_ids_ = None 2025-03-14T06:12:15.4838449Z 2025-03-14T06:12:15.4839229Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1152 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:12:15.4840902Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:12:15.4842172Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:12:15.4842708Z 2025-03-14T06:12:15.4843367Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:131 in forward, code: positions = torch.arange( 2025-03-14T06:12:15.4844312Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:12:15.4844811Z 2025-03-14T06:12:15.4845438Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:133 in forward, code: ).expand(bsz, -1) 2025-03-14T06:12:15.4846309Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:12:15.4846720Z 2025-03-14T06:12:15.4847441Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:135 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:12:15.4848336Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:12:15.4849535Z embed_pos: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:12:15.4850623Z 2025-03-14T06:12:15.4851354Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1155 in forward, code: embed_pos = embed_pos.to(inputs_embeds.device) 2025-03-14T06:12:15.4852372Z embed_pos_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:12:15.4852875Z 2025-03-14T06:12:15.4853581Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1157 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:12:15.4854611Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + embed_pos_1; inputs_embeds = embed_pos_1 = None 2025-03-14T06:12:15.4855149Z 2025-03-14T06:12:15.4855931Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1158 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:12:15.4858227Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:12:15.4859952Z 2025-03-14T06:12:15.4860848Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1159 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:12:15.4862336Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:12:15.4863027Z 2025-03-14T06:12:15.4863720Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1191 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:12:15.4864550Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:12:15.4864897Z 2025-03-14T06:12:15.4865663Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1192 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:12:15.4866742Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:12:15.4867152Z 2025-03-14T06:12:16.4319815Z 2025-03-14T06:12:16.4320548Z class GraphModule(torch.nn.Module): 2025-03-14T06:12:16.4322683Z def forward(self, L_labels_: "i64[1, 1024][1024, 1]cuda:0", L_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:12:16.4324517Z l_labels_ = L_labels_ 2025-03-14T06:12:16.4324820Z l_input_ids_ = L_input_ids_ 2025-03-14T06:12:16.4325552Z l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:12:16.4326706Z l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:12:16.4327912Z l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:12:16.4329126Z l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:12:16.4329867Z 2025-03-14T06:12:16.4330696Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:104 in shift_tokens_right, code: shifted_input_ids = input_ids.new_zeros(input_ids.shape) 2025-03-14T06:12:16.4331706Z shifted_input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_labels_.new_zeros((1, 1024)) 2025-03-14T06:12:16.4332121Z 2025-03-14T06:12:16.4332905Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:105 in shift_tokens_right, code: shifted_input_ids[:, 1:] = input_ids[:, :-1].clone() 2025-03-14T06:12:16.4333970Z getitem: "i64[1, 1023][1024, 1]cuda:0" = l_labels_[(slice(None, None, None), slice(None, -1, None))]; l_labels_ = None 2025-03-14T06:12:16.4334600Z clone: "i64[1, 1023][1023, 1]cuda:0" = getitem.clone(); getitem = None 2025-03-14T06:12:16.4335260Z shifted_input_ids[(slice(None, None, None), slice(1, None, None))] = clone; setitem = shifted_input_ids; clone = setitem = None 2025-03-14T06:12:16.4335818Z 2025-03-14T06:12:16.4336892Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:106 in shift_tokens_right, code: shifted_input_ids[:, 0] = decoder_start_token_id 2025-03-14T06:12:16.4338526Z shifted_input_ids[(slice(None, None, None), 0)] = 2; setitem_1 = shifted_input_ids; setitem_1 = None 2025-03-14T06:12:16.4339013Z 2025-03-14T06:12:16.4340454Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:111 in shift_tokens_right, code: shifted_input_ids.masked_fill_(shifted_input_ids == -100, pad_token_id) 2025-03-14T06:12:16.4341681Z eq: "b8[1, 1024][1024, 1]cuda:0" = shifted_input_ids == -100 2025-03-14T06:12:16.4342296Z masked_fill_: "i64[1, 1024][1024, 1]cuda:0" = shifted_input_ids.masked_fill_(eq, 1); shifted_input_ids = eq = masked_fill_ = None 2025-03-14T06:12:16.4343066Z 2025-03-14T06:12:16.4343810Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1145 in forward, code: input_ids = input_ids.view(-1, input_ids.shape[-1]) 2025-03-14T06:12:16.4344935Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_input_ids_.view(-1, 1024); l_input_ids_ = None 2025-03-14T06:12:16.4345360Z 2025-03-14T06:12:16.4346286Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1152 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:12:16.4348268Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:12:16.4349549Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:12:16.4349996Z 2025-03-14T06:12:16.4350647Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:131 in forward, code: positions = torch.arange( 2025-03-14T06:12:16.4351594Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:12:16.4352089Z 2025-03-14T06:12:16.4352716Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:133 in forward, code: ).expand(bsz, -1) 2025-03-14T06:12:16.4353532Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:12:16.4353941Z 2025-03-14T06:12:16.4354745Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:135 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:12:16.4355627Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:12:16.4356824Z embed_pos: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:12:16.4357926Z 2025-03-14T06:12:16.4358663Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1155 in forward, code: embed_pos = embed_pos.to(inputs_embeds.device) 2025-03-14T06:12:16.4359677Z embed_pos_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:12:16.4360182Z 2025-03-14T06:12:16.4361042Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1157 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:12:16.4362060Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + embed_pos_1; inputs_embeds = embed_pos_1 = None 2025-03-14T06:12:16.4362601Z 2025-03-14T06:12:16.4363352Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1158 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:12:16.4365734Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:12:16.4367524Z 2025-03-14T06:12:16.4368688Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1159 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:12:16.4370186Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:12:16.4370879Z 2025-03-14T06:12:16.4371571Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1191 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:12:16.4384229Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:12:16.4384577Z 2025-03-14T06:12:16.4385363Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1192 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:12:16.4386317Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:12:16.4386740Z 2025-03-14T06:12:16.4386896Z 2025-03-14T06:12:16.4387016Z class GraphModule(torch.nn.Module): 2025-03-14T06:12:16.4388833Z def forward(self, L_labels_: "i64[1, 1024][1024, 1]cuda:0", L_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:12:16.4390614Z l_labels_ = L_labels_ 2025-03-14T06:12:16.4390895Z l_input_ids_ = L_input_ids_ 2025-03-14T06:12:16.4391588Z l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:12:16.4392725Z l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:12:16.4393904Z l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:12:16.4395209Z l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:12:16.4395943Z 2025-03-14T06:12:16.4396777Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:104 in shift_tokens_right, code: shifted_input_ids = input_ids.new_zeros(input_ids.shape) 2025-03-14T06:12:16.4397792Z shifted_input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_labels_.new_zeros((1, 1024)) 2025-03-14T06:12:16.4398215Z 2025-03-14T06:12:16.4398982Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:105 in shift_tokens_right, code: shifted_input_ids[:, 1:] = input_ids[:, :-1].clone() 2025-03-14T06:12:16.4400313Z getitem: "i64[1, 1023][1024, 1]cuda:0" = l_labels_[(slice(None, None, None), slice(None, -1, None))]; l_labels_ = None 2025-03-14T06:12:16.4400921Z clone: "i64[1, 1023][1023, 1]cuda:0" = getitem.clone(); getitem = None 2025-03-14T06:12:16.4401582Z shifted_input_ids[(slice(None, None, None), slice(1, None, None))] = clone; setitem = shifted_input_ids; clone = setitem = None 2025-03-14T06:12:16.4402138Z 2025-03-14T06:12:16.4402912Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:106 in shift_tokens_right, code: shifted_input_ids[:, 0] = decoder_start_token_id 2025-03-14T06:12:16.4403934Z shifted_input_ids[(slice(None, None, None), 0)] = 2; setitem_1 = shifted_input_ids; setitem_1 = None 2025-03-14T06:12:16.4404518Z 2025-03-14T06:12:16.4405362Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:111 in shift_tokens_right, code: shifted_input_ids.masked_fill_(shifted_input_ids == -100, pad_token_id) 2025-03-14T06:12:16.4406340Z eq: "b8[1, 1024][1024, 1]cuda:0" = shifted_input_ids == -100 2025-03-14T06:12:16.4406988Z masked_fill_: "i64[1, 1024][1024, 1]cuda:0" = shifted_input_ids.masked_fill_(eq, 1); shifted_input_ids = eq = masked_fill_ = None 2025-03-14T06:12:16.4407557Z 2025-03-14T06:12:16.4408300Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1145 in forward, code: input_ids = input_ids.view(-1, input_ids.shape[-1]) 2025-03-14T06:12:16.4409249Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_input_ids_.view(-1, 1024); l_input_ids_ = None 2025-03-14T06:12:16.4409680Z 2025-03-14T06:12:16.4410469Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1152 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:12:16.4412151Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:12:16.4413430Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:12:16.4413884Z 2025-03-14T06:12:16.4414538Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:131 in forward, code: positions = torch.arange( 2025-03-14T06:12:16.4415474Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:12:16.4415982Z 2025-03-14T06:12:16.4416655Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:133 in forward, code: ).expand(bsz, -1) 2025-03-14T06:12:16.4417476Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:12:16.4417888Z 2025-03-14T06:12:16.4418614Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:135 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:12:16.4419496Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:12:16.4420698Z embed_pos: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:12:16.4421798Z 2025-03-14T06:12:16.4422529Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1155 in forward, code: embed_pos = embed_pos.to(inputs_embeds.device) 2025-03-14T06:12:16.4423635Z embed_pos_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:12:16.4424141Z 2025-03-14T06:12:16.4424848Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1157 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:12:16.4425873Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + embed_pos_1; inputs_embeds = embed_pos_1 = None 2025-03-14T06:12:16.4426403Z 2025-03-14T06:12:16.4427216Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1158 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:12:16.4429655Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:12:16.4431390Z 2025-03-14T06:12:16.4432287Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1159 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:12:16.4433657Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:12:16.4434432Z 2025-03-14T06:12:16.4435135Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1191 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:12:16.4435966Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:12:16.4436314Z 2025-03-14T06:12:16.4437130Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1192 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:12:16.4438087Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:12:16.4438499Z 2025-03-14T06:12:17.0542372Z 2025-03-14T06:12:17.0542838Z class GraphModule(torch.nn.Module): 2025-03-14T06:12:17.0544928Z def forward(self, L_decoder_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_self_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_self_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_self_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_self_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:12:17.0546652Z l_decoder_input_ids_ = L_decoder_input_ids_ 2025-03-14T06:12:17.0547048Z l_input_ids_ = L_input_ids_ 2025-03-14T06:12:17.0547686Z l_self_modules_encoder_modules_embed_tokens_parameters_weight_ = L_self_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:12:17.0548642Z l_self_modules_encoder_modules_embed_positions_parameters_weight_ = L_self_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:12:17.0549648Z l_self_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:12:17.0550684Z l_self_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:12:17.0551336Z 2025-03-14T06:12:17.0552469Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1145 in forward, code: input_ids = input_ids.view(-1, input_ids.shape[-1]) 2025-03-14T06:12:17.0553440Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_input_ids_.view(-1, 1024); l_input_ids_ = None 2025-03-14T06:12:17.0553875Z 2025-03-14T06:12:17.0554759Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1152 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:12:17.0556359Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_self_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:12:17.0557766Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:12:17.0558224Z 2025-03-14T06:12:17.0558880Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:131 in forward, code: positions = torch.arange( 2025-03-14T06:12:17.0559835Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:12:17.0560334Z 2025-03-14T06:12:17.0560969Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:133 in forward, code: ).expand(bsz, -1) 2025-03-14T06:12:17.0561789Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:12:17.0562212Z 2025-03-14T06:12:17.0562940Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:135 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:12:17.0563830Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:12:17.0565099Z embed_pos: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_self_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_self_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:12:17.0566115Z 2025-03-14T06:12:17.0566848Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1155 in forward, code: embed_pos = embed_pos.to(inputs_embeds.device) 2025-03-14T06:12:17.0568139Z embed_pos_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:12:17.0568661Z 2025-03-14T06:12:17.0569393Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1157 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:12:17.0570423Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + embed_pos_1; inputs_embeds = embed_pos_1 = None 2025-03-14T06:12:17.0570963Z 2025-03-14T06:12:17.0571726Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1158 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:12:17.0573836Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_self_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:12:17.0575393Z 2025-03-14T06:12:17.0576438Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1159 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:12:17.0577806Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:12:17.0578490Z 2025-03-14T06:12:17.0579189Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1191 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:12:17.0580015Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:12:17.0580512Z 2025-03-14T06:12:17.0581285Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1192 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:12:17.0582238Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:12:17.0582647Z 2025-03-14T06:12:17.1953705Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:12:17.1954789Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py", line 135, in forward 2025-03-14T06:12:17.1955486Z return super().forward(positions + self.offset) 2025-03-14T06:12:17.1955744Z 2025-03-14T06:12:18.8975970Z 2025-03-14T06:12:18.8976651Z class GraphModule(torch.nn.Module): 2025-03-14T06:12:18.8979648Z def forward(self, dict_getitem_L_stack0_list_dict_keys_L_stack0_0_: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0", L_decoder_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_self_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_self_modules_decoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_self_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_self_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:12:18.8983288Z dict_getitem_l_stack0_list_dict_keys_l_stack0_0_ = dict_getitem_L_stack0_list_dict_keys_L_stack0_0_ 2025-03-14T06:12:18.8984212Z l_decoder_input_ids_ = L_decoder_input_ids_ 2025-03-14T06:12:18.8985283Z l_self_modules_decoder_modules_embed_tokens_parameters_weight_ = L_self_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:12:18.8986888Z l_self_modules_decoder_modules_embed_positions_parameters_weight_ = L_self_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:12:18.8988561Z l_self_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:12:18.8990389Z l_self_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:12:18.8991514Z 2025-03-14T06:12:18.8992769Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1364 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:12:18.8994537Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_decoder_input_ids_.view(-1, 1024); input_ids = None 2025-03-14T06:12:18.8995303Z 2025-03-14T06:12:18.8996638Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1375 in forward, code: inputs_embeds = self.embed_tokens(input) * self.embed_scale 2025-03-14T06:12:18.8999505Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(l_decoder_input_ids_, l_self_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); l_decoder_input_ids_ = l_self_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:12:18.9002188Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:12:18.9003001Z 2025-03-14T06:12:18.9004133Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:131 in forward, code: positions = torch.arange( 2025-03-14T06:12:18.9005782Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:12:18.9006658Z 2025-03-14T06:12:18.9007809Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:133 in forward, code: ).expand(bsz, -1) 2025-03-14T06:12:18.9009504Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:12:18.9010233Z 2025-03-14T06:12:18.9011517Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:135 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:12:18.9013080Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:12:18.9015070Z positions_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_self_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_self_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:12:18.9016888Z 2025-03-14T06:12:18.9018240Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1416 in forward, code: positions = positions.to(inputs_embeds.device) 2025-03-14T06:12:18.9020101Z positions_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = positions_1.to(device(type='cuda', index=0)); positions_1 = None 2025-03-14T06:12:18.9021044Z 2025-03-14T06:12:18.9022293Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1418 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:12:18.9024119Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + positions_2; inputs_embeds = positions_2 = None 2025-03-14T06:12:18.9025076Z 2025-03-14T06:12:18.9026408Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1419 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:12:18.9030230Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_self_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:12:18.9033168Z 2025-03-14T06:12:18.9034889Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1421 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:12:18.9037328Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:12:18.9038567Z 2025-03-14T06:12:18.9039787Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1450 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:12:18.9041264Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:12:18.9041866Z 2025-03-14T06:12:18.9043084Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py:1451 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:12:18.9044840Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:12:18.9045563Z 2025-03-14T06:12:39.3296805Z Compilation time (from dynamo_timed): 8.400414362 2025-03-14T06:12:39.3348831Z pass 2025-03-14T06:12:39.5384266Z TIMING: entire_frame_compile:6.48247 gc:0.00906 _recursive_pre_grad_passes:0.00646 _recursive_joint_graph_passes:0.3597 inductor_compile:3.976 backend_compile:4.75319 _recursive_post_grad_passes:0.17626 async_compile.precompile:0.12035 async_compile.wait:0.76984 code_gen:2.36752 cudagraphify.get_container:0.23431 pad_mm_benchmark:0.03158 entire_backward_compile:1.91794 CUDAGraphNode.record:13.20247 total_wall_time:8.40041 2025-03-14T06:12:39.5386644Z STATS: call_* op count: 93 | FakeTensorMode.__torch_dispatch__:8258 | FakeTensor.__torch_dispatch__:1327 | ProxyTorchDispatchMode.__torch_dispatch__:3855 2025-03-14T06:12:39.5387452Z Dynamo produced 7 graphs covering 93 ops with 8 graph breaks (5 unique) 2025-03-14T06:12:45.0765832Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:12:45.0767030Z warnings.warn( 2025-03-14T06:12:45.2962326Z 2025-03-14T06:12:47.3732905Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:12:47.3733458Z loading model: 0it [00:02, ?it/s] 2025-03-14T06:12:47.3734009Z cuda train BertForMaskedLM 2025-03-14T06:13:20.7650364Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:13:20.7651209Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:13:20.7651943Z pred = mod(**cloned_inputs) 2025-03-14T06:13:20.7652596Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bert/modeling_bert.py", line 1360, in forward 2025-03-14T06:13:20.7653247Z outputs = self.bert( 2025-03-14T06:13:20.7653848Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bert/modeling_bert.py", line 1006, in forward 2025-03-14T06:13:20.7654504Z embedding_output = self.embeddings( 2025-03-14T06:13:20.7655149Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bert/modeling_bert.py", line 232, in forward 2025-03-14T06:13:20.7655827Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:13:20.7656082Z 2025-03-14T06:13:20.9407817Z W0314 06:13:20.939000 15548 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:14:01.4382437Z Compilation time (from dynamo_timed): 67.46135753 2025-03-14T06:14:01.4406238Z pass 2025-03-14T06:14:01.5041731Z TIMING: entire_frame_compile:59.20984 gc:0.0023 _recursive_pre_grad_passes:0.03352 pad_mm_benchmark:0.40359 _recursive_joint_graph_passes:1.76248 _recursive_post_grad_passes:0.62989 async_compile.wait:2.6656 code_gen:17.93094 inductor_compile:32.48826 backend_compile:46.26632 cudagraphify.get_container:0.38536 entire_backward_compile:8.25152 CUDAGraphNode.record:1.69169 total_wall_time:67.46136 2025-03-14T06:14:01.5043667Z STATS: call_* op count: 1401 | FakeTensorMode.__torch_dispatch__:62920 | FakeTensor.__torch_dispatch__:14277 | ProxyTorchDispatchMode.__torch_dispatch__:28580 2025-03-14T06:14:01.5044504Z Dynamo produced 2 graphs covering 1401 ops with 5 graph breaks (4 unique) 2025-03-14T06:14:09.4140909Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:14:09.4142189Z warnings.warn( 2025-03-14T06:14:09.9371126Z 2025-03-14T06:14:11.7980660Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:14:11.7981594Z loading model: 0it [00:01, ?it/s] 2025-03-14T06:14:11.7981959Z cuda train BertForQuestionAnswering 2025-03-14T06:14:44.0100313Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:14:44.0101786Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:14:44.0102900Z pred = mod(**cloned_inputs) 2025-03-14T06:14:44.0103875Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bert/modeling_bert.py", line 1846, in forward 2025-03-14T06:14:44.0104679Z outputs = self.bert( 2025-03-14T06:14:44.0105292Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bert/modeling_bert.py", line 1006, in forward 2025-03-14T06:14:44.0106401Z embedding_output = self.embeddings( 2025-03-14T06:14:44.0107069Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bert/modeling_bert.py", line 232, in forward 2025-03-14T06:14:44.0107749Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:14:44.0108001Z 2025-03-14T06:14:44.1854458Z W0314 06:14:44.184000 16066 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:15:23.3283233Z Compilation time (from dynamo_timed): 65.28162853399999 2025-03-14T06:15:23.3307895Z pass 2025-03-14T06:15:23.3971180Z TIMING: entire_frame_compile:57.65898 gc:0.00369 _recursive_pre_grad_passes:0.0351 pad_mm_benchmark:0.37723 _recursive_joint_graph_passes:1.71356 _recursive_post_grad_passes:0.91392 async_compile.wait:0.89395 code_gen:15.79392 inductor_compile:30.28378 backend_compile:44.76105 cudagraphify.get_container:0.3867 entire_backward_compile:7.62265 CUDAGraphNode.record:1.70561 total_wall_time:65.28163 2025-03-14T06:15:23.3973284Z STATS: call_* op count: 1393 | FakeTensorMode.__torch_dispatch__:62425 | FakeTensor.__torch_dispatch__:14120 | ProxyTorchDispatchMode.__torch_dispatch__:28376 2025-03-14T06:15:23.3974207Z Dynamo produced 2 graphs covering 1393 ops with 5 graph breaks (4 unique) 2025-03-14T06:15:31.2815406Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:15:31.2816591Z warnings.warn( 2025-03-14T06:15:31.5335622Z 2025-03-14T06:15:59.8405720Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:15:59.8406088Z loading model: 0it [00:28, ?it/s] 2025-03-14T06:15:59.8406428Z cuda train BlenderbotForCausalLM 2025-03-14T06:15:59.8413402Z Traceback (most recent call last): 2025-03-14T06:15:59.8414079Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/common.py", line 1913, in validate_model 2025-03-14T06:15:59.8414644Z self.model_iter_fn(model, example_inputs) 2025-03-14T06:15:59.8415294Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in forward_and_backward_pass 2025-03-14T06:15:59.8415905Z pred = mod(**cloned_inputs) 2025-03-14T06:15:59.8416523Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl 2025-03-14T06:15:59.8417163Z return self._call_impl(*args, **kwargs) 2025-03-14T06:15:59.8417764Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl 2025-03-14T06:15:59.8418360Z return forward_call(*args, **kwargs) 2025-03-14T06:15:59.8419068Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot/modeling_blenderbot.py", line 1531, in forward 2025-03-14T06:15:59.8419781Z outputs = self.model.decoder( 2025-03-14T06:15:59.8420396Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl 2025-03-14T06:15:59.8421507Z return self._call_impl(*args, **kwargs) 2025-03-14T06:15:59.8422116Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl 2025-03-14T06:15:59.8422710Z return forward_call(*args, **kwargs) 2025-03-14T06:15:59.8423461Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot/modeling_blenderbot.py", line 997, in forward 2025-03-14T06:15:59.8424162Z layer_outputs = decoder_layer( 2025-03-14T06:15:59.8424773Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl 2025-03-14T06:15:59.8425404Z return self._call_impl(*args, **kwargs) 2025-03-14T06:15:59.8426179Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl 2025-03-14T06:15:59.8426776Z return forward_call(*args, **kwargs) 2025-03-14T06:15:59.8427482Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot/modeling_blenderbot.py", line 397, in forward 2025-03-14T06:15:59.8428286Z hidden_states, self_attn_weights, present_key_value = self.self_attn( 2025-03-14T06:15:59.8429020Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl 2025-03-14T06:15:59.8429658Z return self._call_impl(*args, **kwargs) 2025-03-14T06:15:59.8430253Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl 2025-03-14T06:15:59.8430846Z return forward_call(*args, **kwargs) 2025-03-14T06:15:59.8431543Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot/modeling_blenderbot.py", line 152, in forward 2025-03-14T06:15:59.8432309Z query_states = self.q_proj(hidden_states) * self.scaling 2025-03-14T06:15:59.8432993Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl 2025-03-14T06:15:59.8433635Z return self._call_impl(*args, **kwargs) 2025-03-14T06:15:59.8434299Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl 2025-03-14T06:15:59.8434895Z return forward_call(*args, **kwargs) 2025-03-14T06:15:59.8435463Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/linear.py", line 125, in forward 2025-03-14T06:15:59.8436075Z return F.linear(input, self.weight, self.bias) 2025-03-14T06:15:59.8438069Z torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 14.00 MiB. GPU 0 has a total capacity of 21.98 GiB of which 6.44 MiB is free. Process 72171 has 21.96 GiB memory in use. Of the allocated memory 21.61 GiB is allocated by PyTorch, and 27.77 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables) 2025-03-14T06:15:59.8439884Z 2025-03-14T06:15:59.8440109Z The above exception was the direct cause of the following exception: 2025-03-14T06:15:59.8440425Z 2025-03-14T06:15:59.8440554Z Traceback (most recent call last): 2025-03-14T06:15:59.8441037Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/common.py", line 3991, in run 2025-03-14T06:15:59.8441529Z ) = runner.load_model( 2025-03-14T06:15:59.8442029Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 461, in load_model 2025-03-14T06:15:59.8442592Z self.validate_model(model, example_inputs) 2025-03-14T06:15:59.8443149Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/common.py", line 1915, in validate_model 2025-03-14T06:15:59.8443722Z raise RuntimeError("Eager run failed") from e 2025-03-14T06:15:59.8444083Z RuntimeError: Eager run failed 2025-03-14T06:15:59.8444279Z 2025-03-14T06:15:59.8444387Z eager_fail_to_run 2025-03-14T06:16:04.1488958Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:16:04.1490138Z warnings.warn( 2025-03-14T06:16:04.3814020Z 2025-03-14T06:16:05.8849131Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:16:05.8857190Z loading model: 0it [00:01, ?it/s] 2025-03-14T06:16:05.8857572Z cuda train BlenderbotSmallForCausalLM 2025-03-14T06:16:05.9060166Z WARNING:common:fp64 golden ref were not generated for BlenderbotSmallForCausalLM. Setting accuracy check to cosine 2025-03-14T06:16:07.5542602Z 2025-03-14T06:16:07.5542970Z class GraphModule(torch.nn.Module): 2025-03-14T06:16:07.5545320Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 128][128, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 512][512, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[512, 512][512, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[512][1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[512][1]cuda:0"): 2025-03-14T06:16:07.5547178Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:16:07.5547962Z l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:16:07.5549090Z l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:16:07.5550286Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:16:07.5551484Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:16:07.5552222Z 2025-03-14T06:16:07.5552501Z # No stacktrace found for following nodes 2025-03-14T06:16:07.5553089Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:16:07.5553634Z 2025-03-14T06:16:07.5554571Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:919 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:16:07.5555706Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 128); l_cloned_inputs_input_ids_ = None 2025-03-14T06:16:07.5556216Z 2025-03-14T06:16:07.5557099Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:929 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:16:07.5558844Z embedding: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:16:07.5560085Z inputs_embeds: "f32[1, 128, 512][65536, 512, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:16:07.5560521Z 2025-03-14T06:16:07.5561457Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:16:07.5562988Z mask: "f32[128, 128][128, 1]cuda:0" = torch.full((128, 128), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:16:07.5563864Z 2025-03-14T06:16:07.5564635Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:16:07.5565655Z mask_cond: "i64[128][1]cuda:0" = torch.arange(128, device = device(type='cuda', index=0)) 2025-03-14T06:16:07.5566094Z 2025-03-14T06:16:07.5566891Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:16:07.5568047Z add: "i64[128][1]cuda:0" = mask_cond + 1 2025-03-14T06:16:07.5568663Z view_1: "i64[128, 1][1, 1]cuda:0" = add.view(128, 1); add = None 2025-03-14T06:16:07.5569133Z lt: "b8[128, 128][128, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:16:07.5569701Z masked_fill_: "f32[128, 128][128, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:16:07.5570197Z 2025-03-14T06:16:07.5570849Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:16:07.5571676Z mask_1: "f32[128, 128][128, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:16:07.5572073Z 2025-03-14T06:16:07.5572909Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:16:07.5574086Z getitem: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:16:07.5574893Z causal_4d_mask: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = getitem.expand(1, 1, 128, 128); getitem = causal_4d_mask = None 2025-03-14T06:16:07.5575420Z 2025-03-14T06:16:07.5576175Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:87 in forward, code: positions = torch.arange( 2025-03-14T06:16:07.5577221Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:16:07.5577731Z 2025-03-14T06:16:07.5578506Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:90 in forward, code: return super().forward(positions) 2025-03-14T06:16:07.5580203Z positions_1: "f32[128, 512][512, 1]cuda:0" = torch.nn.functional.embedding(positions, l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:16:07.5581321Z 2025-03-14T06:16:07.5582175Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:946 in forward, code: inputs_embeds = self.layernorm_embedding(inputs_embeds) 2025-03-14T06:16:07.5584507Z inputs_embeds_1: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.layer_norm(inputs_embeds, (512,), l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); inputs_embeds = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:16:07.5586254Z 2025-03-14T06:16:07.5587185Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:947 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:16:07.5588308Z hidden_states: "f32[1, 128, 512][65536, 512, 1]cuda:0" = inputs_embeds_1 + positions_1; inputs_embeds_1 = positions_1 = None 2025-03-14T06:16:07.5588841Z 2025-03-14T06:16:07.5589822Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:949 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:16:07.5591252Z hidden_states_1: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:16:07.5591992Z 2025-03-14T06:16:07.5592781Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:977 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:16:07.5593706Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:16:07.5594054Z 2025-03-14T06:16:07.5594933Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:978 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:16:07.5595940Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:16:07.5596362Z 2025-03-14T06:16:07.5596487Z 2025-03-14T06:16:07.5596620Z class GraphModule(torch.nn.Module): 2025-03-14T06:16:07.5598350Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 128][128, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 512][512, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[512, 512][512, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[512][1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[512][1]cuda:0"): 2025-03-14T06:16:07.5600112Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:16:07.5600890Z l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:16:07.5602012Z l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:16:07.5603181Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:16:07.5604371Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:16:07.5605100Z 2025-03-14T06:16:07.5605382Z # No stacktrace found for following nodes 2025-03-14T06:16:07.5606004Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:16:07.5606543Z 2025-03-14T06:16:07.5607370Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:919 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:16:07.5608476Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 128); l_cloned_inputs_input_ids_ = None 2025-03-14T06:16:07.5608981Z 2025-03-14T06:16:07.5609862Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:929 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:16:07.5611687Z embedding: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:16:07.5612921Z inputs_embeds: "f32[1, 128, 512][65536, 512, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:16:07.5613363Z 2025-03-14T06:16:07.5614184Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:16:07.5615294Z mask: "f32[128, 128][128, 1]cuda:0" = torch.full((128, 128), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:16:07.5615936Z 2025-03-14T06:16:07.5616692Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:16:07.5617657Z mask_cond: "i64[128][1]cuda:0" = torch.arange(128, device = device(type='cuda', index=0)) 2025-03-14T06:16:07.5618102Z 2025-03-14T06:16:07.5618901Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:16:07.5619796Z add: "i64[128][1]cuda:0" = mask_cond + 1 2025-03-14T06:16:07.5620189Z view_1: "i64[128, 1][1, 1]cuda:0" = add.view(128, 1); add = None 2025-03-14T06:16:07.5620658Z lt: "b8[128, 128][128, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:16:07.5621221Z masked_fill_: "f32[128, 128][128, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:16:07.5621675Z 2025-03-14T06:16:07.5622332Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:16:07.5623158Z mask_1: "f32[128, 128][128, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:16:07.5623556Z 2025-03-14T06:16:07.5624390Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:16:07.5625560Z getitem: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:16:07.5626365Z causal_4d_mask: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = getitem.expand(1, 1, 128, 128); getitem = causal_4d_mask = None 2025-03-14T06:16:07.5626892Z 2025-03-14T06:16:07.5627653Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:87 in forward, code: positions = torch.arange( 2025-03-14T06:16:07.5628700Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:16:07.5629210Z 2025-03-14T06:16:07.5629984Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:90 in forward, code: return super().forward(positions) 2025-03-14T06:16:07.5631671Z positions_1: "f32[128, 512][512, 1]cuda:0" = torch.nn.functional.embedding(positions, l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:16:07.5632798Z 2025-03-14T06:16:07.5633771Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:946 in forward, code: inputs_embeds = self.layernorm_embedding(inputs_embeds) 2025-03-14T06:16:07.5636214Z inputs_embeds_1: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.layer_norm(inputs_embeds, (512,), l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); inputs_embeds = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:16:07.5637909Z 2025-03-14T06:16:07.5638796Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:947 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:16:07.5639918Z hidden_states: "f32[1, 128, 512][65536, 512, 1]cuda:0" = inputs_embeds_1 + positions_1; inputs_embeds_1 = positions_1 = None 2025-03-14T06:16:07.5640448Z 2025-03-14T06:16:07.5641434Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:949 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:16:07.5642864Z hidden_states_1: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:16:07.5643525Z 2025-03-14T06:16:07.5644310Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:977 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:16:07.5645234Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:16:07.5645587Z 2025-03-14T06:16:07.5646392Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:978 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:16:07.5647394Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:16:07.5647817Z 2025-03-14T06:16:08.5045291Z 2025-03-14T06:16:08.5045998Z class GraphModule(torch.nn.Module): 2025-03-14T06:16:08.5047755Z def forward(self, L_input_ids_: "i64[1, 128][128, 1]cuda:0", L_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 512][512, 1]cuda:0", L_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[512, 512][512, 1]cuda:0", L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[512][1]cuda:0", L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[512][1]cuda:0"): 2025-03-14T06:16:08.5049477Z l_input_ids_ = L_input_ids_ 2025-03-14T06:16:08.5050235Z l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:16:08.5051380Z l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:16:08.5052576Z l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:16:08.5053779Z l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:16:08.5054521Z 2025-03-14T06:16:08.5055772Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:919 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:16:08.5056815Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_input_ids_.view(-1, 128); l_input_ids_ = None 2025-03-14T06:16:08.5057237Z 2025-03-14T06:16:08.5058118Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:929 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:16:08.5059869Z embedding: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:16:08.5061264Z inputs_embeds: "f32[1, 128, 512][65536, 512, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:16:08.5061701Z 2025-03-14T06:16:08.5062526Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:16:08.5063634Z mask: "f32[128, 128][128, 1]cuda:0" = torch.full((128, 128), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:16:08.5064139Z 2025-03-14T06:16:08.5064879Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:16:08.5065849Z mask_cond: "i64[128][1]cuda:0" = torch.arange(128, device = device(type='cuda', index=0)) 2025-03-14T06:16:08.5066317Z 2025-03-14T06:16:08.5067133Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:16:08.5068315Z add: "i64[128][1]cuda:0" = mask_cond + 1 2025-03-14T06:16:08.5068713Z view_1: "i64[128, 1][1, 1]cuda:0" = add.view(128, 1); add = None 2025-03-14T06:16:08.5069181Z lt: "b8[128, 128][128, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:16:08.5069731Z masked_fill_: "f32[128, 128][128, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:16:08.5070181Z 2025-03-14T06:16:08.5070838Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:16:08.5071655Z mask_1: "f32[128, 128][128, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:16:08.5072055Z 2025-03-14T06:16:08.5072886Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:16:08.5074051Z getitem: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:16:08.5074919Z causal_4d_mask: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = getitem.expand(1, 1, 128, 128); getitem = causal_4d_mask = None 2025-03-14T06:16:08.5075440Z 2025-03-14T06:16:08.5076182Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:87 in forward, code: positions = torch.arange( 2025-03-14T06:16:08.5077279Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:16:08.5077784Z 2025-03-14T06:16:08.5078687Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:90 in forward, code: return super().forward(positions) 2025-03-14T06:16:08.5080377Z positions_1: "f32[128, 512][512, 1]cuda:0" = torch.nn.functional.embedding(positions, l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:16:08.5081498Z 2025-03-14T06:16:08.5082341Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:946 in forward, code: inputs_embeds = self.layernorm_embedding(inputs_embeds) 2025-03-14T06:16:08.5084697Z inputs_embeds_1: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.layer_norm(inputs_embeds, (512,), l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); inputs_embeds = l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:16:08.5086510Z 2025-03-14T06:16:08.5087306Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:947 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:16:08.5088413Z hidden_states: "f32[1, 128, 512][65536, 512, 1]cuda:0" = inputs_embeds_1 + positions_1; inputs_embeds_1 = positions_1 = None 2025-03-14T06:16:08.5088938Z 2025-03-14T06:16:08.5089908Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:949 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:16:08.5091336Z hidden_states_1: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:16:08.5091996Z 2025-03-14T06:16:08.5092776Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:977 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:16:08.5093689Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:16:08.5094035Z 2025-03-14T06:16:08.5094827Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:978 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:16:08.5096167Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:16:08.5096596Z 2025-03-14T06:16:14.0796614Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:16:14.0797563Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 90, in forward 2025-03-14T06:16:14.0798315Z return super().forward(positions) 2025-03-14T06:16:14.0798534Z 2025-03-14T06:16:18.4360388Z Compilation time (from dynamo_timed): 6.575205064 2025-03-14T06:16:18.4372938Z pass 2025-03-14T06:16:18.4645884Z TIMING: entire_frame_compile:5.09827 gc:0.0058 _recursive_pre_grad_passes:0.00591 _recursive_joint_graph_passes:0.43507 inductor_compile:3.48019 backend_compile:4.06554 _recursive_post_grad_passes:0.11136 async_compile.precompile:0.22913 async_compile.wait:1.30055 code_gen:2.52015 cudagraphify.get_container:0.17066 pad_mm_benchmark:0.20119 entire_backward_compile:1.47694 CUDAGraphNode.record:3.94869 total_wall_time:6.57521 2025-03-14T06:16:18.4649607Z STATS: call_* op count: 58 | FakeTensorMode.__torch_dispatch__:4471 | FakeTensor.__torch_dispatch__:672 | ProxyTorchDispatchMode.__torch_dispatch__:1914 2025-03-14T06:16:18.4650420Z Dynamo produced 6 graphs covering 58 ops with 6 graph breaks (5 unique) 2025-03-14T06:16:23.8106890Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:16:23.8108087Z warnings.warn( 2025-03-14T06:16:24.1643298Z 2025-03-14T06:16:26.0946698Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:16:26.0947255Z loading model: 0it [00:01, ?it/s] 2025-03-14T06:16:26.0947874Z cuda train BlenderbotSmallForConditionalGeneration 2025-03-14T06:16:26.1664761Z WARNING:common:fp64 golden ref were not generated for BlenderbotSmallForConditionalGeneration. Setting accuracy check to cosine 2025-03-14T06:16:28.1768548Z 2025-03-14T06:16:28.1769246Z class GraphModule(torch.nn.Module): 2025-03-14T06:16:28.1771369Z def forward(self, L_cloned_inputs_labels_: "i64[1, 128][128, 1]cuda:0", L_cloned_inputs_decoder_input_ids_: "i64[1, 128][128, 1]cuda:0", L_cloned_inputs_input_ids_: "i64[1, 128][128, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 512][512, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[512, 512][512, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[512][1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[512][1]cuda:0"): 2025-03-14T06:16:28.1773487Z l_cloned_inputs_labels_ = L_cloned_inputs_labels_ 2025-03-14T06:16:28.1774011Z l_cloned_inputs_decoder_input_ids_ = L_cloned_inputs_decoder_input_ids_ 2025-03-14T06:16:28.1774521Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:16:28.1775296Z l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:16:28.1776428Z l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:16:28.1777607Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:16:28.1778797Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:16:28.1779527Z 2025-03-14T06:16:28.1779796Z # No stacktrace found for following nodes 2025-03-14T06:16:28.1780380Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:16:28.1780925Z 2025-03-14T06:16:28.1781781Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:714 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:16:28.1783037Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 128); l_cloned_inputs_input_ids_ = None 2025-03-14T06:16:28.1783627Z 2025-03-14T06:16:28.1784522Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:721 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:16:28.1786275Z embedding: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:16:28.1787521Z inputs_embeds: "f32[1, 128, 512][65536, 512, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:16:28.1787963Z 2025-03-14T06:16:28.1789079Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:87 in forward, code: positions = torch.arange( 2025-03-14T06:16:28.1790136Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:16:28.1790650Z 2025-03-14T06:16:28.1791421Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:90 in forward, code: return super().forward(positions) 2025-03-14T06:16:28.1793111Z embed_pos: "f32[128, 512][512, 1]cuda:0" = torch.nn.functional.embedding(positions, l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:16:28.1794486Z 2025-03-14T06:16:28.1795295Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:725 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:16:28.1796384Z hidden_states: "f32[1, 128, 512][65536, 512, 1]cuda:0" = inputs_embeds + embed_pos; inputs_embeds = embed_pos = None 2025-03-14T06:16:28.1796888Z 2025-03-14T06:16:28.1797746Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:726 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:16:28.1800258Z hidden_states_1: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (512,), l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:16:28.1802006Z 2025-03-14T06:16:28.1802998Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:727 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:16:28.1804434Z hidden_states_2: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:16:28.1805102Z 2025-03-14T06:16:28.1805896Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:750 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:16:28.1806813Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:16:28.1807160Z 2025-03-14T06:16:28.1808020Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:751 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:16:28.1809064Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:16:28.1809475Z 2025-03-14T06:16:28.1809633Z 2025-03-14T06:16:28.1809759Z class GraphModule(torch.nn.Module): 2025-03-14T06:16:28.1811863Z def forward(self, L_cloned_inputs_labels_: "i64[1, 128][128, 1]cuda:0", L_cloned_inputs_decoder_input_ids_: "i64[1, 128][128, 1]cuda:0", L_cloned_inputs_input_ids_: "i64[1, 128][128, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 512][512, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[512, 512][512, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[512][1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[512][1]cuda:0"): 2025-03-14T06:16:28.1813946Z l_cloned_inputs_labels_ = L_cloned_inputs_labels_ 2025-03-14T06:16:28.1814429Z l_cloned_inputs_decoder_input_ids_ = L_cloned_inputs_decoder_input_ids_ 2025-03-14T06:16:28.1814936Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:16:28.1815708Z l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:16:28.1816830Z l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:16:28.1818122Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:16:28.1819326Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:16:28.1820058Z 2025-03-14T06:16:28.1820323Z # No stacktrace found for following nodes 2025-03-14T06:16:28.1820907Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:16:28.1821449Z 2025-03-14T06:16:28.1822278Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:714 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:16:28.1823443Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 128); l_cloned_inputs_input_ids_ = None 2025-03-14T06:16:28.1823949Z 2025-03-14T06:16:28.1824834Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:721 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:16:28.1826579Z embedding: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:16:28.1827813Z inputs_embeds: "f32[1, 128, 512][65536, 512, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:16:28.1828255Z 2025-03-14T06:16:28.1829012Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:87 in forward, code: positions = torch.arange( 2025-03-14T06:16:28.1830066Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:16:28.1830575Z 2025-03-14T06:16:28.1831354Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:90 in forward, code: return super().forward(positions) 2025-03-14T06:16:28.1833087Z embed_pos: "f32[128, 512][512, 1]cuda:0" = torch.nn.functional.embedding(positions, l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:16:28.1834200Z 2025-03-14T06:16:28.1835070Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:725 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:16:28.1836166Z hidden_states: "f32[1, 128, 512][65536, 512, 1]cuda:0" = inputs_embeds + embed_pos; inputs_embeds = embed_pos = None 2025-03-14T06:16:28.1836683Z 2025-03-14T06:16:28.1847132Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:726 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:16:28.1849508Z hidden_states_1: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (512,), l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:16:28.1851282Z 2025-03-14T06:16:28.1852329Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:727 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:16:28.1853769Z hidden_states_2: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:16:28.1854443Z 2025-03-14T06:16:28.1855224Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:750 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:16:28.1856138Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:16:28.1856487Z 2025-03-14T06:16:28.1857343Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:751 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:16:28.1858391Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:16:28.1858806Z 2025-03-14T06:16:28.1858938Z 2025-03-14T06:16:28.1859062Z class GraphModule(torch.nn.Module): 2025-03-14T06:16:28.1861069Z def forward(self, L_cloned_inputs_labels_: "i64[1, 128][128, 1]cuda:0", L_cloned_inputs_decoder_input_ids_: "i64[1, 128][128, 1]cuda:0", L_cloned_inputs_input_ids_: "i64[1, 128][128, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 512][512, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[512, 512][512, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[512][1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[512][1]cuda:0"): 2025-03-14T06:16:28.1863162Z l_cloned_inputs_labels_ = L_cloned_inputs_labels_ 2025-03-14T06:16:28.1863642Z l_cloned_inputs_decoder_input_ids_ = L_cloned_inputs_decoder_input_ids_ 2025-03-14T06:16:28.1864139Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:16:28.1864911Z l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:16:28.1866045Z l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:16:28.1867224Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:16:28.1868732Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:16:28.1869476Z 2025-03-14T06:16:28.1869742Z # No stacktrace found for following nodes 2025-03-14T06:16:28.1870512Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:16:28.1871063Z 2025-03-14T06:16:28.1871902Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:714 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:16:28.1873023Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 128); l_cloned_inputs_input_ids_ = None 2025-03-14T06:16:28.1873529Z 2025-03-14T06:16:28.1874482Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:721 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:16:28.1876345Z embedding: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:16:28.1877588Z inputs_embeds: "f32[1, 128, 512][65536, 512, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:16:28.1878027Z 2025-03-14T06:16:28.1878780Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:87 in forward, code: positions = torch.arange( 2025-03-14T06:16:28.1879819Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:16:28.1880323Z 2025-03-14T06:16:28.1881092Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:90 in forward, code: return super().forward(positions) 2025-03-14T06:16:28.1882768Z embed_pos: "f32[128, 512][512, 1]cuda:0" = torch.nn.functional.embedding(positions, l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:16:28.1883874Z 2025-03-14T06:16:28.1884675Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:725 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:16:28.1885757Z hidden_states: "f32[1, 128, 512][65536, 512, 1]cuda:0" = inputs_embeds + embed_pos; inputs_embeds = embed_pos = None 2025-03-14T06:16:28.1886259Z 2025-03-14T06:16:28.1887111Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:726 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:16:28.1889440Z hidden_states_1: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (512,), l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:16:28.1891122Z 2025-03-14T06:16:28.1892099Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:727 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:16:28.1893535Z hidden_states_2: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:16:28.1894201Z 2025-03-14T06:16:28.1895074Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:750 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:16:28.1895992Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:16:28.1896337Z 2025-03-14T06:16:28.1897192Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:751 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:16:28.1898230Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:16:28.1898638Z 2025-03-14T06:16:29.1349363Z 2025-03-14T06:16:29.1349989Z class GraphModule(torch.nn.Module): 2025-03-14T06:16:29.1351959Z def forward(self, L_labels_: "i64[1, 128][128, 1]cuda:0", L_decoder_input_ids_: "i64[1, 128][128, 1]cuda:0", L_input_ids_: "i64[1, 128][128, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 512][512, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[512, 512][512, 1]cuda:0", L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[512][1]cuda:0", L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[512][1]cuda:0"): 2025-03-14T06:16:29.1353883Z l_labels_ = L_labels_ 2025-03-14T06:16:29.1354212Z l_decoder_input_ids_ = L_decoder_input_ids_ 2025-03-14T06:16:29.1354692Z l_input_ids_ = L_input_ids_ 2025-03-14T06:16:29.1355402Z l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:16:29.1356553Z l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:16:29.1357747Z l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:16:29.1358946Z l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:16:29.1359679Z 2025-03-14T06:16:29.1360819Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:714 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:16:29.1361954Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_input_ids_.view(-1, 128); l_input_ids_ = None 2025-03-14T06:16:29.1362405Z 2025-03-14T06:16:29.1363308Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:721 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:16:29.1365060Z embedding: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:16:29.1366305Z inputs_embeds: "f32[1, 128, 512][65536, 512, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:16:29.1366731Z 2025-03-14T06:16:29.1367487Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:87 in forward, code: positions = torch.arange( 2025-03-14T06:16:29.1368847Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:16:29.1369354Z 2025-03-14T06:16:29.1370558Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:90 in forward, code: return super().forward(positions) 2025-03-14T06:16:29.1372247Z embed_pos: "f32[128, 512][512, 1]cuda:0" = torch.nn.functional.embedding(positions, l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:16:29.1373367Z 2025-03-14T06:16:29.1374168Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:725 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:16:29.1375392Z hidden_states: "f32[1, 128, 512][65536, 512, 1]cuda:0" = inputs_embeds + embed_pos; inputs_embeds = embed_pos = None 2025-03-14T06:16:29.1375905Z 2025-03-14T06:16:29.1376752Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:726 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:16:29.1379103Z hidden_states_1: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (512,), l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:16:29.1380803Z 2025-03-14T06:16:29.1381793Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:727 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:16:29.1383284Z hidden_states_2: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:16:29.1383948Z 2025-03-14T06:16:29.1384728Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:750 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:16:29.1385642Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:16:29.1385989Z 2025-03-14T06:16:29.1386840Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:751 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:16:29.1387881Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:16:29.1388287Z 2025-03-14T06:16:29.1388428Z 2025-03-14T06:16:29.1388559Z class GraphModule(torch.nn.Module): 2025-03-14T06:16:29.1390631Z def forward(self, L_labels_: "i64[1, 128][128, 1]cuda:0", L_decoder_input_ids_: "i64[1, 128][128, 1]cuda:0", L_input_ids_: "i64[1, 128][128, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 512][512, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[512, 512][512, 1]cuda:0", L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[512][1]cuda:0", L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[512][1]cuda:0"): 2025-03-14T06:16:29.1392478Z l_labels_ = L_labels_ 2025-03-14T06:16:29.1392854Z l_decoder_input_ids_ = L_decoder_input_ids_ 2025-03-14T06:16:29.1393200Z l_input_ids_ = L_input_ids_ 2025-03-14T06:16:29.1393909Z l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:16:29.1395213Z l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:16:29.1396401Z l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:16:29.1397600Z l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:16:29.1398334Z 2025-03-14T06:16:29.1399159Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:714 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:16:29.1400257Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_input_ids_.view(-1, 128); l_input_ids_ = None 2025-03-14T06:16:29.1400680Z 2025-03-14T06:16:29.1401553Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:721 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:16:29.1403302Z embedding: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:16:29.1404537Z inputs_embeds: "f32[1, 128, 512][65536, 512, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:16:29.1404980Z 2025-03-14T06:16:29.1405732Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:87 in forward, code: positions = torch.arange( 2025-03-14T06:16:29.1406773Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:16:29.1407279Z 2025-03-14T06:16:29.1408051Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:90 in forward, code: return super().forward(positions) 2025-03-14T06:16:29.1409722Z embed_pos: "f32[128, 512][512, 1]cuda:0" = torch.nn.functional.embedding(positions, l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:16:29.1410841Z 2025-03-14T06:16:29.1411640Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:725 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:16:29.1412733Z hidden_states: "f32[1, 128, 512][65536, 512, 1]cuda:0" = inputs_embeds + embed_pos; inputs_embeds = embed_pos = None 2025-03-14T06:16:29.1413237Z 2025-03-14T06:16:29.1414078Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:726 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:16:29.1416420Z hidden_states_1: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (512,), l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:16:29.1418133Z 2025-03-14T06:16:29.1419191Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:727 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:16:29.1420631Z hidden_states_2: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:16:29.1421295Z 2025-03-14T06:16:29.1422082Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:750 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:16:29.1423123Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:16:29.1423471Z 2025-03-14T06:16:29.1424321Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:751 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:16:29.1425367Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:16:29.1425768Z 2025-03-14T06:16:29.2258066Z 2025-03-14T06:16:29.2259391Z class GraphModule(torch.nn.Module): 2025-03-14T06:16:29.2262812Z def forward(self, L_input_ids_: "i64[1, 128][128, 1]cuda:0", L_self_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 512][512, 1]cuda:0", L_self_modules_encoder_modules_embed_positions_parameters_weight_: "f32[512, 512][512, 1]cuda:0", L_self_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[512][1]cuda:0", L_self_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[512][1]cuda:0"): 2025-03-14T06:16:29.2264373Z l_input_ids_ = L_input_ids_ 2025-03-14T06:16:29.2265006Z l_self_modules_encoder_modules_embed_tokens_parameters_weight_ = L_self_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:16:29.2265970Z l_self_modules_encoder_modules_embed_positions_parameters_weight_ = L_self_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:16:29.2266980Z l_self_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:16:29.2268353Z l_self_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:16:29.2269002Z 2025-03-14T06:16:29.2269850Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:714 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:16:29.2270884Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_input_ids_.view(-1, 128); l_input_ids_ = None 2025-03-14T06:16:29.2271303Z 2025-03-14T06:16:29.2272191Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:721 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:16:29.2273849Z embedding: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_encoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_self_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:16:29.2275083Z inputs_embeds: "f32[1, 128, 512][65536, 512, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:16:29.2275521Z 2025-03-14T06:16:29.2276280Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:87 in forward, code: positions = torch.arange( 2025-03-14T06:16:29.2277329Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:16:29.2277840Z 2025-03-14T06:16:29.2278991Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:90 in forward, code: return super().forward(positions) 2025-03-14T06:16:29.2280584Z embed_pos: "f32[128, 512][512, 1]cuda:0" = torch.nn.functional.embedding(positions, l_self_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_self_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:16:29.2281609Z 2025-03-14T06:16:29.2282414Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:725 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:16:29.2283627Z hidden_states: "f32[1, 128, 512][65536, 512, 1]cuda:0" = inputs_embeds + embed_pos; inputs_embeds = embed_pos = None 2025-03-14T06:16:29.2284129Z 2025-03-14T06:16:29.2284982Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:726 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:16:29.2287146Z hidden_states_1: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (512,), l_self_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:16:29.2288665Z 2025-03-14T06:16:29.2289654Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:727 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:16:29.2291101Z hidden_states_2: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:16:29.2291771Z 2025-03-14T06:16:29.2292602Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:750 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:16:29.2293529Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:16:29.2293871Z 2025-03-14T06:16:29.2294727Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:751 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:16:29.2295769Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:16:29.2296181Z 2025-03-14T06:16:29.3838722Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:16:29.3839827Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 90, in forward 2025-03-14T06:16:29.3840584Z return super().forward(positions) 2025-03-14T06:16:29.3840811Z 2025-03-14T06:16:31.5591748Z 2025-03-14T06:16:31.5592176Z class GraphModule(torch.nn.Module): 2025-03-14T06:16:31.5594008Z def forward(self, dict_getitem_L_stack0_list_dict_keys_L_stack0_0_: "f32[1, 128, 512][65536, 512, 1]cuda:0", L_decoder_input_ids_: "i64[1, 128][128, 1]cuda:0", L_self_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 512][512, 1]cuda:0", L_self_modules_decoder_modules_embed_positions_parameters_weight_: "f32[512, 512][512, 1]cuda:0", L_self_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[512][1]cuda:0", L_self_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[512][1]cuda:0"): 2025-03-14T06:16:31.5596406Z dict_getitem_l_stack0_list_dict_keys_l_stack0_0_ = dict_getitem_L_stack0_list_dict_keys_L_stack0_0_ 2025-03-14T06:16:31.5596955Z l_decoder_input_ids_ = L_decoder_input_ids_ 2025-03-14T06:16:31.5597610Z l_self_modules_decoder_modules_embed_tokens_parameters_weight_ = L_self_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:16:31.5598558Z l_self_modules_decoder_modules_embed_positions_parameters_weight_ = L_self_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:16:31.5599561Z l_self_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:16:31.5600714Z l_self_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:16:31.5601356Z 2025-03-14T06:16:31.5602214Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:919 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:16:31.5603293Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_decoder_input_ids_.view(-1, 128); l_decoder_input_ids_ = None 2025-03-14T06:16:31.5603761Z 2025-03-14T06:16:31.5604641Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:929 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:16:31.5606324Z embedding: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_decoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_self_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:16:31.5607488Z inputs_embeds: "f32[1, 128, 512][65536, 512, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:16:31.5607927Z 2025-03-14T06:16:31.5608752Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:16:31.5609856Z mask: "f32[128, 128][128, 1]cuda:0" = torch.full((128, 128), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:16:31.5610363Z 2025-03-14T06:16:31.5611113Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:16:31.5612075Z mask_cond: "i64[128][1]cuda:0" = torch.arange(128, device = device(type='cuda', index=0)) 2025-03-14T06:16:31.5612517Z 2025-03-14T06:16:31.5613372Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:16:31.5614265Z add: "i64[128][1]cuda:0" = mask_cond + 1 2025-03-14T06:16:31.5614658Z view_1: "i64[128, 1][1, 1]cuda:0" = add.view(128, 1); add = None 2025-03-14T06:16:31.5615129Z lt: "b8[128, 128][128, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:16:31.5615686Z masked_fill_: "f32[128, 128][128, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:16:31.5616137Z 2025-03-14T06:16:31.5616789Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:16:31.5617618Z mask_1: "f32[128, 128][128, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:16:31.5618015Z 2025-03-14T06:16:31.5618931Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:16:31.5620103Z getitem: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:16:31.5620907Z causal_4d_mask: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = getitem.expand(1, 1, 128, 128); getitem = causal_4d_mask = None 2025-03-14T06:16:31.5621430Z 2025-03-14T06:16:31.5622185Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:87 in forward, code: positions = torch.arange( 2025-03-14T06:16:31.5623345Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:16:31.5623873Z 2025-03-14T06:16:31.5624651Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:90 in forward, code: return super().forward(positions) 2025-03-14T06:16:31.5626252Z positions_1: "f32[128, 512][512, 1]cuda:0" = torch.nn.functional.embedding(positions, l_self_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_self_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:16:31.5627294Z 2025-03-14T06:16:31.5628151Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:946 in forward, code: inputs_embeds = self.layernorm_embedding(inputs_embeds) 2025-03-14T06:16:31.5630328Z inputs_embeds_1: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.layer_norm(inputs_embeds, (512,), l_self_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); inputs_embeds = l_self_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:16:31.5631858Z 2025-03-14T06:16:31.5632661Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:947 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:16:31.5633778Z hidden_states: "f32[1, 128, 512][65536, 512, 1]cuda:0" = inputs_embeds_1 + positions_1; inputs_embeds_1 = positions_1 = None 2025-03-14T06:16:31.5634352Z 2025-03-14T06:16:31.5635341Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:949 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:16:31.5636779Z hidden_states_1: "f32[1, 128, 512][65536, 512, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:16:31.5637448Z 2025-03-14T06:16:31.5638232Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:977 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:16:31.5639152Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:16:31.5639501Z 2025-03-14T06:16:31.5640298Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py:978 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:16:31.5641300Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:16:31.5641724Z 2025-03-14T06:16:47.4105856Z Compilation time (from dynamo_timed): 9.047933538999999 2025-03-14T06:16:47.4125549Z pass 2025-03-14T06:16:47.4479713Z TIMING: entire_frame_compile:7.02142 gc:0.00944 _recursive_pre_grad_passes:0.00701 _recursive_joint_graph_passes:0.40858 inductor_compile:4.49395 backend_compile:5.42853 async_compile.precompile:0.03602 async_compile.wait:0.92003 cudagraphify.get_container:0.19801 pad_mm_benchmark:0.04233 _recursive_post_grad_passes:0.21422 code_gen:2.70394 entire_backward_compile:2.02651 CUDAGraphNode.record:8.9954 total_wall_time:9.04793 2025-03-14T06:16:47.4481710Z STATS: call_* op count: 121 | FakeTensorMode.__torch_dispatch__:9099 | FakeTensor.__torch_dispatch__:1409 | ProxyTorchDispatchMode.__torch_dispatch__:4270 2025-03-14T06:16:47.4482528Z Dynamo produced 7 graphs covering 121 ops with 8 graph breaks (5 unique) 2025-03-14T06:16:53.0327837Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:16:53.0329670Z warnings.warn( 2025-03-14T06:16:53.4431340Z 2025-03-14T06:16:55.5688061Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:16:55.5688496Z loading model: 0it [00:02, ?it/s] 2025-03-14T06:16:55.5688833Z cuda train CamemBert 2025-03-14T06:17:28.4428866Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:17:28.4430214Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:17:28.4430919Z pred = mod(**cloned_inputs) 2025-03-14T06:17:28.4431608Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/camembert/modeling_camembert.py", line 979, in forward 2025-03-14T06:17:28.4432340Z outputs = self.roberta( 2025-03-14T06:17:28.4432995Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/camembert/modeling_camembert.py", line 881, in forward 2025-03-14T06:17:28.4433697Z embedding_output = self.embeddings( 2025-03-14T06:17:28.4434475Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/camembert/modeling_camembert.py", line 139, in forward 2025-03-14T06:17:28.4435200Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:17:28.4435456Z 2025-03-14T06:17:28.6180342Z W0314 06:17:28.617000 17275 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:18:09.3822677Z Compilation time (from dynamo_timed): 67.20062972299999 2025-03-14T06:18:09.3845391Z pass 2025-03-14T06:18:09.4500777Z TIMING: entire_frame_compile:59.36313 gc:0.00459 _recursive_pre_grad_passes:0.03347 pad_mm_benchmark:0.40946 _recursive_joint_graph_passes:1.77714 _recursive_post_grad_passes:0.624 async_compile.wait:2.06884 code_gen:17.47003 inductor_compile:32.06462 backend_compile:46.35876 cudagraphify.get_container:0.37763 entire_backward_compile:7.8375 CUDAGraphNode.record:1.71125 total_wall_time:67.20063 2025-03-14T06:18:09.4503364Z STATS: call_* op count: 1409 | FakeTensorMode.__torch_dispatch__:63039 | FakeTensor.__torch_dispatch__:14295 | ProxyTorchDispatchMode.__torch_dispatch__:28628 2025-03-14T06:18:09.4504191Z Dynamo produced 2 graphs covering 1409 ops with 5 graph breaks (4 unique) 2025-03-14T06:18:17.4400484Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:18:17.4401830Z warnings.warn( 2025-03-14T06:18:17.7114155Z 2025-03-14T06:18:20.2170431Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:18:20.2171043Z loading model: 0it [00:02, ?it/s] 2025-03-14T06:18:20.2171608Z cuda train DebertaForMaskedLM 2025-03-14T06:18:54.7788781Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:18:54.7790025Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:18:54.7791006Z pred = mod(**cloned_inputs) 2025-03-14T06:18:54.7791890Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/deberta/modeling_deberta.py", line 1062, in forward 2025-03-14T06:18:54.7792569Z outputs = self.deberta( 2025-03-14T06:18:54.7793209Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/deberta/modeling_deberta.py", line 966, in forward 2025-03-14T06:18:54.7793985Z embedding_output = self.embeddings( 2025-03-14T06:18:54.7794736Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/deberta/modeling_deberta.py", line 780, in forward 2025-03-14T06:18:54.7795715Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:18:54.7795976Z 2025-03-14T06:18:54.9454964Z W0314 06:18:54.944000 17724 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:19:29.0632396Z Compilation time (from dynamo_timed): 62.283889677999994 2025-03-14T06:19:29.0656360Z pass 2025-03-14T06:19:29.1240572Z TIMING: entire_frame_compile:54.37268 gc:0.00292 _recursive_pre_grad_passes:0.03372 pad_mm_benchmark:0.92698 _recursive_joint_graph_passes:2.12184 _recursive_post_grad_passes:1.02754 async_compile.wait:3.3172 code_gen:16.92876 inductor_compile:30.86509 backend_compile:42.22191 cudagraphify.get_container:0.4163 entire_backward_compile:7.91121 CUDAGraphNode.record:1.5684 total_wall_time:62.28389 2025-03-14T06:19:29.1242484Z STATS: call_* op count: 1650 | FakeTensorMode.__torch_dispatch__:64644 | FakeTensor.__torch_dispatch__:13084 | ProxyTorchDispatchMode.__torch_dispatch__:29596 2025-03-14T06:19:29.1243344Z Dynamo produced 2 graphs covering 1650 ops with 5 graph breaks (4 unique) 2025-03-14T06:19:54.0612505Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:19:54.0613700Z warnings.warn( 2025-03-14T06:19:54.6676133Z 2025-03-14T06:19:56.7901810Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:19:56.7902192Z loading model: 0it [00:02, ?it/s] 2025-03-14T06:19:56.7902547Z cuda train DebertaForQuestionAnswering 2025-03-14T06:20:29.9423421Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:20:29.9424422Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:20:29.9425233Z pred = mod(**cloned_inputs) 2025-03-14T06:20:29.9425978Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/deberta/modeling_deberta.py", line 1388, in forward 2025-03-14T06:20:29.9426737Z outputs = self.deberta( 2025-03-14T06:20:29.9427474Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/deberta/modeling_deberta.py", line 966, in forward 2025-03-14T06:20:29.9428234Z embedding_output = self.embeddings( 2025-03-14T06:20:29.9428989Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/deberta/modeling_deberta.py", line 780, in forward 2025-03-14T06:20:29.9429770Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:20:29.9430024Z 2025-03-14T06:20:30.0971973Z W0314 06:20:30.096000 18325 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:21:02.1455091Z Compilation time (from dynamo_timed): 59.34827082 2025-03-14T06:21:02.1479102Z pass 2025-03-14T06:21:02.2160360Z TIMING: entire_frame_compile:51.74505 gc:0.00459 _recursive_pre_grad_passes:0.0342 pad_mm_benchmark:0.70739 _recursive_joint_graph_passes:1.8815 _recursive_post_grad_passes:1.04728 async_compile.wait:1.24943 code_gen:14.49581 inductor_compile:28.35428 backend_compile:39.64287 cudagraphify.get_container:0.42415 entire_backward_compile:7.60322 CUDAGraphNode.record:1.55367 total_wall_time:59.34827 2025-03-14T06:21:02.2162284Z STATS: call_* op count: 1642 | FakeTensorMode.__torch_dispatch__:64107 | ProxyTorchDispatchMode.__torch_dispatch__:29383 | FakeTensor.__torch_dispatch__:12915 2025-03-14T06:21:02.2163141Z Dynamo produced 2 graphs covering 1642 ops with 5 graph breaks (4 unique) 2025-03-14T06:21:09.9301083Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:21:09.9304088Z warnings.warn( 2025-03-14T06:21:10.2678165Z 2025-03-14T06:21:22.7979416Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:21:22.7979914Z loading model: 0it [00:12, ?it/s] 2025-03-14T06:21:22.7980328Z cuda train DebertaV2ForMaskedLM 2025-03-14T06:21:22.8216899Z Compilation time (from dynamo_timed): 0 2025-03-14T06:21:22.8217391Z pass_due_to_skip 2025-03-14T06:21:23.0442162Z TIMING: total_wall_time:0 2025-03-14T06:21:23.0442479Z STATS: call_* op count: 0 2025-03-14T06:21:23.0442881Z Dynamo produced 0 graphs covering 0 ops with 0 graph breaks (0 unique) 2025-03-14T06:21:27.3234376Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:21:27.3235695Z warnings.warn( 2025-03-14T06:21:27.5597247Z 2025-03-14T06:21:38.0732112Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:21:38.0732557Z loading model: 0it [00:10, ?it/s] 2025-03-14T06:21:38.0732923Z cuda train DebertaV2ForQuestionAnswering 2025-03-14T06:21:48.7434322Z ERROR:common: 2025-03-14T06:21:48.7436671Z Traceback (most recent call last): 2025-03-14T06:21:48.7437464Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/common.py", line 2154, in check_accuracy 2025-03-14T06:21:48.7438026Z correct_result = self.run_n_iterations( 2025-03-14T06:21:48.7438581Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/common.py", line 1951, in run_n_iterations 2025-03-14T06:21:48.7439171Z model_iter_fn(mod, inputs, collect_outputs=False) 2025-03-14T06:21:48.7439822Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in forward_and_backward_pass 2025-03-14T06:21:48.7440426Z pred = mod(**cloned_inputs) 2025-03-14T06:21:48.7441046Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl 2025-03-14T06:21:48.7441692Z return self._call_impl(*args, **kwargs) 2025-03-14T06:21:48.7442291Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl 2025-03-14T06:21:48.7442890Z return forward_call(*args, **kwargs) 2025-03-14T06:21:48.7443585Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1486, in forward 2025-03-14T06:21:48.7444274Z outputs = self.deberta( 2025-03-14T06:21:48.7444875Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl 2025-03-14T06:21:48.7445514Z return self._call_impl(*args, **kwargs) 2025-03-14T06:21:48.7446109Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl 2025-03-14T06:21:48.7446700Z return forward_call(*args, **kwargs) 2025-03-14T06:21:48.7447392Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1070, in forward 2025-03-14T06:21:48.7448079Z encoder_outputs = self.encoder( 2025-03-14T06:21:48.7449024Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl 2025-03-14T06:21:48.7449665Z return self._call_impl(*args, **kwargs) 2025-03-14T06:21:48.7450256Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl 2025-03-14T06:21:48.7450845Z return forward_call(*args, **kwargs) 2025-03-14T06:21:48.7451531Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 514, in forward 2025-03-14T06:21:48.7452217Z output_states = layer_module( 2025-03-14T06:21:48.7452827Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl 2025-03-14T06:21:48.7453606Z return self._call_impl(*args, **kwargs) 2025-03-14T06:21:48.7454195Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl 2025-03-14T06:21:48.7454783Z return forward_call(*args, **kwargs) 2025-03-14T06:21:48.7455473Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 362, in forward 2025-03-14T06:21:48.7456162Z attention_output = self.attention( 2025-03-14T06:21:48.7456814Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl 2025-03-14T06:21:48.7457469Z return self._call_impl(*args, **kwargs) 2025-03-14T06:21:48.7458057Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl 2025-03-14T06:21:48.7458644Z return forward_call(*args, **kwargs) 2025-03-14T06:21:48.7459327Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 293, in forward 2025-03-14T06:21:48.7460011Z self_output = self.self( 2025-03-14T06:21:48.7460607Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl 2025-03-14T06:21:48.7461243Z return self._call_impl(*args, **kwargs) 2025-03-14T06:21:48.7461836Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl 2025-03-14T06:21:48.7462434Z return forward_call(*args, **kwargs) 2025-03-14T06:21:48.7463124Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 733, in forward 2025-03-14T06:21:48.7463929Z attention_probs = XSoftmax.apply(attention_scores, attention_mask, -1) 2025-03-14T06:21:48.7464610Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/autograd/function.py", line 575, in apply 2025-03-14T06:21:48.7465257Z return super().apply(*args, **kwargs) # type: ignore[misc] 2025-03-14T06:21:48.7466008Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 113, in forward 2025-03-14T06:21:48.7466742Z output = torch.softmax(output, self.dim) 2025-03-14T06:21:48.7469179Z torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 24.00 MiB. GPU 0 has a total capacity of 21.98 GiB of which 10.44 MiB is free. Process 74708 has 21.96 GiB memory in use. Of the allocated memory 21.29 GiB is allocated by PyTorch, and 318.25 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables) 2025-03-14T06:21:48.7672875Z Compilation time (from dynamo_timed): 0 2025-03-14T06:21:48.8006974Z eager_1st_run_OOM 2025-03-14T06:21:48.8007526Z TIMING: total_wall_time:0 2025-03-14T06:21:48.8008090Z STATS: call_* op count: 0 2025-03-14T06:21:48.8008872Z Dynamo produced 0 graphs covering 0 ops with 0 graph breaks (0 unique) 2025-03-14T06:21:53.1013037Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:21:53.1014269Z warnings.warn( 2025-03-14T06:21:53.3576534Z 2025-03-14T06:21:54.9058303Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:21:54.9058697Z loading model: 0it [00:01, ?it/s] 2025-03-14T06:21:54.9059043Z cuda train DistilBertForMaskedLM 2025-03-14T06:22:13.0386909Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:22:13.0387779Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:22:13.0388837Z pred = mod(**cloned_inputs) 2025-03-14T06:22:13.0389532Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 905, in forward 2025-03-14T06:22:13.0390240Z dlbrt_output = self.distilbert( 2025-03-14T06:22:13.0390951Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 814, in forward 2025-03-14T06:22:13.0391785Z embeddings = self.embeddings(input_ids, inputs_embeds) # (bs, seq_length, dim) 2025-03-14T06:22:13.0392622Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 141, in forward 2025-03-14T06:22:13.0393437Z input_embeds = self.word_embeddings(input_ids) # (bs, max_seq_length, dim) 2025-03-14T06:22:13.0393769Z 2025-03-14T06:22:13.1882517Z W0314 06:22:13.187000 18777 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:22:34.6345529Z Compilation time (from dynamo_timed): 36.323178868999996 2025-03-14T06:22:34.6358155Z pass 2025-03-14T06:22:34.6758321Z TIMING: entire_frame_compile:32.00102 gc:0.00335 _recursive_pre_grad_passes:0.02046 pad_mm_benchmark:0.29878 _recursive_joint_graph_passes:1.0475 _recursive_post_grad_passes:0.52917 async_compile.wait:2.596 code_gen:9.68729 inductor_compile:17.41846 backend_compile:24.95042 cudagraphify.get_container:0.2789 entire_backward_compile:4.32216 CUDAGraphNode.record:0.95074 total_wall_time:36.32318 2025-03-14T06:22:34.6760215Z STATS: call_* op count: 752 | FakeTensorMode.__torch_dispatch__:33139 | FakeTensor.__torch_dispatch__:7436 | ProxyTorchDispatchMode.__torch_dispatch__:15096 2025-03-14T06:22:34.6761034Z Dynamo produced 2 graphs covering 752 ops with 5 graph breaks (4 unique) 2025-03-14T06:22:41.1584508Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:22:41.1585731Z warnings.warn( 2025-03-14T06:22:44.2248244Z 2025-03-14T06:22:45.6214215Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:22:45.6214592Z loading model: 0it [00:01, ?it/s] 2025-03-14T06:22:45.6214952Z cuda train DistilBertForQuestionAnswering 2025-03-14T06:23:03.1309701Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:23:03.1311043Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:23:03.1312184Z pred = mod(**cloned_inputs) 2025-03-14T06:23:03.1313281Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1124, in forward 2025-03-14T06:23:03.1314258Z distilbert_output = self.distilbert( 2025-03-14T06:23:03.1315017Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 814, in forward 2025-03-14T06:23:03.1315856Z embeddings = self.embeddings(input_ids, inputs_embeds) # (bs, seq_length, dim) 2025-03-14T06:23:03.1317171Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 141, in forward 2025-03-14T06:23:03.1317988Z input_embeds = self.word_embeddings(input_ids) # (bs, max_seq_length, dim) 2025-03-14T06:23:03.1318325Z 2025-03-14T06:23:03.2646676Z W0314 06:23:03.263000 19339 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:23:23.3608230Z Compilation time (from dynamo_timed): 34.455829927 2025-03-14T06:23:23.3620899Z pass 2025-03-14T06:23:23.4016958Z TIMING: entire_frame_compile:30.57944 gc:0.00337 _recursive_pre_grad_passes:0.02028 pad_mm_benchmark:0.29392 _recursive_joint_graph_passes:1.02244 _recursive_post_grad_passes:0.53855 async_compile.wait:1.33691 code_gen:8.06259 inductor_compile:15.71455 backend_compile:23.63965 cudagraphify.get_container:0.27403 entire_backward_compile:3.87639 CUDAGraphNode.record:0.94293 total_wall_time:34.45583 2025-03-14T06:23:23.4019317Z STATS: call_* op count: 745 | FakeTensorMode.__torch_dispatch__:32684 | FakeTensor.__torch_dispatch__:7291 | ProxyTorchDispatchMode.__torch_dispatch__:14914 2025-03-14T06:23:23.4020143Z Dynamo produced 2 graphs covering 745 ops with 5 graph breaks (4 unique) 2025-03-14T06:23:29.9213357Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:23:29.9214549Z warnings.warn( 2025-03-14T06:23:30.1926095Z 2025-03-14T06:23:32.2509351Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:23:32.2509826Z loading model: 0it [00:02, ?it/s] 2025-03-14T06:23:32.2510169Z cuda train DistillGPT2 2025-03-14T06:23:50.0613270Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:23:50.0614185Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:23:50.0614896Z pred = mod(**cloned_inputs) 2025-03-14T06:23:50.0615542Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1074, in forward 2025-03-14T06:23:50.0616213Z transformer_outputs = self.transformer( 2025-03-14T06:23:50.0616873Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 837, in forward 2025-03-14T06:23:50.0617528Z inputs_embeds = self.wte(input_ids) 2025-03-14T06:23:50.0617752Z 2025-03-14T06:23:50.1839333Z W0314 06:23:50.182000 19750 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:24:06.5232085Z Compilation time (from dynamo_timed): 30.074135633999997 2025-03-14T06:24:06.5243415Z pass 2025-03-14T06:24:06.5604372Z TIMING: entire_frame_compile:25.70676 gc:0.0032 _recursive_pre_grad_passes:0.0191 pad_mm_benchmark:0.32062 _recursive_joint_graph_passes:0.94158 _recursive_post_grad_passes:0.37184 async_compile.wait:2.58092 code_gen:8.92943 inductor_compile:15.10291 backend_compile:20.20869 cudagraphify.get_container:0.24964 entire_backward_compile:4.36737 CUDAGraphNode.record:0.82425 total_wall_time:30.07414 2025-03-14T06:24:06.5606267Z STATS: call_* op count: 725 | FakeTensorMode.__torch_dispatch__:27325 | FakeTensor.__torch_dispatch__:6021 | ProxyTorchDispatchMode.__torch_dispatch__:12140 2025-03-14T06:24:12.9371268Z Dynamo produced 2 graphs covering 725 ops with 5 graph breaks (4 unique) 2025-03-14T06:24:12.9372781Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:24:12.9373959Z warnings.warn( 2025-03-14T06:24:16.0303186Z 2025-03-14T06:24:16.0310972Z loading model: 0it [00:00, ?it/s]If you want to use `ElectraForCausalLM` as a standalone, add `is_decoder=True.` 2025-03-14T06:24:16.8788979Z 2025-03-14T06:24:16.8789607Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:24:16.8789997Z cuda train ElectraForCausalLM 2025-03-14T06:24:48.3154860Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:24:48.3155721Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:24:48.3156426Z pred = mod(**cloned_inputs) 2025-03-14T06:24:48.3157100Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/electra/modeling_electra.py", line 1617, in forward 2025-03-14T06:24:48.3158079Z outputs = self.electra( 2025-03-14T06:24:48.3158715Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/electra/modeling_electra.py", line 902, in forward 2025-03-14T06:24:48.3159391Z hidden_states = self.embeddings( 2025-03-14T06:24:48.3160051Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/electra/modeling_electra.py", line 203, in forward 2025-03-14T06:24:48.3160752Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:24:48.3161009Z 2025-03-14T06:24:49.1152949Z Compilation time (from dynamo_timed): 29.196066773 2025-03-14T06:24:49.1177738Z pass 2025-03-14T06:24:49.1376093Z TIMING: entire_frame_compile:21.24861 gc:0.00125 _recursive_pre_grad_passes:0.00799 pad_mm_benchmark:0.37919 _recursive_joint_graph_passes:1.5743 _recursive_post_grad_passes:0.51189 async_compile.wait:0.19946 code_gen:7.02575 inductor_compile:13.70966 backend_compile:15.43999 cudagraphify.get_container:0.35498 entire_backward_compile:7.94746 CUDAGraphNode.record:0.45066 total_wall_time:29.19607 2025-03-14T06:24:49.1379864Z STATS: call_* op count: 377 | FakeTensorMode.__torch_dispatch__:39287 | FakeTensor.__torch_dispatch__:5402 | ProxyTorchDispatchMode.__torch_dispatch__:18002 2025-03-14T06:24:49.1380789Z Dynamo produced 1 graphs covering 377 ops with 4 graph breaks (4 unique) 2025-03-14T06:24:55.5717113Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:24:55.5718348Z warnings.warn( 2025-03-14T06:24:56.0070816Z 2025-03-14T06:24:56.8113337Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:24:56.8113714Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:24:56.8114074Z cuda train ElectraForQuestionAnswering 2025-03-14T06:25:27.6548596Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:25:27.6550745Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:25:27.6552148Z pred = mod(**cloned_inputs) 2025-03-14T06:25:27.6552875Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/electra/modeling_electra.py", line 1385, in forward 2025-03-14T06:25:27.6553668Z discriminator_hidden_states = self.electra( 2025-03-14T06:25:27.6554552Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/electra/modeling_electra.py", line 902, in forward 2025-03-14T06:25:27.6555233Z hidden_states = self.embeddings( 2025-03-14T06:25:27.6555903Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/electra/modeling_electra.py", line 203, in forward 2025-03-14T06:25:27.6556629Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:25:27.6556882Z 2025-03-14T06:25:27.8185951Z W0314 06:25:27.817000 20731 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:26:09.3727339Z Compilation time (from dynamo_timed): 67.834359045 2025-03-14T06:26:09.3748398Z pass 2025-03-14T06:26:09.3903753Z TIMING: entire_frame_compile:60.12798 gc:0.00504 _recursive_pre_grad_passes:0.03352 pad_mm_benchmark:0.38166 _recursive_joint_graph_passes:1.72829 _recursive_post_grad_passes:0.64326 async_compile.wait:3.00743 code_gen:18.28664 inductor_compile:32.84655 backend_compile:47.18408 cudagraphify.get_container:0.34765 entire_backward_compile:7.70638 CUDAGraphNode.record:1.69056 total_wall_time:67.83436 2025-03-14T06:26:09.3905976Z STATS: call_* op count: 1404 | FakeTensorMode.__torch_dispatch__:62984 | FakeTensor.__torch_dispatch__:14263 | ProxyTorchDispatchMode.__torch_dispatch__:28589 2025-03-14T06:26:09.3907064Z Dynamo produced 2 graphs covering 1404 ops with 5 graph breaks (4 unique) 2025-03-14T06:26:17.3165752Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:26:17.5757332Z warnings.warn( 2025-03-14T06:26:17.5757528Z 2025-03-14T06:26:19.9576398Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:26:19.9576756Z loading model: 0it [00:02, ?it/s] 2025-03-14T06:26:19.9577110Z cuda train GPT2ForSequenceClassification 2025-03-14T06:26:50.8447011Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:26:50.8448276Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:26:50.8449346Z pred = mod(**cloned_inputs) 2025-03-14T06:26:50.8450320Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 1426, in forward 2025-03-14T06:26:50.8451322Z transformer_outputs = self.transformer( 2025-03-14T06:26:50.8452307Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/gpt2/modeling_gpt2.py", line 837, in forward 2025-03-14T06:26:50.8453275Z inputs_embeds = self.wte(input_ids) 2025-03-14T06:26:50.8453594Z 2025-03-14T06:26:50.9318912Z W0314 06:26:50.930000 21253 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:27:22.0291071Z Compilation time (from dynamo_timed): 54.686320515 2025-03-14T06:27:22.0312367Z pass 2025-03-14T06:27:22.1121490Z TIMING: entire_frame_compile:47.8234 gc:0.00335 _recursive_pre_grad_passes:0.03137 pad_mm_benchmark:0.34141 _recursive_joint_graph_passes:1.62389 _recursive_post_grad_passes:0.69919 async_compile.wait:3.11526 code_gen:15.38069 inductor_compile:26.75935 backend_compile:37.64786 cudagraphify.get_container:0.34582 entire_backward_compile:6.86292 CUDAGraphNode.record:1.36336 total_wall_time:54.68632 2025-03-14T06:27:22.1123450Z STATS: call_* op count: 1399 | FakeTensorMode.__torch_dispatch__:52287 | FakeTensor.__torch_dispatch__:11646 | ProxyTorchDispatchMode.__torch_dispatch__:23504 2025-03-14T06:27:22.1124287Z Dynamo produced 2 graphs covering 1399 ops with 5 graph breaks (4 unique) 2025-03-14T06:27:29.4136887Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:27:29.4138101Z warnings.warn( 2025-03-14T06:27:29.6763065Z 2025-03-14T06:27:31.0776049Z loading model: 0it [00:00, ?it/s]WARNING:common:Model GoogleFnet supports float32 only 2025-03-14T06:27:31.3616451Z 2025-03-14T06:27:31.3616847Z loading model: 0it [00:01, ?it/s] 2025-03-14T06:27:31.3617291Z WARNING:common:Model GoogleFnet supports float32 only 2025-03-14T06:27:31.3627376Z cuda train GoogleFnet 2025-03-14T06:27:32.7277219Z WARNING:common:Model GoogleFnet supports float32 only 2025-03-14T06:27:49.7286354Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:27:49.7287232Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:27:49.7287947Z pred = mod(**cloned_inputs) 2025-03-14T06:27:49.7288588Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/fnet/modeling_fnet.py", line 749, in forward 2025-03-14T06:27:49.7289232Z outputs = self.fnet( 2025-03-14T06:27:49.7289834Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/fnet/modeling_fnet.py", line 581, in forward 2025-03-14T06:27:49.7290494Z embedding_output = self.embeddings( 2025-03-14T06:27:49.7291316Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/fnet/modeling_fnet.py", line 148, in forward 2025-03-14T06:27:49.7291996Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:27:49.7292246Z 2025-03-14T06:27:49.8744363Z W0314 06:27:49.873000 21827 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:28:11.9283578Z Compilation time (from dynamo_timed): 36.158149349 2025-03-14T06:28:11.9296150Z pass 2025-03-14T06:28:11.9730894Z TIMING: entire_frame_compile:31.72812 gc:0.00339 _recursive_pre_grad_passes:0.0211 pad_mm_benchmark:0.04772 _recursive_joint_graph_passes:0.54072 _recursive_post_grad_passes:0.29624 async_compile.wait:2.32857 code_gen:10.05921 inductor_compile:18.16716 backend_compile:24.50948 cudagraphify.get_container:0.27852 entire_backward_compile:4.43003 CUDAGraphNode.record:0.93457 total_wall_time:36.15815 2025-03-14T06:28:11.9732820Z STATS: call_* op count: 791 | FakeTensorMode.__torch_dispatch__:28408 | FakeTensor.__torch_dispatch__:7541 | ProxyTorchDispatchMode.__torch_dispatch__:13166 2025-03-14T06:28:11.9733643Z Dynamo produced 2 graphs covering 791 ops with 5 graph breaks (4 unique) 2025-03-14T06:28:18.5115705Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:28:18.5116908Z warnings.warn( 2025-03-14T06:28:18.8207822Z 2025-03-14T06:28:20.9770671Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:28:20.9771044Z loading model: 0it [00:02, ?it/s] 2025-03-14T06:28:20.9771389Z cuda train LayoutLMForMaskedLM 2025-03-14T06:28:54.7896361Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:28:54.7897233Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:28:54.7897977Z pred = mod(**cloned_inputs) 2025-03-14T06:28:54.7898653Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 938, in forward 2025-03-14T06:28:54.7899354Z outputs = self.layoutlm( 2025-03-14T06:28:54.7900013Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 821, in forward 2025-03-14T06:28:54.7900708Z embedding_output = self.embeddings( 2025-03-14T06:28:54.7901393Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 99, in forward 2025-03-14T06:28:54.7902105Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:28:54.7902363Z 2025-03-14T06:28:54.9662101Z W0314 06:28:54.965000 22266 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:29:37.0153635Z Compilation time (from dynamo_timed): 69.414224612 2025-03-14T06:29:37.0178358Z pass 2025-03-14T06:29:37.0833142Z TIMING: entire_frame_compile:61.06973 gc:0.0033 _recursive_pre_grad_passes:0.03449 pad_mm_benchmark:0.20889 _recursive_joint_graph_passes:1.57039 _recursive_post_grad_passes:0.89972 async_compile.wait:3.43275 code_gen:19.03837 inductor_compile:33.7918 backend_compile:47.7244 cudagraphify.get_container:0.39019 entire_backward_compile:8.34449 CUDAGraphNode.record:1.70146 total_wall_time:69.41422 2025-03-14T06:29:37.0835145Z STATS: call_* op count: 1447 | FakeTensorMode.__torch_dispatch__:64297 | FakeTensor.__torch_dispatch__:14542 | ProxyTorchDispatchMode.__torch_dispatch__:29184 2025-03-14T06:29:37.0835979Z Dynamo produced 2 graphs covering 1447 ops with 5 graph breaks (4 unique) 2025-03-14T06:29:44.9233282Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:29:44.9236251Z warnings.warn( 2025-03-14T06:29:45.1722580Z 2025-03-14T06:29:47.0762246Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:29:47.0762668Z loading model: 0it [00:01, ?it/s] 2025-03-14T06:29:47.0763033Z cuda train LayoutLMForSequenceClassification 2025-03-14T06:30:19.7973505Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:30:19.7974696Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:30:19.7975673Z pred = mod(**cloned_inputs) 2025-03-14T06:30:19.7976606Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 1060, in forward 2025-03-14T06:30:19.7977348Z outputs = self.layoutlm( 2025-03-14T06:30:19.7978093Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 821, in forward 2025-03-14T06:30:19.7978780Z embedding_output = self.embeddings( 2025-03-14T06:30:19.7979468Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/layoutlm/modeling_layoutlm.py", line 99, in forward 2025-03-14T06:30:19.7980179Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:30:19.7980429Z 2025-03-14T06:30:19.9792430Z W0314 06:30:19.978000 22722 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:31:01.0689227Z Compilation time (from dynamo_timed): 67.696035279 2025-03-14T06:31:01.0709890Z pass 2025-03-14T06:31:01.1390457Z TIMING: entire_frame_compile:59.82176 gc:0.00425 _recursive_pre_grad_passes:0.03418 pad_mm_benchmark:0.37885 _recursive_joint_graph_passes:1.72935 _recursive_post_grad_passes:0.63486 async_compile.wait:1.85739 code_gen:17.34282 inductor_compile:32.05917 backend_compile:46.57986 cudagraphify.get_container:0.37679 entire_backward_compile:7.87427 CUDAGraphNode.record:1.69929 total_wall_time:67.69604 2025-03-14T06:31:01.1392409Z STATS: call_* op count: 1440 | FakeTensorMode.__torch_dispatch__:63856 | FakeTensor.__torch_dispatch__:14429 | ProxyTorchDispatchMode.__torch_dispatch__:28985 2025-03-14T06:31:01.1393236Z Dynamo produced 2 graphs covering 1440 ops with 5 graph breaks (4 unique) 2025-03-14T06:31:09.0433690Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:31:09.0434978Z warnings.warn( 2025-03-14T06:31:09.5687212Z 2025-03-14T06:31:20.0200033Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:31:20.0200405Z loading model: 0it [00:10, ?it/s] 2025-03-14T06:31:20.0200788Z cuda train M2M100ForConditionalGeneration 2025-03-14T06:31:20.1216704Z WARNING:common:fp64 golden ref were not generated for M2M100ForConditionalGeneration. Setting accuracy check to cosine 2025-03-14T06:32:30.3719695Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:32:30.3721110Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:32:30.3721833Z pred = mod(**cloned_inputs) 2025-03-14T06:32:30.3722483Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1275, in forward 2025-03-14T06:32:30.3723142Z outputs = self.model( 2025-03-14T06:32:30.3723760Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 1162, in forward 2025-03-14T06:32:30.3724416Z encoder_outputs = self.encoder( 2025-03-14T06:32:30.3725058Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/m2m_100/modeling_m2m_100.py", line 763, in forward 2025-03-14T06:32:30.3725989Z inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:32:30.3726331Z 2025-03-14T06:32:32.0753801Z Compilation time (from dynamo_timed): 67.153471388 2025-03-14T06:32:32.0787490Z pass 2025-03-14T06:32:32.1756940Z TIMING: entire_frame_compile:50.6678 gc:0.00237 _recursive_pre_grad_passes:0.01898 pad_mm_benchmark:0.47245 _recursive_joint_graph_passes:3.13451 _recursive_post_grad_passes:2.60887 async_compile.wait:0.87396 code_gen:15.82188 inductor_compile:32.36013 backend_compile:38.49839 cudagraphify.get_container:0.74374 entire_backward_compile:16.48567 CUDAGraphNode.record:0.90213 total_wall_time:67.15347 2025-03-14T06:32:32.1758848Z STATS: call_* op count: 1296 | FakeTensorMode.__torch_dispatch__:98277 | FakeTensor.__torch_dispatch__:11833 | ProxyTorchDispatchMode.__torch_dispatch__:47271 2025-03-14T06:32:32.1759676Z Dynamo produced 1 graphs covering 1296 ops with 4 graph breaks (4 unique) 2025-03-14T06:32:40.2022031Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:32:40.2023213Z warnings.warn( 2025-03-14T06:32:40.4312558Z 2025-03-14T06:32:44.4048790Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:32:44.4049156Z loading model: 0it [00:03, ?it/s] 2025-03-14T06:32:44.4049498Z cuda train MBartForCausalLM 2025-03-14T06:32:44.4357280Z WARNING:common:fp64 golden ref were not generated for MBartForCausalLM. Setting accuracy check to cosine 2025-03-14T06:32:46.4145533Z 2025-03-14T06:32:46.4146387Z class GraphModule(torch.nn.Module): 2025-03-14T06:32:46.4150200Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:32:46.4152836Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:32:46.4153764Z l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:32:46.4155107Z l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:32:46.4156278Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:32:46.4157465Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:32:46.4158190Z 2025-03-14T06:32:46.4158796Z # No stacktrace found for following nodes 2025-03-14T06:32:46.4159385Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:32:46.4159917Z 2025-03-14T06:32:46.4160683Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1224 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:32:46.4161770Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 1024); l_cloned_inputs_input_ids_ = None 2025-03-14T06:32:46.4162281Z 2025-03-14T06:32:46.4163067Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1235 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:32:46.4164899Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:32:46.4166166Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:32:46.4166614Z 2025-03-14T06:32:46.4167421Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:32:46.4168862Z mask: "f32[1024, 1024][1024, 1]cuda:0" = torch.full((1024, 1024), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:32:46.4169398Z 2025-03-14T06:32:46.4170176Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:32:46.4171150Z mask_cond: "i64[1024][1]cuda:0" = torch.arange(1024, device = device(type='cuda', index=0)) 2025-03-14T06:32:46.4171585Z 2025-03-14T06:32:46.4172374Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:32:46.4173261Z add: "i64[1024][1]cuda:0" = mask_cond + 1 2025-03-14T06:32:46.4173654Z view_1: "i64[1024, 1][1, 1]cuda:0" = add.view(1024, 1); add = None 2025-03-14T06:32:46.4174130Z lt: "b8[1024, 1024][1024, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:32:46.4174706Z masked_fill_: "f32[1024, 1024][1024, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:32:46.4175158Z 2025-03-14T06:32:46.4175809Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:32:46.4176639Z mask_1: "f32[1024, 1024][1024, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:32:46.4177037Z 2025-03-14T06:32:46.4177866Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:32:46.4179056Z getitem: "f32[1, 1, 1024, 1024][1048576, 1048576, 1024, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:32:46.4179903Z causal_4d_mask: "f32[1, 1, 1024, 1024][1048576, 1048576, 1024, 1]cuda:0" = getitem.expand(1, 1, 1024, 1024); getitem = causal_4d_mask = None 2025-03-14T06:32:46.4180464Z 2025-03-14T06:32:46.4181281Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:119 in forward, code: positions = torch.arange( 2025-03-14T06:32:46.4182233Z arange_1: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:32:46.4182737Z 2025-03-14T06:32:46.4183365Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:121 in forward, code: ).expand(bsz, -1) 2025-03-14T06:32:46.4184191Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange_1.expand(1, -1); arange_1 = None 2025-03-14T06:32:46.4184612Z 2025-03-14T06:32:46.4185344Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:123 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:32:46.4187084Z add_1: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:32:46.4188321Z positions_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add_1, l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add_1 = l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:32:46.4189433Z 2025-03-14T06:32:46.4190238Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1259 in forward, code: hidden_states = inputs_embeds + positions.to(inputs_embeds.device) 2025-03-14T06:32:46.4191369Z to_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = positions_1.to(device(type='cuda', index=0)); positions_1 = None 2025-03-14T06:32:46.4192070Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + to_1; inputs_embeds = to_1 = None 2025-03-14T06:32:46.4192560Z 2025-03-14T06:32:46.4193331Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1260 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:32:46.4195687Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:32:46.4197402Z 2025-03-14T06:32:46.4198300Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1262 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:32:46.4199680Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:32:46.4200364Z 2025-03-14T06:32:46.4201073Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1290 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:32:46.4201941Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:32:46.4202287Z 2025-03-14T06:32:46.4203000Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1291 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:32:46.4203914Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:32:46.4204337Z 2025-03-14T06:32:46.4204466Z 2025-03-14T06:32:46.4204590Z class GraphModule(torch.nn.Module): 2025-03-14T06:32:46.4206437Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:32:46.4208231Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:32:46.4209002Z l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:32:46.4210198Z l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:32:46.4211373Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:32:46.4212560Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:32:46.4213285Z 2025-03-14T06:32:46.4213547Z # No stacktrace found for following nodes 2025-03-14T06:32:46.4214146Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:32:46.4214684Z 2025-03-14T06:32:46.4215427Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1224 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:32:46.4216465Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 1024); l_cloned_inputs_input_ids_ = None 2025-03-14T06:32:46.4225901Z 2025-03-14T06:32:46.4226912Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1235 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:32:46.4228622Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:32:46.4229907Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:32:46.4230369Z 2025-03-14T06:32:46.4231249Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:32:46.4232378Z mask: "f32[1024, 1024][1024, 1]cuda:0" = torch.full((1024, 1024), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:32:46.4232902Z 2025-03-14T06:32:46.4233655Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:32:46.4234731Z mask_cond: "i64[1024][1]cuda:0" = torch.arange(1024, device = device(type='cuda', index=0)) 2025-03-14T06:32:46.4235177Z 2025-03-14T06:32:46.4235976Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:32:46.4236877Z add: "i64[1024][1]cuda:0" = mask_cond + 1 2025-03-14T06:32:46.4237277Z view_1: "i64[1024, 1][1, 1]cuda:0" = add.view(1024, 1); add = None 2025-03-14T06:32:46.4237898Z lt: "b8[1024, 1024][1024, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:32:46.4238483Z masked_fill_: "f32[1024, 1024][1024, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:32:46.4238946Z 2025-03-14T06:32:46.4239600Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:32:46.4240431Z mask_1: "f32[1024, 1024][1024, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:32:46.4240834Z 2025-03-14T06:32:46.4241666Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:32:46.4242952Z getitem: "f32[1, 1, 1024, 1024][1048576, 1048576, 1024, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:32:46.4243813Z causal_4d_mask: "f32[1, 1, 1024, 1024][1048576, 1048576, 1024, 1]cuda:0" = getitem.expand(1, 1, 1024, 1024); getitem = causal_4d_mask = None 2025-03-14T06:32:46.4244369Z 2025-03-14T06:32:46.4245035Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:119 in forward, code: positions = torch.arange( 2025-03-14T06:32:46.4245992Z arange_1: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:32:46.4246505Z 2025-03-14T06:32:46.4247141Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:121 in forward, code: ).expand(bsz, -1) 2025-03-14T06:32:46.4247988Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange_1.expand(1, -1); arange_1 = None 2025-03-14T06:32:46.4248408Z 2025-03-14T06:32:46.4249154Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:123 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:32:46.4250051Z add_1: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:32:46.4251326Z positions_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add_1, l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add_1 = l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:32:46.4252439Z 2025-03-14T06:32:46.4253250Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1259 in forward, code: hidden_states = inputs_embeds + positions.to(inputs_embeds.device) 2025-03-14T06:32:46.4254341Z to_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = positions_1.to(device(type='cuda', index=0)); positions_1 = None 2025-03-14T06:32:46.4255048Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + to_1; inputs_embeds = to_1 = None 2025-03-14T06:32:46.4255543Z 2025-03-14T06:32:46.4256314Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1260 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:32:46.4258589Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:32:46.4260311Z 2025-03-14T06:32:46.4261359Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1262 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:32:46.4262735Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:32:46.4263425Z 2025-03-14T06:32:46.4264124Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1290 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:32:46.4264958Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:32:46.4265388Z 2025-03-14T06:32:46.4266106Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1291 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:32:46.4267033Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:32:46.4267456Z 2025-03-14T06:32:47.3722013Z 2025-03-14T06:32:47.3722733Z class GraphModule(torch.nn.Module): 2025-03-14T06:32:47.3725653Z def forward(self, L_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:32:47.3728738Z l_input_ids_ = L_input_ids_ 2025-03-14T06:32:47.3729979Z l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:32:47.3731877Z l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:32:47.3733975Z l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:32:47.3736122Z l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:32:47.3737410Z 2025-03-14T06:32:47.3738710Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1224 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:32:47.3740412Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_input_ids_.view(-1, 1024); l_input_ids_ = None 2025-03-14T06:32:47.3741201Z 2025-03-14T06:32:47.3742611Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1235 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:32:47.3745632Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:32:47.3747915Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:32:47.3748713Z 2025-03-14T06:32:47.3750139Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:32:47.3752603Z mask: "f32[1024, 1024][1024, 1]cuda:0" = torch.full((1024, 1024), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:32:47.3753545Z 2025-03-14T06:32:47.3754995Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:32:47.3756706Z mask_cond: "i64[1024][1]cuda:0" = torch.arange(1024, device = device(type='cuda', index=0)) 2025-03-14T06:32:47.3757466Z 2025-03-14T06:32:47.3758897Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:32:47.3760754Z add: "i64[1024][1]cuda:0" = mask_cond + 1 2025-03-14T06:32:47.3761440Z view_1: "i64[1024, 1][1, 1]cuda:0" = add.view(1024, 1); add = None 2025-03-14T06:32:47.3762261Z lt: "b8[1024, 1024][1024, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:32:47.3763281Z masked_fill_: "f32[1024, 1024][1024, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:32:47.3764092Z 2025-03-14T06:32:47.3765219Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:32:47.3766683Z mask_1: "f32[1024, 1024][1024, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:32:47.3767353Z 2025-03-14T06:32:47.3769105Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:32:47.3771210Z getitem: "f32[1, 1, 1024, 1024][1048576, 1048576, 1024, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:32:47.3772719Z causal_4d_mask: "f32[1, 1, 1024, 1024][1048576, 1048576, 1024, 1]cuda:0" = getitem.expand(1, 1, 1024, 1024); getitem = causal_4d_mask = None 2025-03-14T06:32:47.3773686Z 2025-03-14T06:32:47.3774840Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:119 in forward, code: positions = torch.arange( 2025-03-14T06:32:47.3776495Z arange_1: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:32:47.3777376Z 2025-03-14T06:32:47.3778483Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:121 in forward, code: ).expand(bsz, -1) 2025-03-14T06:32:47.3779955Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange_1.expand(1, -1); arange_1 = None 2025-03-14T06:32:47.3780676Z 2025-03-14T06:32:47.3782083Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:123 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:32:47.3783657Z add_1: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:32:47.3785825Z positions_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add_1, l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add_1 = l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:32:47.3787827Z 2025-03-14T06:32:47.3789251Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1259 in forward, code: hidden_states = inputs_embeds + positions.to(inputs_embeds.device) 2025-03-14T06:32:47.3791185Z to_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = positions_1.to(device(type='cuda', index=0)); positions_1 = None 2025-03-14T06:32:47.3792626Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + to_1; inputs_embeds = to_1 = None 2025-03-14T06:32:47.3793531Z 2025-03-14T06:32:47.3794965Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1260 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:32:47.3799069Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:32:47.3802483Z 2025-03-14T06:32:47.3804136Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1262 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:32:47.3806637Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:32:47.3807865Z 2025-03-14T06:32:47.3809107Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1290 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:32:47.3810577Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:32:47.3811237Z 2025-03-14T06:32:47.3812500Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1291 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:32:47.3814137Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:32:47.3814865Z 2025-03-14T06:32:48.0091848Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:32:48.0092644Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py", line 135, in forward 2025-03-14T06:32:48.0093335Z return super().forward(positions + self.offset) 2025-03-14T06:32:48.0093597Z 2025-03-14T06:32:57.5256439Z Compilation time (from dynamo_timed): 4.742593223 2025-03-14T06:32:57.5274430Z pass 2025-03-14T06:32:57.5927612Z TIMING: entire_frame_compile:3.76297 gc:0.00576 _recursive_pre_grad_passes:0.0059 _recursive_joint_graph_passes:0.2334 inductor_compile:1.93961 backend_compile:2.8591 _recursive_post_grad_passes:0.08563 async_compile.precompile:0.1161 async_compile.wait:0.44835 code_gen:1.19408 cudagraphify.get_container:0.18263 pad_mm_benchmark:0.01435 entire_backward_compile:0.97962 CUDAGraphNode.record:5.49897 total_wall_time:4.74259 2025-03-14T06:32:57.5929635Z STATS: call_* op count: 60 | FakeTensorMode.__torch_dispatch__:3871 | FakeTensor.__torch_dispatch__:555 | ProxyTorchDispatchMode.__torch_dispatch__:1594 2025-03-14T06:32:57.5930454Z Dynamo produced 6 graphs covering 60 ops with 6 graph breaks (5 unique) 2025-03-14T06:33:02.9301737Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:33:02.9302888Z warnings.warn( 2025-03-14T06:33:03.2513601Z 2025-03-14T06:33:10.2336080Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:33:10.2336596Z loading model: 0it [00:06, ?it/s] 2025-03-14T06:33:10.2337093Z cuda train MBartForConditionalGeneration 2025-03-14T06:33:10.3281593Z WARNING:common:fp64 golden ref were not generated for MBartForConditionalGeneration. Setting accuracy check to cosine 2025-03-14T06:33:13.9952346Z 2025-03-14T06:33:13.9953158Z class GraphModule(torch.nn.Module): 2025-03-14T06:33:13.9955659Z def forward(self, L_cloned_inputs_labels_: "i64[1, 1024][1024, 1]cuda:0", L_cloned_inputs_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:33:13.9958124Z l_cloned_inputs_labels_ = L_cloned_inputs_labels_ 2025-03-14T06:33:13.9958567Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:33:13.9959430Z l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:33:13.9960576Z l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:33:13.9961763Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:33:13.9962965Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:33:13.9963704Z 2025-03-14T06:33:13.9963974Z # No stacktrace found for following nodes 2025-03-14T06:33:13.9964558Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:33:13.9965096Z 2025-03-14T06:33:13.9965874Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:88 in shift_tokens_right, code: prev_output_tokens = input_ids.clone() 2025-03-14T06:33:13.9966915Z prev_output_tokens: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_labels_.clone(); l_cloned_inputs_labels_ = None 2025-03-14T06:33:13.9967432Z 2025-03-14T06:33:13.9968587Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:93 in shift_tokens_right, code: prev_output_tokens.masked_fill_(prev_output_tokens == -100, pad_token_id) 2025-03-14T06:33:13.9969584Z eq: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens == -100 2025-03-14T06:33:13.9970162Z masked_fill_: "i64[1, 1024][1024, 1]cuda:0" = prev_output_tokens.masked_fill_(eq, 1); eq = masked_fill_ = None 2025-03-14T06:33:13.9970658Z 2025-03-14T06:33:13.9971531Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:95 in shift_tokens_right, code: index_of_eos = (prev_output_tokens.ne(pad_token_id).sum(dim=1) - 1).unsqueeze(-1) 2025-03-14T06:33:13.9972526Z ne: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens.ne(1) 2025-03-14T06:33:13.9972933Z sum_1: "i64[1][1]cuda:0" = ne.sum(dim = 1); ne = None 2025-03-14T06:33:13.9973320Z sub: "i64[1][1]cuda:0" = sum_1 - 1; sum_1 = None 2025-03-14T06:33:13.9973738Z index_of_eos: "i64[1, 1][1, 1]cuda:0" = sub.unsqueeze(-1); sub = None 2025-03-14T06:33:13.9974116Z 2025-03-14T06:33:13.9974979Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:96 in shift_tokens_right, code: decoder_start_tokens = prev_output_tokens.gather(1, index_of_eos).squeeze() 2025-03-14T06:33:13.9976078Z gather: "i64[1, 1][1, 1]cuda:0" = prev_output_tokens.gather(1, index_of_eos); index_of_eos = None 2025-03-14T06:33:13.9976888Z decoder_start_tokens: "i64[][]cuda:0" = gather.squeeze(); gather = None 2025-03-14T06:33:13.9977706Z 2025-03-14T06:33:13.9978682Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:97 in shift_tokens_right, code: prev_output_tokens[:, 1:] = prev_output_tokens[:, :-1].clone() 2025-03-14T06:33:13.9979757Z getitem: "i64[1, 1023][1024, 1]cuda:0" = prev_output_tokens[(slice(None, None, None), slice(None, -1, None))] 2025-03-14T06:33:13.9980351Z clone_1: "i64[1, 1023][1023, 1]cuda:0" = getitem.clone(); getitem = None 2025-03-14T06:33:13.9981029Z prev_output_tokens[(slice(None, None, None), slice(1, None, None))] = clone_1; setitem = prev_output_tokens; clone_1 = setitem = None 2025-03-14T06:33:13.9981736Z 2025-03-14T06:33:13.9982504Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:98 in shift_tokens_right, code: prev_output_tokens[:, 0] = decoder_start_tokens 2025-03-14T06:33:13.9983742Z prev_output_tokens[(slice(None, None, None), 0)] = decoder_start_tokens; setitem_1 = prev_output_tokens; prev_output_tokens = decoder_start_tokens = setitem_1 = None 2025-03-14T06:33:13.9984427Z 2025-03-14T06:33:13.9985164Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1010 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:33:13.9986199Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 1024); l_cloned_inputs_input_ids_ = None 2025-03-14T06:33:13.9986715Z 2025-03-14T06:33:13.9987503Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1017 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:33:13.9989199Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:33:13.9990471Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:33:13.9990922Z 2025-03-14T06:33:13.9991586Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:119 in forward, code: positions = torch.arange( 2025-03-14T06:33:13.9992536Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:33:13.9993036Z 2025-03-14T06:33:13.9993680Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:121 in forward, code: ).expand(bsz, -1) 2025-03-14T06:33:13.9994572Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:33:13.9994991Z 2025-03-14T06:33:13.9995725Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:123 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:33:13.9996620Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:33:13.9997814Z embed_pos: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:33:13.9998909Z 2025-03-14T06:33:13.9999711Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1021 in forward, code: hidden_states = inputs_embeds + embed_pos.to(inputs_embeds.device) 2025-03-14T06:33:14.0000856Z to: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:33:14.0001583Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + to; inputs_embeds = to = None 2025-03-14T06:33:14.0002314Z 2025-03-14T06:33:14.0003270Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1022 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:33:14.0005556Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:33:14.0007429Z 2025-03-14T06:33:14.0008344Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1023 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:33:14.0009724Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:33:14.0010413Z 2025-03-14T06:33:14.0011118Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1050 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:33:14.0011959Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:33:14.0012311Z 2025-03-14T06:33:14.0013089Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1051 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:33:14.0014048Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:33:14.0014456Z 2025-03-14T06:33:14.0014640Z 2025-03-14T06:33:14.0014772Z class GraphModule(torch.nn.Module): 2025-03-14T06:33:14.0016653Z def forward(self, L_cloned_inputs_labels_: "i64[1, 1024][1024, 1]cuda:0", L_cloned_inputs_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:33:14.0018575Z l_cloned_inputs_labels_ = L_cloned_inputs_labels_ 2025-03-14T06:33:14.0019046Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:33:14.0019816Z l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:33:14.0020930Z l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:33:14.0022102Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:33:14.0023286Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:33:14.0024016Z 2025-03-14T06:33:14.0024278Z # No stacktrace found for following nodes 2025-03-14T06:33:14.0024947Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:33:14.0025484Z 2025-03-14T06:33:14.0026221Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:88 in shift_tokens_right, code: prev_output_tokens = input_ids.clone() 2025-03-14T06:33:14.0027253Z prev_output_tokens: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_labels_.clone(); l_cloned_inputs_labels_ = None 2025-03-14T06:33:14.0027769Z 2025-03-14T06:33:14.0028620Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:93 in shift_tokens_right, code: prev_output_tokens.masked_fill_(prev_output_tokens == -100, pad_token_id) 2025-03-14T06:33:14.0029721Z eq: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens == -100 2025-03-14T06:33:14.0030288Z masked_fill_: "i64[1, 1024][1024, 1]cuda:0" = prev_output_tokens.masked_fill_(eq, 1); eq = masked_fill_ = None 2025-03-14T06:33:14.0030791Z 2025-03-14T06:33:14.0031654Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:95 in shift_tokens_right, code: index_of_eos = (prev_output_tokens.ne(pad_token_id).sum(dim=1) - 1).unsqueeze(-1) 2025-03-14T06:33:14.0032645Z ne: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens.ne(1) 2025-03-14T06:33:14.0033050Z sum_1: "i64[1][1]cuda:0" = ne.sum(dim = 1); ne = None 2025-03-14T06:33:14.0033433Z sub: "i64[1][1]cuda:0" = sum_1 - 1; sum_1 = None 2025-03-14T06:33:14.0033851Z index_of_eos: "i64[1, 1][1, 1]cuda:0" = sub.unsqueeze(-1); sub = None 2025-03-14T06:33:14.0034234Z 2025-03-14T06:33:14.0035171Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:96 in shift_tokens_right, code: decoder_start_tokens = prev_output_tokens.gather(1, index_of_eos).squeeze() 2025-03-14T06:33:14.0036267Z gather: "i64[1, 1][1, 1]cuda:0" = prev_output_tokens.gather(1, index_of_eos); index_of_eos = None 2025-03-14T06:33:14.0036844Z decoder_start_tokens: "i64[][]cuda:0" = gather.squeeze(); gather = None 2025-03-14T06:33:14.0037252Z 2025-03-14T06:33:14.0038054Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:97 in shift_tokens_right, code: prev_output_tokens[:, 1:] = prev_output_tokens[:, :-1].clone() 2025-03-14T06:33:14.0039122Z getitem: "i64[1, 1023][1024, 1]cuda:0" = prev_output_tokens[(slice(None, None, None), slice(None, -1, None))] 2025-03-14T06:33:14.0039712Z clone_1: "i64[1, 1023][1023, 1]cuda:0" = getitem.clone(); getitem = None 2025-03-14T06:33:14.0040401Z prev_output_tokens[(slice(None, None, None), slice(1, None, None))] = clone_1; setitem = prev_output_tokens; clone_1 = setitem = None 2025-03-14T06:33:14.0040978Z 2025-03-14T06:33:14.0041746Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:98 in shift_tokens_right, code: prev_output_tokens[:, 0] = decoder_start_tokens 2025-03-14T06:33:14.0042977Z prev_output_tokens[(slice(None, None, None), 0)] = decoder_start_tokens; setitem_1 = prev_output_tokens; prev_output_tokens = decoder_start_tokens = setitem_1 = None 2025-03-14T06:33:14.0043661Z 2025-03-14T06:33:14.0044400Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1010 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:33:14.0045436Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 1024); l_cloned_inputs_input_ids_ = None 2025-03-14T06:33:14.0045956Z 2025-03-14T06:33:14.0046742Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1017 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:33:14.0048514Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:33:14.0049835Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:33:14.0050289Z 2025-03-14T06:33:14.0050956Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:119 in forward, code: positions = torch.arange( 2025-03-14T06:33:14.0051985Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:33:14.0052483Z 2025-03-14T06:33:14.0053120Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:121 in forward, code: ).expand(bsz, -1) 2025-03-14T06:33:14.0053943Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:33:14.0054354Z 2025-03-14T06:33:14.0055082Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:123 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:33:14.0055973Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:33:14.0057170Z embed_pos: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:33:14.0058262Z 2025-03-14T06:33:14.0059074Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1021 in forward, code: hidden_states = inputs_embeds + embed_pos.to(inputs_embeds.device) 2025-03-14T06:33:14.0060132Z to: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:33:14.0060802Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + to; inputs_embeds = to = None 2025-03-14T06:33:14.0061281Z 2025-03-14T06:33:14.0062046Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1022 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:33:14.0064322Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:33:14.0066042Z 2025-03-14T06:33:14.0066944Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1023 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:33:14.0068611Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:33:14.0069306Z 2025-03-14T06:33:14.0070006Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1050 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:33:14.0070839Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:33:14.0071355Z 2025-03-14T06:33:14.0072130Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1051 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:33:14.0073087Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:33:14.0073497Z 2025-03-14T06:33:14.0073622Z 2025-03-14T06:33:14.0073754Z class GraphModule(torch.nn.Module): 2025-03-14T06:33:14.0075731Z def forward(self, L_cloned_inputs_labels_: "i64[1, 1024][1024, 1]cuda:0", L_cloned_inputs_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:33:14.0077764Z l_cloned_inputs_labels_ = L_cloned_inputs_labels_ 2025-03-14T06:33:14.0078188Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:33:14.0078983Z l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:33:14.0080122Z l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:33:14.0081292Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:33:14.0082483Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:33:14.0083208Z 2025-03-14T06:33:14.0083471Z # No stacktrace found for following nodes 2025-03-14T06:33:14.0084051Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:33:14.0084586Z 2025-03-14T06:33:14.0085328Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:88 in shift_tokens_right, code: prev_output_tokens = input_ids.clone() 2025-03-14T06:33:14.0086364Z prev_output_tokens: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_labels_.clone(); l_cloned_inputs_labels_ = None 2025-03-14T06:33:14.0086884Z 2025-03-14T06:33:14.0087735Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:93 in shift_tokens_right, code: prev_output_tokens.masked_fill_(prev_output_tokens == -100, pad_token_id) 2025-03-14T06:33:14.0088726Z eq: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens == -100 2025-03-14T06:33:14.0089290Z masked_fill_: "i64[1, 1024][1024, 1]cuda:0" = prev_output_tokens.masked_fill_(eq, 1); eq = masked_fill_ = None 2025-03-14T06:33:14.0089786Z 2025-03-14T06:33:14.0090643Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:95 in shift_tokens_right, code: index_of_eos = (prev_output_tokens.ne(pad_token_id).sum(dim=1) - 1).unsqueeze(-1) 2025-03-14T06:33:14.0091629Z ne: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens.ne(1) 2025-03-14T06:33:14.0092037Z sum_1: "i64[1][1]cuda:0" = ne.sum(dim = 1); ne = None 2025-03-14T06:33:14.0092429Z sub: "i64[1][1]cuda:0" = sum_1 - 1; sum_1 = None 2025-03-14T06:33:14.0092850Z index_of_eos: "i64[1, 1][1, 1]cuda:0" = sub.unsqueeze(-1); sub = None 2025-03-14T06:33:14.0093229Z 2025-03-14T06:33:14.0094181Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:96 in shift_tokens_right, code: decoder_start_tokens = prev_output_tokens.gather(1, index_of_eos).squeeze() 2025-03-14T06:33:14.0095276Z gather: "i64[1, 1][1, 1]cuda:0" = prev_output_tokens.gather(1, index_of_eos); index_of_eos = None 2025-03-14T06:33:14.0095853Z decoder_start_tokens: "i64[][]cuda:0" = gather.squeeze(); gather = None 2025-03-14T06:33:14.0096259Z 2025-03-14T06:33:14.0097061Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:97 in shift_tokens_right, code: prev_output_tokens[:, 1:] = prev_output_tokens[:, :-1].clone() 2025-03-14T06:33:14.0098127Z getitem: "i64[1, 1023][1024, 1]cuda:0" = prev_output_tokens[(slice(None, None, None), slice(None, -1, None))] 2025-03-14T06:33:14.0098805Z clone_1: "i64[1, 1023][1023, 1]cuda:0" = getitem.clone(); getitem = None 2025-03-14T06:33:14.0099539Z prev_output_tokens[(slice(None, None, None), slice(1, None, None))] = clone_1; setitem = prev_output_tokens; clone_1 = setitem = None 2025-03-14T06:33:14.0100120Z 2025-03-14T06:33:14.0100886Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:98 in shift_tokens_right, code: prev_output_tokens[:, 0] = decoder_start_tokens 2025-03-14T06:33:14.0102115Z prev_output_tokens[(slice(None, None, None), 0)] = decoder_start_tokens; setitem_1 = prev_output_tokens; prev_output_tokens = decoder_start_tokens = setitem_1 = None 2025-03-14T06:33:14.0102796Z 2025-03-14T06:33:14.0103537Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1010 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:33:14.0104576Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 1024); l_cloned_inputs_input_ids_ = None 2025-03-14T06:33:14.0105093Z 2025-03-14T06:33:14.0105887Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1017 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:33:14.0107565Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:33:14.0108848Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:33:14.0109351Z 2025-03-14T06:33:14.0110027Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:119 in forward, code: positions = torch.arange( 2025-03-14T06:33:14.0110982Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:33:14.0111483Z 2025-03-14T06:33:14.0112115Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:121 in forward, code: ).expand(bsz, -1) 2025-03-14T06:33:14.0122057Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:33:14.0122498Z 2025-03-14T06:33:14.0123258Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:123 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:33:14.0124164Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:33:14.0125495Z embed_pos: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:33:14.0126603Z 2025-03-14T06:33:14.0127407Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1021 in forward, code: hidden_states = inputs_embeds + embed_pos.to(inputs_embeds.device) 2025-03-14T06:33:14.0128485Z to: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:33:14.0129163Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + to; inputs_embeds = to = None 2025-03-14T06:33:14.0129645Z 2025-03-14T06:33:14.0130410Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1022 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:33:14.0132800Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:33:14.0134521Z 2025-03-14T06:33:14.0135433Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1023 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:33:14.0136815Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:33:14.0137512Z 2025-03-14T06:33:14.0138224Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1050 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:33:14.0139108Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:33:14.0139458Z 2025-03-14T06:33:14.0140237Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1051 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:33:14.0141209Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:33:14.0141622Z 2025-03-14T06:33:14.9602572Z 2025-03-14T06:33:14.9603067Z class GraphModule(torch.nn.Module): 2025-03-14T06:33:14.9605272Z def forward(self, L_labels_: "i64[1, 1024][1024, 1]cuda:0", L_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:33:14.9607181Z l_labels_ = L_labels_ 2025-03-14T06:33:14.9607497Z l_input_ids_ = L_input_ids_ 2025-03-14T06:33:14.9608222Z l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:33:14.9609390Z l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:33:14.9610613Z l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:33:14.9612247Z l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:33:14.9613007Z 2025-03-14T06:33:14.9613783Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:88 in shift_tokens_right, code: prev_output_tokens = input_ids.clone() 2025-03-14T06:33:14.9614759Z prev_output_tokens: "i64[1, 1024][1024, 1]cuda:0" = l_labels_.clone(); l_labels_ = None 2025-03-14T06:33:14.9615200Z 2025-03-14T06:33:14.9616079Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:93 in shift_tokens_right, code: prev_output_tokens.masked_fill_(prev_output_tokens == -100, pad_token_id) 2025-03-14T06:33:14.9617243Z eq: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens == -100 2025-03-14T06:33:14.9617822Z masked_fill_: "i64[1, 1024][1024, 1]cuda:0" = prev_output_tokens.masked_fill_(eq, 1); eq = masked_fill_ = None 2025-03-14T06:33:14.9618328Z 2025-03-14T06:33:14.9619202Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:95 in shift_tokens_right, code: index_of_eos = (prev_output_tokens.ne(pad_token_id).sum(dim=1) - 1).unsqueeze(-1) 2025-03-14T06:33:14.9620207Z ne: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens.ne(1) 2025-03-14T06:33:14.9620620Z sum_1: "i64[1][1]cuda:0" = ne.sum(dim = 1); ne = None 2025-03-14T06:33:14.9621007Z sub: "i64[1][1]cuda:0" = sum_1 - 1; sum_1 = None 2025-03-14T06:33:14.9621435Z index_of_eos: "i64[1, 1][1, 1]cuda:0" = sub.unsqueeze(-1); sub = None 2025-03-14T06:33:14.9621827Z 2025-03-14T06:33:14.9622698Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:96 in shift_tokens_right, code: decoder_start_tokens = prev_output_tokens.gather(1, index_of_eos).squeeze() 2025-03-14T06:33:14.9623812Z gather: "i64[1, 1][1, 1]cuda:0" = prev_output_tokens.gather(1, index_of_eos); index_of_eos = None 2025-03-14T06:33:14.9624398Z decoder_start_tokens: "i64[][]cuda:0" = gather.squeeze(); gather = None 2025-03-14T06:33:14.9624813Z 2025-03-14T06:33:14.9625763Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:97 in shift_tokens_right, code: prev_output_tokens[:, 1:] = prev_output_tokens[:, :-1].clone() 2025-03-14T06:33:14.9626856Z getitem: "i64[1, 1023][1024, 1]cuda:0" = prev_output_tokens[(slice(None, None, None), slice(None, -1, None))] 2025-03-14T06:33:14.9627461Z clone_1: "i64[1, 1023][1023, 1]cuda:0" = getitem.clone(); getitem = None 2025-03-14T06:33:14.9628165Z prev_output_tokens[(slice(None, None, None), slice(1, None, None))] = clone_1; setitem = prev_output_tokens; clone_1 = setitem = None 2025-03-14T06:33:14.9628751Z 2025-03-14T06:33:14.9629529Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:98 in shift_tokens_right, code: prev_output_tokens[:, 0] = decoder_start_tokens 2025-03-14T06:33:14.9630774Z prev_output_tokens[(slice(None, None, None), 0)] = decoder_start_tokens; setitem_1 = prev_output_tokens; prev_output_tokens = decoder_start_tokens = setitem_1 = None 2025-03-14T06:33:14.9631468Z 2025-03-14T06:33:14.9632252Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1010 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:33:14.9633211Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_input_ids_.view(-1, 1024); l_input_ids_ = None 2025-03-14T06:33:14.9633654Z 2025-03-14T06:33:14.9634562Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1017 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:33:14.9636377Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:33:14.9637681Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:33:14.9638146Z 2025-03-14T06:33:14.9638826Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:119 in forward, code: positions = torch.arange( 2025-03-14T06:33:14.9639927Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:33:14.9640432Z 2025-03-14T06:33:14.9641080Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:121 in forward, code: ).expand(bsz, -1) 2025-03-14T06:33:14.9641918Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:33:14.9642337Z 2025-03-14T06:33:14.9643082Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:123 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:33:14.9643988Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:33:14.9645201Z embed_pos: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:33:14.9646318Z 2025-03-14T06:33:14.9647140Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1021 in forward, code: hidden_states = inputs_embeds + embed_pos.to(inputs_embeds.device) 2025-03-14T06:33:14.9648217Z to: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:33:14.9648901Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + to; inputs_embeds = to = None 2025-03-14T06:33:14.9649390Z 2025-03-14T06:33:14.9650169Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1022 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:33:14.9652491Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:33:14.9654256Z 2025-03-14T06:33:14.9655170Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1023 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:33:14.9656560Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:33:14.9657257Z 2025-03-14T06:33:14.9657972Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1050 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:33:14.9658811Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:33:14.9659158Z 2025-03-14T06:33:14.9660026Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1051 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:33:14.9660996Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:33:14.9661409Z 2025-03-14T06:33:14.9661533Z 2025-03-14T06:33:14.9661664Z class GraphModule(torch.nn.Module): 2025-03-14T06:33:14.9663499Z def forward(self, L_labels_: "i64[1, 1024][1024, 1]cuda:0", L_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:33:14.9665391Z l_labels_ = L_labels_ 2025-03-14T06:33:14.9665694Z l_input_ids_ = L_input_ids_ 2025-03-14T06:33:14.9666420Z l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:33:14.9667579Z l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:33:14.9669299Z l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:33:14.9670522Z l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:33:14.9671263Z 2025-03-14T06:33:14.9672018Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:88 in shift_tokens_right, code: prev_output_tokens = input_ids.clone() 2025-03-14T06:33:14.9672976Z prev_output_tokens: "i64[1, 1024][1024, 1]cuda:0" = l_labels_.clone(); l_labels_ = None 2025-03-14T06:33:14.9673418Z 2025-03-14T06:33:14.9674320Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:93 in shift_tokens_right, code: prev_output_tokens.masked_fill_(prev_output_tokens == -100, pad_token_id) 2025-03-14T06:33:14.9675311Z eq: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens == -100 2025-03-14T06:33:14.9675883Z masked_fill_: "i64[1, 1024][1024, 1]cuda:0" = prev_output_tokens.masked_fill_(eq, 1); eq = masked_fill_ = None 2025-03-14T06:33:14.9676390Z 2025-03-14T06:33:14.9677263Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:95 in shift_tokens_right, code: index_of_eos = (prev_output_tokens.ne(pad_token_id).sum(dim=1) - 1).unsqueeze(-1) 2025-03-14T06:33:14.9678259Z ne: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens.ne(1) 2025-03-14T06:33:14.9678672Z sum_1: "i64[1][1]cuda:0" = ne.sum(dim = 1); ne = None 2025-03-14T06:33:14.9679058Z sub: "i64[1][1]cuda:0" = sum_1 - 1; sum_1 = None 2025-03-14T06:33:14.9679486Z index_of_eos: "i64[1, 1][1, 1]cuda:0" = sub.unsqueeze(-1); sub = None 2025-03-14T06:33:14.9679869Z 2025-03-14T06:33:14.9680724Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:96 in shift_tokens_right, code: decoder_start_tokens = prev_output_tokens.gather(1, index_of_eos).squeeze() 2025-03-14T06:33:14.9681836Z gather: "i64[1, 1][1, 1]cuda:0" = prev_output_tokens.gather(1, index_of_eos); index_of_eos = None 2025-03-14T06:33:14.9682618Z decoder_start_tokens: "i64[][]cuda:0" = gather.squeeze(); gather = None 2025-03-14T06:33:14.9683032Z 2025-03-14T06:33:14.9683843Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:97 in shift_tokens_right, code: prev_output_tokens[:, 1:] = prev_output_tokens[:, :-1].clone() 2025-03-14T06:33:14.9684913Z getitem: "i64[1, 1023][1024, 1]cuda:0" = prev_output_tokens[(slice(None, None, None), slice(None, -1, None))] 2025-03-14T06:33:14.9685509Z clone_1: "i64[1, 1023][1023, 1]cuda:0" = getitem.clone(); getitem = None 2025-03-14T06:33:14.9686193Z prev_output_tokens[(slice(None, None, None), slice(1, None, None))] = clone_1; setitem = prev_output_tokens; clone_1 = setitem = None 2025-03-14T06:33:14.9686895Z 2025-03-14T06:33:14.9687664Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:98 in shift_tokens_right, code: prev_output_tokens[:, 0] = decoder_start_tokens 2025-03-14T06:33:14.9688904Z prev_output_tokens[(slice(None, None, None), 0)] = decoder_start_tokens; setitem_1 = prev_output_tokens; prev_output_tokens = decoder_start_tokens = setitem_1 = None 2025-03-14T06:33:14.9689642Z 2025-03-14T06:33:14.9690385Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1010 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:33:14.9691332Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_input_ids_.view(-1, 1024); l_input_ids_ = None 2025-03-14T06:33:14.9691765Z 2025-03-14T06:33:14.9692556Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1017 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:33:14.9694255Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:33:14.9695542Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:33:14.9696001Z 2025-03-14T06:33:14.9696664Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:119 in forward, code: positions = torch.arange( 2025-03-14T06:33:14.9697615Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:33:14.9698118Z 2025-03-14T06:33:14.9698765Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:121 in forward, code: ).expand(bsz, -1) 2025-03-14T06:33:14.9699651Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:33:14.9700069Z 2025-03-14T06:33:14.9700819Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:123 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:33:14.9701717Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:33:14.9702923Z embed_pos: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:33:14.9704026Z 2025-03-14T06:33:14.9704847Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1021 in forward, code: hidden_states = inputs_embeds + embed_pos.to(inputs_embeds.device) 2025-03-14T06:33:14.9706011Z to: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:33:14.9706698Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + to; inputs_embeds = to = None 2025-03-14T06:33:14.9707189Z 2025-03-14T06:33:14.9707962Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1022 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:33:14.9710263Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:33:14.9712091Z 2025-03-14T06:33:14.9712995Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1023 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:33:14.9714465Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:33:14.9715156Z 2025-03-14T06:33:14.9715858Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1050 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:33:14.9716697Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:33:14.9717042Z 2025-03-14T06:33:14.9717825Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1051 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:33:14.9718788Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:33:14.9719198Z 2025-03-14T06:33:15.7033302Z 2025-03-14T06:33:15.7034054Z class GraphModule(torch.nn.Module): 2025-03-14T06:33:15.7036678Z def forward(self, L_decoder_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_self_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_self_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_self_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_self_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:33:15.7039285Z l_decoder_input_ids_ = L_decoder_input_ids_ 2025-03-14T06:33:15.7039802Z l_input_ids_ = L_input_ids_ 2025-03-14T06:33:15.7040689Z l_self_modules_encoder_modules_embed_tokens_parameters_weight_ = L_self_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:33:15.7042068Z l_self_modules_encoder_modules_embed_positions_parameters_weight_ = L_self_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:33:15.7043550Z l_self_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:33:15.7045024Z l_self_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:33:15.7045986Z 2025-03-14T06:33:15.7047129Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1010 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:33:15.7048539Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_input_ids_.view(-1, 1024); l_input_ids_ = None 2025-03-14T06:33:15.7049616Z 2025-03-14T06:33:15.7050754Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1017 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:33:15.7052859Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_self_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:33:15.7054226Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:33:15.7054885Z 2025-03-14T06:33:15.7055557Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:119 in forward, code: positions = torch.arange( 2025-03-14T06:33:15.7056516Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:33:15.7057012Z 2025-03-14T06:33:15.7057650Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:121 in forward, code: ).expand(bsz, -1) 2025-03-14T06:33:15.7058475Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:33:15.7058888Z 2025-03-14T06:33:15.7059621Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:123 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:33:15.7060518Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:33:15.7061638Z embed_pos: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_self_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_self_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:33:15.7062638Z 2025-03-14T06:33:15.7063440Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1021 in forward, code: hidden_states = inputs_embeds + embed_pos.to(inputs_embeds.device) 2025-03-14T06:33:15.7064501Z to: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:33:15.7065169Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + to; inputs_embeds = to = None 2025-03-14T06:33:15.7065650Z 2025-03-14T06:33:15.7066416Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1022 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:33:15.7068893Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_self_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:33:15.7070446Z 2025-03-14T06:33:15.7071348Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1023 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:33:15.7072725Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:33:15.7073415Z 2025-03-14T06:33:15.7074251Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1050 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:33:15.7075175Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:33:15.7075520Z 2025-03-14T06:33:15.7076290Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1051 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:33:15.7077251Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:33:15.7077659Z 2025-03-14T06:33:15.9362398Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:33:15.9363221Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py", line 135, in forward 2025-03-14T06:33:15.9364326Z return super().forward(positions + self.offset) 2025-03-14T06:33:15.9364579Z 2025-03-14T06:33:17.8690714Z 2025-03-14T06:33:17.8691389Z class GraphModule(torch.nn.Module): 2025-03-14T06:33:17.8693809Z def forward(self, dict_getitem_L_stack0_list_dict_keys_L_stack0_0_: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0", L_decoder_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_self_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_self_modules_decoder_modules_embed_positions_parameters_weight_: "f32[1026, 1024][1024, 1]cuda:0", L_self_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_self_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:33:17.8695958Z dict_getitem_l_stack0_list_dict_keys_l_stack0_0_ = dict_getitem_L_stack0_list_dict_keys_L_stack0_0_ 2025-03-14T06:33:17.8696509Z l_decoder_input_ids_ = L_decoder_input_ids_ 2025-03-14T06:33:17.8697189Z l_self_modules_decoder_modules_embed_tokens_parameters_weight_ = L_self_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:33:17.8698148Z l_self_modules_decoder_modules_embed_positions_parameters_weight_ = L_self_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:33:17.8699143Z l_self_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:33:17.8700151Z l_self_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:33:17.8700793Z 2025-03-14T06:33:17.8701557Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1224 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:33:17.8702571Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_decoder_input_ids_.view(-1, 1024); l_decoder_input_ids_ = None 2025-03-14T06:33:17.8703048Z 2025-03-14T06:33:17.8703844Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1235 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:33:17.8705434Z embedding: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_self_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:33:17.8706617Z inputs_embeds: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:33:17.8707068Z 2025-03-14T06:33:17.8707880Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:33:17.8709005Z mask: "f32[1024, 1024][1024, 1]cuda:0" = torch.full((1024, 1024), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:33:17.8709527Z 2025-03-14T06:33:17.8710597Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:33:17.8711570Z mask_cond: "i64[1024][1]cuda:0" = torch.arange(1024, device = device(type='cuda', index=0)) 2025-03-14T06:33:17.8712013Z 2025-03-14T06:33:17.8712804Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:33:17.8713692Z add: "i64[1024][1]cuda:0" = mask_cond + 1 2025-03-14T06:33:17.8714357Z view_1: "i64[1024, 1][1, 1]cuda:0" = add.view(1024, 1); add = None 2025-03-14T06:33:17.8714841Z lt: "b8[1024, 1024][1024, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:33:17.8715419Z masked_fill_: "f32[1024, 1024][1024, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:33:17.8715877Z 2025-03-14T06:33:17.8716524Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:33:17.8717349Z mask_1: "f32[1024, 1024][1024, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:33:17.8717750Z 2025-03-14T06:33:17.8718574Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:33:17.8719769Z getitem: "f32[1, 1, 1024, 1024][1048576, 1048576, 1024, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:33:17.8720676Z causal_4d_mask: "f32[1, 1, 1024, 1024][1048576, 1048576, 1024, 1]cuda:0" = getitem.expand(1, 1, 1024, 1024); getitem = causal_4d_mask = None 2025-03-14T06:33:17.8721225Z 2025-03-14T06:33:17.8721881Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:119 in forward, code: positions = torch.arange( 2025-03-14T06:33:17.8722828Z arange_1: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:33:17.8723332Z 2025-03-14T06:33:17.8723960Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:121 in forward, code: ).expand(bsz, -1) 2025-03-14T06:33:17.8724789Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange_1.expand(1, -1); arange_1 = None 2025-03-14T06:33:17.8725208Z 2025-03-14T06:33:17.8725940Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:123 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:33:17.8726840Z add_1: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:33:17.8727965Z positions_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.embedding(add_1, l_self_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add_1 = l_self_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:33:17.8728988Z 2025-03-14T06:33:17.8729788Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1259 in forward, code: hidden_states = inputs_embeds + positions.to(inputs_embeds.device) 2025-03-14T06:33:17.8730872Z to_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = positions_1.to(device(type='cuda', index=0)); positions_1 = None 2025-03-14T06:33:17.8731567Z hidden_states: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = inputs_embeds + to_1; inputs_embeds = to_1 = None 2025-03-14T06:33:17.8732158Z 2025-03-14T06:33:17.8732922Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1260 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:33:17.8735020Z hidden_states_1: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_self_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:33:17.8736637Z 2025-03-14T06:33:17.8737534Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1262 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:33:17.8738908Z hidden_states_2: "f32[1, 1024, 1024][1048576, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:33:17.8739591Z 2025-03-14T06:33:17.8740340Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1290 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:33:17.8741167Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:33:17.8741512Z 2025-03-14T06:33:17.8742222Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mbart/modeling_mbart.py:1291 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:33:17.8743135Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:33:17.8743551Z 2025-03-14T06:33:39.0815926Z Compilation time (from dynamo_timed): 8.918287111 2025-03-14T06:33:39.0846482Z pass 2025-03-14T06:33:39.1888994Z TIMING: entire_frame_compile:6.78203 gc:0.01034 _recursive_pre_grad_passes:0.00751 _recursive_joint_graph_passes:0.41388 inductor_compile:4.23013 backend_compile:5.14998 _recursive_post_grad_passes:0.20407 async_compile.precompile:0.12711 async_compile.wait:0.54218 code_gen:2.43084 cudagraphify.get_container:0.29018 pad_mm_benchmark:0.04218 entire_backward_compile:2.13626 CUDAGraphNode.record:13.68507 total_wall_time:8.91829 2025-03-14T06:33:39.1890983Z STATS: call_* op count: 136 | FakeTensorMode.__torch_dispatch__:9410 | FakeTensor.__torch_dispatch__:1421 | ProxyTorchDispatchMode.__torch_dispatch__:4383 2025-03-14T06:33:44.6964219Z Dynamo produced 8 graphs covering 136 ops with 8 graph breaks (5 unique) 2025-03-14T06:33:44.6967340Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:33:44.6969039Z warnings.warn( 2025-03-14T06:33:44.9452207Z 2025-03-14T06:33:49.1576367Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:33:49.1576754Z loading model: 0it [00:04, ?it/s] 2025-03-14T06:33:49.1577130Z cuda train MT5ForConditionalGeneration 2025-03-14T06:34:39.0382340Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:34:39.0383607Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:34:39.0384780Z pred = mod(**cloned_inputs) 2025-03-14T06:34:39.0385777Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mt5/modeling_mt5.py", line 1722, in forward 2025-03-14T06:34:39.0386862Z encoder_outputs = self.encoder( 2025-03-14T06:34:39.0387853Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mt5/modeling_mt5.py", line 989, in forward 2025-03-14T06:34:39.0388993Z inputs_embeds = self.embed_tokens(input_ids) 2025-03-14T06:34:39.0389263Z 2025-03-14T06:34:39.2038600Z W0314 06:34:39.202000 24306 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:35:21.0172414Z Compilation time (from dynamo_timed): 84.410059052 2025-03-14T06:35:21.0210024Z pass 2025-03-14T06:35:21.1658027Z TIMING: entire_frame_compile:71.30032 gc:0.00541 _recursive_pre_grad_passes:0.04223 pad_mm_benchmark:0.33773 _recursive_joint_graph_passes:2.4568 _recursive_post_grad_passes:1.25082 async_compile.wait:3.21008 code_gen:21.15834 inductor_compile:39.40371 backend_compile:55.38039 cudagraphify.get_container:0.5974 entire_backward_compile:13.10973 CUDAGraphNode.record:2.13342 total_wall_time:84.41006 2025-03-14T06:35:21.1660489Z STATS: call_* op count: 2145 | FakeTensorMode.__torch_dispatch__:98979 | ProxyTorchDispatchMode.__torch_dispatch__:46407 | FakeTensor.__torch_dispatch__:17593 2025-03-14T06:35:21.1661345Z Dynamo produced 2 graphs covering 2145 ops with 5 graph breaks (4 unique) 2025-03-14T06:35:29.7166206Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:35:29.7167412Z warnings.warn( 2025-03-14T06:35:31.0745360Z 2025-03-14T06:35:31.0784784Z loading model: 0it [00:00, ?it/s]If you want to use `MegatronBertForCausalLM` as a standalone, add `is_decoder=True.` 2025-03-14T06:35:35.8221165Z 2025-03-14T06:35:35.8221606Z loading model: 0it [00:04, ?it/s] 2025-03-14T06:35:35.8222161Z cuda train MegatronBertForCausalLM 2025-03-14T06:36:39.0319813Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:36:39.0320717Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:36:39.0321428Z pred = mod(**cloned_inputs) 2025-03-14T06:36:39.0322143Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1199, in forward 2025-03-14T06:36:39.0322864Z outputs = self.bert( 2025-03-14T06:36:39.0323537Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 967, in forward 2025-03-14T06:36:39.0324259Z embedding_output = self.embeddings( 2025-03-14T06:36:39.0324976Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 179, in forward 2025-03-14T06:36:39.0325732Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:36:39.0325990Z 2025-03-14T06:36:39.2618285Z W0314 06:36:39.260000 24886 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:38:04.4884473Z Compilation time (from dynamo_timed): 134.336057117 2025-03-14T06:38:04.4928575Z pass 2025-03-14T06:38:04.6932994Z TIMING: entire_frame_compile:119.68305 gc:0.00477 _recursive_pre_grad_passes:0.06412 pad_mm_benchmark:0.33738 _recursive_joint_graph_passes:2.87835 _recursive_post_grad_passes:1.22126 async_compile.wait:5.1285 code_gen:38.33075 inductor_compile:65.9404 backend_compile:93.65766 cudagraphify.get_container:0.67485 entire_backward_compile:14.65301 CUDAGraphNode.record:3.46392 total_wall_time:134.33606 2025-03-14T06:38:04.6935372Z STATS: call_* op count: 2712 | FakeTensorMode.__torch_dispatch__:122400 | FakeTensor.__torch_dispatch__:27746 | ProxyTorchDispatchMode.__torch_dispatch__:55810 2025-03-14T06:38:04.6936232Z Dynamo produced 2 graphs covering 2712 ops with 5 graph breaks (4 unique) 2025-03-14T06:38:15.4011261Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:38:15.4012524Z warnings.warn( 2025-03-14T06:38:16.0359867Z 2025-03-14T06:38:20.4840679Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:38:20.4841046Z loading model: 0it [00:04, ?it/s] 2025-03-14T06:38:20.4841413Z cuda train MegatronBertForQuestionAnswering 2025-03-14T06:39:22.9734986Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:39:22.9737621Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:39:22.9738844Z pred = mod(**cloned_inputs) 2025-03-14T06:39:22.9739572Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 1792, in forward 2025-03-14T06:39:22.9740284Z outputs = self.bert( 2025-03-14T06:39:22.9740967Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 967, in forward 2025-03-14T06:39:22.9741690Z embedding_output = self.embeddings( 2025-03-14T06:39:22.9742406Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/megatron_bert/modeling_megatron_bert.py", line 179, in forward 2025-03-14T06:39:22.9743150Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:39:22.9743398Z 2025-03-14T06:39:23.2028264Z W0314 06:39:23.201000 25510 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:40:43.8835625Z Compilation time (from dynamo_timed): 129.513308439 2025-03-14T06:40:43.8883033Z pass 2025-03-14T06:40:44.0929007Z TIMING: entire_frame_compile:115.04987 gc:0.0043 _recursive_pre_grad_passes:0.06408 pad_mm_benchmark:0.5478 _recursive_joint_graph_passes:3.07026 _recursive_post_grad_passes:1.21489 async_compile.wait:0.93522 code_gen:33.5645 inductor_compile:61.02794 backend_compile:88.95607 cudagraphify.get_container:0.66571 entire_backward_compile:14.46344 CUDAGraphNode.record:3.45987 total_wall_time:129.51331 2025-03-14T06:40:44.0930915Z STATS: call_* op count: 2700 | FakeTensorMode.__torch_dispatch__:121783 | FakeTensor.__torch_dispatch__:27581 | ProxyTorchDispatchMode.__torch_dispatch__:55562 2025-03-14T06:40:44.0931746Z Dynamo produced 2 graphs covering 2700 ops with 5 graph breaks (4 unique) 2025-03-14T06:40:54.7361159Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:40:54.7362357Z warnings.warn( 2025-03-14T06:40:56.4419891Z 2025-03-14T06:40:57.9489388Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:40:57.9489756Z loading model: 0it [00:01, ?it/s] 2025-03-14T06:40:57.9490125Z cuda train MobileBertForMaskedLM 2025-03-14T06:42:50.3717947Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:42:50.3720395Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:42:50.3721179Z pred = mod(**cloned_inputs) 2025-03-14T06:42:50.3721889Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1089, in forward 2025-03-14T06:42:50.3722599Z outputs = self.mobilebert( 2025-03-14T06:42:50.3723279Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 895, in forward 2025-03-14T06:42:50.3724015Z embedding_output = self.embeddings( 2025-03-14T06:42:50.3724718Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 218, in forward 2025-03-14T06:42:50.3725708Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:42:50.3725966Z 2025-03-14T06:42:52.7952389Z Compilation time (from dynamo_timed): 108.13130130300001 2025-03-14T06:42:52.8119056Z pass 2025-03-14T06:42:52.8602265Z TIMING: entire_frame_compile:81.84741 gc:0.00285 _recursive_pre_grad_passes:0.02689 pad_mm_benchmark:0.92819 _recursive_joint_graph_passes:5.7293 _recursive_post_grad_passes:2.92064 async_compile.wait:1.12314 code_gen:22.79513 inductor_compile:46.3061 backend_compile:57.24689 cudagraphify.get_container:0.89701 entire_backward_compile:26.28389 CUDAGraphNode.record:1.34992 total_wall_time:108.1313 2025-03-14T06:42:52.8604161Z STATS: call_* op count: 1449 | FakeTensorMode.__torch_dispatch__:155600 | FakeTensor.__torch_dispatch__:17974 | ProxyTorchDispatchMode.__torch_dispatch__:74439 2025-03-14T06:42:52.8605287Z Dynamo produced 1 graphs covering 1449 ops with 3 graph breaks (3 unique) 2025-03-14T06:43:02.6071846Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:43:02.6073057Z warnings.warn( 2025-03-14T06:43:02.8602308Z 2025-03-14T06:43:04.1900338Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:43:04.1901001Z loading model: 0it [00:01, ?it/s] 2025-03-14T06:43:04.1901667Z cuda train MobileBertForQuestionAnswering 2025-03-14T06:44:55.0709721Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:44:55.0710574Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:44:55.0711327Z pred = mod(**cloned_inputs) 2025-03-14T06:44:55.0712029Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 1390, in forward 2025-03-14T06:44:55.0712754Z outputs = self.mobilebert( 2025-03-14T06:44:55.0713432Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 895, in forward 2025-03-14T06:44:55.0714144Z embedding_output = self.embeddings( 2025-03-14T06:44:55.0714907Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/mobilebert/modeling_mobilebert.py", line 218, in forward 2025-03-14T06:44:55.0715638Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:44:55.0715895Z 2025-03-14T06:44:57.4896680Z Compilation time (from dynamo_timed): 106.583438658 2025-03-14T06:44:57.5065677Z pass 2025-03-14T06:44:57.5405390Z TIMING: entire_frame_compile:80.8009 gc:0.00225 _recursive_pre_grad_passes:0.02458 pad_mm_benchmark:0.93206 _recursive_joint_graph_passes:5.72098 _recursive_post_grad_passes:2.95079 async_compile.wait:0.12122 code_gen:21.60633 inductor_compile:45.07853 backend_compile:56.37928 cudagraphify.get_container:0.89543 entire_backward_compile:25.78254 CUDAGraphNode.record:1.35231 total_wall_time:106.58344 2025-03-14T06:44:57.5407428Z STATS: call_* op count: 1453 | FakeTensorMode.__torch_dispatch__:155415 | ProxyTorchDispatchMode.__torch_dispatch__:74387 | FakeTensor.__torch_dispatch__:17960 2025-03-14T06:44:57.5408275Z Dynamo produced 1 graphs covering 1453 ops with 3 graph breaks (3 unique) 2025-03-14T06:45:07.2911317Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:45:07.2912532Z warnings.warn( 2025-03-14T06:45:07.5568745Z 2025-03-14T06:45:10.1484095Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:45:10.1484457Z loading model: 0it [00:02, ?it/s] 2025-03-14T06:45:10.1484800Z cuda train OPTForCausalLM 2025-03-14T06:45:10.1693171Z WARNING:common:fp64 golden ref were not generated for OPTForCausalLM. Setting accuracy check to cosine 2025-03-14T06:45:12.7270569Z 2025-03-14T06:45:12.7271111Z class GraphModule(torch.nn.Module): 2025-03-14T06:45:12.7272292Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 2048][2048, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50272, 768][768, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[2050, 768][768, 1]cuda:0"): 2025-03-14T06:45:12.7273499Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:45:12.7274356Z l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:45:12.7275974Z l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:45:12.7276702Z 2025-03-14T06:45:12.7276984Z # No stacktrace found for following nodes 2025-03-14T06:45:12.7277576Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:45:12.7278117Z 2025-03-14T06:45:12.7278876Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:824 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:45:12.7279906Z input_ids: "i64[1, 2048][2048, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 2048); l_cloned_inputs_input_ids_ = None 2025-03-14T06:45:12.7280437Z 2025-03-14T06:45:12.7281147Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:831 in forward, code: inputs_embeds = self.embed_tokens(input_ids) 2025-03-14T06:45:12.7282816Z inputs_embeds: "f32[1, 2048, 768][1572864, 768, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:45:12.7283926Z 2025-03-14T06:45:12.7284774Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:850 in forward, code: attention_mask = torch.ones(batch_size, mask_seq_length, device=inputs_embeds.device) 2025-03-14T06:45:12.7285879Z attention_mask: "f32[1, 2048][2048, 1]cuda:0" = torch.ones(1, 2048, device = device(type='cuda', index=0)) 2025-03-14T06:45:12.7286364Z 2025-03-14T06:45:12.7287182Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:45:12.7288317Z mask: "f32[2048, 2048][2048, 1]cuda:0" = torch.full((2048, 2048), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:45:12.7288848Z 2025-03-14T06:45:12.7289601Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:45:12.7290581Z mask_cond: "i64[2048][1]cuda:0" = torch.arange(2048, device = device(type='cuda', index=0)) 2025-03-14T06:45:12.7291032Z 2025-03-14T06:45:12.7291834Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:45:12.7292794Z add: "i64[2048][1]cuda:0" = mask_cond + 1 2025-03-14T06:45:12.7293195Z view_1: "i64[2048, 1][1, 1]cuda:0" = add.view(2048, 1); add = None 2025-03-14T06:45:12.7293686Z lt: "b8[2048, 2048][2048, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:45:12.7294454Z masked_fill_: "f32[2048, 2048][2048, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:45:12.7294929Z 2025-03-14T06:45:12.7295591Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:45:12.7296426Z mask_1: "f32[2048, 2048][2048, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:45:12.7296837Z 2025-03-14T06:45:12.7297671Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:45:12.7298956Z getitem: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:45:12.7299781Z causal_4d_mask: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = getitem.expand(1, 1, 2048, 2048); getitem = None 2025-03-14T06:45:12.7300302Z 2025-03-14T06:45:12.7301121Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:182 in _expand_mask, code: expanded_mask = mask[:, None, None, :].expand(bsz, 1, tgt_len, src_len).to(dtype) 2025-03-14T06:45:12.7302269Z getitem_1: "f32[1, 1, 1, 2048][2048, 2048, 2048, 1]cuda:0" = attention_mask[(slice(None, None, None), None, None, slice(None, None, None))] 2025-03-14T06:45:12.7302999Z expand_1: "f32[1, 1, 2048, 2048][2048, 2048, 0, 1]cuda:0" = getitem_1.expand(1, 1, 2048, 2048); getitem_1 = None 2025-03-14T06:45:12.7303659Z expanded_mask: "f32[1, 1, 2048, 2048][2048, 2048, 0, 1]cuda:0" = expand_1.to(torch.float32); expand_1 = None 2025-03-14T06:45:12.7304131Z 2025-03-14T06:45:12.7304820Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:184 in _expand_mask, code: inverted_mask = 1.0 - expanded_mask 2025-03-14T06:45:12.7305785Z inverted_mask: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = 1.0 - expanded_mask; expanded_mask = None 2025-03-14T06:45:12.7306284Z 2025-03-14T06:45:12.7307138Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:186 in _expand_mask, code: return inverted_mask.masked_fill(inverted_mask.to(torch.bool), torch.finfo(dtype).min) 2025-03-14T06:45:12.7308194Z to_2: "b8[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = inverted_mask.to(torch.bool) 2025-03-14T06:45:12.7308944Z masked_fill: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = inverted_mask.masked_fill(to_2, -3.4028234663852886e+38); inverted_mask = to_2 = None 2025-03-14T06:45:12.7309558Z 2025-03-14T06:45:12.7310405Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:132 in to_4d, code: expanded_attn_mask = self._expand_mask(attention_mask_2d, dtype, tgt_len=input_shape[-1]).to( 2025-03-14T06:45:12.7311600Z expanded_attn_mask: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = masked_fill.to(device(type='cuda', index=0)); masked_fill = None 2025-03-14T06:45:12.7312182Z 2025-03-14T06:45:12.7313093Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:137 in to_4d, code: expanded_attn_mask = causal_4d_mask.masked_fill(expanded_attn_mask.bool(), torch.finfo(dtype).min) 2025-03-14T06:45:12.7314227Z bool_1: "b8[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = expanded_attn_mask.bool(); expanded_attn_mask = None 2025-03-14T06:45:12.7315253Z expanded_attn_mask_1: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = causal_4d_mask.masked_fill(bool_1, -3.4028234663852886e+38); causal_4d_mask = bool_1 = expanded_attn_mask_1 = None 2025-03-14T06:45:12.7315984Z 2025-03-14T06:45:12.7316770Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:101 in forward, code: attention_mask = attention_mask.long() 2025-03-14T06:45:12.7317700Z attention_mask_1: "i64[1, 2048][2048, 1]cuda:0" = attention_mask.long(); attention_mask = None 2025-03-14T06:45:12.7318165Z 2025-03-14T06:45:12.7319060Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:104 in forward, code: positions = (torch.cumsum(attention_mask, dim=1).type_as(attention_mask) * attention_mask).long() - 1 2025-03-14T06:45:12.7320137Z cumsum: "i64[1, 2048][2048, 1]cuda:0" = torch.cumsum(attention_mask_1, dim = 1) 2025-03-14T06:45:12.7320771Z type_as: "i64[1, 2048][2048, 1]cuda:0" = cumsum.type_as(attention_mask_1); cumsum = None 2025-03-14T06:45:12.7321362Z mul: "i64[1, 2048][2048, 1]cuda:0" = type_as * attention_mask_1; type_as = attention_mask_1 = None 2025-03-14T06:45:12.7321888Z long_1: "i64[1, 2048][2048, 1]cuda:0" = mul.long(); mul = None 2025-03-14T06:45:12.7322345Z positions: "i64[1, 2048][2048, 1]cuda:0" = long_1 - 1; long_1 = None 2025-03-14T06:45:12.7322768Z 2025-03-14T06:45:12.7323512Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:107 in forward, code: positions = positions[:, past_key_values_length:] 2025-03-14T06:45:12.7324544Z positions_1: "i64[1, 2048][2048, 1]cuda:0" = positions[(slice(None, None, None), slice(0, None, None))]; positions = None 2025-03-14T06:45:12.7325072Z 2025-03-14T06:45:12.7325796Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:109 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:45:12.7326688Z add_1: "i64[1, 2048][2048, 1]cuda:0" = positions_1 + 2; positions_1 = None 2025-03-14T06:45:12.7327911Z pos_embeds: "f32[1, 2048, 768][1572864, 768, 1]cuda:0" = torch.nn.functional.embedding(add_1, l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add_1 = l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:45:12.7329011Z 2025-03-14T06:45:12.7329716Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:865 in forward, code: hidden_states = inputs_embeds + pos_embeds 2025-03-14T06:45:12.7330954Z hidden_states: "f32[1, 2048, 768][1572864, 768, 1]cuda:0" = inputs_embeds + pos_embeds; inputs_embeds = pos_embeds = hidden_states = None 2025-03-14T06:45:12.7331530Z 2025-03-14T06:45:12.7332209Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:894 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:45:12.7333014Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:45:12.7333367Z 2025-03-14T06:45:12.7334063Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:895 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:45:12.7334955Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:45:12.7335381Z 2025-03-14T06:45:12.7335543Z 2025-03-14T06:45:12.7335668Z class GraphModule(torch.nn.Module): 2025-03-14T06:45:12.7336771Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 2048][2048, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50272, 768][768, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[2050, 768][768, 1]cuda:0"): 2025-03-14T06:45:12.7337933Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:45:12.7338802Z l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:45:12.7339927Z l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:45:12.7340645Z 2025-03-14T06:45:12.7340906Z # No stacktrace found for following nodes 2025-03-14T06:45:12.7341488Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:45:12.7342030Z 2025-03-14T06:45:12.7342804Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:824 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:45:12.7343926Z input_ids: "i64[1, 2048][2048, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 2048); l_cloned_inputs_input_ids_ = None 2025-03-14T06:45:12.7344448Z 2025-03-14T06:45:12.7345161Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:831 in forward, code: inputs_embeds = self.embed_tokens(input_ids) 2025-03-14T06:45:12.7346765Z inputs_embeds: "f32[1, 2048, 768][1572864, 768, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:45:12.7347870Z 2025-03-14T06:45:12.7348718Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:850 in forward, code: attention_mask = torch.ones(batch_size, mask_seq_length, device=inputs_embeds.device) 2025-03-14T06:45:12.7349825Z attention_mask: "f32[1, 2048][2048, 1]cuda:0" = torch.ones(1, 2048, device = device(type='cuda', index=0)) 2025-03-14T06:45:12.7350302Z 2025-03-14T06:45:12.7351124Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:45:12.7352247Z mask: "f32[2048, 2048][2048, 1]cuda:0" = torch.full((2048, 2048), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:45:12.7352817Z 2025-03-14T06:45:12.7353586Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:45:12.7354623Z mask_cond: "i64[2048][1]cuda:0" = torch.arange(2048, device = device(type='cuda', index=0)) 2025-03-14T06:45:12.7355075Z 2025-03-14T06:45:12.7355881Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:45:12.7356781Z add: "i64[2048][1]cuda:0" = mask_cond + 1 2025-03-14T06:45:12.7357183Z view_1: "i64[2048, 1][1, 1]cuda:0" = add.view(2048, 1); add = None 2025-03-14T06:45:12.7357672Z lt: "b8[2048, 2048][2048, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:45:12.7358260Z masked_fill_: "f32[2048, 2048][2048, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:45:12.7358730Z 2025-03-14T06:45:12.7359388Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:45:12.7360231Z mask_1: "f32[2048, 2048][2048, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:45:12.7360637Z 2025-03-14T06:45:12.7361562Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:45:12.7362764Z getitem: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:45:12.7363570Z causal_4d_mask: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = getitem.expand(1, 1, 2048, 2048); getitem = None 2025-03-14T06:45:12.7364079Z 2025-03-14T06:45:12.7364891Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:182 in _expand_mask, code: expanded_mask = mask[:, None, None, :].expand(bsz, 1, tgt_len, src_len).to(dtype) 2025-03-14T06:45:12.7366110Z getitem_1: "f32[1, 1, 1, 2048][2048, 2048, 2048, 1]cuda:0" = attention_mask[(slice(None, None, None), None, None, slice(None, None, None))] 2025-03-14T06:45:12.7366842Z expand_1: "f32[1, 1, 2048, 2048][2048, 2048, 0, 1]cuda:0" = getitem_1.expand(1, 1, 2048, 2048); getitem_1 = None 2025-03-14T06:45:12.7367491Z expanded_mask: "f32[1, 1, 2048, 2048][2048, 2048, 0, 1]cuda:0" = expand_1.to(torch.float32); expand_1 = None 2025-03-14T06:45:12.7368224Z 2025-03-14T06:45:12.7368911Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:184 in _expand_mask, code: inverted_mask = 1.0 - expanded_mask 2025-03-14T06:45:12.7369872Z inverted_mask: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = 1.0 - expanded_mask; expanded_mask = None 2025-03-14T06:45:12.7370371Z 2025-03-14T06:45:12.7371226Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:186 in _expand_mask, code: return inverted_mask.masked_fill(inverted_mask.to(torch.bool), torch.finfo(dtype).min) 2025-03-14T06:45:12.7372294Z to_2: "b8[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = inverted_mask.to(torch.bool) 2025-03-14T06:45:12.7373051Z masked_fill: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = inverted_mask.masked_fill(to_2, -3.4028234663852886e+38); inverted_mask = to_2 = None 2025-03-14T06:45:12.7373663Z 2025-03-14T06:45:12.7374506Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:132 in to_4d, code: expanded_attn_mask = self._expand_mask(attention_mask_2d, dtype, tgt_len=input_shape[-1]).to( 2025-03-14T06:45:12.7375705Z expanded_attn_mask: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = masked_fill.to(device(type='cuda', index=0)); masked_fill = None 2025-03-14T06:45:12.7376289Z 2025-03-14T06:45:12.7377155Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:137 in to_4d, code: expanded_attn_mask = causal_4d_mask.masked_fill(expanded_attn_mask.bool(), torch.finfo(dtype).min) 2025-03-14T06:45:12.7378301Z bool_1: "b8[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = expanded_attn_mask.bool(); expanded_attn_mask = None 2025-03-14T06:45:12.7379252Z expanded_attn_mask_1: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = causal_4d_mask.masked_fill(bool_1, -3.4028234663852886e+38); causal_4d_mask = bool_1 = expanded_attn_mask_1 = None 2025-03-14T06:45:12.7379978Z 2025-03-14T06:45:12.7380673Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:101 in forward, code: attention_mask = attention_mask.long() 2025-03-14T06:45:12.7381600Z attention_mask_1: "i64[1, 2048][2048, 1]cuda:0" = attention_mask.long(); attention_mask = None 2025-03-14T06:45:12.7382064Z 2025-03-14T06:45:12.7382957Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:104 in forward, code: positions = (torch.cumsum(attention_mask, dim=1).type_as(attention_mask) * attention_mask).long() - 1 2025-03-14T06:45:12.7384171Z cumsum: "i64[1, 2048][2048, 1]cuda:0" = torch.cumsum(attention_mask_1, dim = 1) 2025-03-14T06:45:12.7384732Z type_as: "i64[1, 2048][2048, 1]cuda:0" = cumsum.type_as(attention_mask_1); cumsum = None 2025-03-14T06:45:12.7385327Z mul: "i64[1, 2048][2048, 1]cuda:0" = type_as * attention_mask_1; type_as = attention_mask_1 = None 2025-03-14T06:45:12.7385851Z long_1: "i64[1, 2048][2048, 1]cuda:0" = mul.long(); mul = None 2025-03-14T06:45:12.7386304Z positions: "i64[1, 2048][2048, 1]cuda:0" = long_1 - 1; long_1 = None 2025-03-14T06:45:12.7386686Z 2025-03-14T06:45:12.7387419Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:107 in forward, code: positions = positions[:, past_key_values_length:] 2025-03-14T06:45:12.7388570Z positions_1: "i64[1, 2048][2048, 1]cuda:0" = positions[(slice(None, None, None), slice(0, None, None))]; positions = None 2025-03-14T06:45:12.7389095Z 2025-03-14T06:45:12.7389825Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:109 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:45:12.7390710Z add_1: "i64[1, 2048][2048, 1]cuda:0" = positions_1 + 2; positions_1 = None 2025-03-14T06:45:12.7391935Z pos_embeds: "f32[1, 2048, 768][1572864, 768, 1]cuda:0" = torch.nn.functional.embedding(add_1, l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add_1 = l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:45:12.7393087Z 2025-03-14T06:45:12.7393788Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:865 in forward, code: hidden_states = inputs_embeds + pos_embeds 2025-03-14T06:45:12.7394903Z hidden_states: "f32[1, 2048, 768][1572864, 768, 1]cuda:0" = inputs_embeds + pos_embeds; inputs_embeds = pos_embeds = hidden_states = None 2025-03-14T06:45:12.7395472Z 2025-03-14T06:45:12.7396151Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:894 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:45:12.7396961Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:45:12.7397309Z 2025-03-14T06:45:12.7397999Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:895 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:45:12.7398895Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:45:12.7399323Z 2025-03-14T06:45:13.6904402Z 2025-03-14T06:45:13.6905330Z class GraphModule(torch.nn.Module): 2025-03-14T06:45:13.6906528Z def forward(self, L_input_ids_: "i64[1, 2048][2048, 1]cuda:0", L_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50272, 768][768, 1]cuda:0", L_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[2050, 768][768, 1]cuda:0"): 2025-03-14T06:45:13.6907606Z l_input_ids_ = L_input_ids_ 2025-03-14T06:45:13.6908332Z l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:45:13.6909477Z l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:45:13.6910211Z 2025-03-14T06:45:13.6910960Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:824 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:45:13.6911904Z input_ids: "i64[1, 2048][2048, 1]cuda:0" = l_input_ids_.view(-1, 2048); l_input_ids_ = None 2025-03-14T06:45:13.6912337Z 2025-03-14T06:45:13.6913389Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:831 in forward, code: inputs_embeds = self.embed_tokens(input_ids) 2025-03-14T06:45:13.6915151Z inputs_embeds: "f32[1, 2048, 768][1572864, 768, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:45:13.6916276Z 2025-03-14T06:45:13.6917129Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:850 in forward, code: attention_mask = torch.ones(batch_size, mask_seq_length, device=inputs_embeds.device) 2025-03-14T06:45:13.6918379Z attention_mask: "f32[1, 2048][2048, 1]cuda:0" = torch.ones(1, 2048, device = device(type='cuda', index=0)) 2025-03-14T06:45:13.6918860Z 2025-03-14T06:45:13.6919692Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:45:13.6921072Z mask: "f32[2048, 2048][2048, 1]cuda:0" = torch.full((2048, 2048), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:45:13.6921779Z 2025-03-14T06:45:13.6922609Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:45:13.6923629Z mask_cond: "i64[2048][1]cuda:0" = torch.arange(2048, device = device(type='cuda', index=0)) 2025-03-14T06:45:13.6924086Z 2025-03-14T06:45:13.6924891Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:45:13.6925791Z add: "i64[2048][1]cuda:0" = mask_cond + 1 2025-03-14T06:45:13.6926195Z view_1: "i64[2048, 1][1, 1]cuda:0" = add.view(2048, 1); add = None 2025-03-14T06:45:13.6926687Z lt: "b8[2048, 2048][2048, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:45:13.6927271Z masked_fill_: "f32[2048, 2048][2048, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:45:13.6927738Z 2025-03-14T06:45:13.6928396Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:45:13.6929238Z mask_1: "f32[2048, 2048][2048, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:45:13.6929645Z 2025-03-14T06:45:13.6930498Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:45:13.6931937Z getitem: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:45:13.6932753Z causal_4d_mask: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = getitem.expand(1, 1, 2048, 2048); getitem = None 2025-03-14T06:45:13.6933269Z 2025-03-14T06:45:13.6934081Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:182 in _expand_mask, code: expanded_mask = mask[:, None, None, :].expand(bsz, 1, tgt_len, src_len).to(dtype) 2025-03-14T06:45:13.6935224Z getitem_1: "f32[1, 1, 1, 2048][2048, 2048, 2048, 1]cuda:0" = attention_mask[(slice(None, None, None), None, None, slice(None, None, None))] 2025-03-14T06:45:13.6936059Z expand_1: "f32[1, 1, 2048, 2048][2048, 2048, 0, 1]cuda:0" = getitem_1.expand(1, 1, 2048, 2048); getitem_1 = None 2025-03-14T06:45:13.6936723Z expanded_mask: "f32[1, 1, 2048, 2048][2048, 2048, 0, 1]cuda:0" = expand_1.to(torch.float32); expand_1 = None 2025-03-14T06:45:13.6937195Z 2025-03-14T06:45:13.6937878Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:184 in _expand_mask, code: inverted_mask = 1.0 - expanded_mask 2025-03-14T06:45:13.6938841Z inverted_mask: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = 1.0 - expanded_mask; expanded_mask = None 2025-03-14T06:45:13.6939348Z 2025-03-14T06:45:13.6940209Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:186 in _expand_mask, code: return inverted_mask.masked_fill(inverted_mask.to(torch.bool), torch.finfo(dtype).min) 2025-03-14T06:45:13.6941388Z to_2: "b8[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = inverted_mask.to(torch.bool) 2025-03-14T06:45:13.6942161Z masked_fill: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = inverted_mask.masked_fill(to_2, -3.4028234663852886e+38); inverted_mask = to_2 = None 2025-03-14T06:45:13.6943008Z 2025-03-14T06:45:13.6943856Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:132 in to_4d, code: expanded_attn_mask = self._expand_mask(attention_mask_2d, dtype, tgt_len=input_shape[-1]).to( 2025-03-14T06:45:13.6945050Z expanded_attn_mask: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = masked_fill.to(device(type='cuda', index=0)); masked_fill = None 2025-03-14T06:45:13.6945631Z 2025-03-14T06:45:13.6946496Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:137 in to_4d, code: expanded_attn_mask = causal_4d_mask.masked_fill(expanded_attn_mask.bool(), torch.finfo(dtype).min) 2025-03-14T06:45:13.6947633Z bool_1: "b8[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = expanded_attn_mask.bool(); expanded_attn_mask = None 2025-03-14T06:45:13.6948573Z expanded_attn_mask_1: "f32[1, 1, 2048, 2048][4194304, 4194304, 2048, 1]cuda:0" = causal_4d_mask.masked_fill(bool_1, -3.4028234663852886e+38); causal_4d_mask = bool_1 = expanded_attn_mask_1 = None 2025-03-14T06:45:13.6949293Z 2025-03-14T06:45:13.6949984Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:101 in forward, code: attention_mask = attention_mask.long() 2025-03-14T06:45:13.6950909Z attention_mask_1: "i64[1, 2048][2048, 1]cuda:0" = attention_mask.long(); attention_mask = None 2025-03-14T06:45:13.6951372Z 2025-03-14T06:45:13.6952263Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:104 in forward, code: positions = (torch.cumsum(attention_mask, dim=1).type_as(attention_mask) * attention_mask).long() - 1 2025-03-14T06:45:13.6953391Z cumsum: "i64[1, 2048][2048, 1]cuda:0" = torch.cumsum(attention_mask_1, dim = 1) 2025-03-14T06:45:13.6953949Z type_as: "i64[1, 2048][2048, 1]cuda:0" = cumsum.type_as(attention_mask_1); cumsum = None 2025-03-14T06:45:13.6954658Z mul: "i64[1, 2048][2048, 1]cuda:0" = type_as * attention_mask_1; type_as = attention_mask_1 = None 2025-03-14T06:45:13.6955184Z long_1: "i64[1, 2048][2048, 1]cuda:0" = mul.long(); mul = None 2025-03-14T06:45:13.6955643Z positions: "i64[1, 2048][2048, 1]cuda:0" = long_1 - 1; long_1 = None 2025-03-14T06:45:13.6956037Z 2025-03-14T06:45:13.6956771Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:107 in forward, code: positions = positions[:, past_key_values_length:] 2025-03-14T06:45:13.6957814Z positions_1: "i64[1, 2048][2048, 1]cuda:0" = positions[(slice(None, None, None), slice(0, None, None))]; positions = None 2025-03-14T06:45:13.6958341Z 2025-03-14T06:45:13.6959175Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:109 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:45:13.6960066Z add_1: "i64[1, 2048][2048, 1]cuda:0" = positions_1 + 2; positions_1 = None 2025-03-14T06:45:13.6961298Z pos_embeds: "f32[1, 2048, 768][1572864, 768, 1]cuda:0" = torch.nn.functional.embedding(add_1, l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add_1 = l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:45:13.6962406Z 2025-03-14T06:45:13.6963254Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:865 in forward, code: hidden_states = inputs_embeds + pos_embeds 2025-03-14T06:45:13.6964312Z hidden_states: "f32[1, 2048, 768][1572864, 768, 1]cuda:0" = inputs_embeds + pos_embeds; inputs_embeds = pos_embeds = hidden_states = None 2025-03-14T06:45:13.6964883Z 2025-03-14T06:45:13.6965558Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:894 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:45:13.6966366Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:45:13.6966715Z 2025-03-14T06:45:13.6967410Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py:895 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:45:13.6968604Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:45:13.6969032Z 2025-03-14T06:45:19.4476564Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:45:19.4477366Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/opt/modeling_opt.py", line 109, in forward 2025-03-14T06:45:19.4478086Z return super().forward(positions + self.offset) 2025-03-14T06:45:19.4478343Z 2025-03-14T06:45:25.7958895Z Compilation time (from dynamo_timed): 6.728956649000001 2025-03-14T06:45:25.7972604Z pass 2025-03-14T06:45:25.8220031Z TIMING: entire_frame_compile:5.21235 gc:0.00805 _recursive_pre_grad_passes:0.00731 _recursive_joint_graph_passes:0.25579 inductor_compile:3.68281 backend_compile:4.1227 _recursive_post_grad_passes:0.11866 async_compile.precompile:0.4083 async_compile.wait:1.15013 code_gen:2.67449 cudagraphify.get_container:0.1689 pad_mm_benchmark:0.01649 entire_backward_compile:1.51661 CUDAGraphNode.record:5.69651 total_wall_time:6.72896 2025-03-14T06:45:25.8224604Z STATS: call_* op count: 82 | FakeTensorMode.__torch_dispatch__:4836 | ProxyTorchDispatchMode.__torch_dispatch__:2049 | FakeTensor.__torch_dispatch__:703 2025-03-14T06:45:25.8226374Z Dynamo produced 6 graphs covering 82 ops with 6 graph breaks (5 unique) 2025-03-14T06:45:31.1709629Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:45:31.1710845Z warnings.warn( 2025-03-14T06:45:31.3936021Z 2025-03-14T06:45:33.3556765Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:45:33.3557132Z loading model: 0it [00:01, ?it/s] 2025-03-14T06:45:33.3557477Z cuda train PLBartForCausalLM 2025-03-14T06:45:33.3726854Z WARNING:common:fp64 golden ref were not generated for PLBartForCausalLM. Setting accuracy check to cosine 2025-03-14T06:45:35.0518169Z 2025-03-14T06:45:35.0518826Z class GraphModule(torch.nn.Module): 2025-03-14T06:45:35.0521687Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50005, 768][768, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[1026, 768][768, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[768][1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[768][1]cuda:0"): 2025-03-14T06:45:35.0523706Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:45:35.0524499Z l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:45:35.0525648Z l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:45:35.0527040Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:45:35.0528263Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:45:35.0529016Z 2025-03-14T06:45:35.0529289Z # No stacktrace found for following nodes 2025-03-14T06:45:35.0529888Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:45:35.0530442Z 2025-03-14T06:45:35.0531240Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:968 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:45:35.0532261Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 1024); input_ids = None 2025-03-14T06:45:35.0532745Z 2025-03-14T06:45:35.0533574Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:979 in forward, code: inputs_embeds = self.embed_tokens(input) * self.embed_scale 2025-03-14T06:45:35.0535422Z embedding: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(l_cloned_inputs_input_ids_, l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); l_cloned_inputs_input_ids_ = l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:45:35.0536833Z inputs_embeds: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = embedding * 27.712812921102035; embedding = None 2025-03-14T06:45:35.0537318Z 2025-03-14T06:45:35.0538153Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:45:35.0539350Z mask: "f32[1024, 1024][1024, 1]cuda:0" = torch.full((1024, 1024), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:45:35.0539887Z 2025-03-14T06:45:35.0549178Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:45:35.0550399Z mask_cond: "i64[1024][1]cuda:0" = torch.arange(1024, device = device(type='cuda', index=0)) 2025-03-14T06:45:35.0550852Z 2025-03-14T06:45:35.0551658Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:45:35.0552553Z add: "i64[1024][1]cuda:0" = mask_cond + 1 2025-03-14T06:45:35.0552962Z view_1: "i64[1024, 1][1, 1]cuda:0" = add.view(1024, 1); add = None 2025-03-14T06:45:35.0553446Z lt: "b8[1024, 1024][1024, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:45:35.0554147Z masked_fill_: "f32[1024, 1024][1024, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:45:35.0554715Z 2025-03-14T06:45:35.0555372Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:45:35.0556198Z mask_1: "f32[1024, 1024][1024, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:45:35.0556608Z 2025-03-14T06:45:35.0557441Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:45:35.0558720Z getitem: "f32[1, 1, 1024, 1024][1048576, 1048576, 1024, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:45:35.0559580Z causal_4d_mask: "f32[1, 1, 1024, 1024][1048576, 1048576, 1024, 1]cuda:0" = getitem.expand(1, 1, 1024, 1024); getitem = causal_4d_mask = None 2025-03-14T06:45:35.0560144Z 2025-03-14T06:45:35.0560823Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:102 in forward, code: positions = torch.arange( 2025-03-14T06:45:35.0561788Z arange_1: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:45:35.0562299Z 2025-03-14T06:45:35.0562952Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:104 in forward, code: ).expand(bsz, -1) 2025-03-14T06:45:35.0563802Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange_1.expand(1, -1); arange_1 = None 2025-03-14T06:45:35.0564223Z 2025-03-14T06:45:35.0564977Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:106 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:45:35.0565884Z add_1: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:45:35.0567098Z positions_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(add_1, l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add_1 = l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:45:35.0568540Z 2025-03-14T06:45:35.0569298Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1020 in forward, code: positions = positions.to(inputs_embeds.device) 2025-03-14T06:45:35.0570361Z positions_2: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = positions_1.to(device(type='cuda', index=0)); positions_1 = None 2025-03-14T06:45:35.0570880Z 2025-03-14T06:45:35.0571618Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1022 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:45:35.0572658Z hidden_states: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = inputs_embeds + positions_2; inputs_embeds = positions_2 = None 2025-03-14T06:45:35.0573186Z 2025-03-14T06:45:35.0573961Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1023 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:45:35.0576382Z hidden_states_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (768,), l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:45:35.0578088Z 2025-03-14T06:45:35.0579010Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1025 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:45:35.0580389Z hidden_states_2: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:45:35.0581065Z 2025-03-14T06:45:35.0581779Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1054 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:45:35.0582737Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:45:35.0583087Z 2025-03-14T06:45:35.0583822Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1055 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:45:35.0584755Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:45:35.0585174Z 2025-03-14T06:45:35.0585331Z 2025-03-14T06:45:35.0585456Z class GraphModule(torch.nn.Module): 2025-03-14T06:45:35.0587187Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50005, 768][768, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[1026, 768][768, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[768][1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[768][1]cuda:0"): 2025-03-14T06:45:35.0588961Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:45:35.0589733Z l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:45:35.0590847Z l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:45:35.0592018Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:45:35.0593201Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:45:35.0593931Z 2025-03-14T06:45:35.0594183Z # No stacktrace found for following nodes 2025-03-14T06:45:35.0594834Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:45:35.0595376Z 2025-03-14T06:45:35.0596125Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:968 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:45:35.0597116Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 1024); input_ids = None 2025-03-14T06:45:35.0597583Z 2025-03-14T06:45:35.0598367Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:979 in forward, code: inputs_embeds = self.embed_tokens(input) * self.embed_scale 2025-03-14T06:45:35.0600135Z embedding: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(l_cloned_inputs_input_ids_, l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); l_cloned_inputs_input_ids_ = l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:45:35.0601677Z inputs_embeds: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = embedding * 27.712812921102035; embedding = None 2025-03-14T06:45:35.0602159Z 2025-03-14T06:45:35.0602973Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:45:35.0604104Z mask: "f32[1024, 1024][1024, 1]cuda:0" = torch.full((1024, 1024), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:45:35.0604635Z 2025-03-14T06:45:35.0605380Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:45:35.0606438Z mask_cond: "i64[1024][1]cuda:0" = torch.arange(1024, device = device(type='cuda', index=0)) 2025-03-14T06:45:35.0606878Z 2025-03-14T06:45:35.0607685Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:45:35.0608580Z add: "i64[1024][1]cuda:0" = mask_cond + 1 2025-03-14T06:45:35.0609005Z view_1: "i64[1024, 1][1, 1]cuda:0" = add.view(1024, 1); add = None 2025-03-14T06:45:35.0609515Z lt: "b8[1024, 1024][1024, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:45:35.0610093Z masked_fill_: "f32[1024, 1024][1024, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:45:35.0610560Z 2025-03-14T06:45:35.0611213Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:45:35.0612046Z mask_1: "f32[1024, 1024][1024, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:45:35.0612454Z 2025-03-14T06:45:35.0613283Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:45:35.0614476Z getitem: "f32[1, 1, 1024, 1024][1048576, 1048576, 1024, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:45:35.0615335Z causal_4d_mask: "f32[1, 1, 1024, 1024][1048576, 1048576, 1024, 1]cuda:0" = getitem.expand(1, 1, 1024, 1024); getitem = causal_4d_mask = None 2025-03-14T06:45:35.0615893Z 2025-03-14T06:45:35.0616572Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:102 in forward, code: positions = torch.arange( 2025-03-14T06:45:35.0617546Z arange_1: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:45:35.0618052Z 2025-03-14T06:45:35.0618697Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:104 in forward, code: ).expand(bsz, -1) 2025-03-14T06:45:35.0619591Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange_1.expand(1, -1); arange_1 = None 2025-03-14T06:45:35.0620009Z 2025-03-14T06:45:35.0620755Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:106 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:45:35.0621662Z add_1: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:45:35.0622966Z positions_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(add_1, l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add_1 = l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:45:35.0624066Z 2025-03-14T06:45:35.0624823Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1020 in forward, code: positions = positions.to(inputs_embeds.device) 2025-03-14T06:45:35.0625873Z positions_2: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = positions_1.to(device(type='cuda', index=0)); positions_1 = None 2025-03-14T06:45:35.0626391Z 2025-03-14T06:45:35.0627120Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1022 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:45:35.0628268Z hidden_states: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = inputs_embeds + positions_2; inputs_embeds = positions_2 = None 2025-03-14T06:45:35.0628803Z 2025-03-14T06:45:35.0629636Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1023 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:45:35.0631903Z hidden_states_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (768,), l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:45:35.0633598Z 2025-03-14T06:45:35.0634570Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1025 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:45:35.0635961Z hidden_states_2: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:45:35.0636636Z 2025-03-14T06:45:35.0637340Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1054 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:45:35.0638189Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:45:35.0638539Z 2025-03-14T06:45:35.0639277Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1055 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:45:35.0640210Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:45:35.0640630Z 2025-03-14T06:45:36.0061113Z 2025-03-14T06:45:36.0061564Z class GraphModule(torch.nn.Module): 2025-03-14T06:45:36.0063444Z def forward(self, L_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50005, 768][768, 1]cuda:0", L_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[1026, 768][768, 1]cuda:0", L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[768][1]cuda:0", L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[768][1]cuda:0"): 2025-03-14T06:45:36.0065158Z l_input_ids_ = L_input_ids_ 2025-03-14T06:45:36.0065880Z l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:45:36.0067043Z l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:45:36.0068879Z l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:45:36.0070152Z l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:45:36.0070893Z 2025-03-14T06:45:36.0071670Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:968 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:45:36.0072631Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_input_ids_.view(-1, 1024); input_ids = None 2025-03-14T06:45:36.0073184Z 2025-03-14T06:45:36.0073980Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:979 in forward, code: inputs_embeds = self.embed_tokens(input) * self.embed_scale 2025-03-14T06:45:36.0075770Z embedding: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(l_input_ids_, l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); l_input_ids_ = l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:45:36.0077070Z inputs_embeds: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = embedding * 27.712812921102035; embedding = None 2025-03-14T06:45:36.0077544Z 2025-03-14T06:45:36.0078374Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:45:36.0079530Z mask: "f32[1024, 1024][1024, 1]cuda:0" = torch.full((1024, 1024), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:45:36.0080086Z 2025-03-14T06:45:36.0080846Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:45:36.0081821Z mask_cond: "i64[1024][1]cuda:0" = torch.arange(1024, device = device(type='cuda', index=0)) 2025-03-14T06:45:36.0082269Z 2025-03-14T06:45:36.0083073Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:45:36.0083979Z add: "i64[1024][1]cuda:0" = mask_cond + 1 2025-03-14T06:45:36.0084384Z view_1: "i64[1024, 1][1, 1]cuda:0" = add.view(1024, 1); add = None 2025-03-14T06:45:36.0084882Z lt: "b8[1024, 1024][1024, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:45:36.0085466Z masked_fill_: "f32[1024, 1024][1024, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:45:36.0085931Z 2025-03-14T06:45:36.0086594Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:45:36.0087431Z mask_1: "f32[1024, 1024][1024, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:45:36.0087841Z 2025-03-14T06:45:36.0088680Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:45:36.0089884Z getitem: "f32[1, 1, 1024, 1024][1048576, 1048576, 1024, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:45:36.0090754Z causal_4d_mask: "f32[1, 1, 1024, 1024][1048576, 1048576, 1024, 1]cuda:0" = getitem.expand(1, 1, 1024, 1024); getitem = causal_4d_mask = None 2025-03-14T06:45:36.0091318Z 2025-03-14T06:45:36.0092089Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:102 in forward, code: positions = torch.arange( 2025-03-14T06:45:36.0093061Z arange_1: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:45:36.0093576Z 2025-03-14T06:45:36.0094230Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:104 in forward, code: ).expand(bsz, -1) 2025-03-14T06:45:36.0095083Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange_1.expand(1, -1); arange_1 = None 2025-03-14T06:45:36.0095507Z 2025-03-14T06:45:36.0096346Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:106 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:45:36.0097259Z add_1: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:45:36.0098480Z positions_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(add_1, l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add_1 = l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:45:36.0099584Z 2025-03-14T06:45:36.0100337Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1020 in forward, code: positions = positions.to(inputs_embeds.device) 2025-03-14T06:45:36.0101391Z positions_2: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = positions_1.to(device(type='cuda', index=0)); positions_1 = None 2025-03-14T06:45:36.0101913Z 2025-03-14T06:45:36.0102643Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1022 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:45:36.0103691Z hidden_states: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = inputs_embeds + positions_2; inputs_embeds = positions_2 = None 2025-03-14T06:45:36.0104220Z 2025-03-14T06:45:36.0105001Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1023 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:45:36.0107288Z hidden_states_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (768,), l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:45:36.0109010Z 2025-03-14T06:45:36.0109988Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1025 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:45:36.0111361Z hidden_states_2: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:45:36.0112037Z 2025-03-14T06:45:36.0112754Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1054 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:45:36.0113599Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:45:36.0113951Z 2025-03-14T06:45:36.0114756Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1055 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:45:36.0115770Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:45:36.0116194Z 2025-03-14T06:45:40.9821387Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:45:40.9822211Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py", line 106, in forward 2025-03-14T06:45:40.9822925Z return super().forward(positions + self.offset) 2025-03-14T06:45:40.9823174Z 2025-03-14T06:45:44.4947895Z Compilation time (from dynamo_timed): 5.9914223479999995 2025-03-14T06:45:44.4956581Z pass 2025-03-14T06:45:44.5286215Z TIMING: entire_frame_compile:4.6018 gc:0.0057 _recursive_pre_grad_passes:0.00596 _recursive_joint_graph_passes:0.51038 inductor_compile:2.78657 backend_compile:3.54759 async_compile.precompile:0.11276 async_compile.wait:0.71786 cudagraphify.get_container:0.17826 _recursive_post_grad_passes:0.1132 code_gen:1.82765 pad_mm_benchmark:0.26508 entire_backward_compile:1.38962 CUDAGraphNode.record:3.12952 total_wall_time:5.99142 2025-03-14T06:45:44.5288400Z STATS: call_* op count: 60 | FakeTensorMode.__torch_dispatch__:4410 | FakeTensor.__torch_dispatch__:675 | ProxyTorchDispatchMode.__torch_dispatch__:1886 2025-03-14T06:45:44.5289209Z Dynamo produced 6 graphs covering 60 ops with 6 graph breaks (5 unique) 2025-03-14T06:45:49.9113682Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:45:49.9114944Z warnings.warn( 2025-03-14T06:45:50.1601518Z 2025-03-14T06:45:53.5873303Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:45:53.5873667Z loading model: 0it [00:03, ?it/s] 2025-03-14T06:45:53.5874032Z cuda train PLBartForConditionalGeneration 2025-03-14T06:45:53.6501489Z WARNING:common:fp64 golden ref were not generated for PLBartForConditionalGeneration. Setting accuracy check to cosine 2025-03-14T06:45:55.9422716Z 2025-03-14T06:45:55.9423277Z class GraphModule(torch.nn.Module): 2025-03-14T06:45:55.9425223Z def forward(self, L_cloned_inputs_labels_: "i64[1, 1024][1024, 1]cuda:0", L_cloned_inputs_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50005, 768][768, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 768][768, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[768][1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[768][1]cuda:0"): 2025-03-14T06:45:55.9427162Z l_cloned_inputs_labels_ = L_cloned_inputs_labels_ 2025-03-14T06:45:55.9427623Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:45:55.9428405Z l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:45:55.9429518Z l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:45:55.9430686Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:45:55.9431868Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:45:55.9432592Z 2025-03-14T06:45:55.9432867Z # No stacktrace found for following nodes 2025-03-14T06:45:55.9433446Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:45:55.9433982Z 2025-03-14T06:45:55.9435363Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:71 in shift_tokens_right, code: prev_output_tokens = input_ids.clone() 2025-03-14T06:45:55.9436431Z prev_output_tokens: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_labels_.clone(); l_cloned_inputs_labels_ = None 2025-03-14T06:45:55.9436951Z 2025-03-14T06:45:55.9437820Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:76 in shift_tokens_right, code: prev_output_tokens.masked_fill_(prev_output_tokens == -100, pad_token_id) 2025-03-14T06:45:55.9438820Z eq: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens == -100 2025-03-14T06:45:55.9439565Z masked_fill_: "i64[1, 1024][1024, 1]cuda:0" = prev_output_tokens.masked_fill_(eq, 1); eq = masked_fill_ = None 2025-03-14T06:45:55.9440066Z 2025-03-14T06:45:55.9440948Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:78 in shift_tokens_right, code: index_of_eos = (prev_output_tokens.ne(pad_token_id).sum(dim=1) - 1).unsqueeze(-1) 2025-03-14T06:45:55.9441958Z ne: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens.ne(1) 2025-03-14T06:45:55.9442364Z sum_1: "i64[1][1]cuda:0" = ne.sum(dim = 1); ne = None 2025-03-14T06:45:55.9442746Z sub: "i64[1][1]cuda:0" = sum_1 - 1; sum_1 = None 2025-03-14T06:45:55.9443162Z index_of_eos: "i64[1, 1][1, 1]cuda:0" = sub.unsqueeze(-1); sub = None 2025-03-14T06:45:55.9443539Z 2025-03-14T06:45:55.9444413Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:79 in shift_tokens_right, code: decoder_start_tokens = prev_output_tokens.gather(1, index_of_eos).squeeze() 2025-03-14T06:45:55.9445521Z gather: "i64[1, 1][1, 1]cuda:0" = prev_output_tokens.gather(1, index_of_eos); index_of_eos = None 2025-03-14T06:45:55.9446097Z decoder_start_tokens: "i64[][]cuda:0" = gather.squeeze(); gather = None 2025-03-14T06:45:55.9446507Z 2025-03-14T06:45:55.9447320Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:80 in shift_tokens_right, code: prev_output_tokens[:, 1:] = prev_output_tokens[:, :-1].clone() 2025-03-14T06:45:55.9448408Z getitem: "i64[1, 1023][1024, 1]cuda:0" = prev_output_tokens[(slice(None, None, None), slice(None, -1, None))] 2025-03-14T06:45:55.9448998Z clone_1: "i64[1, 1023][1023, 1]cuda:0" = getitem.clone(); getitem = None 2025-03-14T06:45:55.9449676Z prev_output_tokens[(slice(None, None, None), slice(1, None, None))] = clone_1; setitem = prev_output_tokens; clone_1 = setitem = None 2025-03-14T06:45:55.9450264Z 2025-03-14T06:45:55.9451042Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:81 in shift_tokens_right, code: prev_output_tokens[:, 0] = decoder_start_tokens 2025-03-14T06:45:55.9452285Z prev_output_tokens[(slice(None, None, None), 0)] = decoder_start_tokens; setitem_1 = prev_output_tokens; prev_output_tokens = decoder_start_tokens = setitem_1 = None 2025-03-14T06:45:55.9452970Z 2025-03-14T06:45:55.9453729Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:748 in forward, code: input_ids = input_ids.view(-1, input_ids.shape[-1]) 2025-03-14T06:45:55.9454787Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 1024); l_cloned_inputs_input_ids_ = None 2025-03-14T06:45:55.9455299Z 2025-03-14T06:45:55.9456095Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:755 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:45:55.9457862Z embedding: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:45:55.9459137Z inputs_embeds: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = embedding * 27.712812921102035; embedding = None 2025-03-14T06:45:55.9459606Z 2025-03-14T06:45:55.9460291Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:102 in forward, code: positions = torch.arange( 2025-03-14T06:45:55.9461251Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:45:55.9461828Z 2025-03-14T06:45:55.9462468Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:104 in forward, code: ).expand(bsz, -1) 2025-03-14T06:45:55.9463310Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:45:55.9463727Z 2025-03-14T06:45:55.9464472Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:106 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:45:55.9465402Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:45:55.9466599Z embed_pos: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(add, l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:45:55.9467681Z 2025-03-14T06:45:55.9468677Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:758 in forward, code: embed_pos = embed_pos.to(inputs_embeds.device) 2025-03-14T06:45:55.9469699Z embed_pos_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:45:55.9470188Z 2025-03-14T06:45:55.9470909Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:760 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:45:55.9471931Z hidden_states: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = inputs_embeds + embed_pos_1; inputs_embeds = embed_pos_1 = None 2025-03-14T06:45:55.9472449Z 2025-03-14T06:45:55.9473240Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:761 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:45:55.9475610Z hidden_states_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (768,), l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:45:55.9477316Z 2025-03-14T06:45:55.9478222Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:762 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:45:55.9479587Z hidden_states_2: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:45:55.9480264Z 2025-03-14T06:45:55.9480971Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:794 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:45:55.9481959Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:45:55.9482310Z 2025-03-14T06:45:55.9483091Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:795 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:45:55.9484055Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:45:55.9484464Z 2025-03-14T06:45:55.9484630Z 2025-03-14T06:45:55.9484758Z class GraphModule(torch.nn.Module): 2025-03-14T06:45:55.9486636Z def forward(self, L_cloned_inputs_labels_: "i64[1, 1024][1024, 1]cuda:0", L_cloned_inputs_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50005, 768][768, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 768][768, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[768][1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[768][1]cuda:0"): 2025-03-14T06:45:55.9488661Z l_cloned_inputs_labels_ = L_cloned_inputs_labels_ 2025-03-14T06:45:55.9489091Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:45:55.9489868Z l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:45:55.9490987Z l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:45:55.9492163Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:45:55.9493350Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:45:55.9494079Z 2025-03-14T06:45:55.9494344Z # No stacktrace found for following nodes 2025-03-14T06:45:55.9494928Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:45:55.9495512Z 2025-03-14T06:45:55.9496269Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:71 in shift_tokens_right, code: prev_output_tokens = input_ids.clone() 2025-03-14T06:45:55.9497316Z prev_output_tokens: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_labels_.clone(); l_cloned_inputs_labels_ = None 2025-03-14T06:45:55.9497838Z 2025-03-14T06:45:55.9498704Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:76 in shift_tokens_right, code: prev_output_tokens.masked_fill_(prev_output_tokens == -100, pad_token_id) 2025-03-14T06:45:55.9499704Z eq: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens == -100 2025-03-14T06:45:55.9500275Z masked_fill_: "i64[1, 1024][1024, 1]cuda:0" = prev_output_tokens.masked_fill_(eq, 1); eq = masked_fill_ = None 2025-03-14T06:45:55.9500771Z 2025-03-14T06:45:55.9501649Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:78 in shift_tokens_right, code: index_of_eos = (prev_output_tokens.ne(pad_token_id).sum(dim=1) - 1).unsqueeze(-1) 2025-03-14T06:45:55.9502654Z ne: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens.ne(1) 2025-03-14T06:45:55.9503066Z sum_1: "i64[1][1]cuda:0" = ne.sum(dim = 1); ne = None 2025-03-14T06:45:55.9503451Z sub: "i64[1][1]cuda:0" = sum_1 - 1; sum_1 = None 2025-03-14T06:45:55.9503864Z index_of_eos: "i64[1, 1][1, 1]cuda:0" = sub.unsqueeze(-1); sub = None 2025-03-14T06:45:55.9504243Z 2025-03-14T06:45:55.9505201Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:79 in shift_tokens_right, code: decoder_start_tokens = prev_output_tokens.gather(1, index_of_eos).squeeze() 2025-03-14T06:45:55.9506354Z gather: "i64[1, 1][1, 1]cuda:0" = prev_output_tokens.gather(1, index_of_eos); index_of_eos = None 2025-03-14T06:45:55.9506931Z decoder_start_tokens: "i64[][]cuda:0" = gather.squeeze(); gather = None 2025-03-14T06:45:55.9507337Z 2025-03-14T06:45:55.9508150Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:80 in shift_tokens_right, code: prev_output_tokens[:, 1:] = prev_output_tokens[:, :-1].clone() 2025-03-14T06:45:55.9509328Z getitem: "i64[1, 1023][1024, 1]cuda:0" = prev_output_tokens[(slice(None, None, None), slice(None, -1, None))] 2025-03-14T06:45:55.9509926Z clone_1: "i64[1, 1023][1023, 1]cuda:0" = getitem.clone(); getitem = None 2025-03-14T06:45:55.9510619Z prev_output_tokens[(slice(None, None, None), slice(1, None, None))] = clone_1; setitem = prev_output_tokens; clone_1 = setitem = None 2025-03-14T06:45:55.9511203Z 2025-03-14T06:45:55.9511979Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:81 in shift_tokens_right, code: prev_output_tokens[:, 0] = decoder_start_tokens 2025-03-14T06:45:55.9513220Z prev_output_tokens[(slice(None, None, None), 0)] = decoder_start_tokens; setitem_1 = prev_output_tokens; prev_output_tokens = decoder_start_tokens = setitem_1 = None 2025-03-14T06:45:55.9513903Z 2025-03-14T06:45:55.9514742Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:748 in forward, code: input_ids = input_ids.view(-1, input_ids.shape[-1]) 2025-03-14T06:45:55.9515805Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 1024); l_cloned_inputs_input_ids_ = None 2025-03-14T06:45:55.9516329Z 2025-03-14T06:45:55.9517125Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:755 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:45:55.9518798Z embedding: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:45:55.9520068Z inputs_embeds: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = embedding * 27.712812921102035; embedding = None 2025-03-14T06:45:55.9520542Z 2025-03-14T06:45:55.9521220Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:102 in forward, code: positions = torch.arange( 2025-03-14T06:45:55.9522191Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:45:55.9522695Z 2025-03-14T06:45:55.9523340Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:104 in forward, code: ).expand(bsz, -1) 2025-03-14T06:45:55.9524177Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:45:55.9524592Z 2025-03-14T06:45:55.9525358Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:106 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:45:55.9526267Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:45:55.9527533Z embed_pos: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(add, l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:45:55.9528612Z 2025-03-14T06:45:55.9529361Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:758 in forward, code: embed_pos = embed_pos.to(inputs_embeds.device) 2025-03-14T06:45:55.9530385Z embed_pos_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:45:55.9539942Z 2025-03-14T06:45:55.9540720Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:760 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:45:55.9541886Z hidden_states: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = inputs_embeds + embed_pos_1; inputs_embeds = embed_pos_1 = None 2025-03-14T06:45:55.9542418Z 2025-03-14T06:45:55.9543207Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:761 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:45:55.9545498Z hidden_states_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (768,), l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:45:55.9547240Z 2025-03-14T06:45:55.9548147Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:762 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:45:55.9549536Z hidden_states_2: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:45:55.9550217Z 2025-03-14T06:45:55.9550928Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:794 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:45:55.9551773Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:45:55.9552114Z 2025-03-14T06:45:55.9552895Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:795 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:45:55.9553868Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:45:55.9554358Z 2025-03-14T06:45:55.9554492Z 2025-03-14T06:45:55.9554616Z class GraphModule(torch.nn.Module): 2025-03-14T06:45:55.9556536Z def forward(self, L_cloned_inputs_labels_: "i64[1, 1024][1024, 1]cuda:0", L_cloned_inputs_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50005, 768][768, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 768][768, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[768][1]cuda:0", L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[768][1]cuda:0"): 2025-03-14T06:45:55.9558432Z l_cloned_inputs_labels_ = L_cloned_inputs_labels_ 2025-03-14T06:45:55.9558864Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:45:55.9559636Z l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:45:55.9560852Z l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:45:55.9562030Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:45:55.9563221Z l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:45:55.9563950Z 2025-03-14T06:45:55.9564216Z # No stacktrace found for following nodes 2025-03-14T06:45:55.9564798Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:45:55.9565412Z 2025-03-14T06:45:55.9566174Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:71 in shift_tokens_right, code: prev_output_tokens = input_ids.clone() 2025-03-14T06:45:55.9567231Z prev_output_tokens: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_labels_.clone(); l_cloned_inputs_labels_ = None 2025-03-14T06:45:55.9567748Z 2025-03-14T06:45:55.9568882Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:76 in shift_tokens_right, code: prev_output_tokens.masked_fill_(prev_output_tokens == -100, pad_token_id) 2025-03-14T06:45:55.9569881Z eq: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens == -100 2025-03-14T06:45:55.9570450Z masked_fill_: "i64[1, 1024][1024, 1]cuda:0" = prev_output_tokens.masked_fill_(eq, 1); eq = masked_fill_ = None 2025-03-14T06:45:55.9570955Z 2025-03-14T06:45:55.9571830Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:78 in shift_tokens_right, code: index_of_eos = (prev_output_tokens.ne(pad_token_id).sum(dim=1) - 1).unsqueeze(-1) 2025-03-14T06:45:55.9572845Z ne: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens.ne(1) 2025-03-14T06:45:55.9573248Z sum_1: "i64[1][1]cuda:0" = ne.sum(dim = 1); ne = None 2025-03-14T06:45:55.9573630Z sub: "i64[1][1]cuda:0" = sum_1 - 1; sum_1 = None 2025-03-14T06:45:55.9574048Z index_of_eos: "i64[1, 1][1, 1]cuda:0" = sub.unsqueeze(-1); sub = None 2025-03-14T06:45:55.9574425Z 2025-03-14T06:45:55.9575298Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:79 in shift_tokens_right, code: decoder_start_tokens = prev_output_tokens.gather(1, index_of_eos).squeeze() 2025-03-14T06:45:55.9576408Z gather: "i64[1, 1][1, 1]cuda:0" = prev_output_tokens.gather(1, index_of_eos); index_of_eos = None 2025-03-14T06:45:55.9576985Z decoder_start_tokens: "i64[][]cuda:0" = gather.squeeze(); gather = None 2025-03-14T06:45:55.9577396Z 2025-03-14T06:45:55.9578219Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:80 in shift_tokens_right, code: prev_output_tokens[:, 1:] = prev_output_tokens[:, :-1].clone() 2025-03-14T06:45:55.9579301Z getitem: "i64[1, 1023][1024, 1]cuda:0" = prev_output_tokens[(slice(None, None, None), slice(None, -1, None))] 2025-03-14T06:45:55.9579895Z clone_1: "i64[1, 1023][1023, 1]cuda:0" = getitem.clone(); getitem = None 2025-03-14T06:45:55.9580574Z prev_output_tokens[(slice(None, None, None), slice(1, None, None))] = clone_1; setitem = prev_output_tokens; clone_1 = setitem = None 2025-03-14T06:45:55.9581153Z 2025-03-14T06:45:55.9581933Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:81 in shift_tokens_right, code: prev_output_tokens[:, 0] = decoder_start_tokens 2025-03-14T06:45:55.9583373Z prev_output_tokens[(slice(None, None, None), 0)] = decoder_start_tokens; setitem_1 = prev_output_tokens; prev_output_tokens = decoder_start_tokens = setitem_1 = None 2025-03-14T06:45:55.9584065Z 2025-03-14T06:45:55.9584824Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:748 in forward, code: input_ids = input_ids.view(-1, input_ids.shape[-1]) 2025-03-14T06:45:55.9585879Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 1024); l_cloned_inputs_input_ids_ = None 2025-03-14T06:45:55.9586395Z 2025-03-14T06:45:55.9587195Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:755 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:45:55.9589000Z embedding: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:45:55.9590272Z inputs_embeds: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = embedding * 27.712812921102035; embedding = None 2025-03-14T06:45:55.9590742Z 2025-03-14T06:45:55.9591421Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:102 in forward, code: positions = torch.arange( 2025-03-14T06:45:55.9592381Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:45:55.9592882Z 2025-03-14T06:45:55.9593527Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:104 in forward, code: ).expand(bsz, -1) 2025-03-14T06:45:55.9594423Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:45:55.9594839Z 2025-03-14T06:45:55.9595650Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:106 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:45:55.9596557Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:45:55.9597736Z embed_pos: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(add, l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:45:55.9598811Z 2025-03-14T06:45:55.9599559Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:758 in forward, code: embed_pos = embed_pos.to(inputs_embeds.device) 2025-03-14T06:45:55.9600583Z embed_pos_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:45:55.9601077Z 2025-03-14T06:45:55.9601802Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:760 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:45:55.9602833Z hidden_states: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = inputs_embeds + embed_pos_1; inputs_embeds = embed_pos_1 = None 2025-03-14T06:45:55.9603357Z 2025-03-14T06:45:55.9604129Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:761 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:45:55.9606530Z hidden_states_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (768,), l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:45:55.9608245Z 2025-03-14T06:45:55.9609150Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:762 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:45:55.9610525Z hidden_states_2: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:45:55.9611201Z 2025-03-14T06:45:55.9611993Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:794 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:45:55.9612831Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:45:55.9613171Z 2025-03-14T06:45:55.9613959Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:795 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:45:55.9614924Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:45:55.9615338Z 2025-03-14T06:45:56.9080601Z 2025-03-14T06:45:56.9081034Z class GraphModule(torch.nn.Module): 2025-03-14T06:45:56.9082886Z def forward(self, L_labels_: "i64[1, 1024][1024, 1]cuda:0", L_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50005, 768][768, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 768][768, 1]cuda:0", L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[768][1]cuda:0", L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[768][1]cuda:0"): 2025-03-14T06:45:56.9084713Z l_labels_ = L_labels_ 2025-03-14T06:45:56.9085024Z l_input_ids_ = L_input_ids_ 2025-03-14T06:45:56.9085739Z l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:45:56.9086875Z l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:45:56.9088065Z l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:45:56.9089274Z l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:45:56.9090008Z 2025-03-14T06:45:56.9090788Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:71 in shift_tokens_right, code: prev_output_tokens = input_ids.clone() 2025-03-14T06:45:56.9091759Z prev_output_tokens: "i64[1, 1024][1024, 1]cuda:0" = l_labels_.clone(); l_labels_ = None 2025-03-14T06:45:56.9092193Z 2025-03-14T06:45:56.9093070Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:76 in shift_tokens_right, code: prev_output_tokens.masked_fill_(prev_output_tokens == -100, pad_token_id) 2025-03-14T06:45:56.9094068Z eq: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens == -100 2025-03-14T06:45:56.9094645Z masked_fill_: "i64[1, 1024][1024, 1]cuda:0" = prev_output_tokens.masked_fill_(eq, 1); eq = masked_fill_ = None 2025-03-14T06:45:56.9095148Z 2025-03-14T06:45:56.9096432Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:78 in shift_tokens_right, code: index_of_eos = (prev_output_tokens.ne(pad_token_id).sum(dim=1) - 1).unsqueeze(-1) 2025-03-14T06:45:56.9097450Z ne: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens.ne(1) 2025-03-14T06:45:56.9097859Z sum_1: "i64[1][1]cuda:0" = ne.sum(dim = 1); ne = None 2025-03-14T06:45:56.9098246Z sub: "i64[1][1]cuda:0" = sum_1 - 1; sum_1 = None 2025-03-14T06:45:56.9098674Z index_of_eos: "i64[1, 1][1, 1]cuda:0" = sub.unsqueeze(-1); sub = None 2025-03-14T06:45:56.9099064Z 2025-03-14T06:45:56.9099942Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:79 in shift_tokens_right, code: decoder_start_tokens = prev_output_tokens.gather(1, index_of_eos).squeeze() 2025-03-14T06:45:56.9101209Z gather: "i64[1, 1][1, 1]cuda:0" = prev_output_tokens.gather(1, index_of_eos); index_of_eos = None 2025-03-14T06:45:56.9101795Z decoder_start_tokens: "i64[][]cuda:0" = gather.squeeze(); gather = None 2025-03-14T06:45:56.9102211Z 2025-03-14T06:45:56.9103029Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:80 in shift_tokens_right, code: prev_output_tokens[:, 1:] = prev_output_tokens[:, :-1].clone() 2025-03-14T06:45:56.9104120Z getitem: "i64[1, 1023][1024, 1]cuda:0" = prev_output_tokens[(slice(None, None, None), slice(None, -1, None))] 2025-03-14T06:45:56.9104711Z clone_1: "i64[1, 1023][1023, 1]cuda:0" = getitem.clone(); getitem = None 2025-03-14T06:45:56.9105399Z prev_output_tokens[(slice(None, None, None), slice(1, None, None))] = clone_1; setitem = prev_output_tokens; clone_1 = setitem = None 2025-03-14T06:45:56.9106034Z 2025-03-14T06:45:56.9106808Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:81 in shift_tokens_right, code: prev_output_tokens[:, 0] = decoder_start_tokens 2025-03-14T06:45:56.9108068Z prev_output_tokens[(slice(None, None, None), 0)] = decoder_start_tokens; setitem_1 = prev_output_tokens; prev_output_tokens = decoder_start_tokens = setitem_1 = None 2025-03-14T06:45:56.9108754Z 2025-03-14T06:45:56.9109549Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:748 in forward, code: input_ids = input_ids.view(-1, input_ids.shape[-1]) 2025-03-14T06:45:56.9110521Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_input_ids_.view(-1, 1024); l_input_ids_ = None 2025-03-14T06:45:56.9110957Z 2025-03-14T06:45:56.9111756Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:755 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:45:56.9113455Z embedding: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:45:56.9114829Z inputs_embeds: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = embedding * 27.712812921102035; embedding = None 2025-03-14T06:45:56.9115308Z 2025-03-14T06:45:56.9116038Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:102 in forward, code: positions = torch.arange( 2025-03-14T06:45:56.9116998Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:45:56.9117503Z 2025-03-14T06:45:56.9118153Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:104 in forward, code: ).expand(bsz, -1) 2025-03-14T06:45:56.9118990Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:45:56.9119409Z 2025-03-14T06:45:56.9120254Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:106 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:45:56.9121166Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:45:56.9122357Z embed_pos: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(add, l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:45:56.9123444Z 2025-03-14T06:45:56.9124271Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:758 in forward, code: embed_pos = embed_pos.to(inputs_embeds.device) 2025-03-14T06:45:56.9125299Z embed_pos_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:45:56.9125794Z 2025-03-14T06:45:56.9126522Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:760 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:45:56.9127553Z hidden_states: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = inputs_embeds + embed_pos_1; inputs_embeds = embed_pos_1 = None 2025-03-14T06:45:56.9128082Z 2025-03-14T06:45:56.9128859Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:761 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:45:56.9131152Z hidden_states_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (768,), l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:45:56.9132866Z 2025-03-14T06:45:56.9133779Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:762 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:45:56.9135146Z hidden_states_2: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:45:56.9135823Z 2025-03-14T06:45:56.9136531Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:794 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:45:56.9137376Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:45:56.9137726Z 2025-03-14T06:45:56.9138507Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:795 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:45:56.9139484Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:45:56.9139897Z 2025-03-14T06:45:56.9140022Z 2025-03-14T06:45:56.9140152Z class GraphModule(torch.nn.Module): 2025-03-14T06:45:56.9142028Z def forward(self, L_labels_: "i64[1, 1024][1024, 1]cuda:0", L_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50005, 768][768, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 768][768, 1]cuda:0", L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[768][1]cuda:0", L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[768][1]cuda:0"): 2025-03-14T06:45:56.9143802Z l_labels_ = L_labels_ 2025-03-14T06:45:56.9144103Z l_input_ids_ = L_input_ids_ 2025-03-14T06:45:56.9144818Z l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:45:56.9145964Z l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:45:56.9147156Z l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:45:56.9148445Z l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:45:56.9149189Z 2025-03-14T06:45:56.9149945Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:71 in shift_tokens_right, code: prev_output_tokens = input_ids.clone() 2025-03-14T06:45:56.9150915Z prev_output_tokens: "i64[1, 1024][1024, 1]cuda:0" = l_labels_.clone(); l_labels_ = None 2025-03-14T06:45:56.9151356Z 2025-03-14T06:45:56.9152224Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:76 in shift_tokens_right, code: prev_output_tokens.masked_fill_(prev_output_tokens == -100, pad_token_id) 2025-03-14T06:45:56.9153223Z eq: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens == -100 2025-03-14T06:45:56.9153798Z masked_fill_: "i64[1, 1024][1024, 1]cuda:0" = prev_output_tokens.masked_fill_(eq, 1); eq = masked_fill_ = None 2025-03-14T06:45:56.9154356Z 2025-03-14T06:45:56.9155240Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:78 in shift_tokens_right, code: index_of_eos = (prev_output_tokens.ne(pad_token_id).sum(dim=1) - 1).unsqueeze(-1) 2025-03-14T06:45:56.9156251Z ne: "b8[1, 1024][1024, 1]cuda:0" = prev_output_tokens.ne(1) 2025-03-14T06:45:56.9156659Z sum_1: "i64[1][1]cuda:0" = ne.sum(dim = 1); ne = None 2025-03-14T06:45:56.9157046Z sub: "i64[1][1]cuda:0" = sum_1 - 1; sum_1 = None 2025-03-14T06:45:56.9157472Z index_of_eos: "i64[1, 1][1, 1]cuda:0" = sub.unsqueeze(-1); sub = None 2025-03-14T06:45:56.9157854Z 2025-03-14T06:45:56.9158728Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:79 in shift_tokens_right, code: decoder_start_tokens = prev_output_tokens.gather(1, index_of_eos).squeeze() 2025-03-14T06:45:56.9159840Z gather: "i64[1, 1][1, 1]cuda:0" = prev_output_tokens.gather(1, index_of_eos); index_of_eos = None 2025-03-14T06:45:56.9160426Z decoder_start_tokens: "i64[][]cuda:0" = gather.squeeze(); gather = None 2025-03-14T06:45:56.9160836Z 2025-03-14T06:45:56.9161654Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:80 in shift_tokens_right, code: prev_output_tokens[:, 1:] = prev_output_tokens[:, :-1].clone() 2025-03-14T06:45:56.9162737Z getitem: "i64[1, 1023][1024, 1]cuda:0" = prev_output_tokens[(slice(None, None, None), slice(None, -1, None))] 2025-03-14T06:45:56.9163329Z clone_1: "i64[1, 1023][1023, 1]cuda:0" = getitem.clone(); getitem = None 2025-03-14T06:45:56.9164011Z prev_output_tokens[(slice(None, None, None), slice(1, None, None))] = clone_1; setitem = prev_output_tokens; clone_1 = setitem = None 2025-03-14T06:45:56.9164595Z 2025-03-14T06:45:56.9165462Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:81 in shift_tokens_right, code: prev_output_tokens[:, 0] = decoder_start_tokens 2025-03-14T06:45:56.9166702Z prev_output_tokens[(slice(None, None, None), 0)] = decoder_start_tokens; setitem_1 = prev_output_tokens; prev_output_tokens = decoder_start_tokens = setitem_1 = None 2025-03-14T06:45:56.9167389Z 2025-03-14T06:45:56.9168435Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:748 in forward, code: input_ids = input_ids.view(-1, input_ids.shape[-1]) 2025-03-14T06:45:56.9169413Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_input_ids_.view(-1, 1024); l_input_ids_ = None 2025-03-14T06:45:56.9169850Z 2025-03-14T06:45:56.9170647Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:755 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:45:56.9172556Z embedding: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:45:56.9173841Z inputs_embeds: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = embedding * 27.712812921102035; embedding = None 2025-03-14T06:45:56.9174314Z 2025-03-14T06:45:56.9174994Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:102 in forward, code: positions = torch.arange( 2025-03-14T06:45:56.9175962Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:45:56.9176475Z 2025-03-14T06:45:56.9177130Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:104 in forward, code: ).expand(bsz, -1) 2025-03-14T06:45:56.9177975Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:45:56.9178394Z 2025-03-14T06:45:56.9179142Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:106 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:45:56.9180050Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:45:56.9181247Z embed_pos: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(add, l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:45:56.9182342Z 2025-03-14T06:45:56.9183093Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:758 in forward, code: embed_pos = embed_pos.to(inputs_embeds.device) 2025-03-14T06:45:56.9184122Z embed_pos_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:45:56.9184620Z 2025-03-14T06:45:56.9185352Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:760 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:45:56.9186387Z hidden_states: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = inputs_embeds + embed_pos_1; inputs_embeds = embed_pos_1 = None 2025-03-14T06:45:56.9186910Z 2025-03-14T06:45:56.9187677Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:761 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:45:56.9190081Z hidden_states_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (768,), l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_model_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:45:56.9191800Z 2025-03-14T06:45:56.9192712Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:762 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:45:56.9194083Z hidden_states_2: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:45:56.9194907Z 2025-03-14T06:45:56.9195641Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:794 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:45:56.9196512Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:45:56.9196861Z 2025-03-14T06:45:56.9197641Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:795 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:45:56.9198616Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:45:56.9199029Z 2025-03-14T06:45:57.2514362Z 2025-03-14T06:45:57.2515013Z class GraphModule(torch.nn.Module): 2025-03-14T06:45:57.2516970Z def forward(self, L_decoder_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_self_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50005, 768][768, 1]cuda:0", L_self_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1026, 768][768, 1]cuda:0", L_self_modules_encoder_modules_layernorm_embedding_parameters_weight_: "f32[768][1]cuda:0", L_self_modules_encoder_modules_layernorm_embedding_parameters_bias_: "f32[768][1]cuda:0"): 2025-03-14T06:45:57.2518650Z l_decoder_input_ids_ = L_decoder_input_ids_ 2025-03-14T06:45:57.2519047Z l_input_ids_ = L_input_ids_ 2025-03-14T06:45:57.2519663Z l_self_modules_encoder_modules_embed_tokens_parameters_weight_ = L_self_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:45:57.2520606Z l_self_modules_encoder_modules_embed_positions_parameters_weight_ = L_self_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:45:57.2521608Z l_self_modules_encoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_encoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:45:57.2522624Z l_self_modules_encoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_encoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:45:57.2523261Z 2025-03-14T06:45:57.2524050Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:748 in forward, code: input_ids = input_ids.view(-1, input_ids.shape[-1]) 2025-03-14T06:45:57.2525027Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_input_ids_.view(-1, 1024); l_input_ids_ = None 2025-03-14T06:45:57.2525460Z 2025-03-14T06:45:57.2526259Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:755 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:45:57.2527845Z embedding: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_encoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_self_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:45:57.2529034Z inputs_embeds: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = embedding * 27.712812921102035; embedding = None 2025-03-14T06:45:57.2529888Z 2025-03-14T06:45:57.2530570Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:102 in forward, code: positions = torch.arange( 2025-03-14T06:45:57.2531525Z arange: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:45:57.2532022Z 2025-03-14T06:45:57.2532668Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:104 in forward, code: ).expand(bsz, -1) 2025-03-14T06:45:57.2533507Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:45:57.2534074Z 2025-03-14T06:45:57.2534818Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:106 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:45:57.2535758Z add: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:45:57.2536866Z embed_pos: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(add, l_self_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_self_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:45:57.2537855Z 2025-03-14T06:45:57.2538598Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:758 in forward, code: embed_pos = embed_pos.to(inputs_embeds.device) 2025-03-14T06:45:57.2539612Z embed_pos_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = embed_pos.to(device(type='cuda', index=0)); embed_pos = None 2025-03-14T06:45:57.2540104Z 2025-03-14T06:45:57.2540825Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:760 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:45:57.2541861Z hidden_states: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = inputs_embeds + embed_pos_1; inputs_embeds = embed_pos_1 = None 2025-03-14T06:45:57.2542376Z 2025-03-14T06:45:57.2543139Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:761 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:45:57.2545229Z hidden_states_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (768,), l_self_modules_encoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_encoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_encoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_encoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:45:57.2546807Z 2025-03-14T06:45:57.2547717Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:762 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:45:57.2549081Z hidden_states_2: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:45:57.2549751Z 2025-03-14T06:45:57.2550458Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:794 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:45:57.2551294Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:45:57.2551649Z 2025-03-14T06:45:57.2552426Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:795 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:45:57.2553474Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:45:57.2553882Z 2025-03-14T06:45:57.3959749Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:45:57.3960765Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py", line 106, in forward 2025-03-14T06:45:57.3961474Z return super().forward(positions + self.offset) 2025-03-14T06:45:57.3961725Z 2025-03-14T06:45:59.3891632Z 2025-03-14T06:45:59.3892088Z class GraphModule(torch.nn.Module): 2025-03-14T06:45:59.3893981Z def forward(self, dict_getitem_L_stack0_list_dict_keys_L_stack0_0_: "f32[1, 1024, 768][786432, 768, 1]cuda:0", L_decoder_input_ids_: "i64[1, 1024][1024, 1]cuda:0", L_self_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50005, 768][768, 1]cuda:0", L_self_modules_decoder_modules_embed_positions_parameters_weight_: "f32[1026, 768][768, 1]cuda:0", L_self_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[768][1]cuda:0", L_self_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[768][1]cuda:0"): 2025-03-14T06:45:59.3896297Z dict_getitem_l_stack0_list_dict_keys_l_stack0_0_ = dict_getitem_L_stack0_list_dict_keys_L_stack0_0_ 2025-03-14T06:45:59.3896894Z l_decoder_input_ids_ = L_decoder_input_ids_ 2025-03-14T06:45:59.3897546Z l_self_modules_decoder_modules_embed_tokens_parameters_weight_ = L_self_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:45:59.3898498Z l_self_modules_decoder_modules_embed_positions_parameters_weight_ = L_self_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:45:59.3899504Z l_self_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:45:59.3900528Z l_self_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:45:59.3901182Z 2025-03-14T06:45:59.3901967Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:968 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:45:59.3902948Z input_ids: "i64[1, 1024][1024, 1]cuda:0" = l_decoder_input_ids_.view(-1, 1024); input_ids = None 2025-03-14T06:45:59.3903397Z 2025-03-14T06:45:59.3904189Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:979 in forward, code: inputs_embeds = self.embed_tokens(input) * self.embed_scale 2025-03-14T06:45:59.3905836Z embedding: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(l_decoder_input_ids_, l_self_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); l_decoder_input_ids_ = l_self_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:45:59.3907157Z inputs_embeds: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = embedding * 27.712812921102035; embedding = None 2025-03-14T06:45:59.3907625Z 2025-03-14T06:45:59.3908443Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:45:59.3909558Z mask: "f32[1024, 1024][1024, 1]cuda:0" = torch.full((1024, 1024), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:45:59.3910092Z 2025-03-14T06:45:59.3910837Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:45:59.3911814Z mask_cond: "i64[1024][1]cuda:0" = torch.arange(1024, device = device(type='cuda', index=0)) 2025-03-14T06:45:59.3912257Z 2025-03-14T06:45:59.3913218Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:45:59.3914122Z add: "i64[1024][1]cuda:0" = mask_cond + 1 2025-03-14T06:45:59.3914605Z view_1: "i64[1024, 1][1, 1]cuda:0" = add.view(1024, 1); add = None 2025-03-14T06:45:59.3915090Z lt: "b8[1024, 1024][1024, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:45:59.3915668Z masked_fill_: "f32[1024, 1024][1024, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:45:59.3916134Z 2025-03-14T06:45:59.3916929Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:45:59.3917762Z mask_1: "f32[1024, 1024][1024, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:45:59.3918173Z 2025-03-14T06:45:59.3919006Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:45:59.3920204Z getitem: "f32[1, 1, 1024, 1024][1048576, 1048576, 1024, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:45:59.3921062Z causal_4d_mask: "f32[1, 1, 1024, 1024][1048576, 1048576, 1024, 1]cuda:0" = getitem.expand(1, 1, 1024, 1024); getitem = causal_4d_mask = None 2025-03-14T06:45:59.3921622Z 2025-03-14T06:45:59.3922309Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:102 in forward, code: positions = torch.arange( 2025-03-14T06:45:59.3923282Z arange_1: "i64[1024][1]cuda:0" = torch.arange(0, 1024, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:45:59.3923795Z 2025-03-14T06:45:59.3924458Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:104 in forward, code: ).expand(bsz, -1) 2025-03-14T06:45:59.3925319Z positions: "i64[1, 1024][1024, 1]cuda:0" = arange_1.expand(1, -1); arange_1 = None 2025-03-14T06:45:59.3925743Z 2025-03-14T06:45:59.3926498Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:106 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:45:59.3927414Z add_1: "i64[1, 1024][1024, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:45:59.3928552Z positions_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.embedding(add_1, l_self_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add_1 = l_self_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:45:59.3929573Z 2025-03-14T06:45:59.3930329Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1020 in forward, code: positions = positions.to(inputs_embeds.device) 2025-03-14T06:45:59.3939662Z positions_2: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = positions_1.to(device(type='cuda', index=0)); positions_1 = None 2025-03-14T06:45:59.3940186Z 2025-03-14T06:45:59.3940927Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1022 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:45:59.3941978Z hidden_states: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = inputs_embeds + positions_2; inputs_embeds = positions_2 = None 2025-03-14T06:45:59.3942515Z 2025-03-14T06:45:59.3943409Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1023 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:45:59.3945524Z hidden_states_1: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (768,), l_self_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:45:59.3947114Z 2025-03-14T06:45:59.3948037Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1025 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:45:59.3949498Z hidden_states_2: "f32[1, 1024, 768][786432, 768, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:45:59.3950182Z 2025-03-14T06:45:59.3950900Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1054 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:45:59.3951750Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:45:59.3952109Z 2025-03-14T06:45:59.3952845Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/plbart/modeling_plbart.py:1055 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:45:59.3953779Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:45:59.3954215Z 2025-03-14T06:46:13.7349819Z Compilation time (from dynamo_timed): 9.594474905 2025-03-14T06:46:13.7367260Z pass 2025-03-14T06:46:13.7733206Z TIMING: entire_frame_compile:7.21548 gc:0.00888 _recursive_pre_grad_passes:0.00758 _recursive_joint_graph_passes:0.41869 inductor_compile:4.69281 backend_compile:5.40998 async_compile.precompile:0.05556 async_compile.wait:0.8108 cudagraphify.get_container:0.19437 pad_mm_benchmark:0.04322 _recursive_post_grad_passes:0.22 code_gen:2.59575 entire_backward_compile:2.379 CUDAGraphNode.record:7.31742 total_wall_time:9.59447 2025-03-14T06:46:13.7735208Z STATS: call_* op count: 137 | FakeTensorMode.__torch_dispatch__:9362 | FakeTensor.__torch_dispatch__:1471 | ProxyTorchDispatchMode.__torch_dispatch__:4380 2025-03-14T06:46:13.7736037Z Dynamo produced 8 graphs covering 137 ops with 8 graph breaks (5 unique) 2025-03-14T06:46:19.3574825Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:46:19.3576061Z warnings.warn( 2025-03-14T06:46:20.0173120Z 2025-03-14T06:46:26.2343781Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:46:26.2344230Z loading model: 0it [00:06, ?it/s] 2025-03-14T06:46:26.2344595Z cuda train PegasusForCausalLM 2025-03-14T06:46:26.2648623Z WARNING:common:fp64 golden ref were not generated for PegasusForCausalLM. Setting accuracy check to cosine 2025-03-14T06:46:28.0682407Z 2025-03-14T06:46:28.0682952Z class GraphModule(torch.nn.Module): 2025-03-14T06:46:28.0684515Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 128][128, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[1024, 1024][1024, 1]cuda:0"): 2025-03-14T06:46:28.0686148Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:46:28.0686980Z l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:46:28.0688525Z l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:46:28.0689271Z 2025-03-14T06:46:28.0689546Z # No stacktrace found for following nodes 2025-03-14T06:46:28.0690150Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:46:28.0690708Z 2025-03-14T06:46:28.0691523Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:976 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:46:28.0692783Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 128); l_cloned_inputs_input_ids_ = None 2025-03-14T06:46:28.0693320Z 2025-03-14T06:46:28.0694172Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:986 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:46:28.0695934Z embedding: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:46:28.0697208Z inputs_embeds: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:46:28.0697667Z 2025-03-14T06:46:28.0698495Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:46:28.0699621Z mask: "f32[128, 128][128, 1]cuda:0" = torch.full((128, 128), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:46:28.0700139Z 2025-03-14T06:46:28.0700908Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:46:28.0701880Z mask_cond: "i64[128][1]cuda:0" = torch.arange(128, device = device(type='cuda', index=0)) 2025-03-14T06:46:28.0702389Z 2025-03-14T06:46:28.0703558Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:46:28.0704536Z add: "i64[128][1]cuda:0" = mask_cond + 1 2025-03-14T06:46:28.0704981Z view_1: "i64[128, 1][1, 1]cuda:0" = add.view(128, 1); add = None 2025-03-14T06:46:28.0705489Z lt: "b8[128, 128][128, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:46:28.0706066Z masked_fill_: "f32[128, 128][128, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:46:28.0706533Z 2025-03-14T06:46:28.0707202Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:46:28.0708039Z mask_1: "f32[128, 128][128, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:46:28.0708445Z 2025-03-14T06:46:28.0709292Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:46:28.0710491Z getitem: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:46:28.0711410Z causal_4d_mask: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = getitem.expand(1, 1, 128, 128); getitem = causal_4d_mask = None 2025-03-14T06:46:28.0711957Z 2025-03-14T06:46:28.0712234Z # No stacktrace found for following nodes 2025-03-14T06:46:28.0712719Z _set_grad_enabled = torch._C._set_grad_enabled(False); _set_grad_enabled = None 2025-03-14T06:46:28.0713168Z 2025-03-14T06:46:28.0713867Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:105 in forward, code: positions = torch.arange( 2025-03-14T06:46:28.0715034Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:46:28.0715556Z 2025-03-14T06:46:28.0716373Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:108 in forward, code: return super().forward(positions) 2025-03-14T06:46:28.0718024Z positions_1: "f32[128, 1024][1024, 1]cuda:0" = torch.nn.functional.embedding(positions, l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:46:28.0719172Z 2025-03-14T06:46:28.0719441Z # No stacktrace found for following nodes 2025-03-14T06:46:28.0719931Z _set_grad_enabled_1 = torch._C._set_grad_enabled(True); _set_grad_enabled_1 = None 2025-03-14T06:46:28.0720373Z 2025-03-14T06:46:28.0721128Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:1002 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:46:28.0722204Z hidden_states: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = inputs_embeds + positions_1; inputs_embeds = positions_1 = None 2025-03-14T06:46:28.0722751Z 2025-03-14T06:46:28.0723701Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:1004 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:46:28.0725101Z hidden_states_1: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:46:28.0725792Z 2025-03-14T06:46:28.0726531Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:1032 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:46:28.0727398Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:46:28.0727784Z 2025-03-14T06:46:28.0728538Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:1033 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:46:28.0729496Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:46:28.0729928Z 2025-03-14T06:46:28.0730068Z 2025-03-14T06:46:28.0730197Z class GraphModule(torch.nn.Module): 2025-03-14T06:46:28.0731312Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 128][128, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[1024, 1024][1024, 1]cuda:0"): 2025-03-14T06:46:28.0732473Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:46:28.0733260Z l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:46:28.0734397Z l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:46:28.0735122Z 2025-03-14T06:46:28.0735487Z # No stacktrace found for following nodes 2025-03-14T06:46:28.0736086Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:46:28.0736634Z 2025-03-14T06:46:28.0737404Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:976 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:46:28.0738460Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 128); l_cloned_inputs_input_ids_ = None 2025-03-14T06:46:28.0738980Z 2025-03-14T06:46:28.0739805Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:986 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:46:28.0741597Z embedding: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:46:28.0742862Z inputs_embeds: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:46:28.0743321Z 2025-03-14T06:46:28.0744153Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:46:28.0745323Z mask: "f32[128, 128][128, 1]cuda:0" = torch.full((128, 128), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:46:28.0745848Z 2025-03-14T06:46:28.0747169Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:46:28.0748147Z mask_cond: "i64[128][1]cuda:0" = torch.arange(128, device = device(type='cuda', index=0)) 2025-03-14T06:46:28.0748595Z 2025-03-14T06:46:28.0749401Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:46:28.0750308Z add: "i64[128][1]cuda:0" = mask_cond + 1 2025-03-14T06:46:28.0750710Z view_1: "i64[128, 1][1, 1]cuda:0" = add.view(128, 1); add = None 2025-03-14T06:46:28.0751188Z lt: "b8[128, 128][128, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:46:28.0751764Z masked_fill_: "f32[128, 128][128, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:46:28.0752226Z 2025-03-14T06:46:28.0752894Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:46:28.0753727Z mask_1: "f32[128, 128][128, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:46:28.0754132Z 2025-03-14T06:46:28.0755031Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:46:28.0756211Z getitem: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:46:28.0757027Z causal_4d_mask: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = getitem.expand(1, 1, 128, 128); getitem = causal_4d_mask = None 2025-03-14T06:46:28.0757566Z 2025-03-14T06:46:28.0757836Z # No stacktrace found for following nodes 2025-03-14T06:46:28.0758423Z _set_grad_enabled = torch._C._set_grad_enabled(False); _set_grad_enabled = None 2025-03-14T06:46:28.0758866Z 2025-03-14T06:46:28.0759570Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:105 in forward, code: positions = torch.arange( 2025-03-14T06:46:28.0760559Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:46:28.0761081Z 2025-03-14T06:46:28.0761810Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:108 in forward, code: return super().forward(positions) 2025-03-14T06:46:28.0763568Z positions_1: "f32[128, 1024][1024, 1]cuda:0" = torch.nn.functional.embedding(positions, l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:46:28.0764745Z 2025-03-14T06:46:28.0765041Z # No stacktrace found for following nodes 2025-03-14T06:46:28.0765530Z _set_grad_enabled_1 = torch._C._set_grad_enabled(True); _set_grad_enabled_1 = None 2025-03-14T06:46:28.0765974Z 2025-03-14T06:46:28.0766732Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:1002 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:46:28.0768108Z hidden_states: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = inputs_embeds + positions_1; inputs_embeds = positions_1 = None 2025-03-14T06:46:28.0768666Z 2025-03-14T06:46:28.0769609Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:1004 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:46:28.0771029Z hidden_states_1: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:46:28.0771712Z 2025-03-14T06:46:28.0772443Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:1032 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:46:28.0773316Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:46:28.0773671Z 2025-03-14T06:46:28.0774420Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:1033 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:46:28.0775380Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:46:28.0775806Z 2025-03-14T06:46:29.0152744Z 2025-03-14T06:46:29.0153284Z class GraphModule(torch.nn.Module): 2025-03-14T06:46:29.0154541Z def forward(self, L_input_ids_: "i64[1, 128][128, 1]cuda:0", L_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[1024, 1024][1024, 1]cuda:0"): 2025-03-14T06:46:29.0155718Z l_input_ids_ = L_input_ids_ 2025-03-14T06:46:29.0156489Z l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:46:29.0157635Z l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:46:29.0158376Z 2025-03-14T06:46:29.0159165Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:976 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:46:29.0161081Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_input_ids_.view(-1, 128); l_input_ids_ = None 2025-03-14T06:46:29.0161523Z 2025-03-14T06:46:29.0162343Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:986 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:46:29.0164057Z embedding: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:46:29.0165521Z inputs_embeds: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:46:29.0165976Z 2025-03-14T06:46:29.0166802Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:46:29.0168180Z mask: "f32[128, 128][128, 1]cuda:0" = torch.full((128, 128), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:46:29.0168689Z 2025-03-14T06:46:29.0169442Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:46:29.0170404Z mask_cond: "i64[128][1]cuda:0" = torch.arange(128, device = device(type='cuda', index=0)) 2025-03-14T06:46:29.0170841Z 2025-03-14T06:46:29.0171645Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:46:29.0172542Z add: "i64[128][1]cuda:0" = mask_cond + 1 2025-03-14T06:46:29.0172937Z view_1: "i64[128, 1][1, 1]cuda:0" = add.view(128, 1); add = None 2025-03-14T06:46:29.0173405Z lt: "b8[128, 128][128, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:46:29.0173956Z masked_fill_: "f32[128, 128][128, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:46:29.0174406Z 2025-03-14T06:46:29.0175052Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:46:29.0175923Z mask_1: "f32[128, 128][128, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:46:29.0176323Z 2025-03-14T06:46:29.0177157Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:46:29.0178331Z getitem: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:46:29.0179139Z causal_4d_mask: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = getitem.expand(1, 1, 128, 128); getitem = causal_4d_mask = None 2025-03-14T06:46:29.0179657Z 2025-03-14T06:46:29.0179917Z # No stacktrace found for following nodes 2025-03-14T06:46:29.0180389Z _set_grad_enabled = torch._C._set_grad_enabled(False); _set_grad_enabled = None 2025-03-14T06:46:29.0180818Z 2025-03-14T06:46:29.0181501Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:105 in forward, code: positions = torch.arange( 2025-03-14T06:46:29.0182491Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:46:29.0182999Z 2025-03-14T06:46:29.0183913Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:108 in forward, code: return super().forward(positions) 2025-03-14T06:46:29.0185567Z positions_1: "f32[128, 1024][1024, 1]cuda:0" = torch.nn.functional.embedding(positions, l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:46:29.0186767Z 2025-03-14T06:46:29.0187027Z # No stacktrace found for following nodes 2025-03-14T06:46:29.0187502Z _set_grad_enabled_1 = torch._C._set_grad_enabled(True); _set_grad_enabled_1 = None 2025-03-14T06:46:29.0188053Z 2025-03-14T06:46:29.0188799Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:1002 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:46:29.0189863Z hidden_states: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = inputs_embeds + positions_1; inputs_embeds = positions_1 = None 2025-03-14T06:46:29.0190400Z 2025-03-14T06:46:29.0191326Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:1004 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:46:29.0192714Z hidden_states_1: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:46:29.0193390Z 2025-03-14T06:46:29.0194108Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:1032 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:46:29.0195048Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:46:29.0195397Z 2025-03-14T06:46:29.0196145Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:1033 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:46:29.0197084Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:46:29.0197508Z 2025-03-14T06:46:40.0168270Z Compilation time (from dynamo_timed): 5.615560554 2025-03-14T06:46:40.0186630Z pass 2025-03-14T06:46:40.0950402Z TIMING: entire_frame_compile:4.33584 gc:0.00533 _recursive_pre_grad_passes:0.00593 _recursive_joint_graph_passes:0.45891 inductor_compile:2.66363 backend_compile:3.45713 async_compile.precompile:0.10531 async_compile.wait:0.69948 cudagraphify.get_container:0.17518 _recursive_post_grad_passes:0.10984 code_gen:1.73414 pad_mm_benchmark:0.2247 entire_backward_compile:1.27972 CUDAGraphNode.record:5.59356 total_wall_time:5.61556 2025-03-14T06:46:40.0952495Z STATS: call_* op count: 58 | FakeTensorMode.__torch_dispatch__:4268 | ProxyTorchDispatchMode.__torch_dispatch__:1822 | FakeTensor.__torch_dispatch__:664 2025-03-14T06:46:40.0953312Z Dynamo produced 6 graphs covering 58 ops with 6 graph breaks (5 unique) 2025-03-14T06:46:45.4793309Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:46:45.4794636Z warnings.warn( 2025-03-14T06:46:45.7119176Z 2025-03-14T06:46:55.9681589Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:46:55.9681980Z loading model: 0it [00:10, ?it/s] 2025-03-14T06:46:55.9682349Z cuda train PegasusForConditionalGeneration 2025-03-14T06:46:56.0582762Z WARNING:common:fp64 golden ref were not generated for PegasusForConditionalGeneration. Setting accuracy check to cosine 2025-03-14T06:46:58.5540964Z 2025-03-14T06:46:58.5541591Z class GraphModule(torch.nn.Module): 2025-03-14T06:46:58.5543471Z def forward(self, L_cloned_inputs_labels_: "i64[1, 128][128, 1]cuda:0", L_cloned_inputs_decoder_input_ids_: "i64[1, 128][128, 1]cuda:0", L_cloned_inputs_input_ids_: "i64[1, 128][128, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1024, 1024][1024, 1]cuda:0"): 2025-03-14T06:46:58.5544997Z l_cloned_inputs_labels_ = L_cloned_inputs_labels_ 2025-03-14T06:46:58.5545481Z l_cloned_inputs_decoder_input_ids_ = L_cloned_inputs_decoder_input_ids_ 2025-03-14T06:46:58.5545980Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:46:58.5546918Z l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:46:58.5548042Z l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:46:58.5548755Z 2025-03-14T06:46:58.5549016Z # No stacktrace found for following nodes 2025-03-14T06:46:58.5549605Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:46:58.5550154Z 2025-03-14T06:46:58.5550940Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:739 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:46:58.5551987Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 128); l_cloned_inputs_input_ids_ = None 2025-03-14T06:46:58.5552496Z 2025-03-14T06:46:58.5553306Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:746 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:46:58.5555500Z embedding: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:46:58.5556920Z inputs_embeds: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:46:58.5557368Z 2025-03-14T06:46:58.5557624Z # No stacktrace found for following nodes 2025-03-14T06:46:58.5558090Z _set_grad_enabled = torch._C._set_grad_enabled(False); _set_grad_enabled = None 2025-03-14T06:46:58.5558528Z 2025-03-14T06:46:58.5559220Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:105 in forward, code: positions = torch.arange( 2025-03-14T06:46:58.5560210Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:46:58.5560716Z 2025-03-14T06:46:58.5561428Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:108 in forward, code: return super().forward(positions) 2025-03-14T06:46:58.5563048Z embed_pos: "f32[128, 1024][1024, 1]cuda:0" = torch.nn.functional.embedding(positions, l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:46:58.5564170Z 2025-03-14T06:46:58.5564426Z # No stacktrace found for following nodes 2025-03-14T06:46:58.5564904Z _set_grad_enabled_1 = torch._C._set_grad_enabled(True); _set_grad_enabled_1 = None 2025-03-14T06:46:58.5565333Z 2025-03-14T06:46:58.5566184Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:750 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:46:58.5567229Z hidden_states: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = inputs_embeds + embed_pos; inputs_embeds = embed_pos = None 2025-03-14T06:46:58.5567743Z 2025-03-14T06:46:58.5569035Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:752 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:46:58.5570412Z hidden_states_1: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:46:58.5571323Z 2025-03-14T06:46:58.5572290Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:775 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:46:58.5573156Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:46:58.5573510Z 2025-03-14T06:46:58.5574291Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:776 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:46:58.5575269Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:46:58.5575672Z 2025-03-14T06:46:58.5575820Z 2025-03-14T06:46:58.5575950Z class GraphModule(torch.nn.Module): 2025-03-14T06:46:58.5577332Z def forward(self, L_cloned_inputs_labels_: "i64[1, 128][128, 1]cuda:0", L_cloned_inputs_decoder_input_ids_: "i64[1, 128][128, 1]cuda:0", L_cloned_inputs_input_ids_: "i64[1, 128][128, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1024, 1024][1024, 1]cuda:0"): 2025-03-14T06:46:58.5578750Z l_cloned_inputs_labels_ = L_cloned_inputs_labels_ 2025-03-14T06:46:58.5579225Z l_cloned_inputs_decoder_input_ids_ = L_cloned_inputs_decoder_input_ids_ 2025-03-14T06:46:58.5579717Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:46:58.5580486Z l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:46:58.5581597Z l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:46:58.5582309Z 2025-03-14T06:46:58.5582572Z # No stacktrace found for following nodes 2025-03-14T06:46:58.5583150Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:46:58.5583683Z 2025-03-14T06:46:58.5584472Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:739 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:46:58.5585537Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 128); l_cloned_inputs_input_ids_ = None 2025-03-14T06:46:58.5586038Z 2025-03-14T06:46:58.5586845Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:746 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:46:58.5588535Z embedding: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:46:58.5589938Z inputs_embeds: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:46:58.5590385Z 2025-03-14T06:46:58.5590645Z # No stacktrace found for following nodes 2025-03-14T06:46:58.5591110Z _set_grad_enabled = torch._C._set_grad_enabled(False); _set_grad_enabled = None 2025-03-14T06:46:58.5591536Z 2025-03-14T06:46:58.5592222Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:105 in forward, code: positions = torch.arange( 2025-03-14T06:46:58.5593192Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:46:58.5593693Z 2025-03-14T06:46:58.5594640Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:108 in forward, code: return super().forward(positions) 2025-03-14T06:46:58.5596261Z embed_pos: "f32[128, 1024][1024, 1]cuda:0" = torch.nn.functional.embedding(positions, l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:46:58.5597379Z 2025-03-14T06:46:58.5597632Z # No stacktrace found for following nodes 2025-03-14T06:46:58.5598113Z _set_grad_enabled_1 = torch._C._set_grad_enabled(True); _set_grad_enabled_1 = None 2025-03-14T06:46:58.5598541Z 2025-03-14T06:46:58.5599276Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:750 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:46:58.5600312Z hidden_states: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = inputs_embeds + embed_pos; inputs_embeds = embed_pos = None 2025-03-14T06:46:58.5600823Z 2025-03-14T06:46:58.5601740Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:752 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:46:58.5603115Z hidden_states_1: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:46:58.5603788Z 2025-03-14T06:46:58.5604500Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:775 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:46:58.5605361Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:46:58.5605714Z 2025-03-14T06:46:58.5606504Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:776 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:46:58.5607483Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:46:58.5607889Z 2025-03-14T06:46:58.5608012Z 2025-03-14T06:46:58.5608144Z class GraphModule(torch.nn.Module): 2025-03-14T06:46:58.5619444Z def forward(self, L_cloned_inputs_labels_: "i64[1, 128][128, 1]cuda:0", L_cloned_inputs_decoder_input_ids_: "i64[1, 128][128, 1]cuda:0", L_cloned_inputs_input_ids_: "i64[1, 128][128, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1024, 1024][1024, 1]cuda:0"): 2025-03-14T06:46:58.5620927Z l_cloned_inputs_labels_ = L_cloned_inputs_labels_ 2025-03-14T06:46:58.5621420Z l_cloned_inputs_decoder_input_ids_ = L_cloned_inputs_decoder_input_ids_ 2025-03-14T06:46:58.5621917Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:46:58.5622808Z l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:46:58.5623929Z l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:46:58.5624669Z 2025-03-14T06:46:58.5624958Z # No stacktrace found for following nodes 2025-03-14T06:46:58.5625537Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:46:58.5626073Z 2025-03-14T06:46:58.5626842Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:739 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:46:58.5627989Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 128); l_cloned_inputs_input_ids_ = None 2025-03-14T06:46:58.5628482Z 2025-03-14T06:46:58.5629302Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:746 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:46:58.5631007Z embedding: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:46:58.5632262Z inputs_embeds: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:46:58.5632708Z 2025-03-14T06:46:58.5632974Z # No stacktrace found for following nodes 2025-03-14T06:46:58.5633437Z _set_grad_enabled = torch._C._set_grad_enabled(False); _set_grad_enabled = None 2025-03-14T06:46:58.5633866Z 2025-03-14T06:46:58.5634648Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:105 in forward, code: positions = torch.arange( 2025-03-14T06:46:58.5635629Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:46:58.5636137Z 2025-03-14T06:46:58.5636847Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:108 in forward, code: return super().forward(positions) 2025-03-14T06:46:58.5638465Z embed_pos: "f32[128, 1024][1024, 1]cuda:0" = torch.nn.functional.embedding(positions, l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_mod_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:46:58.5639587Z 2025-03-14T06:46:58.5639843Z # No stacktrace found for following nodes 2025-03-14T06:46:58.5640320Z _set_grad_enabled_1 = torch._C._set_grad_enabled(True); _set_grad_enabled_1 = None 2025-03-14T06:46:58.5640754Z 2025-03-14T06:46:58.5641490Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:750 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:46:58.5642522Z hidden_states: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = inputs_embeds + embed_pos; inputs_embeds = embed_pos = None 2025-03-14T06:46:58.5643037Z 2025-03-14T06:46:58.5644005Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:752 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:46:58.5645389Z hidden_states_1: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:46:58.5646059Z 2025-03-14T06:46:58.5646869Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:775 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:46:58.5647722Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:46:58.5648069Z 2025-03-14T06:46:58.5648859Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:776 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:46:58.5649837Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:46:58.5650241Z 2025-03-14T06:46:59.5073628Z 2025-03-14T06:46:59.5074324Z class GraphModule(torch.nn.Module): 2025-03-14T06:46:59.5075895Z def forward(self, L_labels_: "i64[1, 128][128, 1]cuda:0", L_decoder_input_ids_: "i64[1, 128][128, 1]cuda:0", L_input_ids_: "i64[1, 128][128, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1024, 1024][1024, 1]cuda:0"): 2025-03-14T06:46:59.5077369Z l_labels_ = L_labels_ 2025-03-14T06:46:59.5077734Z l_decoder_input_ids_ = L_decoder_input_ids_ 2025-03-14T06:46:59.5078111Z l_input_ids_ = L_input_ids_ 2025-03-14T06:46:59.5078901Z l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:46:59.5080222Z l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:46:59.5081043Z 2025-03-14T06:46:59.5081908Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:739 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:46:59.5082970Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_input_ids_.view(-1, 128); l_input_ids_ = None 2025-03-14T06:46:59.5083475Z 2025-03-14T06:46:59.5084395Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:746 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:46:59.5086640Z embedding: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:46:59.5088090Z inputs_embeds: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:46:59.5088560Z 2025-03-14T06:46:59.5088884Z # No stacktrace found for following nodes 2025-03-14T06:46:59.5089375Z _set_grad_enabled = torch._C._set_grad_enabled(False); _set_grad_enabled = None 2025-03-14T06:46:59.5089891Z 2025-03-14T06:46:59.5090665Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:105 in forward, code: positions = torch.arange( 2025-03-14T06:46:59.5091726Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:46:59.5092314Z 2025-03-14T06:46:59.5093106Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:108 in forward, code: return super().forward(positions) 2025-03-14T06:46:59.5095258Z embed_pos: "f32[128, 1024][1024, 1]cuda:0" = torch.nn.functional.embedding(positions, l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:46:59.5096578Z 2025-03-14T06:46:59.5096847Z # No stacktrace found for following nodes 2025-03-14T06:46:59.5097429Z _set_grad_enabled_1 = torch._C._set_grad_enabled(True); _set_grad_enabled_1 = None 2025-03-14T06:46:59.5097871Z 2025-03-14T06:46:59.5098699Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:750 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:46:59.5099897Z hidden_states: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = inputs_embeds + embed_pos; inputs_embeds = embed_pos = None 2025-03-14T06:46:59.5100427Z 2025-03-14T06:46:59.5101675Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:752 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:46:59.5103216Z hidden_states_1: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:46:59.5103944Z 2025-03-14T06:46:59.5104770Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:775 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:46:59.5105766Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:46:59.5106149Z 2025-03-14T06:46:59.5107024Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:776 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:46:59.5108103Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:46:59.5108599Z 2025-03-14T06:46:59.5108732Z 2025-03-14T06:46:59.5108864Z class GraphModule(torch.nn.Module): 2025-03-14T06:46:59.5110281Z def forward(self, L_labels_: "i64[1, 128][128, 1]cuda:0", L_decoder_input_ids_: "i64[1, 128][128, 1]cuda:0", L_input_ids_: "i64[1, 128][128, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1024, 1024][1024, 1]cuda:0"): 2025-03-14T06:46:59.5111636Z l_labels_ = L_labels_ 2025-03-14T06:46:59.5112015Z l_decoder_input_ids_ = L_decoder_input_ids_ 2025-03-14T06:46:59.5112368Z l_input_ids_ = L_input_ids_ 2025-03-14T06:46:59.5113162Z l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:46:59.5114567Z l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = L_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:46:59.5115350Z 2025-03-14T06:46:59.5116214Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:739 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:46:59.5117185Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_input_ids_.view(-1, 128); l_input_ids_ = None 2025-03-14T06:46:59.5117611Z 2025-03-14T06:46:59.5118510Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:746 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:46:59.5120226Z embedding: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_self_modules_model_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:46:59.5121611Z inputs_embeds: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:46:59.5122136Z 2025-03-14T06:46:59.5122423Z # No stacktrace found for following nodes 2025-03-14T06:46:59.5122896Z _set_grad_enabled = torch._C._set_grad_enabled(False); _set_grad_enabled = None 2025-03-14T06:46:59.5123326Z 2025-03-14T06:46:59.5124022Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:105 in forward, code: positions = torch.arange( 2025-03-14T06:46:59.5125001Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:46:59.5125667Z 2025-03-14T06:46:59.5126379Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:108 in forward, code: return super().forward(positions) 2025-03-14T06:46:59.5128017Z embed_pos: "f32[128, 1024][1024, 1]cuda:0" = torch.nn.functional.embedding(positions, l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_self_modules_model_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:46:59.5129225Z 2025-03-14T06:46:59.5129507Z # No stacktrace found for following nodes 2025-03-14T06:46:59.5129987Z _set_grad_enabled_1 = torch._C._set_grad_enabled(True); _set_grad_enabled_1 = None 2025-03-14T06:46:59.5130426Z 2025-03-14T06:46:59.5131176Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:750 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:46:59.5132230Z hidden_states: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = inputs_embeds + embed_pos; inputs_embeds = embed_pos = None 2025-03-14T06:46:59.5132748Z 2025-03-14T06:46:59.5133683Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:752 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:46:59.5135073Z hidden_states_1: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:46:59.5135752Z 2025-03-14T06:46:59.5136473Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:775 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:46:59.5137340Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:46:59.5137695Z 2025-03-14T06:46:59.5138491Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:776 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:46:59.5139476Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:46:59.5139889Z 2025-03-14T06:46:59.5931516Z 2025-03-14T06:46:59.5932078Z class GraphModule(torch.nn.Module): 2025-03-14T06:46:59.5933406Z def forward(self, L_input_ids_: "i64[1, 128][128, 1]cuda:0", L_self_modules_encoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_self_modules_encoder_modules_embed_positions_parameters_weight_: "f32[1024, 1024][1024, 1]cuda:0"): 2025-03-14T06:46:59.5934398Z l_input_ids_ = L_input_ids_ 2025-03-14T06:46:59.5935081Z l_self_modules_encoder_modules_embed_tokens_parameters_weight_ = L_self_modules_encoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:46:59.5936076Z l_self_modules_encoder_modules_embed_positions_parameters_weight_ = L_self_modules_encoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:46:59.5936713Z 2025-03-14T06:46:59.5937815Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:739 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:46:59.5938805Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_input_ids_.view(-1, 128); l_input_ids_ = None 2025-03-14T06:46:59.5939233Z 2025-03-14T06:46:59.5940056Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:746 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:46:59.5941669Z embedding: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_encoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_self_modules_encoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:46:59.5942984Z inputs_embeds: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:46:59.5943436Z 2025-03-14T06:46:59.5943711Z # No stacktrace found for following nodes 2025-03-14T06:46:59.5944188Z _set_grad_enabled = torch._C._set_grad_enabled(False); _set_grad_enabled = None 2025-03-14T06:46:59.5944625Z 2025-03-14T06:46:59.5945319Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:105 in forward, code: positions = torch.arange( 2025-03-14T06:46:59.5946303Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:46:59.5946816Z 2025-03-14T06:46:59.5947532Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:108 in forward, code: return super().forward(positions) 2025-03-14T06:46:59.5949081Z embed_pos: "f32[128, 1024][1024, 1]cuda:0" = torch.nn.functional.embedding(positions, l_self_modules_encoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_self_modules_encoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:46:59.5950125Z 2025-03-14T06:46:59.5950394Z # No stacktrace found for following nodes 2025-03-14T06:46:59.5950877Z _set_grad_enabled_1 = torch._C._set_grad_enabled(True); _set_grad_enabled_1 = None 2025-03-14T06:46:59.5951317Z 2025-03-14T06:46:59.5952063Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:750 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:46:59.5953107Z hidden_states: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = inputs_embeds + embed_pos; inputs_embeds = embed_pos = None 2025-03-14T06:46:59.5953633Z 2025-03-14T06:46:59.5954678Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:752 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:46:59.5956107Z hidden_states_1: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:46:59.5956783Z 2025-03-14T06:46:59.5957506Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:775 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:46:59.5958358Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:46:59.5958709Z 2025-03-14T06:46:59.5959505Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:776 in forward, code: if dropout_probability < self.layerdrop: # skip the layer 2025-03-14T06:46:59.5960493Z lt: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt = None 2025-03-14T06:46:59.5960907Z 2025-03-14T06:47:01.8508737Z 2025-03-14T06:47:01.8509451Z class GraphModule(torch.nn.Module): 2025-03-14T06:47:01.8510952Z def forward(self, dict_getitem_L_stack0_list_dict_keys_L_stack0_0_: "f32[1, 128, 1024][131072, 1024, 1]cuda:0", L_decoder_input_ids_: "i64[1, 128][128, 1]cuda:0", L_self_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_self_modules_decoder_modules_embed_positions_parameters_weight_: "f32[1024, 1024][1024, 1]cuda:0"): 2025-03-14T06:47:01.8512572Z dict_getitem_l_stack0_list_dict_keys_l_stack0_0_ = dict_getitem_L_stack0_list_dict_keys_L_stack0_0_ 2025-03-14T06:47:01.8513184Z l_decoder_input_ids_ = L_decoder_input_ids_ 2025-03-14T06:47:01.8513864Z l_self_modules_decoder_modules_embed_tokens_parameters_weight_ = L_self_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:47:01.8515201Z l_self_modules_decoder_modules_embed_positions_parameters_weight_ = L_self_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:47:01.8515913Z 2025-03-14T06:47:01.8516786Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:976 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:47:01.8517884Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_decoder_input_ids_.view(-1, 128); l_decoder_input_ids_ = None 2025-03-14T06:47:01.8518432Z 2025-03-14T06:47:01.8519322Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:986 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:47:01.8521107Z embedding: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_decoder_modules_embed_tokens_parameters_weight_, 0, None, 2.0, False, False); input_ids = l_self_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:47:01.8522402Z inputs_embeds: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:47:01.8522925Z 2025-03-14T06:47:01.8523853Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:47:01.8525110Z mask: "f32[128, 128][128, 1]cuda:0" = torch.full((128, 128), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:47:01.8525643Z 2025-03-14T06:47:01.8526483Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:47:01.8527570Z mask_cond: "i64[128][1]cuda:0" = torch.arange(128, device = device(type='cuda', index=0)) 2025-03-14T06:47:01.8528062Z 2025-03-14T06:47:01.8528953Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:47:01.8529940Z add: "i64[128][1]cuda:0" = mask_cond + 1 2025-03-14T06:47:01.8530414Z view_1: "i64[128, 1][1, 1]cuda:0" = add.view(128, 1); add = None 2025-03-14T06:47:01.8530901Z lt: "b8[128, 128][128, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:47:01.8531534Z masked_fill_: "f32[128, 128][128, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:47:01.8532057Z 2025-03-14T06:47:01.8532767Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:47:01.8533695Z mask_1: "f32[128, 128][128, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:47:01.8534094Z 2025-03-14T06:47:01.8535123Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:47:01.8536528Z getitem: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:47:01.8537413Z causal_4d_mask: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = getitem.expand(1, 1, 128, 128); getitem = causal_4d_mask = None 2025-03-14T06:47:01.8538016Z 2025-03-14T06:47:01.8538276Z # No stacktrace found for following nodes 2025-03-14T06:47:01.8538826Z _set_grad_enabled = torch._C._set_grad_enabled(False); _set_grad_enabled = None 2025-03-14T06:47:01.8539375Z 2025-03-14T06:47:01.8540154Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:105 in forward, code: positions = torch.arange( 2025-03-14T06:47:01.8541269Z positions: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:47:01.8541825Z 2025-03-14T06:47:01.8542622Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:108 in forward, code: return super().forward(positions) 2025-03-14T06:47:01.8544339Z positions_1: "f32[128, 1024][1024, 1]cuda:0" = torch.nn.functional.embedding(positions, l_self_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); positions = l_self_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:47:01.8545531Z 2025-03-14T06:47:01.8545877Z # No stacktrace found for following nodes 2025-03-14T06:47:01.8546394Z _set_grad_enabled_1 = torch._C._set_grad_enabled(True); _set_grad_enabled_1 = None 2025-03-14T06:47:01.8546874Z 2025-03-14T06:47:01.8547704Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:1002 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:47:01.8548859Z hidden_states: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = inputs_embeds + positions_1; inputs_embeds = positions_1 = None 2025-03-14T06:47:01.8549463Z 2025-03-14T06:47:01.8550470Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:1004 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:47:01.8551944Z hidden_states_1: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:47:01.8552712Z 2025-03-14T06:47:01.8553443Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:1032 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:47:01.8554372Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:47:01.8554809Z 2025-03-14T06:47:01.8555551Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/pegasus/modeling_pegasus.py:1033 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:47:01.8556494Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:47:01.8556923Z 2025-03-14T06:47:23.2235047Z Compilation time (from dynamo_timed): 8.884028645 2025-03-14T06:47:23.2282310Z pass 2025-03-14T06:47:23.3560761Z TIMING: entire_frame_compile:6.64888 gc:0.00953 _recursive_pre_grad_passes:0.00707 _recursive_joint_graph_passes:0.40416 inductor_compile:4.40751 backend_compile:5.08358 async_compile.precompile:0.02013 async_compile.wait:0.78952 cudagraphify.get_container:0.19939 pad_mm_benchmark:0.04234 _recursive_post_grad_passes:0.2134 code_gen:2.60562 entire_backward_compile:2.23515 CUDAGraphNode.record:13.54318 total_wall_time:8.88403 2025-03-14T06:47:23.3562835Z STATS: call_* op count: 121 | FakeTensorMode.__torch_dispatch__:9058 | ProxyTorchDispatchMode.__torch_dispatch__:4270 | FakeTensor.__torch_dispatch__:1439 2025-03-14T06:47:23.3563666Z Dynamo produced 7 graphs covering 121 ops with 7 graph breaks (4 unique) 2025-03-14T06:47:28.8918587Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:47:28.8920305Z warnings.warn( 2025-03-14T06:47:31.2277635Z 2025-03-14T06:47:31.2308050Z loading model: 0it [00:00, ?it/s]If you want to use `RobertaLMHeadModel` as a standalone, add `is_decoder=True.` 2025-03-14T06:47:33.2732233Z We strongly recommend passing in an `attention_mask` since your input_ids may be padded. See https://huggingface.co/docs/transformers/troubleshooting#incorrect-output-when-padding-tokens-arent-masked. 2025-03-14T06:47:33.2733569Z You may ignore this warning if your `pad_token_id` (0) is identical to the `bos_token_id` (0), `eos_token_id` (2), or the `sep_token_id` (None), and your input is not padded. 2025-03-14T06:47:33.6518444Z 2025-03-14T06:47:33.6518760Z loading model: 0it [00:02, ?it/s] 2025-03-14T06:47:33.6519195Z cuda train RobertaForCausalLM 2025-03-14T06:48:06.4981970Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:48:06.4982849Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:48:06.4983599Z pred = mod(**cloned_inputs) 2025-03-14T06:48:06.4984313Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/roberta/modeling_roberta.py", line 953, in forward 2025-03-14T06:48:06.4984982Z outputs = self.roberta( 2025-03-14T06:48:06.4985629Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/roberta/modeling_roberta.py", line 828, in forward 2025-03-14T06:48:06.4986310Z embedding_output = self.embeddings( 2025-03-14T06:48:06.4986984Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/roberta/modeling_roberta.py", line 125, in forward 2025-03-14T06:48:06.4987682Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:48:06.4987942Z 2025-03-14T06:48:06.6733475Z W0314 06:48:06.672000 28499 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:48:47.1023843Z Compilation time (from dynamo_timed): 66.647083351 2025-03-14T06:48:47.1048861Z pass 2025-03-14T06:48:47.1582387Z TIMING: entire_frame_compile:58.86482 gc:0.00304 _recursive_pre_grad_passes:0.03351 pad_mm_benchmark:0.20916 _recursive_joint_graph_passes:1.56897 _recursive_post_grad_passes:0.89781 async_compile.wait:2.0362 code_gen:17.26892 inductor_compile:31.79907 backend_compile:45.91727 cudagraphify.get_container:0.36407 entire_backward_compile:7.78226 CUDAGraphNode.record:1.67061 total_wall_time:66.64708 2025-03-14T06:48:47.1584934Z STATS: call_* op count: 1413 | FakeTensorMode.__torch_dispatch__:63145 | FakeTensor.__torch_dispatch__:14302 | ProxyTorchDispatchMode.__torch_dispatch__:28668 2025-03-14T06:48:47.1585853Z Dynamo produced 2 graphs covering 1413 ops with 5 graph breaks (4 unique) 2025-03-14T06:48:55.0287392Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:48:55.0289151Z warnings.warn( 2025-03-14T06:48:55.6022722Z 2025-03-14T06:48:57.2302473Z loading model: 0it [00:00, ?it/s]We strongly recommend passing in an `attention_mask` since your input_ids may be padded. See https://huggingface.co/docs/transformers/troubleshooting#incorrect-output-when-padding-tokens-arent-masked. 2025-03-14T06:48:57.2303900Z You may ignore this warning if your `pad_token_id` (0) is identical to the `bos_token_id` (0), `eos_token_id` (2), or the `sep_token_id` (None), and your input is not padded. 2025-03-14T06:48:57.6336619Z 2025-03-14T06:48:57.6337029Z loading model: 0it [00:02, ?it/s] 2025-03-14T06:48:57.6337442Z cuda train RobertaForQuestionAnswering 2025-03-14T06:49:29.5400749Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:49:29.5402036Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:49:29.5403702Z pred = mod(**cloned_inputs) 2025-03-14T06:49:29.5404688Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/roberta/modeling_roberta.py", line 1500, in forward 2025-03-14T06:49:29.5405718Z outputs = self.roberta( 2025-03-14T06:49:29.5406667Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/roberta/modeling_roberta.py", line 828, in forward 2025-03-14T06:49:29.5407692Z embedding_output = self.embeddings( 2025-03-14T06:49:29.5408695Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/roberta/modeling_roberta.py", line 125, in forward 2025-03-14T06:49:29.5409759Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:49:29.5410139Z 2025-03-14T06:49:29.7077497Z W0314 06:49:29.706000 28867 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:50:08.1649585Z Compilation time (from dynamo_timed): 64.230836111 2025-03-14T06:50:08.1673863Z pass 2025-03-14T06:50:08.2338495Z TIMING: entire_frame_compile:56.66105 gc:0.00379 _recursive_pre_grad_passes:0.03556 pad_mm_benchmark:0.20649 _recursive_joint_graph_passes:1.55029 _recursive_post_grad_passes:0.90958 async_compile.wait:0.22191 code_gen:15.11713 inductor_compile:29.54371 backend_compile:43.74399 cudagraphify.get_container:0.37658 entire_backward_compile:7.56979 CUDAGraphNode.record:1.69021 total_wall_time:64.23084 2025-03-14T06:50:08.2340421Z STATS: call_* op count: 1400 | FakeTensorMode.__torch_dispatch__:62515 | FakeTensor.__torch_dispatch__:14135 | ProxyTorchDispatchMode.__torch_dispatch__:28411 2025-03-14T06:50:08.2341309Z Dynamo produced 2 graphs covering 1400 ops with 5 graph breaks (4 unique) 2025-03-14T06:50:16.1039216Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:50:16.1040621Z warnings.warn( 2025-03-14T06:50:16.3866867Z 2025-03-14T06:50:17.1469473Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:50:17.1469851Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:50:17.1470229Z cuda train Speech2Text2ForCausalLM 2025-03-14T06:50:17.1634363Z WARNING:common:fp64 golden ref were not generated for Speech2Text2ForCausalLM. Setting accuracy check to cosine 2025-03-14T06:50:18.7201904Z 2025-03-14T06:50:18.7202735Z class GraphModule(torch.nn.Module): 2025-03-14T06:50:18.7204105Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 128][128, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[10000, 256][256, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weights_: "f32[1026, 256][256, 1]cuda:0"): 2025-03-14T06:50:18.7205628Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:50:18.7206608Z l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:50:18.7208516Z l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weights_ = L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weights_ 2025-03-14T06:50:18.7209268Z 2025-03-14T06:50:18.7209539Z # No stacktrace found for following nodes 2025-03-14T06:50:18.7210129Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:50:18.7210681Z 2025-03-14T06:50:18.7211515Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:559 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:50:18.7212611Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 128); l_cloned_inputs_input_ids_ = None 2025-03-14T06:50:18.7213354Z 2025-03-14T06:50:18.7214212Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:569 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:50:18.7215899Z embedding: "f32[1, 128, 256][32768, 256, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:50:18.7217113Z inputs_embeds: "f32[1, 128, 256][32768, 256, 1]cuda:0" = embedding * 16.0; embedding = None 2025-03-14T06:50:18.7217560Z 2025-03-14T06:50:18.7218383Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:50:18.7219497Z mask: "f32[128, 128][128, 1]cuda:0" = torch.full((128, 128), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:50:18.7220009Z 2025-03-14T06:50:18.7220763Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:50:18.7221731Z mask_cond: "i64[128][1]cuda:0" = torch.arange(128, device = device(type='cuda', index=0)) 2025-03-14T06:50:18.7222173Z 2025-03-14T06:50:18.7222981Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:50:18.7223881Z add: "i64[128][1]cuda:0" = mask_cond + 1 2025-03-14T06:50:18.7224285Z view_1: "i64[128, 1][1, 1]cuda:0" = add.view(128, 1); add = None 2025-03-14T06:50:18.7224759Z lt: "b8[128, 128][128, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:50:18.7225323Z masked_fill_: "f32[128, 128][128, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:50:18.7225776Z 2025-03-14T06:50:18.7226431Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:50:18.7227258Z mask_1: "f32[128, 128][128, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:50:18.7227652Z 2025-03-14T06:50:18.7228488Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:50:18.7229790Z getitem: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:50:18.7230863Z causal_4d_mask: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = getitem.expand(1, 1, 128, 128); getitem = causal_4d_mask = None 2025-03-14T06:50:18.7231491Z 2025-03-14T06:50:18.7231760Z # No stacktrace found for following nodes 2025-03-14T06:50:18.7232241Z _set_grad_enabled = torch._C._set_grad_enabled(False); _set_grad_enabled = None 2025-03-14T06:50:18.7232685Z 2025-03-14T06:50:18.7233559Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:112 in create_position_ids_from_input_ids, code: mask = input_ids.ne(padding_idx).int() 2025-03-14T06:50:18.7234664Z ne: "b8[1, 128][128, 1]cuda:0" = input_ids.ne(1); input_ids = None 2025-03-14T06:50:18.7235103Z mask_2: "i32[1, 128][128, 1]cuda:0" = ne.int(); ne = None 2025-03-14T06:50:18.7235537Z 2025-03-14T06:50:18.7236587Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:113 in create_position_ids_from_input_ids, code: incremental_indices = (torch.cumsum(mask, dim=1).type_as(mask) + past_key_values_length) * mask 2025-03-14T06:50:18.7237776Z cumsum: "i64[1, 128][128, 1]cuda:0" = torch.cumsum(mask_2, dim = 1) 2025-03-14T06:50:18.7238263Z type_as: "i32[1, 128][128, 1]cuda:0" = cumsum.type_as(mask_2); cumsum = None 2025-03-14T06:50:18.7238741Z add_1: "i32[1, 128][128, 1]cuda:0" = type_as + 0; type_as = None 2025-03-14T06:50:18.7239257Z incremental_indices: "i32[1, 128][128, 1]cuda:0" = add_1 * mask_2; add_1 = mask_2 = None 2025-03-14T06:50:18.7239696Z 2025-03-14T06:50:18.7240589Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:114 in create_position_ids_from_input_ids, code: return incremental_indices.long() + padding_idx 2025-03-14T06:50:18.7241703Z long: "i64[1, 128][128, 1]cuda:0" = incremental_indices.long(); incremental_indices = None 2025-03-14T06:50:18.7242214Z add_2: "i64[1, 128][128, 1]cuda:0" = long + 1; long = None 2025-03-14T06:50:18.7242567Z 2025-03-14T06:50:18.7243572Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:89 in forward, code: position_ids = self.create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length).to( 2025-03-14T06:50:18.7244802Z position_ids: "i64[1, 128][128, 1]cuda:0" = add_2.to(device(type='cuda', index=0)); add_2 = None 2025-03-14T06:50:18.7245250Z 2025-03-14T06:50:18.7246185Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:98 in forward, code: return self.weights.index_select(0, position_ids.view(-1)).view(bsz, seq_len, -1).detach() 2025-03-14T06:50:18.7247296Z view_2: "i64[128][1]cuda:0" = position_ids.view(-1); position_ids = None 2025-03-14T06:50:18.7248355Z index_select: "f32[128, 256][256, 1]cuda:0" = l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weights_.index_select(0, view_2); l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weights_ = view_2 = None 2025-03-14T06:50:18.7249459Z view_3: "f32[1, 128, 256][32768, 256, 1]cuda:0" = index_select.view(1, 128, -1); index_select = None 2025-03-14T06:50:18.7250041Z positions: "f32[1, 128, 256][32768, 256, 1]cuda:0" = view_3.detach(); view_3 = None 2025-03-14T06:50:18.7250458Z 2025-03-14T06:50:18.7250725Z # No stacktrace found for following nodes 2025-03-14T06:50:18.7251207Z _set_grad_enabled_1 = torch._C._set_grad_enabled(True); _set_grad_enabled_1 = None 2025-03-14T06:50:18.7251647Z 2025-03-14T06:50:18.7252434Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:585 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:50:18.7253600Z hidden_states: "f32[1, 128, 256][32768, 256, 1]cuda:0" = inputs_embeds + positions; inputs_embeds = positions = None 2025-03-14T06:50:18.7254118Z 2025-03-14T06:50:18.7255076Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:586 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:50:18.7256473Z hidden_states_1: "f32[1, 128, 256][32768, 256, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:50:18.7257140Z 2025-03-14T06:50:18.7257901Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:614 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:50:18.7258876Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:50:18.7259228Z 2025-03-14T06:50:18.7260008Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:615 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:50:18.7260983Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:50:18.7261408Z 2025-03-14T06:50:18.7261559Z 2025-03-14T06:50:18.7261696Z class GraphModule(torch.nn.Module): 2025-03-14T06:50:18.7262794Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 128][128, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[10000, 256][256, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weights_: "f32[1026, 256][256, 1]cuda:0"): 2025-03-14T06:50:18.7263945Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:50:18.7264737Z l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:50:18.7265869Z l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weights_ = L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weights_ 2025-03-14T06:50:18.7266596Z 2025-03-14T06:50:18.7266864Z # No stacktrace found for following nodes 2025-03-14T06:50:18.7267450Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:50:18.7268406Z 2025-03-14T06:50:18.7269216Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:559 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:50:18.7270310Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 128); l_cloned_inputs_input_ids_ = None 2025-03-14T06:50:18.7270818Z 2025-03-14T06:50:18.7271669Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:569 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:50:18.7273353Z embedding: "f32[1, 128, 256][32768, 256, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:50:18.7274658Z inputs_embeds: "f32[1, 128, 256][32768, 256, 1]cuda:0" = embedding * 16.0; embedding = None 2025-03-14T06:50:18.7275106Z 2025-03-14T06:50:18.7275934Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:50:18.7277217Z mask: "f32[128, 128][128, 1]cuda:0" = torch.full((128, 128), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:50:18.7277736Z 2025-03-14T06:50:18.7278493Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:50:18.7279462Z mask_cond: "i64[128][1]cuda:0" = torch.arange(128, device = device(type='cuda', index=0)) 2025-03-14T06:50:18.7279909Z 2025-03-14T06:50:18.7280712Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:50:18.7281726Z add: "i64[128][1]cuda:0" = mask_cond + 1 2025-03-14T06:50:18.7282124Z view_1: "i64[128, 1][1, 1]cuda:0" = add.view(128, 1); add = None 2025-03-14T06:50:18.7282603Z lt: "b8[128, 128][128, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:50:18.7283172Z masked_fill_: "f32[128, 128][128, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:50:18.7283626Z 2025-03-14T06:50:18.7284285Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:50:18.7285111Z mask_1: "f32[128, 128][128, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:50:18.7285512Z 2025-03-14T06:50:18.7286347Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:50:18.7287525Z getitem: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:50:18.7288335Z causal_4d_mask: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = getitem.expand(1, 1, 128, 128); getitem = causal_4d_mask = None 2025-03-14T06:50:18.7288864Z 2025-03-14T06:50:18.7289129Z # No stacktrace found for following nodes 2025-03-14T06:50:18.7289600Z _set_grad_enabled = torch._C._set_grad_enabled(False); _set_grad_enabled = None 2025-03-14T06:50:18.7290034Z 2025-03-14T06:50:18.7290897Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:112 in create_position_ids_from_input_ids, code: mask = input_ids.ne(padding_idx).int() 2025-03-14T06:50:18.7291891Z ne: "b8[1, 128][128, 1]cuda:0" = input_ids.ne(1); input_ids = None 2025-03-14T06:50:18.7292332Z mask_2: "i32[1, 128][128, 1]cuda:0" = ne.int(); ne = None 2025-03-14T06:50:18.7292688Z 2025-03-14T06:50:18.7293737Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:113 in create_position_ids_from_input_ids, code: incremental_indices = (torch.cumsum(mask, dim=1).type_as(mask) + past_key_values_length) * mask 2025-03-14T06:50:18.7294925Z cumsum: "i64[1, 128][128, 1]cuda:0" = torch.cumsum(mask_2, dim = 1) 2025-03-14T06:50:18.7295409Z type_as: "i32[1, 128][128, 1]cuda:0" = cumsum.type_as(mask_2); cumsum = None 2025-03-14T06:50:18.7295883Z add_1: "i32[1, 128][128, 1]cuda:0" = type_as + 0; type_as = None 2025-03-14T06:50:18.7296398Z incremental_indices: "i32[1, 128][128, 1]cuda:0" = add_1 * mask_2; add_1 = mask_2 = None 2025-03-14T06:50:18.7296828Z 2025-03-14T06:50:18.7297720Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:114 in create_position_ids_from_input_ids, code: return incremental_indices.long() + padding_idx 2025-03-14T06:50:18.7298920Z long: "i64[1, 128][128, 1]cuda:0" = incremental_indices.long(); incremental_indices = None 2025-03-14T06:50:18.7299429Z add_2: "i64[1, 128][128, 1]cuda:0" = long + 1; long = None 2025-03-14T06:50:18.7299780Z 2025-03-14T06:50:18.7300788Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:89 in forward, code: position_ids = self.create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length).to( 2025-03-14T06:50:18.7302016Z position_ids: "i64[1, 128][128, 1]cuda:0" = add_2.to(device(type='cuda', index=0)); add_2 = None 2025-03-14T06:50:18.7302466Z 2025-03-14T06:50:18.7303454Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:98 in forward, code: return self.weights.index_select(0, position_ids.view(-1)).view(bsz, seq_len, -1).detach() 2025-03-14T06:50:18.7304664Z view_2: "i64[128][1]cuda:0" = position_ids.view(-1); position_ids = None 2025-03-14T06:50:18.7305729Z index_select: "f32[128, 256][256, 1]cuda:0" = l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weights_.index_select(0, view_2); l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weights_ = view_2 = None 2025-03-14T06:50:18.7306835Z view_3: "f32[1, 128, 256][32768, 256, 1]cuda:0" = index_select.view(1, 128, -1); index_select = None 2025-03-14T06:50:18.7307413Z positions: "f32[1, 128, 256][32768, 256, 1]cuda:0" = view_3.detach(); view_3 = None 2025-03-14T06:50:18.7307824Z 2025-03-14T06:50:18.7308089Z # No stacktrace found for following nodes 2025-03-14T06:50:18.7308575Z _set_grad_enabled_1 = torch._C._set_grad_enabled(True); _set_grad_enabled_1 = None 2025-03-14T06:50:18.7309018Z 2025-03-14T06:50:18.7309806Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:585 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:50:18.7310876Z hidden_states: "f32[1, 128, 256][32768, 256, 1]cuda:0" = inputs_embeds + positions; inputs_embeds = positions = None 2025-03-14T06:50:18.7311387Z 2025-03-14T06:50:18.7312347Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:586 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:50:18.7313793Z hidden_states_1: "f32[1, 128, 256][32768, 256, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:50:18.7314513Z 2025-03-14T06:50:18.7315271Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:614 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:50:18.7316167Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:50:18.7316519Z 2025-03-14T06:50:18.7317291Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:615 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:50:18.7318260Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:50:18.7318689Z 2025-03-14T06:50:19.6704880Z 2025-03-14T06:50:19.6705430Z class GraphModule(torch.nn.Module): 2025-03-14T06:50:19.6706980Z def forward(self, L_input_ids_: "i64[1, 128][128, 1]cuda:0", L_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[10000, 256][256, 1]cuda:0", L_self_modules_model_modules_decoder_modules_embed_positions_parameters_weights_: "f32[1026, 256][256, 1]cuda:0"): 2025-03-14T06:50:19.6708093Z l_input_ids_ = L_input_ids_ 2025-03-14T06:50:19.6709793Z l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:50:19.6710956Z l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weights_ = L_self_modules_model_modules_decoder_modules_embed_positions_parameters_weights_ 2025-03-14T06:50:19.6711691Z 2025-03-14T06:50:19.6712520Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:559 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:50:19.6713518Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_input_ids_.view(-1, 128); l_input_ids_ = None 2025-03-14T06:50:19.6714088Z 2025-03-14T06:50:19.6715054Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:569 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:50:19.6716756Z embedding: "f32[1, 128, 256][32768, 256, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:50:19.6717974Z inputs_embeds: "f32[1, 128, 256][32768, 256, 1]cuda:0" = embedding * 16.0; embedding = None 2025-03-14T06:50:19.6718419Z 2025-03-14T06:50:19.6719242Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:50:19.6720353Z mask: "f32[128, 128][128, 1]cuda:0" = torch.full((128, 128), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:50:19.6720858Z 2025-03-14T06:50:19.6721609Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:50:19.6722569Z mask_cond: "i64[128][1]cuda:0" = torch.arange(128, device = device(type='cuda', index=0)) 2025-03-14T06:50:19.6723008Z 2025-03-14T06:50:19.6723804Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:50:19.6724700Z add: "i64[128][1]cuda:0" = mask_cond + 1 2025-03-14T06:50:19.6725093Z view_1: "i64[128, 1][1, 1]cuda:0" = add.view(128, 1); add = None 2025-03-14T06:50:19.6725569Z lt: "b8[128, 128][128, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:50:19.6726128Z masked_fill_: "f32[128, 128][128, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:50:19.6726579Z 2025-03-14T06:50:19.6727236Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:50:19.6728058Z mask_1: "f32[128, 128][128, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:50:19.6728451Z 2025-03-14T06:50:19.6729289Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:50:19.6730458Z getitem: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:50:19.6731268Z causal_4d_mask: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = getitem.expand(1, 1, 128, 128); getitem = causal_4d_mask = None 2025-03-14T06:50:19.6731795Z 2025-03-14T06:50:19.6732154Z # No stacktrace found for following nodes 2025-03-14T06:50:19.6732628Z _set_grad_enabled = torch._C._set_grad_enabled(False); _set_grad_enabled = None 2025-03-14T06:50:19.6733063Z 2025-03-14T06:50:19.6733974Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:112 in create_position_ids_from_input_ids, code: mask = input_ids.ne(padding_idx).int() 2025-03-14T06:50:19.6734972Z ne: "b8[1, 128][128, 1]cuda:0" = input_ids.ne(1); input_ids = None 2025-03-14T06:50:19.6735410Z mask_2: "i32[1, 128][128, 1]cuda:0" = ne.int(); ne = None 2025-03-14T06:50:19.6735758Z 2025-03-14T06:50:19.6736876Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:113 in create_position_ids_from_input_ids, code: incremental_indices = (torch.cumsum(mask, dim=1).type_as(mask) + past_key_values_length) * mask 2025-03-14T06:50:19.6738077Z cumsum: "i64[1, 128][128, 1]cuda:0" = torch.cumsum(mask_2, dim = 1) 2025-03-14T06:50:19.6738561Z type_as: "i32[1, 128][128, 1]cuda:0" = cumsum.type_as(mask_2); cumsum = None 2025-03-14T06:50:19.6739032Z add_1: "i32[1, 128][128, 1]cuda:0" = type_as + 0; type_as = None 2025-03-14T06:50:19.6739549Z incremental_indices: "i32[1, 128][128, 1]cuda:0" = add_1 * mask_2; add_1 = mask_2 = None 2025-03-14T06:50:19.6739986Z 2025-03-14T06:50:19.6740877Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:114 in create_position_ids_from_input_ids, code: return incremental_indices.long() + padding_idx 2025-03-14T06:50:19.6741980Z long: "i64[1, 128][128, 1]cuda:0" = incremental_indices.long(); incremental_indices = None 2025-03-14T06:50:19.6742483Z add_2: "i64[1, 128][128, 1]cuda:0" = long + 1; long = None 2025-03-14T06:50:19.6742825Z 2025-03-14T06:50:19.6743890Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:89 in forward, code: position_ids = self.create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length).to( 2025-03-14T06:50:19.6745124Z position_ids: "i64[1, 128][128, 1]cuda:0" = add_2.to(device(type='cuda', index=0)); add_2 = None 2025-03-14T06:50:19.6745568Z 2025-03-14T06:50:19.6746500Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:98 in forward, code: return self.weights.index_select(0, position_ids.view(-1)).view(bsz, seq_len, -1).detach() 2025-03-14T06:50:19.6747615Z view_2: "i64[128][1]cuda:0" = position_ids.view(-1); position_ids = None 2025-03-14T06:50:19.6748685Z index_select: "f32[128, 256][256, 1]cuda:0" = l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weights_.index_select(0, view_2); l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weights_ = view_2 = None 2025-03-14T06:50:19.6749790Z view_3: "f32[1, 128, 256][32768, 256, 1]cuda:0" = index_select.view(1, 128, -1); index_select = None 2025-03-14T06:50:19.6750371Z positions: "f32[1, 128, 256][32768, 256, 1]cuda:0" = view_3.detach(); view_3 = None 2025-03-14T06:50:19.6750784Z 2025-03-14T06:50:19.6751039Z # No stacktrace found for following nodes 2025-03-14T06:50:19.6751519Z _set_grad_enabled_1 = torch._C._set_grad_enabled(True); _set_grad_enabled_1 = None 2025-03-14T06:50:19.6751958Z 2025-03-14T06:50:19.6752737Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:585 in forward, code: hidden_states = inputs_embeds + positions 2025-03-14T06:50:19.6753859Z hidden_states: "f32[1, 128, 256][32768, 256, 1]cuda:0" = inputs_embeds + positions; inputs_embeds = positions = None 2025-03-14T06:50:19.6754424Z 2025-03-14T06:50:19.6755466Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:586 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:50:19.6756866Z hidden_states_1: "f32[1, 128, 256][32768, 256, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:50:19.6757529Z 2025-03-14T06:50:19.6758286Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:614 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:50:19.6759255Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:50:19.6759608Z 2025-03-14T06:50:19.6760389Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py:615 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:50:19.6761363Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:50:19.6761784Z 2025-03-14T06:50:27.5110836Z Compilation time (from dynamo_timed): 5.542676225999999 2025-03-14T06:50:27.5118877Z pass 2025-03-14T06:50:27.5171619Z TIMING: entire_frame_compile:4.39023 gc:0.00655 _recursive_pre_grad_passes:0.00672 _recursive_joint_graph_passes:0.24744 inductor_compile:2.63506 backend_compile:3.34032 async_compile.precompile:0.12513 async_compile.wait:0.70401 cudagraphify.get_container:0.16122 _recursive_post_grad_passes:0.11279 code_gen:1.73141 pad_mm_benchmark:0.01577 entire_backward_compile:1.15245 CUDAGraphNode.record:3.09564 total_wall_time:5.54268 2025-03-14T06:50:27.5173652Z STATS: call_* op count: 68 | FakeTensorMode.__torch_dispatch__:4442 | ProxyTorchDispatchMode.__torch_dispatch__:1886 | FakeTensor.__torch_dispatch__:636 2025-03-14T06:50:27.5174477Z Dynamo produced 6 graphs covering 68 ops with 6 graph breaks (5 unique) 2025-03-14T06:50:32.8635202Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:50:32.8636375Z warnings.warn( 2025-03-14T06:50:33.4187791Z 2025-03-14T06:50:34.9683010Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:50:34.9683384Z loading model: 0it [00:01, ?it/s] 2025-03-14T06:50:34.9683734Z cuda train T5ForConditionalGeneration 2025-03-14T06:51:11.3675879Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:51:11.3676795Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:51:11.3677511Z pred = mod(**cloned_inputs) 2025-03-14T06:51:11.3678151Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/t5/modeling_t5.py", line 1706, in forward 2025-03-14T06:51:11.3678796Z encoder_outputs = self.encoder( 2025-03-14T06:51:11.3679420Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/t5/modeling_t5.py", line 1016, in forward 2025-03-14T06:51:11.3680078Z inputs_embeds = self.embed_tokens(input_ids) 2025-03-14T06:51:11.3680330Z 2025-03-14T06:51:11.4978953Z W0314 06:51:11.497000 29420 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:51:41.5220895Z Compilation time (from dynamo_timed): 59.108583371 2025-03-14T06:51:41.5245368Z pass 2025-03-14T06:51:41.5907402Z TIMING: entire_frame_compile:51.25928 gc:0.00422 _recursive_pre_grad_passes:0.03042 pad_mm_benchmark:0.2337 _recursive_joint_graph_passes:1.76385 _recursive_post_grad_passes:0.81951 async_compile.wait:3.01927 code_gen:14.5391 inductor_compile:27.37089 backend_compile:39.45162 cudagraphify.get_container:0.39559 entire_backward_compile:7.84931 CUDAGraphNode.record:1.45777 total_wall_time:59.10858 2025-03-14T06:51:41.5909414Z STATS: call_* op count: 1487 | FakeTensorMode.__torch_dispatch__:68032 | ProxyTorchDispatchMode.__torch_dispatch__:31692 | FakeTensor.__torch_dispatch__:11585 2025-03-14T06:51:41.5910257Z Dynamo produced 2 graphs covering 1487 ops with 5 graph breaks (4 unique) 2025-03-14T06:51:49.0304491Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:51:49.0306212Z warnings.warn( 2025-03-14T06:51:49.2897432Z 2025-03-14T06:51:50.8302610Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:51:50.8303113Z loading model: 0it [00:01, ?it/s] 2025-03-14T06:51:50.8303563Z cuda train T5Small 2025-03-14T06:52:02.1661964Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:52:02.1662816Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:52:02.1663517Z pred = mod(**cloned_inputs) 2025-03-14T06:52:02.1664186Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/t5/modeling_t5.py", line 1706, in forward 2025-03-14T06:52:02.1664819Z encoder_outputs = self.encoder( 2025-03-14T06:52:02.1665431Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/t5/modeling_t5.py", line 1016, in forward 2025-03-14T06:52:02.1666090Z inputs_embeds = self.embed_tokens(input_ids) 2025-03-14T06:52:02.1666334Z 2025-03-14T06:52:03.4064294Z W0314 06:52:03.405000 29931 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:52:15.3167556Z Compilation time (from dynamo_timed): 18.129403498 2025-03-14T06:52:15.3191145Z pass 2025-03-14T06:52:15.3802639Z TIMING: entire_frame_compile:18.1294 gc:0.0034 _recursive_pre_grad_passes:0.03019 async_compile.wait:0.49684 backend_compile:6.80287 cudagraphify.get_container:0.21225 CUDAGraphNode.record:0.58024 total_wall_time:18.1294 2025-03-14T06:52:15.3804023Z STATS: call_* op count: 1487 | FakeTensorMode.__torch_dispatch__:10165 | FakeTensor.__torch_dispatch__:1576 2025-03-14T06:52:15.3804698Z Dynamo produced 2 graphs covering 1487 ops with 5 graph breaks (4 unique) 2025-03-14T06:52:20.8541420Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:52:20.8542651Z warnings.warn( 2025-03-14T06:52:21.2534057Z 2025-03-14T06:52:25.2093151Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:52:25.2093560Z loading model: 0it [00:03, ?it/s] 2025-03-14T06:52:25.2093900Z cuda train TrOCRForCausalLM 2025-03-14T06:52:25.2402369Z WARNING:common:fp64 golden ref were not generated for TrOCRForCausalLM. Setting accuracy check to cosine 2025-03-14T06:52:27.0754479Z 2025-03-14T06:52:27.0755121Z class GraphModule(torch.nn.Module): 2025-03-14T06:52:27.0756930Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 256][256, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[514, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:52:27.0758782Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:52:27.0759926Z l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:52:27.0761063Z l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:52:27.0762235Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:52:27.0763419Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:52:27.0764303Z 2025-03-14T06:52:27.0764573Z # No stacktrace found for following nodes 2025-03-14T06:52:27.0765155Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:52:27.0765703Z 2025-03-14T06:52:27.0766470Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:578 in forward, code: input_ids = input_ids.view(-1, input.shape[-1]) 2025-03-14T06:52:27.0767501Z input_ids: "i64[1, 256][256, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 256); l_cloned_inputs_input_ids_ = None 2025-03-14T06:52:27.0768395Z 2025-03-14T06:52:27.0769191Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:589 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:52:27.0770926Z embedding: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:52:27.0772205Z inputs_embeds: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:52:27.0772652Z 2025-03-14T06:52:27.0773320Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:62 in forward, code: positions = torch.arange( 2025-03-14T06:52:27.0774254Z arange: "i64[256][1]cuda:0" = torch.arange(0, 256, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:52:27.0774742Z 2025-03-14T06:52:27.0775375Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:64 in forward, code: ).expand(bsz, -1) 2025-03-14T06:52:27.0776197Z positions: "i64[1, 256][256, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:52:27.0776607Z 2025-03-14T06:52:27.0777350Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:66 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:52:27.0778234Z add: "i64[1, 256][256, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:52:27.0779410Z embed_pos: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:52:27.0780489Z 2025-03-14T06:52:27.0781201Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:596 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:52:27.0782210Z hidden_states: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = inputs_embeds + embed_pos; inputs_embeds = embed_pos = None 2025-03-14T06:52:27.0782724Z 2025-03-14T06:52:27.0783627Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:599 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:52:27.0785894Z hidden_states_1: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:52:27.0787597Z 2025-03-14T06:52:27.0788617Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:601 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:52:27.0789983Z hidden_states_2: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:52:27.0790668Z 2025-03-14T06:52:27.0791527Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:52:27.0792626Z mask: "f32[256, 256][256, 1]cuda:0" = torch.full((256, 256), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:52:27.0793132Z 2025-03-14T06:52:27.0793878Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:52:27.0794908Z mask_cond: "i64[256][1]cuda:0" = torch.arange(256, device = device(type='cuda', index=0)) 2025-03-14T06:52:27.0795345Z 2025-03-14T06:52:27.0796144Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:52:27.0797041Z add_2: "i64[256][1]cuda:0" = mask_cond + 1 2025-03-14T06:52:27.0797441Z view_1: "i64[256, 1][1, 1]cuda:0" = add_2.view(256, 1); add_2 = None 2025-03-14T06:52:27.0797917Z lt: "b8[256, 256][256, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:52:27.0798471Z masked_fill_: "f32[256, 256][256, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:52:27.0798924Z 2025-03-14T06:52:27.0799575Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:52:27.0800391Z mask_1: "f32[256, 256][256, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:52:27.0800791Z 2025-03-14T06:52:27.0801677Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:52:27.0802840Z getitem: "f32[1, 1, 256, 256][65536, 65536, 256, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:52:27.0803642Z causal_4d_mask: "f32[1, 1, 256, 256][65536, 65536, 256, 1]cuda:0" = getitem.expand(1, 1, 256, 256); getitem = causal_4d_mask = None 2025-03-14T06:52:27.0804167Z 2025-03-14T06:52:27.0804870Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:642 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:52:27.0805699Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:52:27.0806046Z 2025-03-14T06:52:27.0806844Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:643 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:52:27.0807757Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:52:27.0808175Z 2025-03-14T06:52:27.0808316Z 2025-03-14T06:52:27.0808448Z class GraphModule(torch.nn.Module): 2025-03-14T06:52:27.0810187Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 256][256, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[514, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:52:27.0812055Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:52:27.0812829Z l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:52:27.0813949Z l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:52:27.0815122Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:52:27.0816306Z l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:52:27.0817031Z 2025-03-14T06:52:27.0817288Z # No stacktrace found for following nodes 2025-03-14T06:52:27.0817871Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:52:27.0818411Z 2025-03-14T06:52:27.0819151Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:578 in forward, code: input_ids = input_ids.view(-1, input.shape[-1]) 2025-03-14T06:52:27.0820172Z input_ids: "i64[1, 256][256, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 256); l_cloned_inputs_input_ids_ = None 2025-03-14T06:52:27.0820674Z 2025-03-14T06:52:27.0821508Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:589 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:52:27.0823196Z embedding: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:52:27.0824450Z inputs_embeds: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:52:27.0824898Z 2025-03-14T06:52:27.0825557Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:62 in forward, code: positions = torch.arange( 2025-03-14T06:52:27.0826491Z arange: "i64[256][1]cuda:0" = torch.arange(0, 256, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:52:27.0826983Z 2025-03-14T06:52:27.0827616Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:64 in forward, code: ).expand(bsz, -1) 2025-03-14T06:52:27.0828437Z positions: "i64[1, 256][256, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:52:27.0828845Z 2025-03-14T06:52:27.0829699Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:66 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:52:27.0830585Z add: "i64[1, 256][256, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:52:27.0831820Z embed_pos: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_mod_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:52:27.0832909Z 2025-03-14T06:52:27.0833624Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:596 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:52:27.0834788Z hidden_states: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = inputs_embeds + embed_pos; inputs_embeds = embed_pos = None 2025-03-14T06:52:27.0835310Z 2025-03-14T06:52:27.0836074Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:599 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:52:27.0838342Z hidden_states_1: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_mod_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:52:27.0840058Z 2025-03-14T06:52:27.0841014Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:601 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:52:27.0842394Z hidden_states_2: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:52:27.0843081Z 2025-03-14T06:52:27.0843895Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:52:27.0845000Z mask: "f32[256, 256][256, 1]cuda:0" = torch.full((256, 256), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:52:27.0845509Z 2025-03-14T06:52:27.0846257Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:52:27.0847223Z mask_cond: "i64[256][1]cuda:0" = torch.arange(256, device = device(type='cuda', index=0)) 2025-03-14T06:52:27.0847661Z 2025-03-14T06:52:27.0848456Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:52:27.0849356Z add_2: "i64[256][1]cuda:0" = mask_cond + 1 2025-03-14T06:52:27.0849764Z view_1: "i64[256, 1][1, 1]cuda:0" = add_2.view(256, 1); add_2 = None 2025-03-14T06:52:27.0850237Z lt: "b8[256, 256][256, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:52:27.0850801Z masked_fill_: "f32[256, 256][256, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:52:27.0851316Z 2025-03-14T06:52:27.0851967Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:52:27.0852877Z mask_1: "f32[256, 256][256, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:52:27.0853270Z 2025-03-14T06:52:27.0854096Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:52:27.0855263Z getitem: "f32[1, 1, 256, 256][65536, 65536, 256, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:52:27.0856057Z causal_4d_mask: "f32[1, 1, 256, 256][65536, 65536, 256, 1]cuda:0" = getitem.expand(1, 1, 256, 256); getitem = causal_4d_mask = None 2025-03-14T06:52:27.0856656Z 2025-03-14T06:52:27.0857354Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:642 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:52:27.0858187Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:52:27.0858532Z 2025-03-14T06:52:27.0859244Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:643 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:52:27.0860151Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:52:27.0860572Z 2025-03-14T06:52:28.0352607Z 2025-03-14T06:52:28.0353190Z class GraphModule(torch.nn.Module): 2025-03-14T06:52:28.0355071Z def forward(self, L_input_ids_: "i64[1, 256][256, 1]cuda:0", L_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_: "f32[50265, 1024][1024, 1]cuda:0", L_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_: "f32[514, 1024][1024, 1]cuda:0", L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_: "f32[1024][1]cuda:0", L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_: "f32[1024][1]cuda:0"): 2025-03-14T06:52:28.0356792Z l_input_ids_ = L_input_ids_ 2025-03-14T06:52:28.0357507Z l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ 2025-03-14T06:52:28.0358641Z l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = L_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ 2025-03-14T06:52:28.0359829Z l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ 2025-03-14T06:52:28.0361032Z l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = L_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ 2025-03-14T06:52:28.0361764Z 2025-03-14T06:52:28.0362517Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:578 in forward, code: input_ids = input_ids.view(-1, input.shape[-1]) 2025-03-14T06:52:28.0363467Z input_ids: "i64[1, 256][256, 1]cuda:0" = l_input_ids_.view(-1, 256); l_input_ids_ = None 2025-03-14T06:52:28.0363886Z 2025-03-14T06:52:28.0364671Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:589 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:52:28.0366344Z embedding: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_self_modules_model_modules_decoder_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:52:28.0367612Z inputs_embeds: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = embedding * 1.0; embedding = None 2025-03-14T06:52:28.0368736Z 2025-03-14T06:52:28.0369402Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:62 in forward, code: positions = torch.arange( 2025-03-14T06:52:28.0370332Z arange: "i64[256][1]cuda:0" = torch.arange(0, 256, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:52:28.0370824Z 2025-03-14T06:52:28.0371502Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:64 in forward, code: ).expand(bsz, -1) 2025-03-14T06:52:28.0372313Z positions: "i64[1, 256][256, 1]cuda:0" = arange.expand(1, -1); arange = None 2025-03-14T06:52:28.0372853Z 2025-03-14T06:52:28.0373578Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:66 in forward, code: return super().forward(positions + self.offset) 2025-03-14T06:52:28.0374459Z add: "i64[1, 256][256, 1]cuda:0" = positions + 2; positions = None 2025-03-14T06:52:28.0375646Z embed_pos: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = torch.nn.functional.embedding(add, l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_, None, None, 2.0, False, False); add = l_self_modules_model_modules_decoder_modules_embed_positions_parameters_weight_ = None 2025-03-14T06:52:28.0376734Z 2025-03-14T06:52:28.0377445Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:596 in forward, code: hidden_states = inputs_embeds + embed_pos 2025-03-14T06:52:28.0378454Z hidden_states: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = inputs_embeds + embed_pos; inputs_embeds = embed_pos = None 2025-03-14T06:52:28.0378972Z 2025-03-14T06:52:28.0379732Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:599 in forward, code: hidden_states = self.layernorm_embedding(hidden_states) 2025-03-14T06:52:28.0382013Z hidden_states_1: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = torch.nn.functional.layer_norm(hidden_states, (1024,), l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_, l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_, 1e-05); hidden_states = l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_weight_ = l_self_modules_model_modules_decoder_modules_layernorm_embedding_parameters_bias_ = None 2025-03-14T06:52:28.0383736Z 2025-03-14T06:52:28.0384629Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:601 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=self.dropout, training=self.training) 2025-03-14T06:52:28.0385997Z hidden_states_2: "f32[1, 256, 1024][262144, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states_1, p = 0.1, training = True); hidden_states_1 = hidden_states_2 = None 2025-03-14T06:52:28.0386676Z 2025-03-14T06:52:28.0387484Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:52:28.0388580Z mask: "f32[256, 256][256, 1]cuda:0" = torch.full((256, 256), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:52:28.0389079Z 2025-03-14T06:52:28.0389816Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:52:28.0390772Z mask_cond: "i64[256][1]cuda:0" = torch.arange(256, device = device(type='cuda', index=0)) 2025-03-14T06:52:28.0391204Z 2025-03-14T06:52:28.0392078Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:52:28.0392970Z add_2: "i64[256][1]cuda:0" = mask_cond + 1 2025-03-14T06:52:28.0393367Z view_1: "i64[256, 1][1, 1]cuda:0" = add_2.view(256, 1); add_2 = None 2025-03-14T06:52:28.0393833Z lt: "b8[256, 256][256, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:52:28.0394452Z masked_fill_: "f32[256, 256][256, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:52:28.0394894Z 2025-03-14T06:52:28.0395539Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:52:28.0396434Z mask_1: "f32[256, 256][256, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:52:28.0396820Z 2025-03-14T06:52:28.0397646Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:52:28.0398805Z getitem: "f32[1, 1, 256, 256][65536, 65536, 256, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:52:28.0399601Z causal_4d_mask: "f32[1, 1, 256, 256][65536, 65536, 256, 1]cuda:0" = getitem.expand(1, 1, 256, 256); getitem = causal_4d_mask = None 2025-03-14T06:52:28.0400120Z 2025-03-14T06:52:28.0400810Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:642 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:52:28.0401638Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:52:28.0401977Z 2025-03-14T06:52:28.0402686Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py:643 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:52:28.0403589Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:52:28.0404003Z 2025-03-14T06:52:33.3252336Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:52:33.3253373Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/trocr/modeling_trocr.py", line 66, in forward 2025-03-14T06:52:33.3254085Z return super().forward(positions + self.offset) 2025-03-14T06:52:33.3254343Z 2025-03-14T06:52:39.6499830Z Compilation time (from dynamo_timed): 6.114932379000001 2025-03-14T06:52:39.6517846Z pass 2025-03-14T06:52:39.7311370Z TIMING: entire_frame_compile:4.70654 gc:0.00606 _recursive_pre_grad_passes:0.00597 _recursive_joint_graph_passes:0.47713 inductor_compile:3.07721 backend_compile:3.78881 _recursive_post_grad_passes:0.11243 async_compile.precompile:0.23387 async_compile.wait:0.83839 code_gen:2.10153 cudagraphify.get_container:0.18315 pad_mm_benchmark:0.23228 entire_backward_compile:1.40839 CUDAGraphNode.record:5.66096 total_wall_time:6.11493 2025-03-14T06:52:39.7315030Z STATS: call_* op count: 59 | FakeTensorMode.__torch_dispatch__:4514 | FakeTensor.__torch_dispatch__:678 | ProxyTorchDispatchMode.__torch_dispatch__:1934 2025-03-14T06:52:39.7315839Z Dynamo produced 6 graphs covering 59 ops with 6 graph breaks (5 unique) 2025-03-14T06:52:45.1432265Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:52:45.1433472Z warnings.warn( 2025-03-14T06:52:45.3710488Z 2025-03-14T06:52:55.0723126Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:52:55.0723514Z loading model: 0it [00:09, ?it/s] 2025-03-14T06:52:55.0723857Z cuda train XGLMForCausalLM 2025-03-14T06:52:55.1268642Z WARNING:common:fp64 golden ref were not generated for XGLMForCausalLM. Setting accuracy check to cosine 2025-03-14T06:52:57.4724765Z 2025-03-14T06:52:57.4725402Z class GraphModule(torch.nn.Module): 2025-03-14T06:52:57.4726449Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 128][128, 1]cuda:0", L_mod_modules_model_modules_embed_tokens_parameters_weight_: "f32[256008, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_embed_positions_buffers_weights_: "f32[2050, 1024][1024, 1]cuda:0"): 2025-03-14T06:52:57.4727505Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:52:57.4728166Z l_mod_modules_model_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_embed_tokens_parameters_weight_ 2025-03-14T06:52:57.4729500Z l_mod_modules_model_modules_embed_positions_buffers_weights_ = L_mod_modules_model_modules_embed_positions_buffers_weights_ 2025-03-14T06:52:57.4730107Z 2025-03-14T06:52:57.4730380Z # No stacktrace found for following nodes 2025-03-14T06:52:57.4730962Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:52:57.4731496Z 2025-03-14T06:52:57.4732250Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:555 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:52:57.4733265Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 128); l_cloned_inputs_input_ids_ = None 2025-03-14T06:52:57.4733765Z 2025-03-14T06:52:57.4734429Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:564 in forward, code: position_ids = torch.arange( 2025-03-14T06:52:57.4735396Z position_ids: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:52:57.4735905Z 2025-03-14T06:52:57.4736609Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:570 in forward, code: position_ids = position_ids.unsqueeze(0) 2025-03-14T06:52:57.4737540Z position_ids_1: "i64[1, 128][128, 1]cuda:0" = position_ids.unsqueeze(0); position_ids = None 2025-03-14T06:52:57.4737982Z 2025-03-14T06:52:57.4738755Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:573 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:52:57.4740303Z embedding: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:52:57.4741449Z inputs_embeds: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = embedding * 32.0; embedding = None 2025-03-14T06:52:57.4741897Z 2025-03-14T06:52:57.4742713Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:52:57.4743940Z mask: "f32[128, 128][128, 1]cuda:0" = torch.full((128, 128), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:52:57.4744644Z 2025-03-14T06:52:57.4745418Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:52:57.4746385Z mask_cond: "i64[128][1]cuda:0" = torch.arange(128, device = device(type='cuda', index=0)) 2025-03-14T06:52:57.4746822Z 2025-03-14T06:52:57.4747818Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:52:57.4748715Z add: "i64[128][1]cuda:0" = mask_cond + 1 2025-03-14T06:52:57.4749117Z view_1: "i64[128, 1][1, 1]cuda:0" = add.view(128, 1); add = None 2025-03-14T06:52:57.4749586Z lt: "b8[128, 128][128, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:52:57.4750175Z masked_fill_: "f32[128, 128][128, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:52:57.4750624Z 2025-03-14T06:52:57.4751270Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:52:57.4752169Z mask_1: "f32[128, 128][128, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:52:57.4752558Z 2025-03-14T06:52:57.4753395Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:52:57.4754655Z getitem: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:52:57.4755445Z causal_4d_mask: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = getitem.expand(1, 1, 128, 128); getitem = causal_4d_mask = None 2025-03-14T06:52:57.4755964Z 2025-03-14T06:52:57.4756224Z # No stacktrace found for following nodes 2025-03-14T06:52:57.4756691Z _set_grad_enabled = torch._C._set_grad_enabled(False); _set_grad_enabled = None 2025-03-14T06:52:57.4757125Z 2025-03-14T06:52:57.4757784Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:174 in forward, code: position_ids += self.offset 2025-03-14T06:52:57.4758699Z position_ids_1 += 2; position_ids_2: "i64[1, 128][128, 1]cuda:0" = position_ids_1; position_ids_1 = None 2025-03-14T06:52:57.4759163Z 2025-03-14T06:52:57.4760090Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:181 in forward, code: return self.weights.index_select(0, position_ids.view(-1)).view(bsz, seq_len, self.weights.shape[-1]).detach() 2025-03-14T06:52:57.4761200Z view_2: "i64[128][1]cuda:0" = position_ids_2.view(-1); position_ids_2 = None 2025-03-14T06:52:57.4762136Z index_select: "f32[128, 1024][1024, 1]cuda:0" = l_mod_modules_model_modules_embed_positions_buffers_weights_.index_select(0, view_2); l_mod_modules_model_modules_embed_positions_buffers_weights_ = view_2 = None 2025-03-14T06:52:57.4763128Z view_3: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = index_select.view(1, 128, 1024); index_select = None 2025-03-14T06:52:57.4763703Z detach: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = view_3.detach(); view_3 = None 2025-03-14T06:52:57.4764113Z 2025-03-14T06:52:57.4764368Z # No stacktrace found for following nodes 2025-03-14T06:52:57.4764840Z _set_grad_enabled_1 = torch._C._set_grad_enabled(True); _set_grad_enabled_1 = None 2025-03-14T06:52:57.4765267Z 2025-03-14T06:52:57.4766134Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:586 in forward, code: hidden_states = inputs_embeds + self.embed_positions(position_ids, past_key_values_length) 2025-03-14T06:52:57.4767273Z hidden_states: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = inputs_embeds + detach; inputs_embeds = detach = None 2025-03-14T06:52:57.4767774Z 2025-03-14T06:52:57.4769129Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:587 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=float(self.dropout), training=self.training) 2025-03-14T06:52:57.4770635Z hidden_states_1: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:52:57.4771310Z 2025-03-14T06:52:57.4771997Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:616 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:52:57.4772812Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:52:57.4773160Z 2025-03-14T06:52:57.4773861Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:617 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:52:57.4774871Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:52:57.4775290Z 2025-03-14T06:52:57.4775444Z 2025-03-14T06:52:57.4775578Z class GraphModule(torch.nn.Module): 2025-03-14T06:52:57.4776564Z def forward(self, L_cloned_inputs_input_ids_: "i64[1, 128][128, 1]cuda:0", L_mod_modules_model_modules_embed_tokens_parameters_weight_: "f32[256008, 1024][1024, 1]cuda:0", L_mod_modules_model_modules_embed_positions_buffers_weights_: "f32[2050, 1024][1024, 1]cuda:0"): 2025-03-14T06:52:57.4777591Z l_cloned_inputs_input_ids_ = L_cloned_inputs_input_ids_ 2025-03-14T06:52:57.4778247Z l_mod_modules_model_modules_embed_tokens_parameters_weight_ = L_mod_modules_model_modules_embed_tokens_parameters_weight_ 2025-03-14T06:52:57.4779117Z l_mod_modules_model_modules_embed_positions_buffers_weights_ = L_mod_modules_model_modules_embed_positions_buffers_weights_ 2025-03-14T06:52:57.4779701Z 2025-03-14T06:52:57.4779953Z # No stacktrace found for following nodes 2025-03-14T06:52:57.4780533Z _enter_autocast = torch.amp.autocast_mode._enter_autocast('cuda', None, True, None); _enter_autocast = None 2025-03-14T06:52:57.4781069Z 2025-03-14T06:52:57.4781800Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:555 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:52:57.4782815Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_cloned_inputs_input_ids_.view(-1, 128); l_cloned_inputs_input_ids_ = None 2025-03-14T06:52:57.4783314Z 2025-03-14T06:52:57.4783986Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:564 in forward, code: position_ids = torch.arange( 2025-03-14T06:52:57.4784943Z position_ids: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:52:57.4785460Z 2025-03-14T06:52:57.4786156Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:570 in forward, code: position_ids = position_ids.unsqueeze(0) 2025-03-14T06:52:57.4787089Z position_ids_1: "i64[1, 128][128, 1]cuda:0" = position_ids.unsqueeze(0); position_ids = None 2025-03-14T06:52:57.4787531Z 2025-03-14T06:52:57.4788303Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:573 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:52:57.4789846Z embedding: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_mod_modules_model_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_mod_modules_model_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:52:57.4791039Z inputs_embeds: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = embedding * 32.0; embedding = None 2025-03-14T06:52:57.4791492Z 2025-03-14T06:52:57.4792397Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:52:57.4793501Z mask: "f32[128, 128][128, 1]cuda:0" = torch.full((128, 128), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:52:57.4794004Z 2025-03-14T06:52:57.4794826Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:52:57.4795782Z mask_cond: "i64[128][1]cuda:0" = torch.arange(128, device = device(type='cuda', index=0)) 2025-03-14T06:52:57.4796216Z 2025-03-14T06:52:57.4797015Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:52:57.4798049Z add: "i64[128][1]cuda:0" = mask_cond + 1 2025-03-14T06:52:57.4798446Z view_1: "i64[128, 1][1, 1]cuda:0" = add.view(128, 1); add = None 2025-03-14T06:52:57.4798911Z lt: "b8[128, 128][128, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:52:57.4807345Z masked_fill_: "f32[128, 128][128, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:52:57.4807857Z 2025-03-14T06:52:57.4808531Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:52:57.4809363Z mask_1: "f32[128, 128][128, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:52:57.4809759Z 2025-03-14T06:52:57.4810596Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:52:57.4811783Z getitem: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:52:57.4812589Z causal_4d_mask: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = getitem.expand(1, 1, 128, 128); getitem = causal_4d_mask = None 2025-03-14T06:52:57.4813111Z 2025-03-14T06:52:57.4813373Z # No stacktrace found for following nodes 2025-03-14T06:52:57.4813844Z _set_grad_enabled = torch._C._set_grad_enabled(False); _set_grad_enabled = None 2025-03-14T06:52:57.4814273Z 2025-03-14T06:52:57.4814939Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:174 in forward, code: position_ids += self.offset 2025-03-14T06:52:57.4815850Z position_ids_1 += 2; position_ids_2: "i64[1, 128][128, 1]cuda:0" = position_ids_1; position_ids_1 = None 2025-03-14T06:52:57.4816318Z 2025-03-14T06:52:57.4817260Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:181 in forward, code: return self.weights.index_select(0, position_ids.view(-1)).view(bsz, seq_len, self.weights.shape[-1]).detach() 2025-03-14T06:52:57.4818366Z view_2: "i64[128][1]cuda:0" = position_ids_2.view(-1); position_ids_2 = None 2025-03-14T06:52:57.4819308Z index_select: "f32[128, 1024][1024, 1]cuda:0" = l_mod_modules_model_modules_embed_positions_buffers_weights_.index_select(0, view_2); l_mod_modules_model_modules_embed_positions_buffers_weights_ = view_2 = None 2025-03-14T06:52:57.4820294Z view_3: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = index_select.view(1, 128, 1024); index_select = None 2025-03-14T06:52:57.4820873Z detach: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = view_3.detach(); view_3 = None 2025-03-14T06:52:57.4821283Z 2025-03-14T06:52:57.4821537Z # No stacktrace found for following nodes 2025-03-14T06:52:57.4822209Z _set_grad_enabled_1 = torch._C._set_grad_enabled(True); _set_grad_enabled_1 = None 2025-03-14T06:52:57.4822763Z 2025-03-14T06:52:57.4823637Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:586 in forward, code: hidden_states = inputs_embeds + self.embed_positions(position_ids, past_key_values_length) 2025-03-14T06:52:57.4824776Z hidden_states: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = inputs_embeds + detach; inputs_embeds = detach = None 2025-03-14T06:52:57.4825271Z 2025-03-14T06:52:57.4826183Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:587 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=float(self.dropout), training=self.training) 2025-03-14T06:52:57.4827636Z hidden_states_1: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:52:57.4828308Z 2025-03-14T06:52:57.4829008Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:616 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:52:57.4829851Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:52:57.4830223Z 2025-03-14T06:52:57.4830931Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:617 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:52:57.4831839Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:52:57.4832258Z 2025-03-14T06:52:58.4255543Z 2025-03-14T06:52:58.4256536Z class GraphModule(torch.nn.Module): 2025-03-14T06:52:58.4258505Z def forward(self, L_input_ids_: "i64[1, 128][128, 1]cuda:0", L_self_modules_model_modules_embed_tokens_parameters_weight_: "f32[256008, 1024][1024, 1]cuda:0", L_self_modules_model_modules_embed_positions_buffers_weights_: "f32[2050, 1024][1024, 1]cuda:0"): 2025-03-14T06:52:58.4260258Z l_input_ids_ = L_input_ids_ 2025-03-14T06:52:58.4260895Z l_self_modules_model_modules_embed_tokens_parameters_weight_ = L_self_modules_model_modules_embed_tokens_parameters_weight_ 2025-03-14T06:52:58.4261796Z l_self_modules_model_modules_embed_positions_buffers_weights_ = L_self_modules_model_modules_embed_positions_buffers_weights_ 2025-03-14T06:52:58.4262399Z 2025-03-14T06:52:58.4263146Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:555 in forward, code: input_ids = input_ids.view(-1, input_shape[-1]) 2025-03-14T06:52:58.4264081Z input_ids: "i64[1, 128][128, 1]cuda:0" = l_input_ids_.view(-1, 128); l_input_ids_ = None 2025-03-14T06:52:58.4264510Z 2025-03-14T06:52:58.4265174Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:564 in forward, code: position_ids = torch.arange( 2025-03-14T06:52:58.4266143Z position_ids: "i64[128][1]cuda:0" = torch.arange(0, 128, dtype = torch.int64, device = device(type='cuda', index=0)) 2025-03-14T06:52:58.4266665Z 2025-03-14T06:52:58.4267377Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:570 in forward, code: position_ids = position_ids.unsqueeze(0) 2025-03-14T06:52:58.4268495Z position_ids_1: "i64[1, 128][128, 1]cuda:0" = position_ids.unsqueeze(0); position_ids = None 2025-03-14T06:52:58.4268947Z 2025-03-14T06:52:58.4269725Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:573 in forward, code: inputs_embeds = self.embed_tokens(input_ids) * self.embed_scale 2025-03-14T06:52:58.4271626Z embedding: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.embedding(input_ids, l_self_modules_model_modules_embed_tokens_parameters_weight_, 1, None, 2.0, False, False); input_ids = l_self_modules_model_modules_embed_tokens_parameters_weight_ = None 2025-03-14T06:52:58.4272785Z inputs_embeds: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = embedding * 32.0; embedding = None 2025-03-14T06:52:58.4273235Z 2025-03-14T06:52:58.4274058Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:156 in _make_causal_mask, code: mask = torch.full((tgt_len, tgt_len), torch.finfo(dtype).min, device=device) 2025-03-14T06:52:58.4275231Z mask: "f32[128, 128][128, 1]cuda:0" = torch.full((128, 128), -3.4028234663852886e+38, device = device(type='cuda', index=0)) 2025-03-14T06:52:58.4275738Z 2025-03-14T06:52:58.4276621Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:157 in _make_causal_mask, code: mask_cond = torch.arange(mask.size(-1), device=device) 2025-03-14T06:52:58.4277578Z mask_cond: "i64[128][1]cuda:0" = torch.arange(128, device = device(type='cuda', index=0)) 2025-03-14T06:52:58.4278021Z 2025-03-14T06:52:58.4278815Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:158 in _make_causal_mask, code: mask.masked_fill_(mask_cond < (mask_cond + 1).view(mask.size(-1), 1), 0) 2025-03-14T06:52:58.4279712Z add: "i64[128][1]cuda:0" = mask_cond + 1 2025-03-14T06:52:58.4280109Z view_1: "i64[128, 1][1, 1]cuda:0" = add.view(128, 1); add = None 2025-03-14T06:52:58.4280580Z lt: "b8[128, 128][128, 1]cuda:0" = mask_cond < view_1; mask_cond = view_1 = None 2025-03-14T06:52:58.4281138Z masked_fill_: "f32[128, 128][128, 1]cuda:0" = mask.masked_fill_(lt, 0); lt = masked_fill_ = None 2025-03-14T06:52:58.4281595Z 2025-03-14T06:52:58.4282248Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:160 in _make_causal_mask, code: mask = mask.to(dtype) 2025-03-14T06:52:58.4283077Z mask_1: "f32[128, 128][128, 1]cuda:0" = mask.to(torch.float32); mask = None 2025-03-14T06:52:58.4283474Z 2025-03-14T06:52:58.4284312Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py:172 in _make_causal_mask, code: return mask[None, None, :, :].expand(bsz, 1, tgt_len, tgt_len + past_key_values_length) 2025-03-14T06:52:58.4285480Z getitem: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = mask_1[(None, None, slice(None, None, None), slice(None, None, None))]; mask_1 = None 2025-03-14T06:52:58.4286280Z causal_4d_mask: "f32[1, 1, 128, 128][16384, 16384, 128, 1]cuda:0" = getitem.expand(1, 1, 128, 128); getitem = causal_4d_mask = None 2025-03-14T06:52:58.4286810Z 2025-03-14T06:52:58.4287073Z # No stacktrace found for following nodes 2025-03-14T06:52:58.4287546Z _set_grad_enabled = torch._C._set_grad_enabled(False); _set_grad_enabled = None 2025-03-14T06:52:58.4287979Z 2025-03-14T06:52:58.4288638Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:174 in forward, code: position_ids += self.offset 2025-03-14T06:52:58.4289558Z position_ids_1 += 2; position_ids_2: "i64[1, 128][128, 1]cuda:0" = position_ids_1; position_ids_1 = None 2025-03-14T06:52:58.4290023Z 2025-03-14T06:52:58.4291011Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:181 in forward, code: return self.weights.index_select(0, position_ids.view(-1)).view(bsz, seq_len, self.weights.shape[-1]).detach() 2025-03-14T06:52:58.4292127Z view_2: "i64[128][1]cuda:0" = position_ids_2.view(-1); position_ids_2 = None 2025-03-14T06:52:58.4293084Z index_select: "f32[128, 1024][1024, 1]cuda:0" = l_self_modules_model_modules_embed_positions_buffers_weights_.index_select(0, view_2); l_self_modules_model_modules_embed_positions_buffers_weights_ = view_2 = None 2025-03-14T06:52:58.4294165Z view_3: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = index_select.view(1, 128, 1024); index_select = None 2025-03-14T06:52:58.4294748Z detach: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = view_3.detach(); view_3 = None 2025-03-14T06:52:58.4295159Z 2025-03-14T06:52:58.4295417Z # No stacktrace found for following nodes 2025-03-14T06:52:58.4295895Z _set_grad_enabled_1 = torch._C._set_grad_enabled(True); _set_grad_enabled_1 = None 2025-03-14T06:52:58.4296327Z 2025-03-14T06:52:58.4297198Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:586 in forward, code: hidden_states = inputs_embeds + self.embed_positions(position_ids, past_key_values_length) 2025-03-14T06:52:58.4298420Z hidden_states: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = inputs_embeds + detach; inputs_embeds = detach = None 2025-03-14T06:52:58.4298918Z 2025-03-14T06:52:58.4299837Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:587 in forward, code: hidden_states = nn.functional.dropout(hidden_states, p=float(self.dropout), training=self.training) 2025-03-14T06:52:58.4301249Z hidden_states_1: "f32[1, 128, 1024][131072, 1024, 1]cuda:0" = torch.nn.functional.dropout(hidden_states, p = 0.1, training = True); hidden_states = hidden_states_1 = None 2025-03-14T06:52:58.4301920Z 2025-03-14T06:52:58.4302608Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:616 in forward, code: dropout_probability = torch.rand([]) 2025-03-14T06:52:58.4303435Z dropout_probability: "f32[][]cpu" = torch.rand([]) 2025-03-14T06:52:58.4303788Z 2025-03-14T06:52:58.4304492Z # File: /opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py:617 in forward, code: if dropout_probability < self.layerdrop: 2025-03-14T06:52:58.4305401Z lt_1: "b8[][]cpu" = dropout_probability < 0.0; dropout_probability = lt_1 = None 2025-03-14T06:52:58.4305827Z 2025-03-14T06:52:59.2386917Z skipping cudagraphs due to mutated inputs (1 instances). Found from : 2025-03-14T06:52:59.2387761Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xglm/modeling_xglm.py", line 174, in forward 2025-03-14T06:52:59.2388433Z position_ids += self.offset 2025-03-14T06:52:59.2388637Z 2025-03-14T06:53:16.3915102Z Compilation time (from dynamo_timed): 5.823909672999999 2025-03-14T06:53:16.3938913Z pass 2025-03-14T06:53:16.4849046Z TIMING: entire_frame_compile:4.66767 gc:0.00539 _recursive_pre_grad_passes:0.00684 _recursive_joint_graph_passes:0.25774 inductor_compile:2.91886 backend_compile:3.72303 async_compile.precompile:0.19401 async_compile.wait:0.82256 cudagraphify.get_container:0.24886 _recursive_post_grad_passes:0.11994 code_gen:1.9221 pad_mm_benchmark:0.01669 entire_backward_compile:1.15624 CUDAGraphNode.record:11.19598 total_wall_time:5.82391 2025-03-14T06:53:16.4851068Z STATS: call_* op count: 67 | FakeTensorMode.__torch_dispatch__:4614 | ProxyTorchDispatchMode.__torch_dispatch__:1964 | FakeTensor.__torch_dispatch__:717 2025-03-14T06:53:16.4851884Z Dynamo produced 6 graphs covering 67 ops with 6 graph breaks (5 unique) 2025-03-14T06:53:21.8898907Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:53:21.8900134Z warnings.warn( 2025-03-14T06:53:22.1882798Z 2025-03-14T06:53:26.7800923Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:53:26.7801651Z loading model: 0it [00:04, ?it/s] 2025-03-14T06:53:26.7802315Z cuda train XLNetLMHeadModel 2025-03-14T06:54:18.1575075Z W0314 06:54:18.156000 30753 site-packages/torch/_inductor/utils.py:1780] [2/0_1] DeviceCopy in input program 2025-03-14T06:54:33.1838541Z skipping cudagraphs due to skipping cudagraphs due to cpu device (cat). Found from : 2025-03-14T06:54:33.1839504Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:54:33.1840218Z pred = mod(**cloned_inputs) 2025-03-14T06:54:33.1840869Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1446, in forward 2025-03-14T06:54:33.1841545Z transformer_outputs = self.transformer( 2025-03-14T06:54:33.1842215Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1203, in forward 2025-03-14T06:54:33.1843169Z pos_emb = self.relative_positional_encoding(qlen, klen, bsz=bsz) 2025-03-14T06:54:33.1843997Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1055, in relative_positional_encoding 2025-03-14T06:54:33.1844822Z pos_emb = self.positional_embedding(fwd_pos_seq, inv_freq, bsz) 2025-03-14T06:54:33.1845612Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/xlnet/modeling_xlnet.py", line 1014, in positional_embedding 2025-03-14T06:54:33.1846320Z pos_emb = pos_emb[:, None, :] 2025-03-14T06:54:33.1846523Z 2025-03-14T06:54:53.0213519Z W0314 06:54:53.020000 30753 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:56:14.0261374Z Compilation time (from dynamo_timed): 153.511361665 2025-03-14T06:56:14.0325005Z pass 2025-03-14T06:56:14.2394632Z TIMING: entire_frame_compile:135.63322 gc:0.00606 _recursive_pre_grad_passes:0.05702 pad_mm_benchmark:0.4062 _recursive_joint_graph_passes:4.01262 _recursive_post_grad_passes:1.58626 async_compile.wait:5.14036 code_gen:41.20955 inductor_compile:73.01543 backend_compile:106.25632 entire_backward_compile:17.87814 cudagraphify.get_container:1.67706 CUDAGraphNode.record:1.53679 total_wall_time:153.51136 2025-03-14T06:56:14.2396548Z STATS: call_* op count: 2599 | FakeTensorMode.__torch_dispatch__:163621 | FakeTensor.__torch_dispatch__:29078 | ProxyTorchDispatchMode.__torch_dispatch__:69631 2025-03-14T06:56:14.2397377Z Dynamo produced 2 graphs covering 2599 ops with 5 graph breaks (4 unique) 2025-03-14T06:56:25.4013835Z /opt/conda/envs/py_3.10/lib/python3.10/site-packages/huggingface_hub/file_download.py:797: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. 2025-03-14T06:56:25.4015039Z warnings.warn( 2025-03-14T06:56:25.6742243Z 2025-03-14T06:56:28.0968326Z loading model: 0it [00:00, ?it/s] 2025-03-14T06:56:28.0968824Z loading model: 0it [00:02, ?it/s] 2025-03-14T06:56:28.0969270Z cuda train YituTechConvBert 2025-03-14T06:57:16.3587451Z skipping cudagraphs due to deterministic index put. Found from : 2025-03-14T06:57:16.3590356Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 531, in torch_dynamo_resume_in_forward_and_backward_pass_at_529 2025-03-14T06:57:16.3591187Z pred = mod(**cloned_inputs) 2025-03-14T06:57:16.3591954Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/convbert/modeling_convbert.py", line 918, in forward 2025-03-14T06:57:16.3592732Z generator_hidden_states = self.convbert( 2025-03-14T06:57:16.3593489Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/convbert/modeling_convbert.py", line 834, in forward 2025-03-14T06:57:16.3594317Z hidden_states = self.embeddings( 2025-03-14T06:57:16.3595073Z File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/transformers/models/convbert/modeling_convbert.py", line 230, in forward 2025-03-14T06:57:16.3595901Z inputs_embeds = self.word_embeddings(input_ids) 2025-03-14T06:57:16.3596234Z 2025-03-14T06:57:16.6089405Z W0314 06:57:16.608000 31480 site-packages/torch/_logging/_internal.py:1130] [4/0] Profiler function will be ignored 2025-03-14T06:58:16.6780858Z Compilation time (from dynamo_timed): 100.528478627 2025-03-14T06:58:16.6814777Z pass 2025-03-14T06:58:16.7392937Z TIMING: entire_frame_compile:88.53971 gc:0.00551 _recursive_pre_grad_passes:0.04535 pad_mm_benchmark:0.62762 _recursive_joint_graph_passes:2.509 _recursive_post_grad_passes:2.13371 async_compile.wait:3.55817 code_gen:27.19059 inductor_compile:48.73192 backend_compile:69.24894 cudagraphify.get_container:0.4545 entire_backward_compile:11.98877 CachingAutotuner.benchmark_all_configs:0.02472 CUDAGraphNode.record:2.44121 total_wall_time:100.52848 2025-03-14T06:58:16.7395490Z STATS: call_* op count: 2085 | FakeTensorMode.__torch_dispatch__:93173 | FakeTensor.__torch_dispatch__:20369 | ProxyTorchDispatchMode.__torch_dispatch__:42095 2025-03-14T06:58:16.7396325Z Dynamo produced 2 graphs covering 2085 ops with 5 graph breaks (4 unique) 2025-03-14T06:58:22.6119422Z accuracy pass_rate=93.48% 2025-03-14T06:58:22.6123034Z calls_captured gmean=0.00x mean=951.261x 2025-03-14T06:58:22.6126755Z unique_graphs gmean=0.00x mean=3.217x 2025-03-14T06:58:22.6130505Z graph_breaks gmean=0.00x mean=5.109x 2025-03-14T06:58:22.6134181Z unique_graph_breaks gmean=0.00x mean=4.000x 2025-03-14T06:58:22.6138156Z autograd_captures gmean=0.00x mean=0.000x 2025-03-14T06:58:22.6141826Z autograd_compiles gmean=0.00x mean=0.000x 2025-03-14T06:58:22.6145620Z cudagraph_skips gmean=0.00x mean=0.891x 2025-03-14T06:58:22.6147192Z compilation_latency mean=45.782 seconds 2025-03-14T06:58:24.2270071Z + python benchmarks/dynamo/check_accuracy.py --actual /var/lib/jenkins/workspace/test/test-reports/training_huggingface.csv --expected benchmarks/dynamo/ci_expected_accuracy/inductor_huggingface_training.csv 2025-03-14T06:58:24.4954857Z AlbertForMaskedLM PASS 2025-03-14T06:58:24.4959150Z AlbertForQuestionAnswering PASS 2025-03-14T06:58:24.4964682Z AllenaiLongformerBase PASS 2025-03-14T06:58:24.4970383Z BartForCausalLM PASS 2025-03-14T06:58:24.4975600Z BartForConditionalGeneration PASS 2025-03-14T06:58:24.4980964Z BertForMaskedLM PASS 2025-03-14T06:58:24.4986322Z BertForQuestionAnswering PASS 2025-03-14T06:58:24.4991770Z BlenderbotForCausalLM XFAIL 2025-03-14T06:58:24.4997194Z BlenderbotSmallForCausalLM PASS 2025-03-14T06:58:24.5002578Z BlenderbotSmallForConditionalGeneration PASS 2025-03-14T06:58:24.5007626Z CamemBert PASS 2025-03-14T06:58:24.5013017Z DebertaForMaskedLM PASS 2025-03-14T06:58:24.5018391Z DebertaForQuestionAnswering PASS 2025-03-14T06:58:24.5023708Z DebertaV2ForMaskedLM XFAIL 2025-03-14T06:58:24.5028951Z DebertaV2ForQuestionAnswering XFAIL 2025-03-14T06:58:24.5034136Z DistilBertForMaskedLM PASS 2025-03-14T06:58:24.5039644Z DistilBertForQuestionAnswering PASS 2025-03-14T06:58:24.5044821Z DistillGPT2 PASS 2025-03-14T06:58:24.5050029Z ElectraForCausalLM PASS 2025-03-14T06:58:24.5055320Z ElectraForQuestionAnswering PASS 2025-03-14T06:58:24.5060499Z GPT2ForSequenceClassification PASS 2025-03-14T06:58:24.5065712Z GoogleFnet PASS 2025-03-14T06:58:24.5071431Z LayoutLMForMaskedLM PASS 2025-03-14T06:58:24.5076980Z LayoutLMForSequenceClassification PASS 2025-03-14T06:58:24.5082184Z M2M100ForConditionalGeneration PASS 2025-03-14T06:58:24.5087380Z MBartForCausalLM PASS 2025-03-14T06:58:24.5092639Z MBartForConditionalGeneration PASS 2025-03-14T06:58:24.5097834Z MT5ForConditionalGeneration PASS 2025-03-14T06:58:24.5103013Z MegatronBertForCausalLM PASS 2025-03-14T06:58:24.5108189Z MegatronBertForQuestionAnswering PASS 2025-03-14T06:58:24.5113354Z MobileBertForMaskedLM PASS 2025-03-14T06:58:24.5119146Z MobileBertForQuestionAnswering PASS 2025-03-14T06:58:24.5123911Z OPTForCausalLM PASS 2025-03-14T06:58:24.5129110Z PLBartForCausalLM PASS 2025-03-14T06:58:24.5134330Z PLBartForConditionalGeneration PASS 2025-03-14T06:58:24.5139582Z PegasusForCausalLM PASS 2025-03-14T06:58:24.5144804Z PegasusForConditionalGeneration PASS 2025-03-14T06:58:24.5150024Z RobertaForCausalLM PASS 2025-03-14T06:58:24.5155445Z RobertaForQuestionAnswering PASS 2025-03-14T06:58:24.5160719Z Speech2Text2ForCausalLM PASS 2025-03-14T06:58:24.5165938Z T5ForConditionalGeneration PASS 2025-03-14T06:58:24.5171655Z T5Small PASS 2025-03-14T06:58:24.5176859Z TrOCRForCausalLM PASS 2025-03-14T06:58:24.5182166Z XGLMForCausalLM PASS 2025-03-14T06:58:24.5187377Z XLNetLMHeadModel PASS 2025-03-14T06:58:24.5192552Z YituTechConvBert PASS 2025-03-14T06:58:24.5640136Z + python benchmarks/dynamo/check_graph_breaks.py --actual /var/lib/jenkins/workspace/test/test-reports/training_huggingface.csv --expected benchmarks/dynamo/ci_expected_accuracy/inductor_huggingface_training.csv 2025-03-14T06:58:24.8314013Z AlbertForMaskedLM PASS 2025-03-14T06:58:24.8317965Z AlbertForQuestionAnswering PASS 2025-03-14T06:58:24.8322613Z AllenaiLongformerBase PASS 2025-03-14T06:58:24.8327217Z BartForCausalLM PASS 2025-03-14T06:58:24.8331953Z BartForConditionalGeneration PASS 2025-03-14T06:58:24.8336696Z BertForMaskedLM PASS 2025-03-14T06:58:24.8345360Z BertForQuestionAnswering PASS 2025-03-14T06:58:24.8345932Z BlenderbotForCausalLM PASS 2025-03-14T06:58:24.8350993Z BlenderbotSmallForCausalLM PASS 2025-03-14T06:58:24.8355762Z BlenderbotSmallForConditionalGeneration PASS 2025-03-14T06:58:24.8360421Z CamemBert PASS 2025-03-14T06:58:24.8365282Z DebertaForMaskedLM PASS 2025-03-14T06:58:24.8370132Z DebertaForQuestionAnswering PASS 2025-03-14T06:58:24.8375127Z DebertaV2ForMaskedLM PASS 2025-03-14T06:58:24.8379746Z DebertaV2ForQuestionAnswering PASS 2025-03-14T06:58:24.8384464Z DistilBertForMaskedLM PASS 2025-03-14T06:58:24.8389106Z DistilBertForQuestionAnswering PASS 2025-03-14T06:58:24.8393684Z DistillGPT2 PASS 2025-03-14T06:58:24.8398774Z ElectraForCausalLM PASS 2025-03-14T06:58:24.8403270Z ElectraForQuestionAnswering PASS 2025-03-14T06:58:24.8407832Z GPT2ForSequenceClassification PASS 2025-03-14T06:58:24.8412485Z GoogleFnet PASS 2025-03-14T06:58:24.8417162Z LayoutLMForMaskedLM PASS 2025-03-14T06:58:24.8421795Z LayoutLMForSequenceClassification PASS 2025-03-14T06:58:24.8426426Z M2M100ForConditionalGeneration PASS 2025-03-14T06:58:24.8431055Z MBartForCausalLM PASS 2025-03-14T06:58:24.8435790Z MBartForConditionalGeneration PASS 2025-03-14T06:58:24.8440485Z MT5ForConditionalGeneration PASS 2025-03-14T06:58:24.8445091Z MegatronBertForCausalLM PASS 2025-03-14T06:58:24.8449790Z MegatronBertForQuestionAnswering PASS 2025-03-14T06:58:24.8454382Z MobileBertForMaskedLM PASS 2025-03-14T06:58:24.8459038Z MobileBertForQuestionAnswering PASS 2025-03-14T06:58:24.8463749Z OPTForCausalLM PASS 2025-03-14T06:58:24.8468588Z PLBartForCausalLM PASS 2025-03-14T06:58:24.8474817Z PLBartForConditionalGeneration PASS 2025-03-14T06:58:24.8479522Z PegasusForCausalLM PASS 2025-03-14T06:58:24.8484161Z PegasusForConditionalGeneration PASS 2025-03-14T06:58:24.8488726Z RobertaForCausalLM PASS 2025-03-14T06:58:24.8493323Z RobertaForQuestionAnswering PASS 2025-03-14T06:58:24.8497903Z Speech2Text2ForCausalLM PASS 2025-03-14T06:58:24.8504047Z T5ForConditionalGeneration PASS 2025-03-14T06:58:24.8507139Z T5Small PASS 2025-03-14T06:58:24.8511766Z TrOCRForCausalLM PASS 2025-03-14T06:58:24.8516555Z XGLMForCausalLM PASS_BUT_FLAKY 2025-03-14T06:58:24.8521153Z XLNetLMHeadModel PASS 2025-03-14T06:58:24.8525760Z YituTechConvBert PASS 2025-03-14T06:58:24.8962500Z + cleanup_workspace 2025-03-14T06:58:24.8963272Z + echo 'sudo may print the following warning message that can be ignored. The chown command will still run.' 2025-03-14T06:58:24.8964244Z sudo may print the following warning message that can be ignored. The chown command will still run. 2025-03-14T06:58:24.8965236Z + echo ' sudo: setrlimit(RLIMIT_STACK): Operation not permitted' 2025-03-14T06:58:24.8965708Z sudo: setrlimit(RLIMIT_STACK): Operation not permitted 2025-03-14T06:58:24.8966241Z + echo 'For more details refer to https://github.com/sudo-project/sudo/issues/42' 2025-03-14T06:58:24.8966837Z For more details refer to https://github.com/sudo-project/sudo/issues/42 2025-03-14T06:58:24.8967313Z + sudo chown -R 1000 /var/lib/jenkins/workspace 2025-03-14T06:58:25.7316042Z ##[group]Run pytorch/test-infra/.github/actions/upload-benchmark-results@main 2025-03-14T06:58:25.7316532Z with: 2025-03-14T06:58:25.7316813Z benchmark-results-dir: test/test-reports 2025-03-14T06:58:25.7317151Z dry-run: false 2025-03-14T06:58:25.7317412Z schema-version: v3 2025-03-14T06:58:25.7317869Z github-token: *** 2025-03-14T06:58:25.7318124Z env: 2025-03-14T06:58:25.7318365Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:25.7318769Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:25.7319350Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:25.7319851Z ##[endgroup] 2025-03-14T06:58:25.7347265Z ##[group]Run set -eux 2025-03-14T06:58:25.7347544Z set -eux 2025-03-14T06:58:25.7347842Z python3 -mpip install boto3==1.35.33 2025-03-14T06:58:25.7361852Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:58:25.7362230Z env: 2025-03-14T06:58:25.7362501Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:25.7362844Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:25.7363400Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:25.7363898Z ##[endgroup] 2025-03-14T06:58:25.7401001Z + python3 -mpip install boto3==1.35.33 2025-03-14T06:58:26.0412671Z Defaulting to user installation because normal site-packages is not writeable 2025-03-14T06:58:27.2757563Z Collecting boto3==1.35.33 2025-03-14T06:58:27.2982222Z Downloading boto3-1.35.33-py3-none-any.whl (139 kB) 2025-03-14T06:58:27.3181548Z Requirement already satisfied: jmespath<2.0.0,>=0.7.1 in /usr/lib/python3.9/site-packages (from boto3==1.35.33) (0.10.0) 2025-03-14T06:58:27.3580864Z Collecting s3transfer<0.11.0,>=0.10.0 2025-03-14T06:58:27.3611376Z Downloading s3transfer-0.10.4-py3-none-any.whl (83 kB) 2025-03-14T06:58:28.6466212Z Collecting botocore<1.36.0,>=1.35.33 2025-03-14T06:58:28.6498076Z Downloading botocore-1.35.99-py3-none-any.whl (13.3 MB) 2025-03-14T06:58:28.8088239Z Requirement already satisfied: urllib3<1.27,>=1.25.4 in /usr/lib/python3.9/site-packages (from botocore<1.36.0,>=1.35.33->boto3==1.35.33) (1.25.10) 2025-03-14T06:58:28.8103883Z Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in /usr/lib/python3.9/site-packages (from botocore<1.36.0,>=1.35.33->boto3==1.35.33) (2.8.1) 2025-03-14T06:58:29.0032927Z Requirement already satisfied: six>=1.5 in /usr/lib/python3.9/site-packages (from python-dateutil<3.0.0,>=2.1->botocore<1.36.0,>=1.35.33->boto3==1.35.33) (1.15.0) 2025-03-14T06:58:29.0938680Z Installing collected packages: botocore, s3transfer, boto3 2025-03-14T06:58:29.6797231Z Successfully installed boto3-1.35.33 botocore-1.35.99 s3transfer-0.10.4 2025-03-14T06:58:29.8035689Z ##[group]Run set -eux 2025-03-14T06:58:29.8035998Z set -eux 2025-03-14T06:58:29.8036256Z  2025-03-14T06:58:29.8036521Z if [[ -z "${GITHUB_TOKEN}" ]]; then 2025-03-14T06:58:29.8036909Z  echo "Missing github-token input" 2025-03-14T06:58:29.8037252Z  exit 1 2025-03-14T06:58:29.8037498Z fi 2025-03-14T06:58:29.8046203Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:58:29.8046617Z env: 2025-03-14T06:58:29.8046869Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:29.8047227Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:29.8047793Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:29.8048641Z GITHUB_TOKEN: *** 2025-03-14T06:58:29.8049137Z ##[endgroup] 2025-03-14T06:58:29.8081031Z + [[ -z *** ]] 2025-03-14T06:58:29.8127487Z ##[group]Run pytorch/test-infra/.github/actions/get-workflow-job-id@main 2025-03-14T06:58:29.8127934Z with: 2025-03-14T06:58:29.8128311Z github-token: *** 2025-03-14T06:58:29.8128579Z env: 2025-03-14T06:58:29.8128830Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:29.8129192Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:29.8129993Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:29.8130511Z ##[endgroup] 2025-03-14T06:58:29.8149428Z ##[group]Run set -eux 2025-03-14T06:58:29.8149747Z set -eux 2025-03-14T06:58:29.8150010Z  2025-03-14T06:58:29.8150513Z python3 "${GITHUB_ACTION_PATH}/../../scripts/get_workflow_job_id.py" "${GITHUB_RUN_ID}" "${RUNNER_NAME}" 2025-03-14T06:58:29.8159478Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:58:29.8159859Z env: 2025-03-14T06:58:29.8160106Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:29.8160458Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:29.8161020Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:29.8161824Z GITHUB_TOKEN: *** 2025-03-14T06:58:29.8162085Z ##[endgroup] 2025-03-14T06:58:29.8189006Z + python3 /home/ec2-user/actions-runner/_work/_actions/pytorch/test-infra/main/.github/actions/get-workflow-job-id/../../scripts/get_workflow_job_id.py 13849515380 i-0995e781c94ad14d3 2025-03-14T06:58:31.0666689Z setting job-id=38756916179 2025-03-14T06:58:31.0667280Z setting job-name=cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu) 2025-03-14T06:58:31.0770438Z ##[group]Run set -eux 2025-03-14T06:58:31.0770777Z set -eux 2025-03-14T06:58:31.0771018Z  2025-03-14T06:58:31.0771413Z python3 "${GITHUB_ACTION_PATH}/../../scripts/benchmarks/gather_metadata.py" \ 2025-03-14T06:58:31.0771946Z  --schema-version "${SCHEMA_VERSION}" \ 2025-03-14T06:58:31.0772301Z  --repo "${REPO}" \ 2025-03-14T06:58:31.0772618Z  --head-branch "${HEAD_BRANCH}" \ 2025-03-14T06:58:31.0772961Z  --head-sha "${HEAD_SHA}" \ 2025-03-14T06:58:31.0773309Z  --workflow-id "${WORKFLOW_RUN_ID}" \ 2025-03-14T06:58:31.0773673Z  --run-attempt "${RUN_ATTEMPT}" \ 2025-03-14T06:58:31.0774018Z  --job-id "${JOB_ID}" \ 2025-03-14T06:58:31.0774326Z  --job-name "${JOB_NAME}" 2025-03-14T06:58:31.0783256Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:58:31.0783630Z env: 2025-03-14T06:58:31.0783868Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:31.0784211Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:31.0784766Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:31.0785265Z SCHEMA_VERSION: v3 2025-03-14T06:58:31.0785524Z REPO: pytorch/pytorch 2025-03-14T06:58:31.0785816Z HEAD_BRANCH: refs/heads/main 2025-03-14T06:58:31.0786147Z HEAD_SHA: aed0b7a742a2d7b7901790622829cbd2135049a4 2025-03-14T06:58:31.0786499Z WORKFLOW_RUN_ID: 13849515380 2025-03-14T06:58:31.0786789Z RUN_ATTEMPT: 1 2025-03-14T06:58:31.0787034Z JOB_ID: 38756916179 2025-03-14T06:58:31.0787501Z JOB_NAME: cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu) 2025-03-14T06:58:31.0788018Z ##[endgroup] 2025-03-14T06:58:31.0817797Z + python3 /home/ec2-user/actions-runner/_work/_actions/pytorch/test-infra/main/.github/actions/upload-benchmark-results/../../scripts/benchmarks/gather_metadata.py --schema-version v3 --repo pytorch/pytorch --head-branch refs/heads/main --head-sha aed0b7a742a2d7b7901790622829cbd2135049a4 --workflow-id 13849515380 --run-attempt 1 --job-id 38756916179 --job-name 'cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu)' 2025-03-14T06:58:31.1145238Z ##[group]Run set -eux 2025-03-14T06:58:31.1145682Z set -eux 2025-03-14T06:58:31.1145926Z  2025-03-14T06:58:31.1146196Z # TODO (huydhn): Implement this part 2025-03-14T06:58:31.1146570Z echo "runners=[]" >> "${GITHUB_OUTPUT}" 2025-03-14T06:58:31.1154937Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:58:31.1155322Z env: 2025-03-14T06:58:31.1155562Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:31.1156088Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:31.1156656Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:31.1157164Z ##[endgroup] 2025-03-14T06:58:31.1185867Z + echo 'runners=[]' 2025-03-14T06:58:31.1220761Z ##[group]Run set -eux 2025-03-14T06:58:31.1221068Z set -eux 2025-03-14T06:58:31.1221314Z  2025-03-14T06:58:31.1221595Z # TODO (huydhn): Implement this part 2025-03-14T06:58:31.1221999Z echo "dependencies={}" >> "${GITHUB_OUTPUT}" 2025-03-14T06:58:31.1233968Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:58:31.1234471Z env: 2025-03-14T06:58:31.1234712Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:31.1235069Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:31.1235632Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:31.1236135Z ##[endgroup] 2025-03-14T06:58:31.1262983Z + echo 'dependencies={}' 2025-03-14T06:58:31.1289311Z ##[group]Run set -eux 2025-03-14T06:58:31.1289639Z set -eux 2025-03-14T06:58:31.1289919Z  2025-03-14T06:58:31.1290215Z if [[ ! -d "${BENCHMARK_RESULTS_DIR}" ]]; then 2025-03-14T06:58:31.1290670Z  echo "${BENCHMARK_RESULTS_DIR} does not exist, skipping" 2025-03-14T06:58:31.1291173Z  # We don't want the job to fail if the directory doesn't exist 2025-03-14T06:58:31.1291580Z  exit 0 2025-03-14T06:58:31.1291827Z fi 2025-03-14T06:58:31.1292080Z  2025-03-14T06:58:31.1292346Z if [[ "${DRY_RUN}" == "true" ]]; then 2025-03-14T06:58:31.1292836Z  python3 "${GITHUB_ACTION_PATH}/../../scripts/upload_benchmark_results.py" \ 2025-03-14T06:58:31.1293392Z  --benchmark-results-dir "${BENCHMARK_RESULTS_DIR}" \ 2025-03-14T06:58:31.1293830Z  --metadata "${BENCHMARK_METADATA}" \ 2025-03-14T06:58:31.1294207Z  --runners "${RUNNER_INFO}" \ 2025-03-14T06:58:31.1294584Z  --dependencies "${DEPENDENCIES}" \ 2025-03-14T06:58:31.1294937Z  --dry-run 2025-03-14T06:58:31.1295213Z else 2025-03-14T06:58:31.1295610Z  python3 "${GITHUB_ACTION_PATH}/../../scripts/upload_benchmark_results.py" \ 2025-03-14T06:58:31.1296161Z  --benchmark-results-dir "${BENCHMARK_RESULTS_DIR}" \ 2025-03-14T06:58:31.1296596Z  --metadata "${BENCHMARK_METADATA}" \ 2025-03-14T06:58:31.1296956Z  --runners "${RUNNER_INFO}" \ 2025-03-14T06:58:31.1297332Z  --dependencies "${DEPENDENCIES}" 2025-03-14T06:58:31.1297668Z fi 2025-03-14T06:58:31.1305562Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:58:31.1305934Z env: 2025-03-14T06:58:31.1306175Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:31.1306525Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:31.1307092Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:31.1307641Z BENCHMARK_RESULTS_DIR: test/test-reports 2025-03-14T06:58:31.1307969Z DRY_RUN: false 2025-03-14T06:58:31.1309222Z BENCHMARK_METADATA: {"timestamp": 1741935511, "schema_version": "v3", "name": "cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu)", "repo": "pytorch/pytorch", "head_branch": "refs/heads/main", "head_sha": "aed0b7a742a2d7b7901790622829cbd2135049a4", "workflow_id": 13849515380, "run_attempt": 1, "job_id": 38756916179} 2025-03-14T06:58:31.1310709Z RUNNER_INFO: [] 2025-03-14T06:58:31.1310971Z DEPENDENCIES: {} 2025-03-14T06:58:31.1311231Z ##[endgroup] 2025-03-14T06:58:31.1337586Z + [[ ! -d test/test-reports ]] 2025-03-14T06:58:31.1337901Z + [[ false == \t\r\u\e ]] 2025-03-14T06:58:31.1340284Z + python3 /home/ec2-user/actions-runner/_work/_actions/pytorch/test-infra/main/.github/actions/upload-benchmark-results/../../scripts/upload_benchmark_results.py --benchmark-results-dir test/test-reports --metadata '{"timestamp": 1741935511, "schema_version": "v3", "name": "cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu)", "repo": "pytorch/pytorch", "head_branch": "refs/heads/main", "head_sha": "aed0b7a742a2d7b7901790622829cbd2135049a4", "workflow_id": 13849515380, "run_attempt": 1, "job_id": 38756916179}' --runners '[]' --dependencies '{}' 2025-03-14T06:58:31.2911895Z INFO:root:Upload test/test-reports/inference_huggingface.json to s3://ossci-benchmarks/v3/pytorch/pytorch/13849515380/38756916179/inference_huggingface.json 2025-03-14T06:58:31.3329121Z INFO:botocore.credentials:Found credentials from IAM Role: gh-ci-github-action-runners-runner-role 2025-03-14T06:58:31.5094566Z INFO:root:Upload test/test-reports/inference_huggingface_graph_breaks.json to s3://ossci-benchmarks/v3/pytorch/pytorch/13849515380/38756916179/inference_huggingface_graph_breaks.json 2025-03-14T06:58:31.6063338Z INFO:root:Upload test/test-reports/training_huggingface.json to s3://ossci-benchmarks/v3/pytorch/pytorch/13849515380/38756916179/training_huggingface.json 2025-03-14T06:58:31.7535508Z INFO:root:Upload test/test-reports/training_huggingface_graph_breaks.json to s3://ossci-benchmarks/v3/pytorch/pytorch/13849515380/38756916179/training_huggingface_graph_breaks.json 2025-03-14T06:58:31.9613477Z ##[group]Run cat test/**/*_toprint.log || true 2025-03-14T06:58:31.9613876Z cat test/**/*_toprint.log || true 2025-03-14T06:58:31.9622640Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:58:31.9623019Z env: 2025-03-14T06:58:31.9623277Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:31.9623622Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:31.9624183Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:31.9624682Z ##[endgroup] 2025-03-14T06:58:31.9714155Z cat: 'test/**/*_toprint.log': No such file or directory 2025-03-14T06:58:31.9746886Z ##[group]Run kill "$MONITOR_SCRIPT_PID" 2025-03-14T06:58:31.9747257Z kill "$MONITOR_SCRIPT_PID" 2025-03-14T06:58:31.9755396Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:58:31.9755772Z env: 2025-03-14T06:58:31.9756011Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:31.9756357Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:31.9756901Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:31.9757411Z MONITOR_SCRIPT_PID: 54095 2025-03-14T06:58:31.9757690Z ##[endgroup] 2025-03-14T06:58:31.9910945Z Prepare all required actions 2025-03-14T06:58:31.9911370Z Getting action download info 2025-03-14T06:58:32.1131469Z Download action repository 'actions/upload-artifact@v4' (SHA:4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1) 2025-03-14T06:58:32.5245704Z ##[group]Run ./.github/actions/upload-test-artifacts 2025-03-14T06:58:32.5246072Z with: 2025-03-14T06:58:32.5246492Z file-suffix: test-inductor_huggingface-1-1-linux.g5.4xlarge.nvidia.gpu_38756916179 2025-03-14T06:58:32.5247003Z s3-bucket: gha-artifacts 2025-03-14T06:58:32.5247279Z env: 2025-03-14T06:58:32.5247522Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:32.5247872Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:32.5248431Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:32.5248936Z ##[endgroup] 2025-03-14T06:58:32.5277973Z ##[group]Run # Remove any previous test jsons if they exist 2025-03-14T06:58:32.5278459Z # Remove any previous test jsons if they exist 2025-03-14T06:58:32.5279003Z rm -f test-jsons-*.zip 2025-03-14T06:58:32.5279424Z zip -r "test-jsons-${FILE_SUFFIX}.zip" test/test-reports -i '*.json' 2025-03-14T06:58:32.5288869Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:58:32.5289246Z env: 2025-03-14T06:58:32.5289478Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:32.5289818Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:32.5290372Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:32.5291060Z FILE_SUFFIX: test-inductor_huggingface-1-1-linux.g5.4xlarge.nvidia.gpu_38756916179 2025-03-14T06:58:32.5291542Z ##[endgroup] 2025-03-14T06:58:32.5492886Z adding: test/test-reports/inference_huggingface.json (deflated 99%) 2025-03-14T06:58:32.5495755Z adding: test/test-reports/inference_huggingface_graph_breaks.json (deflated 96%) 2025-03-14T06:58:32.5562628Z adding: test/test-reports/training_huggingface.json (deflated 99%) 2025-03-14T06:58:32.5704803Z adding: test/test-reports/training_huggingface_graph_breaks.json (deflated 98%) 2025-03-14T06:58:32.5736835Z ##[group]Run # Remove any previous test reports if they exist 2025-03-14T06:58:32.5737323Z # Remove any previous test reports if they exist 2025-03-14T06:58:32.5737720Z rm -f test-reports-*.zip 2025-03-14T06:58:32.5738202Z zip -r "test-reports-${FILE_SUFFIX}.zip" test/test-reports -i '*.xml' -i '*.csv' 2025-03-14T06:58:32.5747250Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:58:32.5747632Z env: 2025-03-14T06:58:32.5747870Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:32.5748223Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:32.5748784Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:32.5749466Z FILE_SUFFIX: test-inductor_huggingface-1-1-linux.g5.4xlarge.nvidia.gpu_38756916179 2025-03-14T06:58:32.5749966Z ##[endgroup] 2025-03-14T06:58:32.5801368Z adding: test/test-reports/inference_huggingface.csv (deflated 70%) 2025-03-14T06:58:32.5801970Z adding: test/test-reports/inference_huggingface_graph_breaks.csv (deflated 85%) 2025-03-14T06:58:32.5802617Z adding: test/test-reports/inference_huggingface_graph_break_deduped.csv (deflated 63%) 2025-03-14T06:58:32.5803226Z adding: test/test-reports/training_huggingface.csv (deflated 68%) 2025-03-14T06:58:32.5813074Z adding: test/test-reports/training_huggingface_graph_breaks.csv (deflated 97%) 2025-03-14T06:58:32.5813905Z adding: test/test-reports/training_huggingface_graph_break_deduped.csv (deflated 72%) 2025-03-14T06:58:32.5869983Z ##[group]Run # Remove any previous usage logs if they exist 2025-03-14T06:58:32.5870465Z # Remove any previous usage logs if they exist 2025-03-14T06:58:32.5870875Z rm -f logs-*.zip 2025-03-14T06:58:32.5871394Z # this workflow is also run in bazel build test, but we dont generate usage reports for it 2025-03-14T06:58:32.5872208Z # so check to see if the file exists first 2025-03-14T06:58:32.5872590Z if [ -f 'usage_log.txt' ]; then 2025-03-14T06:58:32.5872966Z  zip "logs-${FILE_SUFFIX}.zip" 'usage_log.txt' 2025-03-14T06:58:32.5873319Z fi 2025-03-14T06:58:32.5873694Z if find "test/test-reports" -name "*.log" 2>/dev/null | grep -q .; then 2025-03-14T06:58:32.5874232Z  zip -r "logs-${FILE_SUFFIX}.zip" test/test-reports -i '*.log' 2025-03-14T06:58:32.5874689Z fi 2025-03-14T06:58:32.5882774Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:58:32.5883155Z env: 2025-03-14T06:58:32.5883393Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:32.5883745Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:32.5884300Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:32.5884987Z FILE_SUFFIX: test-inductor_huggingface-1-1-linux.g5.4xlarge.nvidia.gpu_38756916179 2025-03-14T06:58:32.5885629Z ##[endgroup] 2025-03-14T06:58:32.5976080Z adding: usage_log.txt (deflated 97%) 2025-03-14T06:58:32.6025135Z ##[group]Run # Remove any previous debugging artifacts if they exist 2025-03-14T06:58:32.6025671Z # Remove any previous debugging artifacts if they exist 2025-03-14T06:58:32.6026088Z rm -f debug-*.zip 2025-03-14T06:58:32.6026386Z if [ -d 'test/debug' ]; then 2025-03-14T06:58:32.6026761Z  zip -r "debug-${FILE_SUFFIX}.zip" test/debug 2025-03-14T06:58:32.6027116Z fi 2025-03-14T06:58:32.6035619Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:58:32.6035989Z env: 2025-03-14T06:58:32.6036240Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:32.6036592Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:32.6037151Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:32.6037845Z FILE_SUFFIX: test-inductor_huggingface-1-1-linux.g5.4xlarge.nvidia.gpu_38756916179 2025-03-14T06:58:32.6038342Z ##[endgroup] 2025-03-14T06:58:32.6124360Z ##[group]Run seemethere/upload-artifact-s3@v5 2025-03-14T06:58:32.6124713Z with: 2025-03-14T06:58:32.6124953Z s3-bucket: gha-artifacts 2025-03-14T06:58:32.6125293Z s3-prefix: pytorch/pytorch/13849515380/1/artifact 2025-03-14T06:58:32.6125663Z retention-days: 14 2025-03-14T06:58:32.6125942Z if-no-files-found: warn 2025-03-14T06:58:32.6126241Z path: test-jsons-*.zip 2025-03-14T06:58:32.6126518Z name: artifact 2025-03-14T06:58:32.6126769Z region: us-east-1 2025-03-14T06:58:32.6127016Z env: 2025-03-14T06:58:32.6127253Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:32.6127613Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:32.6128187Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:32.6128703Z ##[endgroup] 2025-03-14T06:58:32.9724001Z NOTE: s3-prefix specified, ignoring name parameter 2025-03-14T06:58:32.9724487Z With the provided path, there will be 1 file uploaded 2025-03-14T06:58:32.9724956Z Uploading to s3 prefix: pytorch/pytorch/13849515380/1/artifact 2025-03-14T06:58:32.9779775Z Starting upload of test-jsons-test-inductor_huggingface-1-1-linux.g5.4xlarge.nvidia.gpu_38756916179.zip 2025-03-14T06:58:33.1282189Z Finished upload of test-jsons-test-inductor_huggingface-1-1-linux.g5.4xlarge.nvidia.gpu_38756916179.zip 2025-03-14T06:58:33.1572372Z ##[group]Run seemethere/upload-artifact-s3@v5 2025-03-14T06:58:33.1572722Z with: 2025-03-14T06:58:33.1572966Z s3-bucket: gha-artifacts 2025-03-14T06:58:33.1573307Z s3-prefix: pytorch/pytorch/13849515380/1/artifact 2025-03-14T06:58:33.1573666Z retention-days: 14 2025-03-14T06:58:33.1573946Z if-no-files-found: error 2025-03-14T06:58:33.1574247Z path: test-reports-*.zip 2025-03-14T06:58:33.1574533Z name: artifact 2025-03-14T06:58:33.1574786Z region: us-east-1 2025-03-14T06:58:33.1575034Z env: 2025-03-14T06:58:33.1575267Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:33.1575831Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:33.1576413Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:33.1576915Z ##[endgroup] 2025-03-14T06:58:33.4930863Z NOTE: s3-prefix specified, ignoring name parameter 2025-03-14T06:58:33.4931505Z With the provided path, there will be 1 file uploaded 2025-03-14T06:58:33.4932097Z Uploading to s3 prefix: pytorch/pytorch/13849515380/1/artifact 2025-03-14T06:58:33.4985980Z Starting upload of test-reports-test-inductor_huggingface-1-1-linux.g5.4xlarge.nvidia.gpu_38756916179.zip 2025-03-14T06:58:33.6447410Z Finished upload of test-reports-test-inductor_huggingface-1-1-linux.g5.4xlarge.nvidia.gpu_38756916179.zip 2025-03-14T06:58:33.6738527Z ##[group]Run seemethere/upload-artifact-s3@v5 2025-03-14T06:58:33.6738877Z with: 2025-03-14T06:58:33.6739124Z s3-bucket: gha-artifacts 2025-03-14T06:58:33.6739467Z s3-prefix: pytorch/pytorch/13849515380/1/artifact 2025-03-14T06:58:33.6739844Z retention-days: 14 2025-03-14T06:58:33.6740310Z if-no-files-found: ignore 2025-03-14T06:58:33.6740607Z path: logs-*.zip 2025-03-14T06:58:33.6740863Z name: artifact 2025-03-14T06:58:33.6741116Z region: us-east-1 2025-03-14T06:58:33.6741365Z env: 2025-03-14T06:58:33.6741599Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:33.6741953Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:33.6742522Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:33.6743021Z ##[endgroup] 2025-03-14T06:58:34.0034307Z NOTE: s3-prefix specified, ignoring name parameter 2025-03-14T06:58:34.0034833Z With the provided path, there will be 1 file uploaded 2025-03-14T06:58:34.0035286Z Uploading to s3 prefix: pytorch/pytorch/13849515380/1/artifact 2025-03-14T06:58:34.0089438Z Starting upload of logs-test-inductor_huggingface-1-1-linux.g5.4xlarge.nvidia.gpu_38756916179.zip 2025-03-14T06:58:34.1947548Z Finished upload of logs-test-inductor_huggingface-1-1-linux.g5.4xlarge.nvidia.gpu_38756916179.zip 2025-03-14T06:58:34.2239521Z ##[group]Run seemethere/upload-artifact-s3@v5 2025-03-14T06:58:34.2239872Z with: 2025-03-14T06:58:34.2240113Z s3-bucket: gha-artifacts 2025-03-14T06:58:34.2240453Z s3-prefix: pytorch/pytorch/13849515380/1/artifact 2025-03-14T06:58:34.2240813Z retention-days: 14 2025-03-14T06:58:34.2241090Z if-no-files-found: ignore 2025-03-14T06:58:34.2241391Z path: debug-*.zip 2025-03-14T06:58:34.2241651Z name: artifact 2025-03-14T06:58:34.2241903Z region: us-east-1 2025-03-14T06:58:34.2242148Z env: 2025-03-14T06:58:34.2242380Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:34.2242730Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:34.2243296Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:34.2243798Z ##[endgroup] 2025-03-14T06:58:34.5474935Z No files were found with the provided path: debug-*.zip. No artifacts will be uploaded. 2025-03-14T06:58:34.5774402Z ##[group]Run # shellcheck disable=SC2156 2025-03-14T06:58:34.5774807Z # shellcheck disable=SC2156 2025-03-14T06:58:34.5775372Z find . -iname "core.[1-9]*" -exec docker exec "${DOCKER_CONTAINER_ID}" sh -c "gdb python {} -ex 'bt' -ex 'q'" \; 2025-03-14T06:58:34.5784940Z shell: /usr/bin/bash -e {0} 2025-03-14T06:58:34.5785215Z env: 2025-03-14T06:58:34.5785446Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:34.5785782Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:34.5786337Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:34.5786831Z ##[endgroup] 2025-03-14T06:58:34.8616840Z Prepare all required actions 2025-03-14T06:58:34.8617231Z Getting action download info 2025-03-14T06:58:34.9636907Z ##[group]Run ./.github/actions/upload-utilization-stats 2025-03-14T06:58:34.9637274Z with: 2025-03-14T06:58:34.9637506Z job_id: 38756916179 2025-03-14T06:58:34.9637993Z job_name: cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu) 2025-03-14T06:58:34.9638540Z workflow_name: inductor 2025-03-14T06:58:34.9638829Z workflow_run_id: 13849515380 2025-03-14T06:58:34.9639121Z workflow_attempt: 1 2025-03-14T06:58:34.9639381Z env: 2025-03-14T06:58:34.9639615Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:34.9639970Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:34.9640567Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:34.9641084Z ##[endgroup] 2025-03-14T06:58:34.9662096Z ##[group]Run echo "workflow_id: 13849515380" 2025-03-14T06:58:34.9662522Z echo "workflow_id: 13849515380" 2025-03-14T06:58:34.9662861Z echo "workflow_attempt: 1" 2025-03-14T06:58:34.9663203Z echo "workflow_Name: inductor" 2025-03-14T06:58:34.9663535Z echo "job_id: 38756916179" 2025-03-14T06:58:34.9664101Z echo "job_name: cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu)" 2025-03-14T06:58:34.9673456Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:58:34.9673839Z env: 2025-03-14T06:58:34.9674081Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:34.9674494Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:34.9675058Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:34.9675555Z ##[endgroup] 2025-03-14T06:58:34.9706502Z workflow_id: 13849515380 2025-03-14T06:58:34.9706812Z workflow_attempt: 1 2025-03-14T06:58:34.9707085Z workflow_Name: inductor 2025-03-14T06:58:34.9707356Z job_id: 38756916179 2025-03-14T06:58:34.9707890Z job_name: cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu) 2025-03-14T06:58:34.9747036Z ##[group]Run nick-fields/retry@v3.0.0 2025-03-14T06:58:34.9747345Z with: 2025-03-14T06:58:34.9747566Z shell: bash 2025-03-14T06:58:34.9747806Z timeout_minutes: 5 2025-03-14T06:58:34.9748066Z max_attempts: 5 2025-03-14T06:58:34.9748331Z retry_wait_seconds: 30 2025-03-14T06:58:34.9748793Z command: set -eu python3 -m pip install python-dateutil==2.8.2 boto3==1.35.42 pandas==2.1.3 2025-03-14T06:58:34.9749295Z polling_interval_seconds: 1 2025-03-14T06:58:34.9749592Z warning_on_retry: true 2025-03-14T06:58:34.9749871Z continue_on_error: false 2025-03-14T06:58:34.9750139Z env: 2025-03-14T06:58:34.9750369Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:34.9750712Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:34.9751270Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:34.9751775Z ##[endgroup] 2025-03-14T06:58:35.3061434Z Defaulting to user installation because normal site-packages is not writeable 2025-03-14T06:58:35.3904260Z Collecting python-dateutil==2.8.2 2025-03-14T06:58:35.4134132Z Downloading python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB) 2025-03-14T06:58:36.4498402Z Collecting boto3==1.35.42 2025-03-14T06:58:36.4534338Z Downloading boto3-1.35.42-py3-none-any.whl (139 kB) 2025-03-14T06:58:37.0075818Z Collecting pandas==2.1.3 2025-03-14T06:58:37.0160765Z Downloading pandas-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (12.3 MB) 2025-03-14T06:58:37.1418708Z Requirement already satisfied: six>=1.5 in /usr/lib/python3.9/site-packages (from python-dateutil==2.8.2) (1.15.0) 2025-03-14T06:58:37.1463393Z Requirement already satisfied: jmespath<2.0.0,>=0.7.1 in /usr/lib/python3.9/site-packages (from boto3==1.35.42) (0.10.0) 2025-03-14T06:58:37.1467451Z Requirement already satisfied: botocore<1.36.0,>=1.35.42 in /home/ec2-user/.local/lib/python3.9/site-packages (from boto3==1.35.42) (1.35.99) 2025-03-14T06:58:37.1472430Z Requirement already satisfied: s3transfer<0.11.0,>=0.10.0 in /home/ec2-user/.local/lib/python3.9/site-packages (from boto3==1.35.42) (0.10.4) 2025-03-14T06:58:37.2114965Z Requirement already satisfied: pytz>=2020.1 in /usr/lib/python3.9/site-packages (from pandas==2.1.3) (2022.7.1) 2025-03-14T06:58:37.2427937Z Collecting tzdata>=2022.1 2025-03-14T06:58:37.2460822Z Downloading tzdata-2025.1-py2.py3-none-any.whl (346 kB) 2025-03-14T06:58:38.1280095Z Collecting numpy<2,>=1.22.4 2025-03-14T06:58:38.1317422Z Downloading numpy-1.26.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.2 MB) 2025-03-14T06:58:38.3157134Z Requirement already satisfied: urllib3<1.27,>=1.25.4 in /usr/lib/python3.9/site-packages (from botocore<1.36.0,>=1.35.42->boto3==1.35.42) (1.25.10) 2025-03-14T06:58:38.4844433Z Installing collected packages: python-dateutil, tzdata, numpy, pandas, boto3 2025-03-14T06:58:43.4344754Z Attempting uninstall: boto3 2025-03-14T06:58:43.4345245Z Found existing installation: boto3 1.35.33 2025-03-14T06:58:43.4471872Z Uninstalling boto3-1.35.33: 2025-03-14T06:58:43.4487256Z Successfully uninstalled boto3-1.35.33 2025-03-14T06:58:43.5084371Z Successfully installed boto3-1.35.42 numpy-1.26.4 pandas-2.1.3 python-dateutil-2.8.2 tzdata-2025.1 2025-03-14T06:58:44.0600908Z Command completed after 1 attempt(s). 2025-03-14T06:58:44.0675932Z ##[group]Run python3 -m tools.stats.upload_utilization_stats.upload_utilization_stats \ 2025-03-14T06:58:44.0676616Z python3 -m tools.stats.upload_utilization_stats.upload_utilization_stats \ 2025-03-14T06:58:44.0677123Z  --workflow-run-id "13849515380" \ 2025-03-14T06:58:44.0677485Z  --workflow-name "inductor" \ 2025-03-14T06:58:44.0677846Z  --workflow-run-attempt "1" \ 2025-03-14T06:58:44.0678184Z  --job-id "38756916179" \ 2025-03-14T06:58:44.0678761Z  --job-name "cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu)" 2025-03-14T06:58:44.0687768Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:58:44.0688151Z env: 2025-03-14T06:58:44.0688396Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:44.0688751Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:44.0689317Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:44.0689837Z ##[endgroup] 2025-03-14T06:58:45.7634565Z repo: pytorch/pytorch 2025-03-14T06:58:45.7635081Z Downloading logs-test-inductor_huggingface-1-1-linux.g5.4xlarge.nvidia.gpu_38756916179.zip 2025-03-14T06:58:45.7635650Z Converted Log Model: UtilizationMetadata: 2025-03-14T06:58:45.7637001Z UtilizationMetadata(level='metadata', workflow_id='13849515380', job_id='38756916179', workflow_name='inductor', job_name='cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu)', usage_collect_interval=1.0, data_model_version=1.0, start_at=1741930879, gpu_count=1, cpu_count=16, gpu_type='pynvml', error=None) 2025-03-14T06:58:45.7638387Z [Db Segments] detected pytest cmd: 11, generated segments: 11 2025-03-14T06:58:45.7638788Z [db model] Peek db timeseries 2025-03-14T06:58:45.7639067Z :{ 2025-03-14T06:58:45.7639290Z "created_at": 1741935525, 2025-03-14T06:58:45.7639577Z "type": "utilization", 2025-03-14T06:58:45.7639858Z "tags": [ 2025-03-14T06:58:45.7640085Z "record" 2025-03-14T06:58:45.7640314Z ], 2025-03-14T06:58:45.7640539Z "time_stamp": 1741930879, 2025-03-14T06:58:45.7640830Z "repo": "pytorch/pytorch", 2025-03-14T06:58:45.7641112Z "workflow_id": 13849515380, 2025-03-14T06:58:45.7641398Z "run_attempt": 1, 2025-03-14T06:58:45.7641656Z "job_id": 38756916179, 2025-03-14T06:58:45.7641938Z "workflow_name": "inductor", 2025-03-14T06:58:45.7642460Z "job_name": "cuda12.6-py3.10-gcc9-sm86 / test (inductor_huggingface, 1, 1, linux.g5.4xlarge.nvidia.gpu)", 2025-03-14T06:58:45.7643230Z "json_data": "{}" 2025-03-14T06:58:45.7643498Z } 2025-03-14T06:58:45.7644022Z Writing 1 documents to S3 ossci-utilization/util_metadata/v_1.0/pytorch/pytorch/13849515380/1/38756916179/metadata 2025-03-14T06:58:45.7644881Z Done! Finish writing document to S3 ossci-utilization/util_metadata/v_1.0/pytorch/pytorch/13849515380/1/38756916179/metadata 2025-03-14T06:58:45.7645762Z Writing 919 documents to S3 ossci-utilization/util_timeseries/v_1.0/pytorch/pytorch/13849515380/1/38756916179/time_series 2025-03-14T06:58:45.7646668Z Done! Finish writing document to S3 ossci-utilization/util_timeseries/v_1.0/pytorch/pytorch/13849515380/1/38756916179/time_series 2025-03-14T06:58:45.8819769Z ##[group]Run pytorch/test-infra/.github/actions/teardown-linux@main 2025-03-14T06:58:45.8820209Z with: 2025-03-14T06:58:45.8820431Z env: 2025-03-14T06:58:45.8820661Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:45.8821006Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:45.8821564Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:45.8822058Z ##[endgroup] 2025-03-14T06:58:45.8848357Z ##[group]Run set -eou pipefail 2025-03-14T06:58:45.8848692Z set -eou pipefail 2025-03-14T06:58:45.8848972Z  2025-03-14T06:58:45.8849345Z echo "Holding runner for 2 hours until all ssh sessions have logged out" 2025-03-14T06:58:45.8849978Z for _ in $(seq 1440); do 2025-03-14T06:58:45.8850323Z  # Break if no ssh session exists anymore 2025-03-14T06:58:45.8850679Z  if [ "$(who)" = "" ]; then 2025-03-14T06:58:45.8850990Z  break 2025-03-14T06:58:45.8851260Z  fi 2025-03-14T06:58:45.8851509Z  echo "." 2025-03-14T06:58:45.8851763Z  sleep 5 2025-03-14T06:58:45.8852011Z done 2025-03-14T06:58:45.8860701Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:58:45.8861073Z env: 2025-03-14T06:58:45.8861304Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:45.8861648Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:45.8862196Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:45.8862693Z ##[endgroup] 2025-03-14T06:58:45.8889654Z Holding runner for 2 hours until all ssh sessions have logged out 2025-03-14T06:58:45.8979855Z ##[group]Run # ignore expansion of "docker ps -q" since it could be empty 2025-03-14T06:58:45.8980594Z # ignore expansion of "docker ps -q" since it could be empty 2025-03-14T06:58:45.8981025Z # shellcheck disable=SC2046 2025-03-14T06:58:45.8981390Z docker stop $(docker ps -q) || true 2025-03-14T06:58:45.8981748Z # Prune all of the docker images 2025-03-14T06:58:45.8982089Z docker system prune -af 2025-03-14T06:58:45.8990820Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:58:45.8991236Z env: 2025-03-14T06:58:45.8991473Z GIT_DEFAULT_BRANCH: main 2025-03-14T06:58:45.8991822Z GPU_FLAG: --gpus all -e NVIDIA_DRIVER_CAPABILITIES=all 2025-03-14T06:58:45.8992378Z DOCKER_CONTAINER_ID: fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:45.8992873Z ##[endgroup] 2025-03-14T06:58:46.8919281Z fb3818aafd9c 2025-03-14T06:58:47.7891288Z Deleted Containers: 2025-03-14T06:58:47.7891736Z fb3818aafd9cfd4bba5adec0a9e0fa1b2d27360283d90c61e01d6936025fa610 2025-03-14T06:58:47.7892105Z 2025-03-14T06:58:59.8507187Z Deleted Images: 2025-03-14T06:58:59.8508157Z untagged: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks:aa89d6e739080d90fa18625d57297c6734465849 2025-03-14T06:58:59.8509688Z untagged: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/pytorch-linux-focal-cuda12.6-cudnn9-py3-gcc9-inductor-benchmarks@sha256:20eb41577713f879cca4c6a57bd64d737c482cad39fe2b18409f513444e2b522 2025-03-14T06:58:59.8510817Z deleted: sha256:9f77b6c3483857c0bff989bce733b5bd5d6fc70a10591ed0f8d1de80d0e77bfd 2025-03-14T06:58:59.8511444Z deleted: sha256:006617515f4e958af4f9dc6065c779e410071c02b44689d893561937feda4f3d 2025-03-14T06:58:59.8512086Z deleted: sha256:d8dfc9fe28ede8a98ec1fc902c02e20e76ec8f22f58784b12c12343fa12aab46 2025-03-14T06:58:59.8512731Z deleted: sha256:2ce32bbc6df81c5a9f0eedd7f48abb4d8336b02cd0fe1c318c1e6ffbf5595e1d 2025-03-14T06:58:59.8513379Z deleted: sha256:c120dcdc0d53f8756455aeed564dcd3f8ecb5141f494a1d6a0e605e5a9c3d254 2025-03-14T06:58:59.8514022Z deleted: sha256:e5b77ce8ae749a04e3cb4574d097ead9420bada2f025a0a4f9fcf7de22ee6a3b 2025-03-14T06:58:59.8514707Z deleted: sha256:65922f72524fa8bd30f5814c0483c298cc1e363eeb7b032d764285811af93ce5 2025-03-14T06:58:59.8515704Z deleted: sha256:33352e487d1c5d4a1ea6f0e5ec8d4aa690c8d775510ea3e62f860dc97ac64997 2025-03-14T06:58:59.8516337Z deleted: sha256:2878e03ee5cad99652cce4dd49bf5eed60082e76b77e6b947066e957d7b4d828 2025-03-14T06:58:59.8516964Z deleted: sha256:e4447a79ed2e204ebac0f3b48e17d6ec52941cf7a0e5d0b3e9051bea08065f71 2025-03-14T06:58:59.8517598Z deleted: sha256:8aa553275dbba5e374d93de0cef6fc5150eca77aab3ff27289706fa6db36af67 2025-03-14T06:58:59.8518232Z deleted: sha256:dc0cf83dbb0248e366220eed83a75f88c574ebe4ca58830c0fef21e4bec501a4 2025-03-14T06:58:59.8518917Z deleted: sha256:daac181dd53a4d2f2c53329e091cb6a2911be2dc9b38cf8bd09f3bce1c56329c 2025-03-14T06:58:59.8519542Z deleted: sha256:5f7a34c79738cb5ff3137836e023378b646d5b28faea7f76b6ed7ba06bf54130 2025-03-14T06:58:59.8520326Z deleted: sha256:4b0c75c8695ff1444c28e7f0d9f37138c72bb20ce91d65155c07902e7e05a0c5 2025-03-14T06:58:59.8520942Z deleted: sha256:0f379e6e379d2461e4d77a69b7f475ee67a98b758bfa2a6e05f11f20e39966b3 2025-03-14T06:58:59.8521563Z deleted: sha256:3584990d592e96be75dd409691a557456ccf4876a869c0e3dd941bed6fed0b42 2025-03-14T06:58:59.8522192Z deleted: sha256:caf7ae943b4b818733b68d0dfeec2a2fcdcb4b58aae0ff0eb07ad8de3aecc198 2025-03-14T06:58:59.8522824Z deleted: sha256:95c2e1a5d17c51b798f929a84a005e82a2665193d91211021be3a9336b0693bf 2025-03-14T06:58:59.8523446Z deleted: sha256:84fd9d078db4e9840fbe6ae6e4fd539b4a7238f0dae2d0474c17148d3a27d059 2025-03-14T06:58:59.8524084Z deleted: sha256:0fb240cfd150fea9ed47646b4e3c3a7e4e1ea5307f56aaf07fa985d412318768 2025-03-14T06:58:59.8524718Z deleted: sha256:f18d22f63350f207ade7b97d488c59e8fd7cd7408ed04ed6f5b92ada881c61ef 2025-03-14T06:58:59.8525348Z deleted: sha256:d42f54324f663c1982969ed3fe001f4b304fecfa3cfc10132ad9d4dc84a2bacc 2025-03-14T06:58:59.8525965Z deleted: sha256:0b37319755f8628856da1780e49c5e71a173a7a1a4a3b7ac5f8ceb1071071c74 2025-03-14T06:58:59.8526576Z deleted: sha256:76592bb55488254587030a382269d8e529361c5dde7153d975ec0396488ca44d 2025-03-14T06:58:59.8527207Z deleted: sha256:2bffea548a1863fddca3ebaedf1ca51aaefd33da5128a2b1d81da0bbb12d4ad8 2025-03-14T06:58:59.8527852Z deleted: sha256:eed215398a5b664a7a01e0d0f26462615ad7a9cefe2f74257fb14575d08c244c 2025-03-14T06:58:59.8528478Z deleted: sha256:f5282aa578fffba31e0baa84c639a8e1e94a77ed3e552206d6ec3446cbfa89b1 2025-03-14T06:58:59.8529104Z deleted: sha256:f317c307d996656151b7ac4957ac6b2c986f06732e6afce47e43d0b31f917a87 2025-03-14T06:58:59.8529716Z deleted: sha256:fd255efe4773a357812751cffd74b8461b2f92c9a7b5127e7521925001c988d3 2025-03-14T06:58:59.8530323Z deleted: sha256:30e20732ffdb3275b7c129ad98542773192aca3faab2122c6332ab64261bcbd1 2025-03-14T06:58:59.8530945Z deleted: sha256:4eba7dc868086afd8f7068ffc4b1430987109e93f31499cbd0b9186d230f4b64 2025-03-14T06:58:59.8531570Z deleted: sha256:32b75793e91d34d3df888760cfaf84d89bad5549dfedc542bd2f37f4f19c4163 2025-03-14T06:58:59.8532200Z deleted: sha256:264e78fc63ac5701d0309f437fbd1371eef92d4cb1ad885bb7e8711f91876023 2025-03-14T06:58:59.8532827Z deleted: sha256:588d5279ec0827a0c14c7996c8a0de1308053c827072bc5ff633db2f3f8b4ca7 2025-03-14T06:58:59.8533436Z deleted: sha256:f8a41ed324145e99d2949a9ad7e770977a1253f00cc0500f20331914040a1267 2025-03-14T06:58:59.8534061Z deleted: sha256:68428ef9c1e2998605df716b9c01dc38c2bcdbba94c8b31c6430ab212efeac31 2025-03-14T06:58:59.8534686Z deleted: sha256:a1917ce13f328407ea441062f71b5b77ac5f2b3ae9c952851c42f9a82135793b 2025-03-14T06:58:59.8535307Z deleted: sha256:b3fbda5315c62609003c185eff2f901f968fb890b3c1f089bc8b524925368f24 2025-03-14T06:58:59.8535914Z deleted: sha256:5e9f33a8a4d4909d1334953eb99e6043242524e35d555e6622a2551211c26f5e 2025-03-14T06:58:59.8536525Z deleted: sha256:51838fe2fd479c6daa3a7cb4c47d5a084ad59043b17540a456ff88c0723d2250 2025-03-14T06:58:59.8537150Z deleted: sha256:f0272fb272ff749e1d3da1729dc7b5ff83c43465e5c95fb80e773f524e61580a 2025-03-14T06:58:59.8537775Z deleted: sha256:db15f066101d28e941686c1ac5e5a81024495efcd81d880451f9001076918e99 2025-03-14T06:58:59.8538448Z deleted: sha256:5f29f057376f675e8139a0b6af0d4795a0fc5a1930d205c2c24135e7929ca460 2025-03-14T06:58:59.8539195Z deleted: sha256:375b9f836aa6a734930fcd4ce90df2a9b40f094f7ebc68449755c2313caa1081 2025-03-14T06:58:59.8539817Z deleted: sha256:c6f22f833901f861e3c9b30d1620529d27ecaf641c94f336e5d17e50696cccc1 2025-03-14T06:58:59.8540437Z deleted: sha256:d0d45ba336b5db9d314ad4a1a948f4e1711f9e2f527865e29de0c162a66b16d6 2025-03-14T06:58:59.8541062Z deleted: sha256:9b7b0db453ce8cc71cef0616cb4e09b80b053735e533649fb39c3f21f26c7d81 2025-03-14T06:58:59.8541674Z deleted: sha256:441e98624d1376b791d8af11aeee95372e5516769496260a25ad268f7c24c606 2025-03-14T06:58:59.8542285Z deleted: sha256:1b3c121c6718eb9cd2b6e96fe80e390205d000355db4014a598bb86d9d97f955 2025-03-14T06:58:59.8542905Z deleted: sha256:f3d3a1d041a9da9f2d71a3565b2cee16d011f9c676d1db4b51599068bb178a86 2025-03-14T06:58:59.8543619Z deleted: sha256:05cbe0ec3896fa1c6cf4f24845f78a75acde5ae92a1b75928cdeb620af0ab92c 2025-03-14T06:58:59.8544242Z deleted: sha256:9233c2f0227af3f5244baddf5ff281313ef3ab22166d295168266b4024143358 2025-03-14T06:58:59.8544865Z deleted: sha256:bffee652694320768fb4c36c5fbf8d8235d9bcc56f6943f8ce068955e2fdff42 2025-03-14T06:58:59.8545489Z deleted: sha256:0c3bd1563fe41a6680da0f88490a6e3f95c13f717fcf3be921cb91bdc0c70c6d 2025-03-14T06:58:59.8546114Z deleted: sha256:a8e68dfacfd1c48c0f0de41a6f1027138d3594de96f11173259edb1c550d16e1 2025-03-14T06:58:59.8546744Z deleted: sha256:686ea9f39e5c7149e5e3a54c1fe8fd8944a89d1ac77d8ceb89d6baa82d38353e 2025-03-14T06:58:59.8547353Z deleted: sha256:cc54309555c2967d247e976d0615961da3574ba7a10807962394fa9715aa5ba9 2025-03-14T06:58:59.8547952Z deleted: sha256:6943262429b139d69d97e0b5a1c815b30a5e729c419be0b625a4dd752f3ea6d0 2025-03-14T06:58:59.8548618Z deleted: sha256:23f6ecaead1d4d1346b4ca530a520f5fb05c393fc9607ceb5d6d74f00fd95d9d 2025-03-14T06:58:59.8549246Z deleted: sha256:3463127b6781701fd57f819279de4b6a70d206e909ec0c3194e4189a7430a1af 2025-03-14T06:58:59.8549858Z deleted: sha256:c01ac6f42ea62550c7e4049f442abaf30774144a8883656d88a11c75a21f64f9 2025-03-14T06:58:59.8550484Z deleted: sha256:514b4f6b58aa7f2fb8ae46592e1bf3874b0fa104ae15f23e593e1a72fe5b6b58 2025-03-14T06:58:59.8551100Z deleted: sha256:c3843a02971b2998897727ff0a2572a5832df6caa5325182b2ec72d734c1e6ed 2025-03-14T06:58:59.8551703Z deleted: sha256:06563b6c0292b6f05bf75a09a8641f589a0d4b77f70f794425317915d99341a9 2025-03-14T06:58:59.8552316Z deleted: sha256:6c53bb6e0232a44846d29332a3243a22a11dddd51125d8f9734b1916bd20f18c 2025-03-14T06:58:59.8552928Z deleted: sha256:59b9e37805a254e0fd1cb7f05851c0a5dca09e78196334d3b70a50a1745f14ed 2025-03-14T06:58:59.8553550Z deleted: sha256:ebc7d3133ec482ba0278ab995ba56a672c8a3450ec83a176b1a780fd0f85e787 2025-03-14T06:58:59.8554237Z deleted: sha256:c5202300187efa50fa85a1f4d829e8a723f1243a4a395ca444f807d731d5b19f 2025-03-14T06:58:59.8554855Z deleted: sha256:9951141dbd2b4ebb9a96557e8e809ded0c1e449f499c8fd2fb4e05233fe6282f 2025-03-14T06:58:59.8555470Z deleted: sha256:fa382588f8917faec5545e1031108a2f5e3391cd75859197c3a632e47dcd00e9 2025-03-14T06:58:59.8556085Z deleted: sha256:2150f1c968f3842e8df8ae2e1ce3c0ab83e7062b906d8f6a4cd8415f6f822a00 2025-03-14T06:58:59.8556710Z deleted: sha256:46bc902662130b8d13f25b0294fac16ca6248034d5ccfaf3ee51cb56572946c5 2025-03-14T06:58:59.8557324Z deleted: sha256:43659ee56fac7f221f9388ec263ad2042847103213025f0a7a0ec7ca6a81b20f 2025-03-14T06:58:59.8557935Z deleted: sha256:5839c16cf80c38c4f69268b2c73c3d7ec37a33e18c9c5f49e6624239a182dbf8 2025-03-14T06:58:59.8558548Z deleted: sha256:e16d8e25c4393c96891af6d6a42889638fc30a7490ef9c5c5517798ea9860bda 2025-03-14T06:58:59.8559160Z deleted: sha256:868539926d1d83fe2b4fdb0231d8b57fb5b8b34144d733a4cf862d241e16c667 2025-03-14T06:58:59.8559777Z deleted: sha256:cc4224d01139aef3aeb29c568d184283602d52f47bb8e5e29de97bf6aa54d951 2025-03-14T06:58:59.8560397Z deleted: sha256:bd18b4fe898c9279c6e9ad952165522e24de55ab48d83289ca0ff6c8df97d85d 2025-03-14T06:58:59.8561026Z deleted: sha256:c7917cebb757d994f33b0f4bbad70fbb563939c2630d8f96113d437fa09e68cd 2025-03-14T06:58:59.8561649Z deleted: sha256:34fb1ecb50c48f883ea450e0364c90bdf16ce5e35e8a132012851ac3b040addf 2025-03-14T06:58:59.8562370Z deleted: sha256:92300fc3f1a943c0b6a295d87da98cb3d593b18b8ddc4f0b229e3ca909d03206 2025-03-14T06:58:59.8563002Z deleted: sha256:fffe76c64ef2dee2d80a8bb3ad13d65d596d04a45510b1956a976a69215dae92 2025-03-14T06:58:59.8563382Z 2025-03-14T06:58:59.8563510Z Total reclaimed space: 49.5GB 2025-03-14T06:58:59.8650738Z Post job cleanup. 2025-03-14T06:58:59.8703015Z Post job cleanup. 2025-03-14T06:58:59.9771804Z [command]/usr/bin/git version 2025-03-14T06:58:59.9830850Z git version 2.47.1 2025-03-14T06:58:59.9868752Z Copying '/home/ec2-user/.gitconfig' to '/home/ec2-user/actions-runner/_work/_temp/7622bbf9-c3ca-4c68-b2a3-351524ee8f3a/.gitconfig' 2025-03-14T06:58:59.9884754Z Temporarily overriding HOME='/home/ec2-user/actions-runner/_work/_temp/7622bbf9-c3ca-4c68-b2a3-351524ee8f3a' before making global git config changes 2025-03-14T06:58:59.9885881Z Adding repository directory to the temporary git global config as a safe directory 2025-03-14T06:58:59.9890481Z [command]/usr/bin/git config --global --add safe.directory /home/ec2-user/actions-runner/_work/pytorch/pytorch 2025-03-14T06:58:59.9935448Z [command]/usr/bin/git config --local --name-only --get-regexp core\.sshCommand 2025-03-14T06:58:59.9980537Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'core\.sshCommand' && git config --local --unset-all 'core.sshCommand' || :" 2025-03-14T06:59:00.0352223Z Entering 'android/libs/fbjni' 2025-03-14T06:59:00.0422780Z Entering 'third_party/FP16' 2025-03-14T06:59:00.0494468Z Entering 'third_party/FXdiv' 2025-03-14T06:59:00.0562922Z Entering 'third_party/NNPACK' 2025-03-14T06:59:00.0634232Z Entering 'third_party/NVTX' 2025-03-14T06:59:00.0703765Z Entering 'third_party/VulkanMemoryAllocator' 2025-03-14T06:59:00.0776015Z Entering 'third_party/XNNPACK' 2025-03-14T06:59:00.0861221Z Entering 'third_party/benchmark' 2025-03-14T06:59:00.0932648Z Entering 'third_party/composable_kernel' 2025-03-14T06:59:00.1008901Z Entering 'third_party/cpp-httplib' 2025-03-14T06:59:00.1077254Z Entering 'third_party/cpuinfo' 2025-03-14T06:59:00.1146652Z Entering 'third_party/cudnn_frontend' 2025-03-14T06:59:00.1218663Z Entering 'third_party/cutlass' 2025-03-14T06:59:00.1303466Z Entering 'third_party/eigen' 2025-03-14T06:59:00.1374262Z Entering 'third_party/fbgemm' 2025-03-14T06:59:00.1444463Z Entering 'third_party/fbgemm/third_party/asmjit' 2025-03-14T06:59:00.1511776Z Entering 'third_party/fbgemm/third_party/cpuinfo' 2025-03-14T06:59:00.1578223Z Entering 'third_party/fbgemm/third_party/cutlass' 2025-03-14T06:59:00.1650766Z Entering 'third_party/fbgemm/third_party/googletest' 2025-03-14T06:59:00.1716950Z Entering 'third_party/fbgemm/third_party/hipify_torch' 2025-03-14T06:59:00.1787056Z Entering 'third_party/flash-attention' 2025-03-14T06:59:00.1859677Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-03-14T06:59:00.1933014Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-03-14T06:59:00.2010974Z Entering 'third_party/flatbuffers' 2025-03-14T06:59:00.2084393Z Entering 'third_party/fmt' 2025-03-14T06:59:00.2152728Z Entering 'third_party/gemmlowp/gemmlowp' 2025-03-14T06:59:00.2222219Z Entering 'third_party/gloo' 2025-03-14T06:59:00.2292757Z Entering 'third_party/googletest' 2025-03-14T06:59:00.2363149Z Entering 'third_party/ideep' 2025-03-14T06:59:00.2430420Z Entering 'third_party/ideep/mkl-dnn' 2025-03-14T06:59:00.2506981Z Entering 'third_party/ittapi' 2025-03-14T06:59:00.2575577Z Entering 'third_party/kineto' 2025-03-14T06:59:00.2643146Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-03-14T06:59:00.2709780Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-03-14T06:59:00.2786728Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-03-14T06:59:00.2854516Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-03-14T06:59:00.2922848Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-03-14T06:59:00.2992043Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-03-14T06:59:00.3064398Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-03-14T06:59:00.3132755Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-03-14T06:59:00.3202851Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-03-14T06:59:00.3272495Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-03-14T06:59:00.3345402Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-03-14T06:59:00.3411626Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-03-14T06:59:00.3481443Z Entering 'third_party/kleidiai' 2025-03-14T06:59:00.3551894Z Entering 'third_party/mimalloc' 2025-03-14T06:59:00.3624995Z Entering 'third_party/nlohmann' 2025-03-14T06:59:00.3695792Z Entering 'third_party/onnx' 2025-03-14T06:59:00.3778671Z Entering 'third_party/onnx/third_party/pybind11' 2025-03-14T06:59:00.3852371Z Entering 'third_party/opentelemetry-cpp' 2025-03-14T06:59:00.3924457Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-03-14T06:59:00.3992726Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-03-14T06:59:00.4060290Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-03-14T06:59:00.4127941Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-03-14T06:59:00.4198322Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-03-14T06:59:00.4265936Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-03-14T06:59:00.4332244Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-03-14T06:59:00.4398714Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-03-14T06:59:00.4469363Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-03-14T06:59:00.4540338Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-03-14T06:59:00.4628768Z Entering 'third_party/pocketfft' 2025-03-14T06:59:00.4697757Z Entering 'third_party/protobuf' 2025-03-14T06:59:00.4768541Z Entering 'third_party/protobuf/third_party/benchmark' 2025-03-14T06:59:00.4835243Z Entering 'third_party/protobuf/third_party/googletest' 2025-03-14T06:59:00.4906199Z Entering 'third_party/psimd' 2025-03-14T06:59:00.4976245Z Entering 'third_party/pthreadpool' 2025-03-14T06:59:00.5044168Z Entering 'third_party/pybind11' 2025-03-14T06:59:00.5114688Z Entering 'third_party/python-peachpy' 2025-03-14T06:59:00.5182264Z Entering 'third_party/sleef' 2025-03-14T06:59:00.5250959Z Entering 'third_party/tensorpipe' 2025-03-14T06:59:00.5317744Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-03-14T06:59:00.5385312Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-03-14T06:59:00.5451741Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-03-14T06:59:00.5518916Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-03-14T06:59:00.5582629Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-03-14T06:59:00.5678622Z [command]/usr/bin/git config --local --name-only --get-regexp http\.https\:\/\/github\.com\/\.extraheader 2025-03-14T06:59:00.5703784Z http.https://github.com/.extraheader 2025-03-14T06:59:00.5716721Z [command]/usr/bin/git config --local --unset-all http.https://github.com/.extraheader 2025-03-14T06:59:00.5752485Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'http\.https\:\/\/github\.com\/\.extraheader' && git config --local --unset-all 'http.https://github.com/.extraheader' || :" 2025-03-14T06:59:00.6110495Z Entering 'android/libs/fbjni' 2025-03-14T06:59:00.6154539Z http.https://github.com/.extraheader 2025-03-14T06:59:00.6200582Z Entering 'third_party/FP16' 2025-03-14T06:59:00.6245244Z http.https://github.com/.extraheader 2025-03-14T06:59:00.6292280Z Entering 'third_party/FXdiv' 2025-03-14T06:59:00.6336716Z http.https://github.com/.extraheader 2025-03-14T06:59:00.6380720Z Entering 'third_party/NNPACK' 2025-03-14T06:59:00.6424594Z http.https://github.com/.extraheader 2025-03-14T06:59:00.6469674Z Entering 'third_party/NVTX' 2025-03-14T06:59:00.6517349Z http.https://github.com/.extraheader 2025-03-14T06:59:00.6562251Z Entering 'third_party/VulkanMemoryAllocator' 2025-03-14T06:59:00.6607445Z http.https://github.com/.extraheader 2025-03-14T06:59:00.6652262Z Entering 'third_party/XNNPACK' 2025-03-14T06:59:00.6697345Z http.https://github.com/.extraheader 2025-03-14T06:59:00.6755637Z Entering 'third_party/benchmark' 2025-03-14T06:59:00.6800905Z http.https://github.com/.extraheader 2025-03-14T06:59:00.6845387Z Entering 'third_party/composable_kernel' 2025-03-14T06:59:00.6890135Z http.https://github.com/.extraheader 2025-03-14T06:59:00.6941035Z Entering 'third_party/cpp-httplib' 2025-03-14T06:59:00.6986920Z http.https://github.com/.extraheader 2025-03-14T06:59:00.7030972Z Entering 'third_party/cpuinfo' 2025-03-14T06:59:00.7080053Z http.https://github.com/.extraheader 2025-03-14T06:59:00.7125660Z Entering 'third_party/cudnn_frontend' 2025-03-14T06:59:00.7169496Z http.https://github.com/.extraheader 2025-03-14T06:59:00.7213459Z Entering 'third_party/cutlass' 2025-03-14T06:59:00.7258747Z http.https://github.com/.extraheader 2025-03-14T06:59:00.7312494Z Entering 'third_party/eigen' 2025-03-14T06:59:00.7357714Z http.https://github.com/.extraheader 2025-03-14T06:59:00.7405820Z Entering 'third_party/fbgemm' 2025-03-14T06:59:00.7451322Z http.https://github.com/.extraheader 2025-03-14T06:59:00.7495768Z Entering 'third_party/fbgemm/third_party/asmjit' 2025-03-14T06:59:00.7543939Z http.https://github.com/.extraheader 2025-03-14T06:59:00.7587683Z Entering 'third_party/fbgemm/third_party/cpuinfo' 2025-03-14T06:59:00.7631004Z http.https://github.com/.extraheader 2025-03-14T06:59:00.7674767Z Entering 'third_party/fbgemm/third_party/cutlass' 2025-03-14T06:59:00.7717852Z http.https://github.com/.extraheader 2025-03-14T06:59:00.7767627Z Entering 'third_party/fbgemm/third_party/googletest' 2025-03-14T06:59:00.7812284Z http.https://github.com/.extraheader 2025-03-14T06:59:00.7855415Z Entering 'third_party/fbgemm/third_party/hipify_torch' 2025-03-14T06:59:00.7898462Z http.https://github.com/.extraheader 2025-03-14T06:59:00.7944207Z Entering 'third_party/flash-attention' 2025-03-14T06:59:00.7989180Z http.https://github.com/.extraheader 2025-03-14T06:59:00.8032502Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-03-14T06:59:00.8075751Z http.https://github.com/.extraheader 2025-03-14T06:59:00.8124942Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-03-14T06:59:00.8168883Z http.https://github.com/.extraheader 2025-03-14T06:59:00.8223840Z Entering 'third_party/flatbuffers' 2025-03-14T06:59:00.8267587Z http.https://github.com/.extraheader 2025-03-14T06:59:00.8315863Z Entering 'third_party/fmt' 2025-03-14T06:59:00.8362554Z http.https://github.com/.extraheader 2025-03-14T06:59:00.8407272Z Entering 'third_party/gemmlowp/gemmlowp' 2025-03-14T06:59:00.8451236Z http.https://github.com/.extraheader 2025-03-14T06:59:00.8496449Z Entering 'third_party/gloo' 2025-03-14T06:59:00.8541099Z http.https://github.com/.extraheader 2025-03-14T06:59:00.8584286Z Entering 'third_party/googletest' 2025-03-14T06:59:00.8628486Z http.https://github.com/.extraheader 2025-03-14T06:59:00.8672253Z Entering 'third_party/ideep' 2025-03-14T06:59:00.8717252Z http.https://github.com/.extraheader 2025-03-14T06:59:00.8760582Z Entering 'third_party/ideep/mkl-dnn' 2025-03-14T06:59:00.8804547Z http.https://github.com/.extraheader 2025-03-14T06:59:00.8857239Z Entering 'third_party/ittapi' 2025-03-14T06:59:00.8902058Z http.https://github.com/.extraheader 2025-03-14T06:59:00.8945712Z Entering 'third_party/kineto' 2025-03-14T06:59:00.8991209Z http.https://github.com/.extraheader 2025-03-14T06:59:00.9033449Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-03-14T06:59:00.9077318Z http.https://github.com/.extraheader 2025-03-14T06:59:00.9119925Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-03-14T06:59:00.9162777Z http.https://github.com/.extraheader 2025-03-14T06:59:00.9209410Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-03-14T06:59:00.9251659Z http.https://github.com/.extraheader 2025-03-14T06:59:00.9295967Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-03-14T06:59:00.9338826Z http.https://github.com/.extraheader 2025-03-14T06:59:00.9385167Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-03-14T06:59:00.9429221Z http.https://github.com/.extraheader 2025-03-14T06:59:00.9474440Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-03-14T06:59:00.9520050Z http.https://github.com/.extraheader 2025-03-14T06:59:00.9568260Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-03-14T06:59:00.9612037Z http.https://github.com/.extraheader 2025-03-14T06:59:00.9656513Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-03-14T06:59:00.9706250Z http.https://github.com/.extraheader 2025-03-14T06:59:00.9751168Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-03-14T06:59:00.9799897Z http.https://github.com/.extraheader 2025-03-14T06:59:00.9846241Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-03-14T06:59:00.9890399Z http.https://github.com/.extraheader 2025-03-14T06:59:00.9936385Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-03-14T06:59:00.9979863Z http.https://github.com/.extraheader 2025-03-14T06:59:01.0023996Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-03-14T06:59:01.0066728Z http.https://github.com/.extraheader 2025-03-14T06:59:01.0113631Z Entering 'third_party/kleidiai' 2025-03-14T06:59:01.0159981Z http.https://github.com/.extraheader 2025-03-14T06:59:01.0208702Z Entering 'third_party/mimalloc' 2025-03-14T06:59:01.0252585Z http.https://github.com/.extraheader 2025-03-14T06:59:01.0296236Z Entering 'third_party/nlohmann' 2025-03-14T06:59:01.0339610Z http.https://github.com/.extraheader 2025-03-14T06:59:01.0387201Z Entering 'third_party/onnx' 2025-03-14T06:59:01.0436430Z http.https://github.com/.extraheader 2025-03-14T06:59:01.0494573Z Entering 'third_party/onnx/third_party/pybind11' 2025-03-14T06:59:01.0538331Z http.https://github.com/.extraheader 2025-03-14T06:59:01.0587638Z Entering 'third_party/opentelemetry-cpp' 2025-03-14T06:59:01.0636554Z http.https://github.com/.extraheader 2025-03-14T06:59:01.0682141Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-03-14T06:59:01.0725519Z http.https://github.com/.extraheader 2025-03-14T06:59:01.0768900Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-03-14T06:59:01.0812362Z http.https://github.com/.extraheader 2025-03-14T06:59:01.0855515Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-03-14T06:59:01.0898361Z http.https://github.com/.extraheader 2025-03-14T06:59:01.0941147Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-03-14T06:59:01.0983404Z http.https://github.com/.extraheader 2025-03-14T06:59:01.1027947Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-03-14T06:59:01.1071106Z http.https://github.com/.extraheader 2025-03-14T06:59:01.1114601Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-03-14T06:59:01.1157635Z http.https://github.com/.extraheader 2025-03-14T06:59:01.1201298Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-03-14T06:59:01.1243390Z http.https://github.com/.extraheader 2025-03-14T06:59:01.1286142Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-03-14T06:59:01.1334393Z http.https://github.com/.extraheader 2025-03-14T06:59:01.1381190Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-03-14T06:59:01.1424892Z http.https://github.com/.extraheader 2025-03-14T06:59:01.1472216Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-03-14T06:59:01.1514481Z http.https://github.com/.extraheader 2025-03-14T06:59:01.1580861Z Entering 'third_party/pocketfft' 2025-03-14T06:59:01.1625075Z http.https://github.com/.extraheader 2025-03-14T06:59:01.1668135Z Entering 'third_party/protobuf' 2025-03-14T06:59:01.1711915Z http.https://github.com/.extraheader 2025-03-14T06:59:01.1757079Z Entering 'third_party/protobuf/third_party/benchmark' 2025-03-14T06:59:01.1801020Z http.https://github.com/.extraheader 2025-03-14T06:59:01.1844295Z Entering 'third_party/protobuf/third_party/googletest' 2025-03-14T06:59:01.1888207Z http.https://github.com/.extraheader 2025-03-14T06:59:01.1935446Z Entering 'third_party/psimd' 2025-03-14T06:59:01.1980825Z http.https://github.com/.extraheader 2025-03-14T06:59:01.2024283Z Entering 'third_party/pthreadpool' 2025-03-14T06:59:01.2069322Z http.https://github.com/.extraheader 2025-03-14T06:59:01.2111306Z Entering 'third_party/pybind11' 2025-03-14T06:59:01.2157179Z http.https://github.com/.extraheader 2025-03-14T06:59:01.2202407Z Entering 'third_party/python-peachpy' 2025-03-14T06:59:01.2246336Z http.https://github.com/.extraheader 2025-03-14T06:59:01.2290231Z Entering 'third_party/sleef' 2025-03-14T06:59:01.2334250Z http.https://github.com/.extraheader 2025-03-14T06:59:01.2378806Z Entering 'third_party/tensorpipe' 2025-03-14T06:59:01.2422771Z http.https://github.com/.extraheader 2025-03-14T06:59:01.2466334Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-03-14T06:59:01.2510558Z http.https://github.com/.extraheader 2025-03-14T06:59:01.2553864Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-03-14T06:59:01.2596938Z http.https://github.com/.extraheader 2025-03-14T06:59:01.2640339Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-03-14T06:59:01.2683882Z http.https://github.com/.extraheader 2025-03-14T06:59:01.2727212Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-03-14T06:59:01.2774828Z http.https://github.com/.extraheader 2025-03-14T06:59:01.2815716Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-03-14T06:59:01.2859017Z http.https://github.com/.extraheader 2025-03-14T06:59:01.3026682Z A job completed hook has been configured by the self-hosted runner administrator 2025-03-14T06:59:01.3056050Z ##[group]Run '/home/ec2-user/runner-scripts/after_job.sh' 2025-03-14T06:59:01.3063999Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-03-14T06:59:01.3064395Z ##[endgroup] 2025-03-14T06:59:07.8406514Z Cleaning up orphan processes