Skip to content

Conversation

@jagadish-amd
Copy link
Contributor

@jagadish-amd jagadish-amd commented Dec 11, 2025

device == "cuda" check fails if device is "cuda:0"
Replace with proper check.

Signed-off-by: Jagadish Krishnamoorthy <[email protected]>
@pytorch-bot
Copy link

pytorch-bot bot commented Dec 11, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/170254

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 Cancelled Job, 13 Unrelated Failures

As of commit 280abc9 with merge base d5c99e5 (image):

CANCELLED JOB - The following job was cancelled. Please retry:

FLAKY - The following jobs failed but were likely due to flakiness present on trunk:

UNSTABLE - The following jobs are marked as unstable, possibly due to flakiness on trunk:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@jagadish-amd
Copy link
Contributor Author

ping @jithunnair-amd

Copy link
Collaborator

@jeffdaily jeffdaily left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test module seems inconsistent. Some params are typed as device: str and others are device: torch.device. In the latter case, torch.device has the type attribute. torch.device is not iterable so "cuda" in device I expect to fail.

@jithunnair-amd jithunnair-amd changed the title inductor/fp8 test: Check for "cuda" in device type. [ROCm] inductor/fp8 test: Check for "cuda" in device type. Dec 12, 2025
@pytorch-bot pytorch-bot bot added ciflow/b200 ciflow/h100 ciflow/inductor ciflow/rocm Trigger "default" config CI on ROCm ciflow/rocm-mi300 Trigger "default" config CI on ROCm MI300 module: rocm AMD GPU support for Pytorch labels Dec 12, 2025
@jithunnair-amd jithunnair-amd added the ciflow/rocm-mi200 Trigger "default" config CI on ROCm MI200 label Dec 12, 2025
@jeffdaily
Copy link
Collaborator

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Dec 15, 2025
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: 1 jobs have failed, first few of them are: Limited CI on H100 / linux-jammy-cuda12_8-py3_10-gcc11-sm90-FA3-ABI-stable-test / test

Details for Dev Infra team Raised by workflow job

@jeffdaily
Copy link
Collaborator

@pytorchbot merge -f "should work"

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use -f as last resort and instead consider -i/--ignore-current to continue the merge ignoring current failures. This will allow currently pending tests to finish and report signal before the merge.

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

vishalgoyal316 pushed a commit to vishalgoyal316/pytorch that referenced this pull request Dec 17, 2025
…70254)

device == "cuda" check fails if device is "cuda:0"
Replace it with proper check.

Pull Request resolved: pytorch#170254
Approved by: https://github.com/jeffdaily
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/b200 ciflow/h100 ciflow/inductor ciflow/rocm Trigger "default" config CI on ROCm ciflow/rocm-mi200 Trigger "default" config CI on ROCm MI200 ciflow/rocm-mi300 Trigger "default" config CI on ROCm MI300 ciflow/trunk Trigger trunk jobs on your pull request Merged module: inductor module: rocm AMD GPU support for Pytorch open source topic: not user facing topic category

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants