I don't know what the worker threads you see are. Are you sure the code was written with multiple GPUs in mind? If so, you will likely see some reference to DDP somewhere pytorch.org/docs/stable/...
06.12.2024 16:15 β π 0 π 0 π¬ 0 π 0@mikkelisk.bsky.social
I don't know what the worker threads you see are. Are you sure the code was written with multiple GPUs in mind? If so, you will likely see some reference to DDP somewhere pytorch.org/docs/stable/...
06.12.2024 16:15 β π 0 π 0 π¬ 0 π 0It's unclear what your issue is. Do the other 7 GPUs have some utilization or zero utilization?
Are you talking about 8 dataloader worker threads or the number you get from torch.get_num_threads()/torch.get_num_interop_threads()?
No, but it would be very nice to keep up with you here, rather than over there going forwards:)
18.11.2024 12:38 β π 1 π 0 π¬ 0 π 0