Shortcuts

DistributedProxySampler#

class ignite.distributed.auto.DistributedProxySampler(sampler, num_replicas=None, rank=None)[source]#

Distributed sampler proxy to adapt user’s sampler for distributed data parallelism configuration.

Code is based on https://github.com/pytorch/pytorch/issues/23430#issuecomment-562350407

Note

Input sampler is assumed to have a constant size.

Parameters
  • sampler – Input torch data sampler.

  • num_replicas – Number of processes participating in distributed training.

  • rank – Rank of the current process within num_replicas.

Methods

set_epoch

Sets the epoch for this sampler.

set_epoch(epoch)#

Sets the epoch for this sampler. When shuffle=True, this ensures all replicas use a different random ordering for each epoch. Otherwise, the next iteration of this sampler will yield the same ordering.

Parameters

epoch (int) – Epoch number.

Return type

None