DistributedProxySampler#
- 
class ignite.distributed.auto.DistributedProxySampler(sampler, num_replicas=None, rank=None)[source]#
- Distributed sampler proxy to adapt user’s sampler for distributed data parallelism configuration. - Code is based on https://github.com/pytorch/pytorch/issues/23430#issuecomment-562350407 - Note - Input sampler is assumed to have a constant size. - Parameters
- sampler – Input torch data sampler. 
- num_replicas – Number of processes participating in distributed training. 
- rank – Rank of the current process within - num_replicas.
 
 - Methods - Sets the epoch for this sampler.