File size: 127 Bytes
5caedb4
1
Defines the size of a mini-batch uses during an iteration of the inference. **Batch size** defines the batch size used per GPU.