public static class LayerMemoryReport.Builder extends Object
| Constructor and Description |
|---|
Builder(String layerName,
Class<?> layerType,
InputType inputType,
InputType outputType) |
| Modifier and Type | Method and Description |
|---|---|
LayerMemoryReport |
build() |
LayerMemoryReport.Builder |
cacheMemory(long cacheModeMemoryFixed,
long cacheModeMemoryVariablePerEx)
Reports the cached/cacheable memory requirements.
|
LayerMemoryReport.Builder |
cacheMemory(Map<CacheMode,Long> cacheModeMemoryFixed,
Map<CacheMode,Long> cacheModeMemoryVariablePerEx)
Reports the cached/cacheable memory requirements.
|
LayerMemoryReport.Builder |
standardMemory(long parameterSize,
long updaterStateSize)
Report the standard memory
|
LayerMemoryReport.Builder |
workingMemory(long fixedInference,
long variableInferencePerEx,
long fixedTrain,
long variableTrainPerEx)
Report the working memory size, for both inference and training
|
LayerMemoryReport.Builder |
workingMemory(long fixedInference,
long variableInferencePerEx,
Map<CacheMode,Long> fixedTrain,
Map<CacheMode,Long> variableTrainPerEx)
Report the working memory requirements, for both inference and training.
|
public LayerMemoryReport.Builder standardMemory(long parameterSize, long updaterStateSize)
parameterSize - Number of parametersupdaterStateSize - Size for the updater arraypublic LayerMemoryReport.Builder workingMemory(long fixedInference, long variableInferencePerEx, long fixedTrain, long variableTrainPerEx)
fixedInference - Number of elements used for inference ( independent of minibatch size)variableInferencePerEx - Number of elements used for inference, for each examplefixedTrain - Number of elements used for training (independent of minibatch size)variableTrainPerEx - Number of elements used for training, for each examplepublic LayerMemoryReport.Builder workingMemory(long fixedInference, long variableInferencePerEx, Map<CacheMode,Long> fixedTrain, Map<CacheMode,Long> variableTrainPerEx)
MemoryReport
Working memory is memory That will be allocated in a ND4J workspace, or can be garbage collected at any
points after the method returns.fixedInference - Number of elements of working memory used for inference (independent of minibatch size)variableInferencePerEx - Number of elements of working memory used for inference, for each examplefixedTrain - Number of elements of working memory used for training (independent of
minibatch size), for each cache modevariableTrainPerEx - Number of elements of working memory used for training, for each example, for
each cache modepublic LayerMemoryReport.Builder cacheMemory(long cacheModeMemoryFixed, long cacheModeMemoryVariablePerEx)
cacheModeMemoryFixed - Number of elements of cache memory, independent of the mini batch sizecacheModeMemoryVariablePerEx - Number of elements of cache memory, for each examplepublic LayerMemoryReport.Builder cacheMemory(Map<CacheMode,Long> cacheModeMemoryFixed, Map<CacheMode,Long> cacheModeMemoryVariablePerEx)
cacheModeMemoryFixed - Number of elements of cache memory, independent of the mini batch sizecacheModeMemoryVariablePerEx - Number of elements of cache memory, for each examplepublic LayerMemoryReport build()
Copyright © 2018. All rights reserved.