TensorFlow on GPU: The Memory Hog
By default Tensorflow preallocates GPU memory eagerly. Background for that is that it wants to prevent memory fragmentation. The amount of memory it allocates is around 80% of the memory available. Although running a rather small model with less than 45k of parameters, the monitoring tool nvidia-smi shows this: