How to clear jupyter memory without restarting notebook

Question:

I am using a 3D Convolutional Neural Network for my thesis and I am trying to train the network with an input of 256×256 images, 22 channels, 5 pictures, using 8×8 sliding window with 90 degree rotation data augmentation. So the input size is (262144,22,8,8,5).

The input of the network are tiles of a bigger 10240×10240 image, so I need to train the model multiple times, in order to encompass my whole dataset.

I am working with 60GB of RAM, and my plan would be:

  1. Load the input tensor of one tile.

  2. Train the model

  3. Save the model

  4. Clear jupyter memory without shutting down the notebook

  5. Load the model

  6. Load the input tensor of the next tile

  7. Continue training the model

  8. Save the model

  9. Clear memory & repeat

I cannot load different tiles successively, or I will get a MemoryError.

I know that using “del tensor_name”, doesn’t actually remove the allocated memory.

Also it seems, that using %reset -f only clears variables and doesn’t clear the whole memory.

Asked By: Paku

||

Answers:

Jupyter is good for prototyping, but not good for months worth of work on the same file.

When I needed to start applying my code, I wound up putting my code into OOP (Object Oriented Programming) classes and used them in multiple .py scripts.

Lastly, to take a huge dataset as input, I needed to make a custom Keras generator by inheriting the Sequential class: https://stanford.edu/~shervine/blog/keras-how-to-generate-data-on-the-fly

Answered By: Paku
Categories: questions Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.