Remove nodes from graph or reset entire default graph

Question:

When working with the default global graph, is it possible to remove nodes after they’ve been added, or alternatively to reset the default graph to empty? When working with TF interactively in IPython, I find myself having to restart the kernel repeatedly. I would like to be able to experiment with graphs more easily if possible.

Answers:

Update 11/2/2016

tf.reset_default_graph()

Old stuff

There’s reset_default_graph, but not part of public API (I think it should be, does someone wants to file an issue on GitHub?)

My work-around to reset things is this:

from tensorflow.python.framework import ops
ops.reset_default_graph()
sess = tf.InteractiveSession()
Answered By: Yaroslav Bulatov

By default, a session is constructed around the default graph.
To avoid leaving dead nodes in the session, you need to either control the default graph or use an explicit graph.

  • To clear the default graph, you can use the tf.reset_default_graph function.

    tf.reset_default_graph()
    sess = tf.InteractiveSession()
    
  • You can also construct explicitly a graph and avoid using the default one. If you use a normal Session, you will need to fully create the graph before constructing the session. For InteractiveSession, you can just declare the graph and use it as a context to declare further changes:

    g = tf.Graph()
    sess = tf.InteractiveSession(graph=g)
    with g.asdefault():
        # Put variable declaration and other tf operation
        # in the graph context
        ....
        b = tf.matmul(A, x)
        ....
    
     sess.run([b], ...)
    

EDIT: For recent versions of tensorflow (1.0+), the correct function is g.as_default.

Answered By: Thomas Moreau

IPython / Jupyter notebook cells keep state between runs of a cell.

Create a custom graph:

def main():
    # Define your model
    data = tf.placeholder(...)
    model = ...

with tf.Graph().as_default():
    main()

Once ran, the graph is cleaned up.

Answered By: Serge

Not sure if I faced the very same problem, but

tf.keras.backend.clear_session()

at the beginning of the cell in which the model (Keras, in my case) was constructed and trained helped to “cut the clutter” so only the current graph remains in the TensorBoard visualization after repeated runs of the same cell.

Environment: TensorFlow 2.0 (tensorflow-gpu==2.0.0b1) in Colab with built-in TensorBoard (using the %load_ext tensorboard trick).

Answered By: John Doe

Tensorflow 2.0 Compatible Answer: In Tensorflow Version >= 2.0, the Command to Reset Entire Default Graph, when run in Graph Mode is tf.compat.v1.reset_default_graph.

NOTE: The default graph is a property of the current thread. This function applies only to the current thread. Calling this function while a tf.compat.v1.Session or tf.compat.v1.InteractiveSession is active will result in undefined behavior. Using any previously created tf.Operation or tf.Tensor objects after calling this function will result in undefined behavior.

Raises: AssertionError: If this function is called within a nested graph.

Answered By: Tensorflow Support
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.