Saturday, 8 September 2018

How to build a reusable graph with input and output variables in Tensorflow?

I'm trying to build a TF graph from Python code, save it, and import the graph from another program, where it would be run with the actual data. For the purpose of keeping the code simple, I'll show the problem using a graph that computes the iterations of usual quadratic function for the Mandelbrot set.

I got this Python code to work and produce the results I expect:

def mandelbrot(x, y):
    """
    Run the TF graph returned by mandelbrot_()
    """
    g, in_, out_ = mandelbrot_()
    x_in, y_in = in_
    n_out, x_out, y_out = out_
    with tf.Session(graph=g).as_default() as session:
        # (a)
        # run the graph at the chosen point
        feed = { x_in:x, y_in:y }
        n_out, x_out, y_out = session.run(out_, feed)
        print("({0},{1}): {2}".format(x, y, [n_out, x_out, y_out]))

def mandelbrot_(maxiter=255):
    """
    Return graph computing the Mandelbrot set at (x,y).
    """
    graph = tf.Graph()
    with graph.as_default():
        # input placeholders
        x = tf.placeholder(tf.float32, shape=[], name='x_in')
        y = tf.placeholder(tf.float32, shape=[], name='y_in')

        # output variables
        n_ = tf.Variable(0, tf.int32,   name='n')
        x_ = tf.Variable(x, tf.float32, name='x')
        y_ = tf.Variable(y, tf.float32, name='y')

        # main loop
        i_ = tf.constant(0)
        def cond(i_, z_re_, z_im_):
            return tf.logical_and(
                tf.less(i_, maxiter),
                (z_re_*z_re_ + z_im_*z_im_) < 4)
        def body(i_, z_re_, z_im_):
            return [
                i_+1,                          # iteration count
                z_re_*z_re_ - z_im_*z_im_ + x, # real part of z
                2*z_re_*z_im_ + y,             # imag part of z
            ]
        l = tf.while_loop(cond, body, [i_, x, y],
                          parallel_iterations=1)
        n_, x_, y_ = l  # (b)

    return (
        graph,       # graph
        (x, y),      # inputs
        (n_, x_, y_) # outputs
    )

if __name__ == '__main__':
    mandelbrot(0.25, -0.15)

Running the above code produces the output:

(0.25,-0.15): [255, 0.22613873, -0.2738613]

Now, if I try to save the graph, tf.Saver() complains that there are no output variables and aborts. So I try to capture the output of the graph generated by mandelbrot_() into output variables, and use them; for short here is the code, which differs from the previous one because of edits in the points marked # (a) and # (b):

def mandelbrot(x, y):
    """
    Compute number of iterations of the Mandelbrot function at (x,y).
    """
    g, in_, out_ = mandelbrot_()
    # ...
    with tf.Session(graph=g).as_default() as session:
        # (a)  *** code added below this line ***
        # initialize vars with null values
        feed0 = { x_in:0.0, y_in:0.0 }
        session.run(n_out.initializer, feed0)
        session.run(x_out.initializer, feed0)
        session.run(y_out.initializer, feed0)
        # run the graph at the chosen point
        # ... (see previous code sample) ...

def mandelbrot_(maxiter=255):
    """
    Return graph computing the Mandelbrot set at (x,y).
    """
    graph = tf.Graph()
    with graph.as_default():
        # ... (see previous code sample) ...
        l = tf.while_loop(cond, body, [i_, x, y],
                          parallel_iterations=1)
        # (b)  *** code added below ***
        with tf.control_dependencies(l):
            n_.assign(l[0])
            x_.assign(l[1])
            y_.assign(l[2])
        # it works if I use this line instead:
        #n_, x_, y_ = l

    return (
        # ...
    )

With these edits, the output variables are always null:

(0.25,-0.15): [0, 0.0, 0.0]

Complete (non-working) code is in this GitHub Gist.

What am I doing wrong? How can I ensure that a variable holds the final computation of a graph?



from How to build a reusable graph with input and output variables in Tensorflow?

No comments:

Post a Comment