I have successfully trained a DNNClassifier to classify texts (posts from an online discussion board). I've created and saved my model using this code:
When I issue the following command:
What is the correct way to save an estimator model so that it can be converted with tfjs-converter?
The source code of my project can be found on GitHub.
from Error Trying to Convert TensorFlow Saved Model to TensorFlow.js Model
embedded_text_feature_column = hub.text_embedding_column(
key="sentence",
module_spec="https://tfhub.dev/google/nnlm-de-dim128/1")
feature_columns = [embedded_text_feature_column]
estimator = tf.estimator.DNNClassifier(
hidden_units=[500, 100],
feature_columns=feature_columns,
n_classes=2,
optimizer=tf.train.AdagradOptimizer(learning_rate=0.003))
feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)
serving_input_receiver_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)
estimator.export_savedmodel(export_dir_base="/my/dir/base", serving_input_receiver_fn=serving_input_receiver_fn)
Now I want to convert my saved model to use it with the JavaScript version of TensorFlow, tf.js, using the tfjs-converter.When I issue the following command:
tensorflowjs_converter --input_format=tf_saved_model --output_node_names='dnn/head/
predictions/str_classes,dnn/head/predictions/probabilities' --saved_model_tags=serve
/my/dir/base /my/export/dir
…I get this error message:ValueError: Node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embedding/module_apply_default/embedding_lookup_sparse/embedding_lookup' expects to be colocated with unknown node 'dnn/input_from_feature_columns/input_layer/sentence_hub_module_embeddingI assume I'm doing something wrong when saving the model.
What is the correct way to save an estimator model so that it can be converted with tfjs-converter?
The source code of my project can be found on GitHub.
from Error Trying to Convert TensorFlow Saved Model to TensorFlow.js Model
No comments:
Post a Comment