Sunday, 28 February 2021

LSTM Model Conversion to TFLite Work Fine, but Error When Importing to Android Project

I have read here that now tensorflow lite is able to convert from tensorflow and keras LSTM models.

Everything works fine until the model is converted and I have the model.tflite file. I already tried to follow this and this to add the model to my android project. How ever everything produce the same error which is java.lang.IllegalArgumentException: ByteBuffer is not a valid flatbuffer model

the code for building the model

model = Sequential()
model.add(LSTM(128, input_shape=(3000,5), return_sequences=True))
model.add(Dropout(0.2))
model.add(BatchNormalization())

model.add(LSTM(128, return_sequences=True))
model.add(Dropout(0.2))
model.add(BatchNormalization())

model.add(LSTM(128))
model.add(Dropout(0.1))
model.add(BatchNormalization())

model.add(Dense(64))
model.add(Dropout(0.2))

model.add(Dense(1, activation="sigmoid"))

opt = tf.keras.optimizers.Adam(lr=0.001, decay=1e-6)
model.compile(loss="binary_crossentropy",
             optimizer=opt,
             metrics=["accuracy"])


history = model.fit(X,
                   y,
                   batch_size=64,
                   epochs=10,
                   callbacks=[callbacks],
                   verbose=1)

Full error logs

E/ModelResourceManager: Error preloading model resource
    com.google.firebase.ml.common.FirebaseMLException: Local model load failed with the model options: Local model path: drowsy-detector-v21.tflite. Remote model name: unspecified. 
        at com.google.firebase.ml.common.internal.modeldownload.zzj.zza(com.google.firebase:firebase-ml-common@@22.1.0:36)
        at com.google.android.gms.internal.firebase_ml.zzrj.zza(com.google.firebase:firebase-ml-model-interpreter@@22.0.2:111)
        at com.google.android.gms.internal.firebase_ml.zzrj.zzol(com.google.firebase:firebase-ml-model-interpreter@@22.0.2:107)
        at com.google.android.gms.internal.firebase_ml.zzqr.zzf(com.google.firebase:firebase-ml-common@@22.1.0:53)
        at com.google.android.gms.internal.firebase_ml.zzqr$zza.zzoo(com.google.firebase:firebase-ml-common@@22.1.0:7)
        at com.google.android.gms.internal.firebase_ml.zzqr$zza.call(com.google.firebase:firebase-ml-common@@22.1.0:24)
        at com.google.android.gms.internal.firebase_ml.zzpx.zza(com.google.firebase:firebase-ml-common@@22.1.0:32)
        at com.google.android.gms.internal.firebase_ml.zzpw.run(Unknown Source:4)
        at android.os.Handler.handleCallback(Handler.java:873)
        at android.os.Handler.dispatchMessage(Handler.java:99)
        at com.google.android.gms.internal.firebase_ml.zze.dispatchMessage(com.google.firebase:firebase-ml-common@@22.1.0:6)
        at android.os.Looper.loop(Looper.java:201)
        at android.os.HandlerThread.run(HandlerThread.java:65)
     Caused by: java.lang.IllegalArgumentException: ByteBuffer is not a valid flatbuffer model
        at org.tensorflow.lite.NativeInterpreterWrapper.createModelWithBuffer(Native Method)
        at org.tensorflow.lite.NativeInterpreterWrapper.<init>(NativeInterpreterWrapper.java:59)
        at org.tensorflow.lite.Interpreter.<init>(Interpreter.java:207)
        at com.google.android.gms.internal.firebase_ml.zzrj.zzb(com.google.firebase:firebase-ml-model-interpreter@@22.0.2:174)
        at com.google.android.gms.internal.firebase_ml.zzrl.zzc(Unknown Source:0)
        at com.google.android.gms.internal.firebase_ml.zzrj.zza(com.google.firebase:firebase-ml-model-interpreter@@22.0.2:170)
        at com.google.android.gms.internal.firebase_ml.zzrk.zza(Unknown Source:6)
        at com.google.firebase.ml.common.internal.modeldownload.zzj.zzb(com.google.firebase:firebase-ml-common@@22.1.0:61)
        at com.google.firebase.ml.common.internal.modeldownload.zzj.zza(com.google.firebase:firebase-ml-common@@22.1.0:21)
        at com.google.android.gms.internal.firebase_ml.zzrj.zza(com.google.firebase:firebase-ml-model-interpreter@@22.0.2:111) 
        at com.google.android.gms.internal.firebase_ml.zzrj.zzol(com.google.firebase:firebase-ml-model-interpreter@@22.0.2:107) 
        at com.google.android.gms.internal.firebase_ml.zzqr.zzf(com.google.firebase:firebase-ml-common@@22.1.0:53) 
        at com.google.android.gms.internal.firebase_ml.zzqr$zza.zzoo(com.google.firebase:firebase-ml-common@@22.1.0:7) 
        at com.google.android.gms.internal.firebase_ml.zzqr$zza.call(com.google.firebase:firebase-ml-common@@22.1.0:24) 
        at com.google.android.gms.internal.firebase_ml.zzpx.zza(com.google.firebase:firebase-ml-common@@22.1.0:32) 
        at com.google.android.gms.internal.firebase_ml.zzpw.run(Unknown Source:4) 
        at android.os.Handler.handleCallback(Handler.java:873) 
        at android.os.Handler.dispatchMessage(Handler.java:99) 
        at com.google.android.gms.internal.firebase_ml.zze.dispatchMessage(com.google.firebase:firebase-ml-common@@22.1.0:6) 
        at android.os.Looper.loop(Looper.java:201) 
        at android.os.HandlerThread.run(HandlerThread.java:65)

Environment

  • OS Platform: Google Colab, Android 9 (Poco F1)
  • TensorFlow version: 2.2.0-rc3 (also tf-nightly installed via pip)
  • firebase ml model interpreter version: firebase-ml-model-interpreter:22.0.2



Any idea I would much appreciated as this project is quite important now. Thank you



from LSTM Model Conversion to TFLite Work Fine, but Error When Importing to Android Project

No comments:

Post a Comment