Instalando Tensorflow 1.13
Depois de instalado Python 3.6.x
Instalar pip (download https://bootstrap.pypa.io/get-pip.py)
python get-pip.py
Instala virtualenv
pip3 install -U pip virtualenv
vai para o raiz
cd\
cria o ambiente virtual DLenv (Deep Learnign Environment)
virtualenv --system-site-packages -p python ./DLenv
entra no diretório
cd DLenv\Scripts
activate
pip install --upgrade pip
pip install --upgrade tensorflow
pip install --upgrade https://storage.googleapis.com/tensorflow/windows/cpu/tensorflow-1.12.0-cp36-cp36m-win_amd64.whl
Tensorflow cd checando a versão:
python -c "import tensorflow as tf; from tensorflow.python.framework.versions import VERSION; print(tf.VERSION)"
https://www.tensorflow.org/lite/convert/python_api
Exporting a GraphDef from file
import tensorflow as tf
graph_def_file = "C:\tmp\my_object_detection\inference_graph_ssdlite_mobilenet_v2_coco\frozen_inference_graph.pb"
input_arrays = ["input"]
output_arrays = ["MobilenetV1/Predictions/Softmax"]
converter = tf.lite.TFLiteConverter.from_frozen_graph(
graph_def_file, input_arrays, output_arrays)
tflite_model = converter.convert()
open("converted_model.tflite", "wb").write(tflite_model)
Post inspirado em:
https://medium.com/tensorflow/training-and-serving-a-realtime-mobile-object-detector-in-30-minutes-with-cloud-tpus-b78971cf1193
Sobre TensorFlow lite em Android:
https://www.tensorflow.org/lite/guide/android
Gerando tflite a apartir de .pb
set CONFIG_FILE=C:\tmp\my_object_detection\inference_graph_ssdlite_mobilenet_v2_coco\pipeline.config
set CHECKPOINT_PATH=C:\tmp\my_object_detection\inference_graph_ssdlite_mobilenet_v2_coco\model.ckpt
set OUTPUT_DIR=C:\tmp\my_object_detection\inference_graph_ssdlite_mobilenet_v2_coco\tflite
python C:\tensorflow1\models\research\object_detection\export_tflite_ssd_graph.py --pipeline_config_path=%CONFIG_FILE% --trained_checkpoint_prefix=%CHECKPOINT_PATH% --output_directory=%OUTPUT_DIR% --add_postprocessing_op=true
gerou arquivos:
tflite_graph.pb
tflite_graph.pbtxt
Nenhum comentário:
Postar um comentário