Questions tagged [tfx]

TensorFlow Extended (TFX) is an end-to-end platform for deploying production ML pipelines

TFX is a Google-production-scale machine learning platform based on TensorFlow. It provides a configuration framework and shared libraries to integrate common components needed to define, launch, and monitor your machine learning system.

Resources

185 questions
9
votes
2 answers

Data stored in MLMD in TensorFlow TFX

As far as I understand, TensorFlow uses MLMD to record and retrieve metadata associated with workflows. This may include: results of pipeline components metadata about artifacts generated through the components of the pipelines metadata about…
Josh
  • 11,979
  • 17
  • 60
  • 96
6
votes
3 answers

How do I get a dataframe or database write from TFX BulkInferrer?

I'm very new to TFX, but have an apparently-working ML Pipeline which is to be used via BulkInferrer. That seems to produce output exclusively in Protobuf format, but since I'm running bulk inference I want to pipe the results to a database instead.…
Sarah Messer
  • 3,592
  • 1
  • 26
  • 43
5
votes
0 answers

Running TensorFlow Extended (TFX) on AWS

I was wondering if it is possible/how easy it would be to implement a TFX pipeline (on a real dataset, with 100+ GB dataset, not a tutorial with a small dataset) in AWS? For the orchestration, I might use Kubeflow. But I suppose, the major issue…
5
votes
1 answer

Is it possible to mix kubeflow components with tensorflow extended components?

It looks like Kubeflow has deprecated all of their TFX components. I currently have some custom Kubeflow components that help launch some of my data pipelines and I was hoping I could use some TFX components in the same kubeflow pipeline. Is there a…
sleepyowl
  • 168
  • 5
5
votes
1 answer

TFX IndexError on Evaluator component

I'm trying to make an Evaluator for my model. Until now every other components are fine but When I try this config: eval_config = tfma.EvalConfig( model_specs=[ tfma.ModelSpec(label_key='Category'), ], …
Alexandre Pieroux
  • 219
  • 1
  • 2
  • 13
5
votes
1 answer

deploy TFX with existing frozen_interface_graph.pb and label_map.pbtxt

I have trained an object detection model with a fasterR-CNN network and has the frozen_interface_graph.pb and label_map.pbtxt after training. I wanted to deploy it as a RESTAPI server so that it can be called from systems that do not have…
Sreekiran A R
  • 3,123
  • 2
  • 20
  • 41
5
votes
1 answer

Installing tensorflow extended python 3 on windows

I tried to pip install tfx==0.13.0 on windows 10 machine and I get this error has anyone been able to pip install tfx==0.13.0 Could not find a version that satisfies the requirement ml-metadata<0.14,>=0.13.2 (from tfx==0.13.0) (from versions:…
opalbert
  • 69
  • 6
4
votes
1 answer

Specify signature name on Vertex AI Predict

I've deployed a tensorflow model in vertex AI platform using TFX Pipelines. The model have custom serving signatures but I'm strugling to specify the signature when I'm predicting. I've the exact same model deployed in GCP AI Platform and I'm able…
4
votes
2 answers

Generate instances or inputs for TensorFlow Serving REST API

I'm ready to try out my TensorFlow Serving REST API based on a saved model, and was wondering if there was an easy way to generate the JSON instances (row-based) or inputs (columnar) I need to send with my request. I have several thousand features…
Max Power
  • 952
  • 9
  • 24
3
votes
0 answers

TFTransform ValueError: "Split does not exist over all example artifacts"

I'm attempting to construct a TFX pipeline, but keep running into an error during the TFTransform component stem. After diving into the error message and its code on GitHub, it appears to have something to do with a function def get_split_uris().…
John Sukup
  • 303
  • 3
  • 11
3
votes
0 answers

Tuner Component: Is it possible to tune the batch size?

I'm currently using TFX to build a pipeline on the Google AI platform with the Kubeflow engine. I have a model where the batch size is an important hyper-parameter to tune. I would like to search this hyper-parameter in the Tuner component. Is it…
JimZer
  • 918
  • 2
  • 9
  • 19
3
votes
2 answers

It's possible to configure the Beam portable runner with the spark configurations?

TLDR; It's possible to configure the Beam portable runner with the spark configurations? More precisely, it's possible to configure the spark.driver.host in the Portable Runner? Motivation Currently, we have airflow implemented in a Kubernetes…
3
votes
2 answers

Tensorflow Extended: Is it possible to use pytorch training loop in Tensorflow extended flow

I have trained an image classification model using pytorch. Now, I want to move it from research to production pipeline. I am thinking of using TensorFlow extended. I have a very noob doubt that will I'll be able to use my PyTorch trained model in…
Beginner
  • 721
  • 11
  • 27
3
votes
1 answer

TFX - REST API Without Serializing the Data Input to Get Prediction

I'm new to TFX, and I've been following through the Keras tutorial, and I have successfully created the TFX pipeline using my data. While I learn to serve my model through Docker with TF Serving, my data input has to be serialized as follows to…
LLTeng
  • 385
  • 3
  • 4
  • 15
3
votes
1 answer

What is best practice for mapping a TF2 keras model's signaturedef to TF Serving classify/predict/regression API in TFX Pipeline?

We are building an automated TFX pipeline on Airflow and have based our model off of the Keras Tutorial . We save the keras model as follows: model.save(fn_args.serving_model_dir, save_format='tf', signatures=signatures, …
1
2 3
12 13