Skip to main content

Enabling Inference Store

Enabling storage is simple, but has powerful configuration options to control what inferences to store—and when and for how long to store them.

Enabling Storage

You can enable inference storage with just a single click in Chariot: Navigate to the model you wish to enable inference storage for and use the Storage tab in Model Version Settings to turn on inference storage. You can also specify how long inference and input data should be retained in the system. See Retention Policies for more details.

storeInferences

Retention Policies

Retention policies are used to manage the storage footprint for specific models. Retention policies allow users to set a maximum length of time to keep data for and to specify whether to clean up the inference input data (i.e., the image inferenced on), the inference output (i.e., the prediction), or both.

Select whether you want to:

  • Delete both inference and input data after a specified time period. The retention period of inference data must be equal to or longer than the input retention period.
  • Delete only input data after a specified time period and keep inference indefinitely.
  • Keep both inference and input data indefinitely.
note

Each model will have only one retention policy, and when this policy is changed, it will immediately be applicable to past and future inputs and inferences.

retentionPolicy