Deleting Inferences
There are two primary ways to manage inference storage in Chariot: automated retention policies and manual deletion of specific inferences. This page covers manual deletion options.
When to Use Manual Deletion vs. Retention Policies
Use Retention Policies for:
- Automated cleanup based on age
- Consistent storage management across all inferences
- Regular housekeeping of old data
Use Manual Deletion for:
- Removing specific problematic inferences
- Deleting test data or experimental runs
- Cleaning up inferences from specific datasets or time periods
- Immediate removal of sensitive or incorrect data
warning
Deletion is permanent: Once inferences are deleted, both the inference results and associated input data are permanently removed and cannot be recovered.
Delete a Single Inference
Remove a specific inference and all its associated data:
from chariot.inference_store import inference
deleted_id = inference.delete_inference(model_id=model_id, inference_id=inference_id)
if deleted_id:
print(f"Successfully deleted inference: {deleted_id}")
else:
print("Inference not found or deletion failed")
Bulk Delete by Filter
Delete multiple inferences matching specific criteria. This is useful for cleaning up specific subsets of data.
from chariot.inference_store import models, inference
# Delete all inferences from a specific dataset snapshot
delete_filter = models.BaseInferenceFilter(
metadata_filter=[
models.MetadataFilter(
key="dataset_snapshot_id",
type=models.MetadataFilterType.STRING,
operator=models.MetadataFilterOperator.EQUAL,
value="snapshot-123",
),
models.MetadataFilter(
key="dataset_split",
type=models.MetadataFilterType.STRING,
operator=models.MetadataFilterOperator.EQUAL,
value="validation",
),
]
)
deleted_inference_ids = inference.bulk_delete_inferences(
model_id=model_id,
filter_=delete_filter
)
print(f"Deleted {len(deleted_inference_ids)} inferences")