|
| 1 | +# Ray Data Source (contrib) |
| 2 | + |
| 3 | +> **⚠️ Contrib Plugin:** |
| 4 | +> `RaySource` is a contributed plugin shipped alongside the [Ray offline store](../offline-stores/ray.md). It may not be as stable or fully supported as core data sources. |
| 5 | +
|
| 6 | +`RaySource` is a pure-metadata descriptor that tells Feast **how** to load a |
| 7 | +[Ray Dataset](https://docs.ray.io/en/latest/data/api/dataset.html) from any |
| 8 | +source that Ray Data supports natively — Parquet, CSV, JSON, HuggingFace |
| 9 | +Datasets, MongoDB, binary files, images, TFRecords, and more. |
| 10 | + |
| 11 | +It is the recommended data source when using the |
| 12 | +[Ray offline store](../offline-stores/ray.md) and replaces the need for |
| 13 | +`FileSource` for all non-Parquet and non-file-based data. |
| 14 | + |
| 15 | +--- |
| 16 | + |
| 17 | +## When to use RaySource vs FileSource |
| 18 | + |
| 19 | +| Scenario | Recommended source | |
| 20 | +|---|---| |
| 21 | +| Parquet files on disk / S3 / GCS (existing setup) | `FileSource` (backward compatible) | |
| 22 | +| Parquet via Ray reader (pipelines, remote auth) | `RaySource(reader_type="parquet")` | |
| 23 | +| CSV, JSON, text, images via Ray | `RaySource` | |
| 24 | +| HuggingFace `datasets` library | `RaySource(reader_type="huggingface")` | |
| 25 | +| MongoDB, SQL, TFRecords, WebDataset | `RaySource` | |
| 26 | + |
| 27 | +--- |
| 28 | + |
| 29 | +## Installation |
| 30 | + |
| 31 | +`RaySource` is bundled with the Ray offline store contrib package: |
| 32 | + |
| 33 | +```bash |
| 34 | +pip install 'feast[ray]' |
| 35 | +``` |
| 36 | + |
| 37 | +--- |
| 38 | + |
| 39 | +## Supported `reader_type` values |
| 40 | + |
| 41 | +| `reader_type` | Underlying Ray API | Notes | |
| 42 | +|---|---|---| |
| 43 | +| `parquet` | `ray.data.read_parquet` | S3, GCS, HDFS, local | |
| 44 | +| `csv` | `ray.data.read_csv` | | |
| 45 | +| `json` | `ray.data.read_json` | | |
| 46 | +| `text` | `ray.data.read_text` | | |
| 47 | +| `images` | `ray.data.read_images` | | |
| 48 | +| `binary_files` | `ray.data.read_binary_files` | | |
| 49 | +| `tfrecords` | `ray.data.read_tfrecords` | | |
| 50 | +| `webdataset` | `ray.data.read_webdataset` | | |
| 51 | +| `huggingface` | `ray.data.from_huggingface` | Wraps `datasets.load_dataset` | |
| 52 | +| `mongo` | `ray.data.read_mongo` | | |
| 53 | +| `sql` | `ray.data.read_sql` | Pass `connection_url` in `reader_options` | |
| 54 | + |
| 55 | +--- |
| 56 | + |
| 57 | +## Configuration |
| 58 | + |
| 59 | +### Parameters |
| 60 | + |
| 61 | +| Parameter | Type | Required | Description | |
| 62 | +|---|---|---|---| |
| 63 | +| `name` | `str` | Yes | Unique name for this data source | |
| 64 | +| `reader_type` | `str` | Yes | One of the supported reader types above | |
| 65 | +| `path` | `str` | No | File or directory path (required for file-based readers) | |
| 66 | +| `reader_options` | `dict` | No | Extra keyword arguments forwarded to the Ray reader | |
| 67 | +| `timestamp_field` | `str` | No | Column containing event timestamps | |
| 68 | +| `created_timestamp_column` | `str` | No | Column containing row creation timestamps | |
| 69 | +| `tags` | `dict` | No | Arbitrary key-value metadata | |
| 70 | +| `description` | `str` | No | Human-readable description | |
| 71 | +| `owner` | `str` | No | Owning team or contact | |
| 72 | + |
| 73 | +--- |
| 74 | + |
| 75 | +## Usage examples |
| 76 | + |
| 77 | +### Parquet on S3 |
| 78 | + |
| 79 | +```python |
| 80 | +from feast.infra.offline_stores.contrib.ray_offline_store.ray_source import RaySource |
| 81 | + |
| 82 | +driver_stats = RaySource( |
| 83 | + name="driver_stats_parquet", |
| 84 | + reader_type="parquet", |
| 85 | + path="s3://my-bucket/driver_stats/", |
| 86 | + timestamp_field="event_timestamp", |
| 87 | +) |
| 88 | +``` |
| 89 | + |
| 90 | +### CSV |
| 91 | + |
| 92 | +```python |
| 93 | +sensor_readings = RaySource( |
| 94 | + name="sensor_readings_csv", |
| 95 | + reader_type="csv", |
| 96 | + path="/data/sensors/", |
| 97 | + timestamp_field="ts", |
| 98 | +) |
| 99 | +``` |
| 100 | + |
| 101 | +### HuggingFace dataset |
| 102 | + |
| 103 | +Load a dataset from the [HuggingFace Hub](https://huggingface.co/datasets) |
| 104 | +directly into Feast. |
| 105 | + |
| 106 | +```python |
| 107 | +from feast.infra.offline_stores.contrib.ray_offline_store.ray_source import RaySource |
| 108 | + |
| 109 | +cheque_images = RaySource( |
| 110 | + name="cheque_images_hf", |
| 111 | + reader_type="huggingface", |
| 112 | + reader_options={ |
| 113 | + "dataset_name": "cheques_sample_data", |
| 114 | + "split": "train", |
| 115 | + }, |
| 116 | + timestamp_field="event_timestamp", |
| 117 | +) |
| 118 | +``` |
| 119 | + |
| 120 | +### MongoDB |
| 121 | + |
| 122 | +```python |
| 123 | +transaction_log = RaySource( |
| 124 | + name="transactions_mongo", |
| 125 | + reader_type="mongo", |
| 126 | + reader_options={ |
| 127 | + "uri": "mongodb://localhost:27017", |
| 128 | + "database": "featuredb", |
| 129 | + "collection": "transactions", |
| 130 | + }, |
| 131 | + timestamp_field="created_at", |
| 132 | +) |
| 133 | +``` |
| 134 | + |
| 135 | +### SQL (via connection URL) |
| 136 | + |
| 137 | +```python |
| 138 | +user_features = RaySource( |
| 139 | + name="user_features_sql", |
| 140 | + reader_type="sql", |
| 141 | + reader_options={ |
| 142 | + "connection_url": "postgresql+psycopg2://user:password@host:5432/db", # pragma: allowlist secret |
| 143 | + "query": "SELECT * FROM user_features", |
| 144 | + }, |
| 145 | + timestamp_field="event_timestamp", |
| 146 | +) |
| 147 | +``` |
| 148 | + |
| 149 | +--- |
| 150 | + |
| 151 | +## Using RaySource in a BatchFeatureView |
| 152 | + |
| 153 | +```python |
| 154 | +from datetime import timedelta |
| 155 | +from feast import BatchFeatureView, Entity, Field |
| 156 | +from feast.types import Float32, Int64, String |
| 157 | +from feast.infra.offline_stores.contrib.ray_offline_store.ray_source import RaySource |
| 158 | + |
| 159 | +cheque = Entity(name="cheque_id", description="Unique cheque identifier") |
| 160 | + |
| 161 | +cheque_source = RaySource( |
| 162 | + name="cheque_images_hf", |
| 163 | + reader_type="huggingface", |
| 164 | + reader_options={ |
| 165 | + "dataset_name": "cheques_sample_data", |
| 166 | + "split": "train", |
| 167 | + }, |
| 168 | + timestamp_field="event_timestamp", |
| 169 | +) |
| 170 | + |
| 171 | +cheque_ocr_fv = BatchFeatureView( |
| 172 | + name="cheque_ocr_features", |
| 173 | + entities=[cheque], |
| 174 | + ttl=timedelta(days=365), |
| 175 | + schema=[ |
| 176 | + Field(name="cheque_id", dtype=Int64), |
| 177 | + Field(name="payee_name", dtype=String), |
| 178 | + Field(name="amount", dtype=String), |
| 179 | + Field(name="bank_name", dtype=String), |
| 180 | + Field(name="raw_text", dtype=String), |
| 181 | + ], |
| 182 | + source=cheque_source, |
| 183 | +) |
| 184 | +``` |
| 185 | + |
| 186 | +--- |
| 187 | + |
| 188 | +## Retrieving data as a Ray Dataset |
| 189 | + |
| 190 | +Once the feature view is materialised you can retrieve the offline features |
| 191 | +directly as a Ray Dataset using the first-class `to_ray_dataset()` method: |
| 192 | + |
| 193 | +```python |
| 194 | +from feast import FeatureStore |
| 195 | + |
| 196 | +store = FeatureStore(".") |
| 197 | + |
| 198 | +# Chain directly on the retrieval job — to_ray_dataset() is a first-class |
| 199 | +# method on every RetrievalJobs. |
| 200 | +ds = store.get_historical_features( |
| 201 | + features=["cheque_ocr_features:payee_name", "cheque_ocr_features:amount"], |
| 202 | + entity_df=entity_df, |
| 203 | +).to_ray_dataset() |
| 204 | + |
| 205 | +# Use the dataset downstream in Ray or ML pipelines |
| 206 | +ds.show(3) |
| 207 | +``` |
| 208 | + |
| 209 | +--- |
| 210 | + |
| 211 | +## Proto serialisation |
| 212 | + |
| 213 | +`RaySource` is fully serialisable to Feast's protobuf registry format. The |
| 214 | +`reader_type`, `path`, and `reader_options` dict are all persisted and can be |
| 215 | +round-tripped via `to_proto()` / `from_proto()`. |
| 216 | + |
| 217 | +--- |
| 218 | + |
| 219 | +## Limitations |
| 220 | + |
| 221 | +* The Ray offline store (and therefore `RaySource`) requires `feast[ray]`. |
| 222 | +* `reader_type="sql"` requires a serialisable `connection_url`; raw |
| 223 | + `sqlalchemy.engine.Engine` objects cannot be pickled across Ray workers. |
| 224 | +* Streaming sources (Kafka, Kinesis) are not supported via `RaySource`; use |
| 225 | + the dedicated [Kafka](kafka.md) or [Kinesis](kinesis.md) data sources. |
| 226 | + |
| 227 | +--- |
| 228 | + |
| 229 | +## Related pages |
| 230 | + |
| 231 | +* [Ray Offline Store](../offline-stores/ray.md) |
| 232 | +* [Ray Compute Engine](../compute-engine/ray.md) |
| 233 | +* [Feature Retrieval](../../getting-started/concepts/feature-retrieval.md) |
0 commit comments