Skip to content

Commit

Permalink
Cleaned up some comments, edited cell metadata, and commented out som… (
Browse files Browse the repository at this point in the history
#2781)

* Cleaned up some comments, edited cell metadata, and commented out some code using better materialization.

* Fixed code formatting errors.

* Updated code snippet for attaching Redis instance to feature store as online store.
  • Loading branch information
ynpandey authored Nov 1, 2023
1 parent 2d93926 commit df21bb4
Show file tree
Hide file tree
Showing 3 changed files with 8 additions and 11 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -647,7 +647,7 @@
"outputs_hidden": false,
"source_hidden": false
},
"name": "gen-training-data-locally",
"name": "load-obs-data",
"nteract": {
"transient": {
"deleting": false
Expand All @@ -668,6 +668,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
"name": "gen-training-data-locally",
"tags": [
"active-ipynb"
]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -473,11 +473,6 @@
"\n",
"online_store = MaterializationStore(type=\"redis\", target=redis_arm_id)\n",
"\n",
"materialization_identity1 = ManagedIdentityConfiguration(\n",
" client_id=uai_client_id, principal_id=uai_principal_id, resource_id=uai_arm_id\n",
")\n",
"\n",
"\n",
"ml_client = MLClient(\n",
" AzureMLOnBehalfOfCredential(),\n",
" subscription_id=featurestore_subscription_id,\n",
Expand All @@ -487,7 +482,6 @@
"fs = FeatureStore(\n",
" name=featurestore_name,\n",
" online_store=online_store,\n",
" materialization_identity=materialization_identity1,\n",
")\n",
"\n",
"fs_poller = ml_client.feature_stores.begin_create(fs)\n",
Expand Down Expand Up @@ -594,9 +588,10 @@
" version=\"1\",\n",
" feature_window_start_time=st,\n",
" feature_window_end_time=ed,\n",
" data_status=[\"None\"],\n",
" # data_status=[\"None\"],\n",
")\n",
"print(poller.result().job_ids)"
"# print(poller.result().job_ids)\n",
"print(poller.result().job_id)"
]
},
{
Expand Down Expand Up @@ -627,7 +622,8 @@
"# Get the job URL, and stream the job logs.\n",
"# With PREMIUM Redis SKU, SKU family \"P\", and cache capacity 2,\n",
"# it takes approximately 10 minutes to complete.\n",
"fs_client.jobs.stream(poller.result().job_ids[0])"
"# fs_client.jobs.stream(poller.result().job_ids[0])\n",
"fs_client.jobs.stream(poller.result().job_id)"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -188,7 +188,7 @@
"metadata": {},
"source": [
"## Custom source definition\n",
"Custom source definition enables you to define your own source loading logic from any data storage. For using this feature, implement a a source processor user-defined function (UDF) class (`CustomerTransactionsTransformer` in this tutorial). This class should define an `__init__(self, **kwargs)` function and a `process(self, start_time, end_time, **kwargs)` function. The `kwargs` dictionary is supplied as a part of the feature set specification definition, which is passed to the UDF. The `start_time` and `end_time` parameters are calculated and passed to the UDF function.\n",
"Custom source definition enables you to define your own source loading logic from any data storage. For using this feature, implement a a source processor user-defined function (UDF) class (`CustomSourceTransformer` in this tutorial). This class should define an `__init__(self, **kwargs)` function and a `process(self, start_time, end_time, **kwargs)` function. The `kwargs` dictionary is supplied as a part of the feature set specification definition, which is passed to the UDF. The `start_time` and `end_time` parameters are calculated and passed to the UDF function.\n",
"\n",
"Below is a sample code of source processor UDF class:\n",
"\n",
Expand Down

0 comments on commit df21bb4

Please sign in to comment.