diff --git a/examples/indexd_demo.ipynb b/examples/indexd_demo.ipynb new file mode 100644 index 00000000..9970a9f3 --- /dev/null +++ b/examples/indexd_demo.ipynb @@ -0,0 +1,759 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# indexd Demo" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## About indexd" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Indexd, in a nutshell, is a microservice which maintains URLs as pointers to stored data files. Indexd adds a layer of abstraction over stored data files: the data can move between or live in multiple locations, while the unique identifier for each file, kept in indexd, allows us to obtain the URLs (and some miscellaneous metadata) for the same stored data. Additionally, indexd tracks revisions of the same data file." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Python Demo" + ] + }, + { + "cell_type": "code", + "execution_count": 223, + "metadata": {}, + "outputs": [], + "source": [ + "import json\n", + "from urllib.parse import urljoin\n", + "\n", + "import requests" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Setup" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To start, run indexd on `localhost:8080`. Probably the easiest way is with a docker container:\n", + "```bash\n", + "# Start from indexd directory\n", + "# Build the docker image if you don't have it yet\n", + "docker build -t indexd .\n", + "# Now run the image, and set it to forward to port 8080.\n", + "docker run -d --name indexd -p 8080:80 indexd\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In order to use endpoints requiring admin authorization, set up a username and password in the indexd docker image:\n", + "```bash\n", + "docker exec indexd python /indexd/bin/index_admin.py create --username test --password test\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "(Here we set up a bit of code just to make the API calls more concise and readable.)" + ] + }, + { + "cell_type": "code", + "execution_count": 224, + "metadata": {}, + "outputs": [], + "source": [ + "base = 'http://localhost:8080'\n", + "\n", + "# NOTE\n", + "# Fill in the auth with whatever username/password you set before.\n", + "request_auth = requests.auth.HTTPBasicAuth('test', 'test')\n", + "\n", + "indexd = lambda path: urljoin(base, path)\n", + "\n", + "def print_response(response):\n", + " print(response)\n", + " try:\n", + " print(json.dumps(response.json(), indent=4))\n", + " except ValueError:\n", + " print(response.text)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Just for the purposes of re-using this demo with the same indexd instance, we'll clear out all the records from indexd. (For the sake of the tutorial, this shouldn't make sense yet—so ignore this, and move along!)" + ] + }, + { + "cell_type": "code", + "execution_count": 225, + "metadata": {}, + "outputs": [], + "source": [ + "def wipe_indexd():\n", + " \"\"\"\n", + " Delete all records from indexd.\n", + " \"\"\"\n", + " records = requests.get(indexd('/index/')).json()['records']\n", + " for record in records:\n", + " path = indexd('/index/{}'.format(record['did']))\n", + " params = {'rev': record['rev']}\n", + " response = requests.delete(path, auth=request_auth, params=params)" + ] + }, + { + "cell_type": "code", + "execution_count": 226, + "metadata": {}, + "outputs": [], + "source": [ + "# WARNING: don't do this if you want to keep your existing records!\n", + "wipe_indexd()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's check that indexd is alive, using the status endpoint." + ] + }, + { + "cell_type": "code", + "execution_count": 227, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "Healthy\n" + ] + } + ], + "source": [ + "print_response(requests.get(indexd('/_status')))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "So far so good. Let's get the list of records stored in indexd right now, by sending a `GET` to `/index/`." + ] + }, + { + "cell_type": "code", + "execution_count": 228, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "{\n", + " \"version\": null,\n", + " \"metadata\": {},\n", + " \"urls\": [],\n", + " \"start\": null,\n", + " \"size\": null,\n", + " \"limit\": 100,\n", + " \"records\": [],\n", + " \"ids\": null,\n", + " \"acl\": [],\n", + " \"hashes\": null,\n", + " \"file_name\": null\n", + "}\n" + ] + } + ], + "source": [ + "print_response(requests.get(indexd('/index/')))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There's no records registered yet...let's create one." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Creating a Record" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Just below is some example data for a record. We `POST` this to the `/index/` endpoint on indexd to register the record.\n", + "\n", + "The minimum information necessary to supply to indexd is the file size, the hash (in any of several common formats), a list of URLs pointing to where the data file is stored (which can be left empty),\n", + "and the form TODO. For this example we'll also give our imaginary file a name, and add `'*'` in the ACL list." + ] + }, + { + "cell_type": "code", + "execution_count": 229, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "{\n", + " \"rev\": \"be8c395f\",\n", + " \"did\": \"testprefix:760c371d-1efa-44e0-8a0e-83b797e738dc\",\n", + " \"baseid\": \"cef3e517-a7e9-4381-9687-0ba11fc177b1\"\n", + "}\n" + ] + } + ], + "source": [ + "data = {\n", + " 'size': 8,\n", + " 'hashes': {'md5': 'e561f9248d7563d15dd93457b02ebbb6'},\n", + " 'urls': [],\n", + " 'form': 'object',\n", + " 'file_name': 'example_file',\n", + " 'acl': ['*'],\n", + "}\n", + "response = requests.post(indexd('/index/'), json=data, auth=request_auth)\n", + "print_response(response)\n", + "# Save this stuff, we'll need to use it later.\n", + "v_0_did = response.json()['did']\n", + "v_0_baseid = response.json()['baseid']\n", + "v_0_rev = response.json()['rev']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Success!" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Retrieving Records" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now the list of records returned from indexd should have our new entry—let's check, again using a `GET` to the `/index/` endpoint." + ] + }, + { + "cell_type": "code", + "execution_count": 230, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "{\n", + " \"version\": null,\n", + " \"metadata\": {},\n", + " \"urls\": [],\n", + " \"start\": null,\n", + " \"size\": null,\n", + " \"limit\": 100,\n", + " \"records\": [\n", + " {\n", + " \"version\": null,\n", + " \"did\": \"testprefix:760c371d-1efa-44e0-8a0e-83b797e738dc\",\n", + " \"urls_metadata\": {},\n", + " \"urls\": [],\n", + " \"baseid\": \"cef3e517-a7e9-4381-9687-0ba11fc177b1\",\n", + " \"created_date\": \"2018-08-07T22:19:03.068052\",\n", + " \"size\": 8,\n", + " \"acl\": [\n", + " \"*\"\n", + " ],\n", + " \"metadata\": {},\n", + " \"hashes\": {\n", + " \"md5\": \"e561f9248d7563d15dd93457b02ebbb6\"\n", + " },\n", + " \"rev\": \"be8c395f\",\n", + " \"form\": \"object\",\n", + " \"updated_date\": \"2018-08-07T22:19:03.068062\",\n", + " \"file_name\": \"example_file\"\n", + " }\n", + " ],\n", + " \"ids\": null,\n", + " \"acl\": [],\n", + " \"hashes\": null,\n", + " \"file_name\": null\n", + "}\n" + ] + } + ], + "source": [ + "print_response(requests.get(indexd('/index/')))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can also look up this specific record using `GET` `/index/{UUID}`, where the UUID is the DID that indexd returned before when we created this record." + ] + }, + { + "cell_type": "code", + "execution_count": 231, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "{\n", + " \"version\": null,\n", + " \"did\": \"testprefix:760c371d-1efa-44e0-8a0e-83b797e738dc\",\n", + " \"urls_metadata\": {},\n", + " \"urls\": [],\n", + " \"baseid\": \"cef3e517-a7e9-4381-9687-0ba11fc177b1\",\n", + " \"created_date\": \"2018-08-07T22:19:03.068052\",\n", + " \"size\": 8,\n", + " \"acl\": [\n", + " \"*\"\n", + " ],\n", + " \"metadata\": {},\n", + " \"hashes\": {\n", + " \"md5\": \"e561f9248d7563d15dd93457b02ebbb6\"\n", + " },\n", + " \"rev\": \"be8c395f\",\n", + " \"form\": \"object\",\n", + " \"updated_date\": \"2018-08-07T22:19:03.068062\",\n", + " \"file_name\": \"example_file\"\n", + "}\n" + ] + } + ], + "source": [ + "path = indexd('/index/{}'.format(v_0_did))\n", + "print_response(requests.get(path))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Great, so we made a new record!...but what does any of that stuff mean?? Let's break this down." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### About Records in indexd" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A single record in indexd contains several fields; let's go through each field and explain what these are for.\n", + "\n", + "#### `did` (\"digital identifier\")\n", + "\n", + "A unique identifier (UUID4) for the file; indexd will make these for new records automatically. Notice that the one that indexd generated for us looks like this:\n", + "```\n", + ":\n", + "```\n", + "TODO\n", + "\n", + "#### `baseid`\n", + "\n", + "The `baseid` is a common identifier for all versions of one file, across revisions.\n", + "\n", + "#### `rev`\n", + "\n", + "The `rev` field identifies a particular version of a file with multiple versions.\n", + "\n", + "#### `form`\n", + "\n", + "#### `size`\n", + "\n", + "This is just the filesize that we gave indexd originally for this file.\n", + "\n", + "#### `file_name`\n", + "\n", + "Optional field recording the filename of the indexed file.\n", + "\n", + "#### `metadata`\n", + "\n", + "#### `urls_metadata`\n", + "\n", + "#### `version`\n", + "\n", + "#### `urls`\n", + "\n", + "Like we mentioned above, this is the list of URLs which point to the real location of the stored data.\n", + "\n", + "#### `acl`\n", + "\n", + "#### `hashes`\n", + "\n", + "`hashes` is an object storing one or more hashes for the file itself. These can be any of:\n", + "- MD5\n", + "- SHA\n", + "- SHA256\n", + "- SHA512\n", + "- CRC\n", + "- ETag" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Record Versions" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now that we've created a record, let's look at the process of updating this record with a new version. We're going to change the contents—and thus the size and the hash—of our imaginary file. Let's update indexd with the new information. To add a new version, we `POST` to `/index/{UUID}`, where the UUID is the DID of the existing file." + ] + }, + { + "cell_type": "code", + "execution_count": 232, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "{\n", + " \"rev\": \"2d09fa8d\",\n", + " \"did\": \"88bca605-42f9-40b1-a0e3-41e632276125\",\n", + " \"baseid\": \"cef3e517-a7e9-4381-9687-0ba11fc177b1\"\n", + "}\n" + ] + } + ], + "source": [ + "# Here's the new data for the \"file\".\n", + "data['size'] = 10\n", + "data['hashes'] = {'md5': 'f7952a9483fae0af6d41370d9333020b'}\n", + "\n", + "# We saved the DID for this file before.\n", + "path = indexd('/index/{}'.format(v_0_did))\n", + "response = requests.post(path, json=data, auth=request_auth)\n", + "v_1_baseid = response.json()['baseid']\n", + "v_1_did = response.json()['did']\n", + "v_1_rev = response.json()['rev']\n", + "print_response(response)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, if we compare this `baseid` to the `baseid` that indexd returned when we created the record for the original file, we see that this `baseid` remains the same." + ] + }, + { + "cell_type": "code", + "execution_count": 233, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "True\n" + ] + } + ], + "source": [ + "print(v_0_baseid == v_1_baseid)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "However, this record has a different `rev` and a different `did` than the original." + ] + }, + { + "cell_type": "code", + "execution_count": 234, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "False\n", + "False\n" + ] + } + ], + "source": [ + "print(v_0_did == v_1_did)\n", + "print(v_0_rev == v_1_rev)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Having created the new version for this file, let's again make a request `GET` `/index/{UUID}`, using the shared `baseid`." + ] + }, + { + "cell_type": "code", + "execution_count": 235, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "{\n", + " \"version\": \"2.0\",\n", + " \"did\": \"88bca605-42f9-40b1-a0e3-41e632276125\",\n", + " \"urls_metadata\": {},\n", + " \"urls\": [],\n", + " \"baseid\": \"cef3e517-a7e9-4381-9687-0ba11fc177b1\",\n", + " \"created_date\": \"2018-08-07T22:19:03.147919\",\n", + " \"size\": 10,\n", + " \"acl\": [\n", + " \"*\"\n", + " ],\n", + " \"metadata\": {},\n", + " \"hashes\": {\n", + " \"md5\": \"f7952a9483fae0af6d41370d9333020b\"\n", + " },\n", + " \"rev\": \"2d09fa8d\",\n", + " \"form\": \"object\",\n", + " \"updated_date\": \"2018-08-07T22:19:03.147927\",\n", + " \"file_name\": \"example_file\"\n", + "}\n" + ] + } + ], + "source": [ + "path = indexd('/index/{}'.format(v_0_baseid))\n", + "response = requests.get(path)\n", + "print_response(response)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The information for this record reflects the new changes to the file." + ] + }, + { + "cell_type": "code", + "execution_count": 236, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "True\n" + ] + } + ], + "source": [ + "print(response.json()['did'] == v_1_did)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "However, the original information still exists. We can make a request again using the DID of the original file, and see that this revision hasn't changed." + ] + }, + { + "cell_type": "code", + "execution_count": 237, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "{\n", + " \"version\": null,\n", + " \"did\": \"testprefix:760c371d-1efa-44e0-8a0e-83b797e738dc\",\n", + " \"urls_metadata\": {},\n", + " \"urls\": [],\n", + " \"baseid\": \"cef3e517-a7e9-4381-9687-0ba11fc177b1\",\n", + " \"created_date\": \"2018-08-07T22:19:03.068052\",\n", + " \"size\": 8,\n", + " \"acl\": [\n", + " \"*\"\n", + " ],\n", + " \"metadata\": {},\n", + " \"hashes\": {\n", + " \"md5\": \"e561f9248d7563d15dd93457b02ebbb6\"\n", + " },\n", + " \"rev\": \"be8c395f\",\n", + " \"form\": \"object\",\n", + " \"updated_date\": \"2018-08-07T22:19:03.068062\",\n", + " \"file_name\": \"example_file\"\n", + "}\n" + ] + } + ], + "source": [ + "path = indexd('/index/{}'.format(v_0_did))\n", + "print_response(requests.get(path))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally, we can look at the whole list of versions for a single file, with `GET` `/index/{UUID}/versions`. The object in the response will contain the records for every version of this file as key-value pairs, where the keys are just numeric indexes (in string form) and the values are the records." + ] + }, + { + "cell_type": "code", + "execution_count": 266, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "{\n", + " \"1\": {\n", + " \"version\": \"2.0\",\n", + " \"did\": \"88bca605-42f9-40b1-a0e3-41e632276125\",\n", + " \"urls_metadata\": {},\n", + " \"urls\": [],\n", + " \"baseid\": \"cef3e517-a7e9-4381-9687-0ba11fc177b1\",\n", + " \"created_date\": \"2018-08-07T22:19:03.147919\",\n", + " \"size\": 10,\n", + " \"acl\": [\n", + " \"*\"\n", + " ],\n", + " \"metadata\": {},\n", + " \"hashes\": {\n", + " \"md5\": \"f7952a9483fae0af6d41370d9333020b\"\n", + " },\n", + " \"rev\": \"2d09fa8d\",\n", + " \"form\": \"object\",\n", + " \"updated_date\": \"2018-08-07T22:19:03.147927\",\n", + " \"file_name\": \"example_file\"\n", + " },\n", + " \"0\": {\n", + " \"version\": null,\n", + " \"did\": \"testprefix:760c371d-1efa-44e0-8a0e-83b797e738dc\",\n", + " \"urls_metadata\": {},\n", + " \"urls\": [],\n", + " \"baseid\": \"cef3e517-a7e9-4381-9687-0ba11fc177b1\",\n", + " \"created_date\": \"2018-08-07T22:19:03.068052\",\n", + " \"size\": 8,\n", + " \"acl\": [\n", + " \"*\"\n", + " ],\n", + " \"metadata\": {},\n", + " \"hashes\": {\n", + " \"md5\": \"e561f9248d7563d15dd93457b02ebbb6\"\n", + " },\n", + " \"rev\": \"be8c395f\",\n", + " \"form\": \"object\",\n", + " \"updated_date\": \"2018-08-07T22:19:03.068062\",\n", + " \"file_name\": \"example_file\"\n", + " }\n", + "}\n" + ] + } + ], + "source": [ + "path = indexd('/index/{}/versions'.format(v_0_baseid))\n", + "print_response(requests.get(path))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Record Aliases" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +}