Skip to content
This repository has been archived by the owner on Feb 12, 2024. It is now read-only.

Commit

Permalink
feat: store pins in datastore instead of a DAG (#2771)
Browse files Browse the repository at this point in the history
Adds a `.pins` datastore to `ipfs-repo` and uses that to store pins as cbor binary keyed by multihash.

### Format

As stored in the datastore, each pin has several fields:

```javascript
{
  codec: // optional Number, the codec from the CID that this multihash was pinned with, if omitted, treated as 'dag-pb'
  version: // optional Number, the version number from the CID that this multihash was pinned with, if omitted, treated as v0
  depth: // Number Infinity = recursive pin, 0 = direct, 1+ = pinned to a depth
  comments: // optional String user-friendly description of the pin
  metadata: // optional Object, user-defined data for the pin
}
```

Notes:

`.codec` and `.version` are stored so we can recreate the original CID when listing pins.

### Metadata

The intention is for us to be able to add extra fields that have technical meaning to the root of the object, and the user can store application-specific data in the `metadata` field.

### CLI

```console
$ ipfs pin add bafyfoo --metadata key1=value1,key2=value2
$ ipfs pin add bafyfoo --metadata-format=json --metadata '{"key1":"value1","key2":"value2"}'

$ ipfs pin list
bafyfoo

$ ipfs pin list -l
CID      Name    Type       Metadata
bafyfoo  My pin  Recursive  {"key1":"value1","key2":"value2"}

$ ipfs pin metadata Qmfoo --format=json
{"key1":"value1","key2":"value2"}
```

### HTTP API

* '/api/v0/pin/add' route adds new `metadata` argument, accepts a json string
* '/api/v0/pin/metadata' returns metadata as json

### Core API

* `ipfs.pin.addAll` accepts and returns an async iterator
* `ipfs.pin.rmAll` accepts and returns an async iterator

```javascript
// pass a cid or IPFS Path with options
const { cid } = await ipfs.pin.add(new CID('/ipfs/Qmfoo'), {
  recursive: false,
  metadata: {
    key: 'value
  },
  timeout: 2000
}))

// pass an iterable of CIDs
const [{ cid: cid1 }, { cid: cid2 }] = await all(ipfs.pin.addAll([
  new CID('/ipfs/Qmfoo'),
  new CID('/ipfs/Qmbar')
], { timeout: '2s' }))

// pass an iterable of objects with options
const [{ cid: cid1 }, { cid: cid2 }] = await all(ipfs.pin.addAll([
  { cid: new CID('/ipfs/Qmfoo'), recursive: true, comments: 'A recursive pin' },
  { cid: new CID('/ipfs/Qmbar'), recursive: false, comments: 'A direct pin' }
], { timeout: '2s' }))
```

* ipfs.pin.rmAll accepts and returns an async generator (other input types are available)

```javascript
// pass an IPFS Path or CID
const { cid } = await ipfs.rm(new CID('/ipfs/Qmfoo/file.txt'))

// pass options
const { cid } = await all(ipfs.rm(new CID('/ipfs/Qmfoo'), { recursive: true }))

// pass an iterable of CIDs or objects with options
const [{ cid }] = await all(ipfs.rmAll([{ cid: new CID('/ipfs/Qmfoo'), recursive: true }]))
```

Bonus: Lets us pipe the output of one command into another:

```javascript
await pipe(
	ipfs.pin.ls({ type: 'recursive' }),
    (source) => ipfs.pin.rmAll(source)
)

// or
await all(ipfs.pin.rmAll(ipfs.pin.ls({ type: 'recursive'})))
```

BREAKING CHANGES:

* pins are now stored in a datastore, a repo migration will occur on startup
* All deps of this module now use Uint8Arrays in place of node Buffers
  • Loading branch information
achingbrain committed Aug 25, 2020
1 parent 84cfa55 commit 64b7fe4
Show file tree
Hide file tree
Showing 318 changed files with 2,811 additions and 2,810 deletions.
2 changes: 1 addition & 1 deletion docs/BROWSERS.md
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ document.addEventListener('DOMContentLoaded', async () => {
const cid = results[0].hash
console.log('CID created via ipfs.add:', cid)
const data = await node.cat(cid)
console.log('Data read back via ipfs.cat:', data.toString())
console.log('Data read back via ipfs.cat:', new TextDecoder().decode(data))
})
</script>
```
Expand Down
89 changes: 54 additions & 35 deletions docs/MIGRATION-TO-ASYNC-AWAIT.md
Original file line number Diff line number Diff line change
Expand Up @@ -171,9 +171,10 @@ e.g.

```js
const readable = ipfs.catReadableStream('QmHash')
const decoder = new TextDecoder()

readable.on('data', chunk => {
console.log(chunk.toString())
console.log(decoder.decode(chunk))
})

readable.on('end', () => {
Expand All @@ -185,9 +186,10 @@ Becomes:

```js
const source = ipfs.cat('QmHash')
const decoder = new TextDecoder()

for await (const chunk of source) {
console.log(chunk.toString())
console.log(decoder.decode(chunk))
}

console.log('done')
Expand All @@ -201,9 +203,10 @@ e.g.

```js
const readable = ipfs.catReadableStream('QmHash')
const decoder = new TextDecoder()

readable.on('data', chunk => {
console.log(chunk.toString())
console.log(decoder.decode(chunk))
})

readable.on('end', () => {
Expand All @@ -216,9 +219,10 @@ Becomes:
```js
const toStream = require('it-to-stream')
const readable = toStream.readable(ipfs.cat('QmHash'))
const decoder = new TextDecoder()

readable.on('data', chunk => {
console.log(chunk.toString())
console.log(decoder.decode(chunk))
})

readable.on('end', () => {
Expand All @@ -238,11 +242,12 @@ e.g.

```js
const { pipeline, Writable } = require('stream')
const decoder = new TextDecoder()

let data = Buffer.alloc(0)
let data = new Uint8Array(0)
const concat = new Writable({
write (chunk, enc, cb) {
data = Buffer.concat([data, chunk])
data = uint8ArrayConcat([data, chunk])
cb()
}
})
Expand All @@ -251,7 +256,7 @@ pipeline(
ipfs.catReadableStream('QmHash'),
concat,
err => {
console.log(data.toString())
console.log(decoder.decode(chunk))
}
)
```
Expand All @@ -260,11 +265,12 @@ Becomes:

```js
const pipe = require('it-pipe')
const decoder = new TextDecoder()

let data = Buffer.alloc(0)
let data = new Uint8Array(0)
const concat = async source => {
for await (const chunk of source) {
data = Buffer.concat([data, chunk])
data = uint8ArrayConcat([data, chunk])
}
}

Expand All @@ -273,15 +279,16 @@ const data = await pipe(
concat
)

console.log(data.toString())
console.log(decoder.decode(data))
```

...which, by the way, could more succinctly be written as:

```js
const toBuffer = require('it-to-buffer')
const decoder = new TextDecoder()
const data = await toBuffer(ipfs.cat('QmHash'))
console.log(data.toString())
console.log(decoder.decode(data))
```

**Impact 🍏**
Expand All @@ -292,11 +299,12 @@ e.g.

```js
const { pipeline, Writable } = require('stream')
const decoder = new TextDecoder()

let data = Buffer.alloc(0)
let data = new Uint8Array(0)
const concat = new Writable({
write (chunk, enc, cb) {
data = Buffer.concat([data, chunk])
data = uint8ArrayConcat([data, chunk])
cb()
}
})
Expand All @@ -305,7 +313,7 @@ pipeline(
ipfs.catReadableStream('QmHash'),
concat,
err => {
console.log(data.toString())
console.log(decoder.decode(data))
}
)
```
Expand All @@ -315,11 +323,12 @@ Becomes:
```js
const toStream = require('it-to-stream')
const { pipeline, Writable } = require('stream')
const decoder = new TextDecoder()

let data = Buffer.alloc(0)
let data = new Uint8Array(0)
const concat = new Writable({
write (chunk, enc, cb) {
data = Buffer.concat([data, chunk])
data = uint8ArrayConcat([data, chunk])
cb()
}
})
Expand All @@ -328,7 +337,7 @@ pipeline(
toStream.readable(ipfs.cat('QmHash')),
concat,
err => {
console.log(data.toString())
console.log(decoder.decode(data))
}
)
```
Expand Down Expand Up @@ -472,10 +481,12 @@ Use a [for/await](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refere
e.g.

```js
const decoder = new TextDecoder()

pull(
ipfs.catPullStream('QmHash'),
pull.through(chunk => {
console.log(chunk.toString())
console.log(decoder.decode(data))
}),
pull.onEnd(err => {
console.log('done')
Expand All @@ -486,8 +497,10 @@ pull(
Becomes:

```js
const decoder = new TextDecoder()

for await (const chunk of ipfs.cat('QmHash')) {
console.log(chunk.toString())
console.log(decoder.decode(data))
}

console.log('done')
Expand All @@ -500,10 +513,12 @@ Convert the async iterable to a pull stream.
e.g.

```js
const decoder = new TextDecoder()

pull(
ipfs.catPullStream('QmHash'),
pull.through(chunk => {
console.log(chunk.toString())
console.log(decoder.decode(data))
}),
pull.onEnd(err => {
console.log('done')
Expand All @@ -515,11 +530,12 @@ Becomes:

```js
const toPull = require('async-iterator-to-pull-stream')
const decoder = new TextDecoder()

pull(
toPull.source(ipfs.cat('QmHash')),
pull.through(chunk => {
console.log(chunk.toString())
console.log(decoder.decode(data))
}),
pull.onEnd(err => {
console.log('done')
Expand All @@ -538,10 +554,12 @@ Use `it-pipe` and `it-concat` concat data from an async iterable.
e.g.

```js
const decoder = new TextDecoder()

pull(
ipfs.catPullStream('QmHash'),
pull.collect((err, chunks) => {
console.log(Buffer.concat(chunks).toString())
console.log(decoder.decode(uint8ArrayConcat(chunks)))
})
)
```
Expand All @@ -551,13 +569,14 @@ Becomes:
```js
const pipe = require('it-pipe')
const concat = require('it-concat')
const decoder = new TextDecoder()

const data = await pipe(
ipfs.cat('QmHash'),
concat
)

console.log(data.toString())
console.log(decoder.decode(data))
```

#### Transform Pull Streams
Expand Down Expand Up @@ -640,8 +659,8 @@ e.g.

```js
const results = await ipfs.addAll([
{ path: 'root/1.txt', content: Buffer.from('one') },
{ path: 'root/2.txt', content: Buffer.from('two') }
{ path: 'root/1.txt', content: 'one' },
{ path: 'root/2.txt', content: 'two' }
])

// Note that ALL files have already been added to IPFS
Expand All @@ -654,8 +673,8 @@ Becomes:

```js
const addSource = ipfs.addAll([
{ path: 'root/1.txt', content: Buffer.from('one') },
{ path: 'root/2.txt', content: Buffer.from('two') }
{ path: 'root/1.txt', content: 'one' },
{ path: 'root/2.txt', content: 'two' }
])

for await (const file of addSource) {
Expand All @@ -669,8 +688,8 @@ Alternatively you can buffer up the results using the `it-all` utility:
const all = require('it-all')

const results = await all(ipfs.addAll([
{ path: 'root/1.txt', content: Buffer.from('one') },
{ path: 'root/2.txt', content: Buffer.from('two') }
{ path: 'root/1.txt', content: 'one' },
{ path: 'root/2.txt', content: 'two' }
]))

results.forEach(file => {
Expand All @@ -682,8 +701,8 @@ Often you just want the last item (the root directory entry) when adding multipl

```js
const results = await ipfs.addAll([
{ path: 'root/1.txt', content: Buffer.from('one') },
{ path: 'root/2.txt', content: Buffer.from('two') }
{ path: 'root/1.txt', content: 'one' },
{ path: 'root/2.txt', content: 'two' }
])

const lastResult = results[results.length - 1]
Expand All @@ -695,8 +714,8 @@ Becomes:

```js
const addSource = ipfs.addAll([
{ path: 'root/1.txt', content: Buffer.from('one') },
{ path: 'root/2.txt', content: Buffer.from('two') }
{ path: 'root/1.txt', content: 'one' },
{ path: 'root/2.txt', content: 'two' }
])

let lastResult
Expand All @@ -711,8 +730,8 @@ Alternatively you can use the `it-last` utility:

```js
const lastResult = await last(ipfs.addAll([
{ path: 'root/1.txt', content: Buffer.from('one') },
{ path: 'root/2.txt', content: Buffer.from('two') }
{ path: 'root/1.txt', content: 'one' },
{ path: 'root/2.txt', content: 'two' }
]))

console.log(lastResult)
Expand Down
9 changes: 5 additions & 4 deletions docs/core-api/BLOCK.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,11 +92,12 @@ An optional object which may have the following keys:

```JavaScript
// Defaults
const buf = Buffer.from('a serialized object')
const buf = new TextEncoder().encode('a serialized object')
const decoder = new TextDecoder()

const block = await ipfs.block.put(buf)

console.log(block.data.toString())
console.log(decoder.decode(block.data))
// Logs:
// a serialized object
console.log(block.cid.toString())
Expand All @@ -105,12 +106,12 @@ console.log(block.cid.toString())

// With custom format and hashtype through CID
const CID = require('cids')
const buf = Buffer.from('another serialized object')
const buf = new TextEncoder().encode('another serialized object')
const cid = new CID(1, 'dag-pb', multihash)

const block = await ipfs.block.put(blob, cid)

console.log(block.data.toString())
console.log(decoder.decode(block.data))
// Logs:
// a serialized object
console.log(block.cid.toString())
Expand Down
Loading

0 comments on commit 64b7fe4

Please sign in to comment.