Skip to content
This repository was archived by the owner on Feb 12, 2024. It is now read-only.

Commit 64b7fe4

Browse files
authored
feat: store pins in datastore instead of a DAG (#2771)
Adds a `.pins` datastore to `ipfs-repo` and uses that to store pins as cbor binary keyed by multihash. ### Format As stored in the datastore, each pin has several fields: ```javascript { codec: // optional Number, the codec from the CID that this multihash was pinned with, if omitted, treated as 'dag-pb' version: // optional Number, the version number from the CID that this multihash was pinned with, if omitted, treated as v0 depth: // Number Infinity = recursive pin, 0 = direct, 1+ = pinned to a depth comments: // optional String user-friendly description of the pin metadata: // optional Object, user-defined data for the pin } ``` Notes: `.codec` and `.version` are stored so we can recreate the original CID when listing pins. ### Metadata The intention is for us to be able to add extra fields that have technical meaning to the root of the object, and the user can store application-specific data in the `metadata` field. ### CLI ```console $ ipfs pin add bafyfoo --metadata key1=value1,key2=value2 $ ipfs pin add bafyfoo --metadata-format=json --metadata '{"key1":"value1","key2":"value2"}' $ ipfs pin list bafyfoo $ ipfs pin list -l CID Name Type Metadata bafyfoo My pin Recursive {"key1":"value1","key2":"value2"} $ ipfs pin metadata Qmfoo --format=json {"key1":"value1","key2":"value2"} ``` ### HTTP API * '/api/v0/pin/add' route adds new `metadata` argument, accepts a json string * '/api/v0/pin/metadata' returns metadata as json ### Core API * `ipfs.pin.addAll` accepts and returns an async iterator * `ipfs.pin.rmAll` accepts and returns an async iterator ```javascript // pass a cid or IPFS Path with options const { cid } = await ipfs.pin.add(new CID('/ipfs/Qmfoo'), { recursive: false, metadata: { key: 'value }, timeout: 2000 })) // pass an iterable of CIDs const [{ cid: cid1 }, { cid: cid2 }] = await all(ipfs.pin.addAll([ new CID('/ipfs/Qmfoo'), new CID('/ipfs/Qmbar') ], { timeout: '2s' })) // pass an iterable of objects with options const [{ cid: cid1 }, { cid: cid2 }] = await all(ipfs.pin.addAll([ { cid: new CID('/ipfs/Qmfoo'), recursive: true, comments: 'A recursive pin' }, { cid: new CID('/ipfs/Qmbar'), recursive: false, comments: 'A direct pin' } ], { timeout: '2s' })) ``` * ipfs.pin.rmAll accepts and returns an async generator (other input types are available) ```javascript // pass an IPFS Path or CID const { cid } = await ipfs.rm(new CID('/ipfs/Qmfoo/file.txt')) // pass options const { cid } = await all(ipfs.rm(new CID('/ipfs/Qmfoo'), { recursive: true })) // pass an iterable of CIDs or objects with options const [{ cid }] = await all(ipfs.rmAll([{ cid: new CID('/ipfs/Qmfoo'), recursive: true }])) ``` Bonus: Lets us pipe the output of one command into another: ```javascript await pipe( ipfs.pin.ls({ type: 'recursive' }), (source) => ipfs.pin.rmAll(source) ) // or await all(ipfs.pin.rmAll(ipfs.pin.ls({ type: 'recursive'}))) ``` BREAKING CHANGES: * pins are now stored in a datastore, a repo migration will occur on startup * All deps of this module now use Uint8Arrays in place of node Buffers
1 parent 84cfa55 commit 64b7fe4

File tree

318 files changed

+2811
-2810
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

318 files changed

+2811
-2810
lines changed

Diff for: docs/BROWSERS.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -66,7 +66,7 @@ document.addEventListener('DOMContentLoaded', async () => {
6666
const cid = results[0].hash
6767
console.log('CID created via ipfs.add:', cid)
6868
const data = await node.cat(cid)
69-
console.log('Data read back via ipfs.cat:', data.toString())
69+
console.log('Data read back via ipfs.cat:', new TextDecoder().decode(data))
7070
})
7171
</script>
7272
```

Diff for: docs/MIGRATION-TO-ASYNC-AWAIT.md

+54-35
Original file line numberDiff line numberDiff line change
@@ -171,9 +171,10 @@ e.g.
171171

172172
```js
173173
const readable = ipfs.catReadableStream('QmHash')
174+
const decoder = new TextDecoder()
174175

175176
readable.on('data', chunk => {
176-
console.log(chunk.toString())
177+
console.log(decoder.decode(chunk))
177178
})
178179

179180
readable.on('end', () => {
@@ -185,9 +186,10 @@ Becomes:
185186

186187
```js
187188
const source = ipfs.cat('QmHash')
189+
const decoder = new TextDecoder()
188190

189191
for await (const chunk of source) {
190-
console.log(chunk.toString())
192+
console.log(decoder.decode(chunk))
191193
}
192194

193195
console.log('done')
@@ -201,9 +203,10 @@ e.g.
201203

202204
```js
203205
const readable = ipfs.catReadableStream('QmHash')
206+
const decoder = new TextDecoder()
204207

205208
readable.on('data', chunk => {
206-
console.log(chunk.toString())
209+
console.log(decoder.decode(chunk))
207210
})
208211

209212
readable.on('end', () => {
@@ -216,9 +219,10 @@ Becomes:
216219
```js
217220
const toStream = require('it-to-stream')
218221
const readable = toStream.readable(ipfs.cat('QmHash'))
222+
const decoder = new TextDecoder()
219223

220224
readable.on('data', chunk => {
221-
console.log(chunk.toString())
225+
console.log(decoder.decode(chunk))
222226
})
223227

224228
readable.on('end', () => {
@@ -238,11 +242,12 @@ e.g.
238242

239243
```js
240244
const { pipeline, Writable } = require('stream')
245+
const decoder = new TextDecoder()
241246

242-
let data = Buffer.alloc(0)
247+
let data = new Uint8Array(0)
243248
const concat = new Writable({
244249
write (chunk, enc, cb) {
245-
data = Buffer.concat([data, chunk])
250+
data = uint8ArrayConcat([data, chunk])
246251
cb()
247252
}
248253
})
@@ -251,7 +256,7 @@ pipeline(
251256
ipfs.catReadableStream('QmHash'),
252257
concat,
253258
err => {
254-
console.log(data.toString())
259+
console.log(decoder.decode(chunk))
255260
}
256261
)
257262
```
@@ -260,11 +265,12 @@ Becomes:
260265

261266
```js
262267
const pipe = require('it-pipe')
268+
const decoder = new TextDecoder()
263269

264-
let data = Buffer.alloc(0)
270+
let data = new Uint8Array(0)
265271
const concat = async source => {
266272
for await (const chunk of source) {
267-
data = Buffer.concat([data, chunk])
273+
data = uint8ArrayConcat([data, chunk])
268274
}
269275
}
270276

@@ -273,15 +279,16 @@ const data = await pipe(
273279
concat
274280
)
275281

276-
console.log(data.toString())
282+
console.log(decoder.decode(data))
277283
```
278284

279285
...which, by the way, could more succinctly be written as:
280286

281287
```js
282288
const toBuffer = require('it-to-buffer')
289+
const decoder = new TextDecoder()
283290
const data = await toBuffer(ipfs.cat('QmHash'))
284-
console.log(data.toString())
291+
console.log(decoder.decode(data))
285292
```
286293

287294
**Impact 🍏**
@@ -292,11 +299,12 @@ e.g.
292299

293300
```js
294301
const { pipeline, Writable } = require('stream')
302+
const decoder = new TextDecoder()
295303

296-
let data = Buffer.alloc(0)
304+
let data = new Uint8Array(0)
297305
const concat = new Writable({
298306
write (chunk, enc, cb) {
299-
data = Buffer.concat([data, chunk])
307+
data = uint8ArrayConcat([data, chunk])
300308
cb()
301309
}
302310
})
@@ -305,7 +313,7 @@ pipeline(
305313
ipfs.catReadableStream('QmHash'),
306314
concat,
307315
err => {
308-
console.log(data.toString())
316+
console.log(decoder.decode(data))
309317
}
310318
)
311319
```
@@ -315,11 +323,12 @@ Becomes:
315323
```js
316324
const toStream = require('it-to-stream')
317325
const { pipeline, Writable } = require('stream')
326+
const decoder = new TextDecoder()
318327

319-
let data = Buffer.alloc(0)
328+
let data = new Uint8Array(0)
320329
const concat = new Writable({
321330
write (chunk, enc, cb) {
322-
data = Buffer.concat([data, chunk])
331+
data = uint8ArrayConcat([data, chunk])
323332
cb()
324333
}
325334
})
@@ -328,7 +337,7 @@ pipeline(
328337
toStream.readable(ipfs.cat('QmHash')),
329338
concat,
330339
err => {
331-
console.log(data.toString())
340+
console.log(decoder.decode(data))
332341
}
333342
)
334343
```
@@ -472,10 +481,12 @@ Use a [for/await](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refere
472481
e.g.
473482

474483
```js
484+
const decoder = new TextDecoder()
485+
475486
pull(
476487
ipfs.catPullStream('QmHash'),
477488
pull.through(chunk => {
478-
console.log(chunk.toString())
489+
console.log(decoder.decode(data))
479490
}),
480491
pull.onEnd(err => {
481492
console.log('done')
@@ -486,8 +497,10 @@ pull(
486497
Becomes:
487498

488499
```js
500+
const decoder = new TextDecoder()
501+
489502
for await (const chunk of ipfs.cat('QmHash')) {
490-
console.log(chunk.toString())
503+
console.log(decoder.decode(data))
491504
}
492505

493506
console.log('done')
@@ -500,10 +513,12 @@ Convert the async iterable to a pull stream.
500513
e.g.
501514

502515
```js
516+
const decoder = new TextDecoder()
517+
503518
pull(
504519
ipfs.catPullStream('QmHash'),
505520
pull.through(chunk => {
506-
console.log(chunk.toString())
521+
console.log(decoder.decode(data))
507522
}),
508523
pull.onEnd(err => {
509524
console.log('done')
@@ -515,11 +530,12 @@ Becomes:
515530

516531
```js
517532
const toPull = require('async-iterator-to-pull-stream')
533+
const decoder = new TextDecoder()
518534

519535
pull(
520536
toPull.source(ipfs.cat('QmHash')),
521537
pull.through(chunk => {
522-
console.log(chunk.toString())
538+
console.log(decoder.decode(data))
523539
}),
524540
pull.onEnd(err => {
525541
console.log('done')
@@ -538,10 +554,12 @@ Use `it-pipe` and `it-concat` concat data from an async iterable.
538554
e.g.
539555

540556
```js
557+
const decoder = new TextDecoder()
558+
541559
pull(
542560
ipfs.catPullStream('QmHash'),
543561
pull.collect((err, chunks) => {
544-
console.log(Buffer.concat(chunks).toString())
562+
console.log(decoder.decode(uint8ArrayConcat(chunks)))
545563
})
546564
)
547565
```
@@ -551,13 +569,14 @@ Becomes:
551569
```js
552570
const pipe = require('it-pipe')
553571
const concat = require('it-concat')
572+
const decoder = new TextDecoder()
554573

555574
const data = await pipe(
556575
ipfs.cat('QmHash'),
557576
concat
558577
)
559578

560-
console.log(data.toString())
579+
console.log(decoder.decode(data))
561580
```
562581

563582
#### Transform Pull Streams
@@ -640,8 +659,8 @@ e.g.
640659

641660
```js
642661
const results = await ipfs.addAll([
643-
{ path: 'root/1.txt', content: Buffer.from('one') },
644-
{ path: 'root/2.txt', content: Buffer.from('two') }
662+
{ path: 'root/1.txt', content: 'one' },
663+
{ path: 'root/2.txt', content: 'two' }
645664
])
646665

647666
// Note that ALL files have already been added to IPFS
@@ -654,8 +673,8 @@ Becomes:
654673

655674
```js
656675
const addSource = ipfs.addAll([
657-
{ path: 'root/1.txt', content: Buffer.from('one') },
658-
{ path: 'root/2.txt', content: Buffer.from('two') }
676+
{ path: 'root/1.txt', content: 'one' },
677+
{ path: 'root/2.txt', content: 'two' }
659678
])
660679

661680
for await (const file of addSource) {
@@ -669,8 +688,8 @@ Alternatively you can buffer up the results using the `it-all` utility:
669688
const all = require('it-all')
670689

671690
const results = await all(ipfs.addAll([
672-
{ path: 'root/1.txt', content: Buffer.from('one') },
673-
{ path: 'root/2.txt', content: Buffer.from('two') }
691+
{ path: 'root/1.txt', content: 'one' },
692+
{ path: 'root/2.txt', content: 'two' }
674693
]))
675694

676695
results.forEach(file => {
@@ -682,8 +701,8 @@ Often you just want the last item (the root directory entry) when adding multipl
682701

683702
```js
684703
const results = await ipfs.addAll([
685-
{ path: 'root/1.txt', content: Buffer.from('one') },
686-
{ path: 'root/2.txt', content: Buffer.from('two') }
704+
{ path: 'root/1.txt', content: 'one' },
705+
{ path: 'root/2.txt', content: 'two' }
687706
])
688707

689708
const lastResult = results[results.length - 1]
@@ -695,8 +714,8 @@ Becomes:
695714

696715
```js
697716
const addSource = ipfs.addAll([
698-
{ path: 'root/1.txt', content: Buffer.from('one') },
699-
{ path: 'root/2.txt', content: Buffer.from('two') }
717+
{ path: 'root/1.txt', content: 'one' },
718+
{ path: 'root/2.txt', content: 'two' }
700719
])
701720

702721
let lastResult
@@ -711,8 +730,8 @@ Alternatively you can use the `it-last` utility:
711730

712731
```js
713732
const lastResult = await last(ipfs.addAll([
714-
{ path: 'root/1.txt', content: Buffer.from('one') },
715-
{ path: 'root/2.txt', content: Buffer.from('two') }
733+
{ path: 'root/1.txt', content: 'one' },
734+
{ path: 'root/2.txt', content: 'two' }
716735
]))
717736

718737
console.log(lastResult)

Diff for: docs/core-api/BLOCK.md

+5-4
Original file line numberDiff line numberDiff line change
@@ -92,11 +92,12 @@ An optional object which may have the following keys:
9292

9393
```JavaScript
9494
// Defaults
95-
const buf = Buffer.from('a serialized object')
95+
const buf = new TextEncoder().encode('a serialized object')
96+
const decoder = new TextDecoder()
9697

9798
const block = await ipfs.block.put(buf)
9899

99-
console.log(block.data.toString())
100+
console.log(decoder.decode(block.data))
100101
// Logs:
101102
// a serialized object
102103
console.log(block.cid.toString())
@@ -105,12 +106,12 @@ console.log(block.cid.toString())
105106

106107
// With custom format and hashtype through CID
107108
const CID = require('cids')
108-
const buf = Buffer.from('another serialized object')
109+
const buf = new TextEncoder().encode('another serialized object')
109110
const cid = new CID(1, 'dag-pb', multihash)
110111

111112
const block = await ipfs.block.put(blob, cid)
112113

113-
console.log(block.data.toString())
114+
console.log(decoder.decode(block.data))
114115
// Logs:
115116
// a serialized object
116117
console.log(block.cid.toString())

0 commit comments

Comments
 (0)