Skip to content
This repository has been archived by the owner on Jan 18, 2024. It is now read-only.

Exports UnixFS and other DAGs from IPFS...but quickly. A fork of `ipfs-unixfs-exporter` that reads blocks from the blockstore concurrently.

License

Notifications You must be signed in to change notification settings

web3-storage/fast-unixfs-exporter

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

⚠️ Deprecated - use npm.im/ipfs-unixfs-exporter.

This fork provided parallel fetching of blocks, but this feature has been merged upstream in the following PRs:


fast-unixfs-exporter

Exports UnixFS and other DAGs from IPFS...but quickly.

This is a fork of ipfs-unixfs-exporter that reads blocks from the blockstore concurrently.

Install

> npm install @web3-storage/fast-unixfs-exporter

Usage

Example

// import a file and export it again
import { importer } from 'ipfs-unixfs-importer'
import { exporter } from '@web3-storage/fast-unixfs-exporter'
import { MemoryBlockstore } from 'blockstore-core/memory'

// Should contain the blocks we are trying to export
const blockstore = new MemoryBlockstore()
const files = []

for await (const file of importer([{
  path: '/foo/bar.txt',
  content: new Uint8Array([0, 1, 2, 3])
}], blockstore)) {
  files.push(file)
}

console.info(files[0].cid) // Qmbaz

const entry = await exporter(files[0].cid, blockstore)

console.info(entry.cid) // Qmqux
console.info(entry.path) // Qmbaz/foo/bar.txt
console.info(entry.name) // bar.txt
console.info(entry.unixfs.fileSize()) // 4

// stream content from unixfs node
const size = entry.unixfs.fileSize()
const bytes = new Uint8Array(size)
let offset = 0

for await (const buf of entry.content()) {
  bytes.set(buf, offset)
  offset += chunk.length
}

console.info(bytes) // 0, 1, 2, 3

API

import { exporter } from '@web3-storage/fast-unixfs-exporter'

exporter(cid, blockstore, options)

Uses the given blockstore instance to fetch an IPFS node by it's CID.

Returns a Promise which resolves to a UnixFSEntry.

options is an optional object argument that might include the following keys:

  • signal (AbortSignal): Used to cancel any network requests that are initiated as a result of this export
  • blockReadConcurrency: Number of blocks to concurrently read from the blockstore. Default: Infinity.

UnixFSEntry

{
  type: 'file' // or 'directory'
  name: 'foo.txt',
  path: 'Qmbar/foo.txt',
  cid: CID, // see https://github.com/multiformats/js-cid
  content: function, // returns an async iterator
  unixfs: UnixFS // see https://github.com/web3-storage/fast-unixfs-exporter
}

If the entry is a file, entry.content() returns an async iterator that yields one or more Uint8Arrays containing the file content:

if (entry.type === 'file') {
  for await (const chunk of entry.content()) {
    // chunk is a Buffer
  }
}

If the entry is a directory, entry.content() returns further entry objects:

if (entry.type === 'directory') {
  for await (const entry of dir.content()) {
    console.info(entry.name)
  }
}

Raw entries

Entries with a raw codec CID return raw entries:

{
  name: 'foo.txt',
  path: 'Qmbar/foo.txt',
  cid: CID, // see https://github.com/multiformats/js-cid
  node: Buffer, // see https://nodejs.org/api/buffer.html
  content: function, // returns an async iterator
}

entry.content() returns an async iterator that yields a buffer containing the node content:

for await (const chunk of entry.content()) {
  // chunk is a Buffer
}

Unless you an options object containing offset and length keys as an argument to entry.content(), chunk will be equal to entry.node.

CBOR entries

Entries with a dag-cbor codec CID return JavaScript object entries:

{
  name: 'foo.txt',
  path: 'Qmbar/foo.txt',
  cid: CID, // see https://github.com/multiformats/js-cid
  node: Uint8Array,
  content: function // returns an async iterator that yields a single object - see https://github.com/ipld/js-ipld-dag-cbor
}

There is no content function for a CBOR node.

entry.content({ offset, length })

When entry is a file or a raw node, offset and/or length arguments can be passed to entry.content() to return slices of data:

const length = 5
const data = new Uint8Array(length)
let offset = 0

for await (const chunk of entry.content({
  offset: 0,
  length
})) {
  data.set(chunk, offset)
  offset += chunk.length
}

// `data` contains the first 5 bytes of the file
return data

If entry is a directory, passing offset and/or length to entry.content() will limit the number of files returned from the directory.

const entries = []

for await (const entry of dir.content({
  offset: 0,
  length: 5
})) {
  entries.push(entry)
}

// `entries` contains the first 5 files/directories in the directory

walkPath(cid, blockstore)

walkPath will return an async iterator that yields entries for all segments in a path:

import { walkPath } from 'ipfs-unixfs-exporter'

const entries = []

for await (const entry of walkPath('Qmfoo/foo/bar/baz.txt', blockstore)) {
  entries.push(entry)
}

// entries contains 4x `entry` objects

recursive(cid, blockstore)

recursive will return an async iterator that yields all entries beneath a given CID or IPFS path, as well as the containing directory.

import { recursive } from 'ipfs-unixfs-exporter'

const entries = []

for await (const child of recursive('Qmfoo/foo/bar', blockstore)) {
  entries.push(entry)
}

// entries contains all children of the `Qmfoo/foo/bar` directory and it's children

Contribute

Feel free to join in. All welcome. Open an issue!

This repository falls under the IPFS Code of Conduct.

License

MIT

About

Exports UnixFS and other DAGs from IPFS...but quickly. A fork of `ipfs-unixfs-exporter` that reads blocks from the blockstore concurrently.

Resources

License

Stars

Watchers

Forks

Packages

No packages published