Hyperfeed is a self-archiving P2P live feed. You can convert any RSS/ATOM/RDF feed to a P2P live update publishing network.
- Self-archiving: Items and it's linked page will be archived within hyperfeed.
- Decentralized: Feed contents can still be distributed between readers even if the original host is down.
- Live: No need to poll the original feed. Updates will be pushed to you.
npm install hyperfeed
Publish your RSS feed through hyperfeed:
const request = require('request')
const hyperfeed = require('hyperfeed')
const hyperdrive = require('hyperdrive')
const swarm = require('hyperdiscovery')
const url = 'https://medium.com/feed/google-developers'
var archive = hyperdrive('./feed')
var feed = hyperfeed(archive)
feed.ready(() => {
swarm(archive)
console.log(feed.key.toString('hex'))
feed.update(request(url), (err) => {
console.log('feed imported')
})
})
Now you can replicate the hyperfeed through a p2p network:
const Hyperfeed = require('hyperfeed')
const swarm = require('hyperdiscovery')
const hyperdrive = require('hyperdrive')
var archive = hyperdrive('./anotherFeed', '<KEY FROM ABOVE>')
var feed = hyperfeed(archive)
swarm(archive) // load the feed from the p2p network
feed.list((err, entries) => {
console.log(entries) // all entries in the feed (include history entries)
})
Create a new Hyperfeed instance. opts
includes:
{
scrapLink: true // set to false to stop archiving linked page for each feed item
}
The public key identifying the feed.
A key derived from the public key that can be used to discovery other peers sharing this feed.
The metadata of the feed.
Wait for feed is fully ready and all properties has been populated.
import a RSS feed into feed
. Accept a stream.
Set feed's metadata.
List archived item in the feed.
Save a new feed item. Check https://github.com/jpmonette/feed for item detail.
If you already have scrapped data for the given item, you can pass it to scrappedData
to avoid redundant requests.
Export a RSS-2.0 Feed containing latest count
items.
The MIT License