A fast, flexible CD+Graphics (CD+G) renderer.
- Designed for requestAnimationFrame
- Audio synchronization when used with currentTime
- Optional background keying (transparency)
- Reports background RGBA and content bounds for each frame
- Supports all modern browsers
- No dependencies
$ npm i cdgraphics
Instantiates a new parser/renderer.
import CDGraphics from 'cdgraphics'
const cdg = new CDGraphics()
Loads a CD+G file from an ArrayBuffer, which can be had via the Response of a fetch(). You must load()
a CD+G file before calling render()
.
fetch(cdgFileUrl)
.then(response => response.arrayBuffer())
.then(buffer => cdg.load(buffer))
time
: Number (in seconds) of the frame to render. Should usually be the currentTime of an<audio>
element.options
: Object with one or more of the following:forceKey
: Boolean forcing the background to be transparent, even if the CD+G title did not explicitly specify it. Defaults tofalse
.
Returns an object with the following properties:
imageData
: ImageData object containing the rendered frame's pixel data.isChanged
: Boolean indicating whether the frame changed since the last render. Useful for skipping unnecessary re-paints to a canvas.backgroundRGBA
: Array containing the frame's background color in the form[r, g, b, a]
with alpha being0
or1
. The reported alpha includes the effect of the forceKey option, if enabled.contentBounds
: Array containing the coordinates of a bounding box that fits the frame's non-transparent pixels in the form[x1, y1, x2, y2]
. Typically only useful when the forceKey option is enabled.
The following excerpt demonstrates an audio-synced render loop that draws to a canvas. See the demo code for a more complete example.
const audio = document.getElementById('audio')
const canvas = document.getElementById('canvas')
const ctx = canvas.getContext('2d')
let frameId
const doRender = time => {
const frame = cdg.render(time)
if (!frame.isChanged) return
createImageBitmap(frame.imageData)
.then(bitmap => {
ctx.clearRect(0, 0, canvas.clientWidth, canvas.clientHeight)
ctx.drawImage(bitmap, 0, 0, canvas.clientWidth, canvas.clientHeight)
})
}
// render loop
const pause = () => cancelAnimationFrame(frameId)
const play = () => {
frameId = requestAnimationFrame(play)
doRender(audio.currentTime)
}
// follow <audio> events
audio.addEventListener('play', play)
audio.addEventListener('pause', pause)
audio.addEventListener('ended', pause)
audio.addEventListener('seeked', () => doRender(audio.currentTime))
To run the demo and see how it all comes together:
- Clone the repo
- Place your audio and .cdg file in the
demo
folder - Update lines 1 and 2 of
demo/demo.js
with those filenames $ npm i
$ npm run demo
- Originally based on the player by Luke Tucker, with improvements from Keith McKnight's fork
- Jim Bumgardner's CD+G Revealed document/specification