Releases: node-webrtc/node-webrtc
v0.3.6
New Features
Programmatic Audio
This release of node-webrtc adds non-standard, programmatic audio APIs in the form of RTCAudioSource and RTCAudioSink. These APIs are similar to the previously added RTCVideoSource and RTCVideoSink APIs. With these APIs, you can
- Pass audio samples to RTCAudioSource via the
onData
method. Then use the RTCAudioSource'screateTrack
method to create a local audio MediaStreamTrack. - Construct an RTCAudioSink from a local or remote audio MediaStreamTrack. The RTCAudioSink will emit a "data" event every time audio samples are received. When you're finished, stop the RTCAudioSink by calling
stop
.
Because these APIs are non-standard, they are exposed via a nonstandard
property on node-webrtc's exports object. For example,
const { RTCAudioSource, RTCAudioSink } = require('wrtc').nonstandard;
const source = new RTCAudioSource();
const track = source.createTrack();
const sink = new RTCAudioSink(track);
const sampleRate = 8000;
const samples = new Int16Array(sampleRate / 100); // 10 ms of 16-bit mono audio
const data = {
samples,
sampleRate
};
const interval = setInterval(() => {
// Update audioData in some way before sending.
source.onData(data);
});
sink.ondata = data => {
// Do something with the received audio samples.
};
setTimeout(() => {
clearInterval(interval);
track.stop();
sink.stop();
}, 10000);
RTCAudioSource
[constructor]
interface RTCAudioSource {
MediaStreamTrack createTrack();
void onData(RTCAudioData data);
};
dictionary RTCAudioData {
required Int16Array samples;
required unsigned short sampleRate;
octet bitsPerSample = 16;
octet channelCount = 1;
unsigned short numberOfFrames;
};
- Calling
createTrack
will return a local audio MediaStreamTrack whose source is the RTCAudioSource. - Calling
onData
with RTCAudioData pushes a new audio samples to every non-stopped local audio MediaStreamTrack created withcreateTrack
. - RTCAudioData should represent 10 ms worth of 16-bit audio samples.
RTCAudioSink
[constructor(MediaStreamTrack track)]
interface RTCAudioSink {
void stop();
readonly attribute boolean stopped;
attribute EventHandler ondata;
};
- RTCAudioSink's constructor accepts a local or remote audio MediaStreamTrack.
- As long as neither the RTCAudioSink nor the RTCAudioSink's MediaStreamTrack are stopped, the RTCAudioSink will raise a "data" event any time RTCAudioData is received.
- The "data" event has all the properties of RTCAudioData.
- RTCAudioSink must be stopped by calling
stop
.
RTCVideoFrame rotation
The RTCVideoFrame raised in RTCVideoSink's "frame" event now includes a property, rotation
, which indicates rotation of the RTCVideoFrame. Possible values are 0, 90, 180, and 270.
EventListener handleEvent
EventListener instances now support handleEvent
.
v0.3.5
New Features
Programmatic Video
This release of node-webrtc adds non-standard, programmatic video APIs in the form of RTCVideoSource and RTCVideoSink. With these APIs, you can
- Pass I420 frames to RTCVideoSource via the
onFrame
method. Then use RTCVideoSource'screateTrack
method to create a local video MediaStreamTrack. - Construct an RTCVideoSink from a local or remote video MediaStreamTrack. The RTCVideoSink will emit a "frame" event every time an I420 frame is received. When you're finished, stop the RTCVideoSink by calling
stop
.
Because these APIs are non-standard, they are exposed via a nonstandard
property on node-webrtc's exports object. For example,
const { RTCVideoSource, RTCVideoSink } = require('wrtc').nonstandard;
const source = new RTCVideoSource();
const track = source.createTrack();
const sink = new RTCVideoSink(track);
const width = 320;
const height = 240;
const data = new Uint8ClampedArray(width * height * 1.5);
const frame = { width, height, data };
const interval = setInterval(() => {
// Update the frame in some way before sending.
source.onFrame(frame);
});
sink.onframe = ({ frame }) => {
// Do something with the received frame.
};
setTimeout(() => {
clearInterval(interval);
track.stop();
sink.stop();
}, 10000);
This release also adds bindings to some libyuv functions for handling I420 frames. These can be useful when converting to and from RGBA.
RTCVideoSource
[constructor(optional RTCVideoSourceInit init)]
interface RTCVideoSource {
readonly attribute boolean isScreencast;
readonly attribute boolean? needsDenoising;
MediaStreamTrack createTrack();
void onFrame(RTCVideoFrame frame);
};
dictionary RTCVideoSourceInit {
boolean isScreencast = false;
boolean needsDenoising;
};
dictionary RTCVideoFrame {
required unsigned long width;
required unsigned long height;
required Uint8ClampedArray data;
};
- Calling
createTrack
will return a local video MediaStreamTrack whose source is the RTCVideoSource. - Calling
onFrame
with an RTCVideoFrame pushes a new video frame to every non-stopped local video MediaStreamTrack created withcreateTrack
. - An RTCVideoFrame represents an I420 frame.
RTCVideoSink
[constructor(MediaStreamTrack track)]
interface RTCVideoSink {
void stop();
readonly attribute boolean stopped;
attribute EventHandler onframe;
};
- RTCVideoSink's constructor accepts a local or remote video MediaStreamTrack.
- As long as neither the RTCVideoSink nor the RTCVideoSink's MediaStreamTrack are stopped, the RTCVideoSink will raise a "frame" event any time an RTCVideoFrame is received.
- The "frame" event has a property,
frame
, of type RTCVideoFrame. - RTCVideoSink must be stopped by calling
stop
.
i420ToRgba
and rgbaToI420
These two functions are bindings to libyuv that provide conversions between I420 and RGBA frames. WebRTC expects I420, whereas APIs like the Canvas API expect RGBA, so these functions are useful for converting between. For example,
const { i420ToRgba, rgbaToI420 } = require('wrtc').nonstandard;
const width = 640;
const height = 480;
const i420Data = new Uint8ClampedArray(width * height * 1.5);
const rgbaData = new Uint8ClampedArray(width * height * 4);
const i420Frame = { width, height, data: i420Data };
const rgbaFrame = { width, height, data: rgbaData };
i420ToRgba(i420Frame, rgbaFrame);
rgbaToI420(rgbaFrame, i420Frame);
MediaStreamTrack
- Added support for setting MediaStreamTrack's
enabled
property (#475).
v0.3.4
New Features
- Updated to WebRTC M71.
- Relay remote audio MediaStreamTracks on Windows (0.1.5 initially introduced this feature for Linux and macOS; now, Windows supports it, too).
- Added support for pkg (#404).
Bug Fixes
- Calling certain methods, like
addTrack
,removeTrack
, etc., with objects that were not instances of MediaStreamTrack, RTCRtpSender, etc., could lead to segfaults. This was because we did not properly validate objects before attempting to unwrap them. (#448)
v0.3.3
New Features
-
Experimental support for armv7l and arm64. Binaries built for these architectures have been tested with QEMU but not on real devices. Please test them out. If you install node-webrtc directly on an ARM device, node-pre-gyp should pull the correct binaries automatically. Otherwise, you may need to set the
TARGET_ARCH
environment variable to "arm" (armv7l) or "arm64". For example,TARGET_ARCH=arm64 npm install
-
Set
DEBUG=true
to install debug binaries (Linux- and macOS-only). For example,DEBUG=true npm install
v0.3.2
New Features
- Support for Node 11 on Windows.
v0.3.1
This release adds a number of new features and brings us closer to spec compliance, thanks to the tests at web-platform-tests/wpt.
New Features
getUserMedia
This release adds limited getUserMedia
support. You can create audio and video MediaStreamTracks; however, the resulting MediaStreamTracks do not capture any media. You can add these MediaStreamTracks to an RTCPeerConnection; however, no media will be transmitted. You can confirm by checking bytesSent
and bytesReceived
in getStats
.
const { getUserMedia } = require('wrtc');
getUserMedia({
audio: true,
video: true
}).then(stream => {
stream.getTracks().forEach(track => stop());
});
Although we will parse and validate some members of the MediaStreamConstraints and related dictionaries, we do not use their values at this time.
getStats
This release adds limited standards-compliant getStats
support. Previous node-webrtc releases exposed the legacy, callback-based getStats
API. This release preserves that API but adds the Promise-based API. Neither the MediaStreamTrack selector argument nor the RTCRtpSender- and RTCRtpReceiver-level getStats
APIs are implemented at this time.
// Legacy API
pc.getStats(
response => { /* ... */ },
console.error
);
// Standards-compliant API
pc.getStats().then(
report => { /* ... */ },
console.error
);
Unified Plan and sdpSemantics
This release adds support for RTCRtpTransceivers and Unified Plan SDP via
- A non-standard RTCConfiguration option,
sdpSemantics
, and - An environment variable,
SDP_SEMANTICS
.
Construct an RTCPeerConnection with sdpSemantics
set to "unified-plan" or launch your application with SDP_SEMANTICS=unified-plan
to enable RTCRtpTransceiver support; otherwise, "plan-b" is the default.
const { RTCPeerConnection } = require('wrtc');
const pc = new RTCPeerConnection({
sdpSemantics: 'unified-plan' // default is "plan-b"
});
SDP_SEMANTICS=unified-plan node app.js
RTCRtpTransceiver
You can use RTCRtpTransceivers and related APIs when Unified Plan is enabled. This includes the following RTCPeerConnection methods
addTransceiver
getTransceivers
and the following RTCTrackEvent properties
transceiver
The following RTCRtpTransceiver methods are supported
stop
as well as the following RTCRtpTransceiver properties
mid
sender
receiver
stopped
direction
currentDirection
setCodecPreferences
is not yet implemented. When calling addTransceiver
, only the following RTCRtpTransceiverInit dictionary members are supported
direction
streams
const assert = require('assert');
const { MediaStream, RTCPeerConnection, RTCRtpTransceiver } = require('wrtc');
const pc = new RTCPeerConnection({
sdpSemantics: 'unified-plan'
});
const t1 = pc.addTransceiver('audio', {
direction: 'recvonly'
});
const t2 = pc.addTransceiver(t1.receiver.track, {
direction: 'sendonly',
streams: [new MediaStream()]
});
MediaStreamTrack
Added limited support for the muted
property (it always returns false
).
Miscellaneous
- APIs that should throw DOMExceptions, such as
addTrack
, will use domexception to construct those DOMExceptions, if installed.
Bug Fixes
- Calling
addTrack
twice with the same MediaStreamTrack should throw an InvalidAccessError (#442). - MediaStream's
getTrackById
did not work for video MediaStreamTracks. - MediaStream's
clone
method did notclone
MediaStreamTracks. - MediaStreamTrack's
readyState
was not updated whenstop
was called.
v0.3.0
New Features
- Support for Node 11. Binaries are available for Linux and macOS. Windows binaries will become available in a subsequent release once AppVeyor gains support for Node 11.
- Updated to WebRTC M70. This release no longer uses mayeut/libwebrtc; instead, WebRTC is built from source.
Breaking Changes
- Dropped support for Node 9
- Minimum CMake version bumped to 3.12
- Minimum GCC version bumped to 5.4
- Minimum Microsoft Visual Studio version bumped to 2017
Bug Fixes
- Updating to WebRTC M70 fixes an RTCDataChannel-related interop bug with recent Firefox releases (#444).