I am using:
- Node v7.2.0
- PhantomJS v2.1.1
- Chrome v54
- ffmpeg v3.2
- OSX v10.12.1
This setup allows you to continuously render a web page in PhantomJS and stream it into an Aframe VR scene by rendering it onto a canvas. It then forwards user events (eg click, keypress) back to PhantomJS allowing the user to interact with the "browser". Not surprisingly, the performance is shit. This is just a proof of concept. Originally my idea was that I could port existing 2D code editing or text editing web applications into Aframe.
I drew a lot of help from Stef van den Ham's blog post on Recording A Website With PhantomJS And FFMPEG and Dominic Szablewski's post on HTML Live Video Streaming Via Websockets.
Going forward I would like to look into SlimerJS instead of Phantom and even using virtual machines such as VirtualBox.
First install PhantomJS with npm install phantomjs-prebuilt -g
or yarn global add phantomjs-prebuilt
. You can check if it's installed with phantomjs -v
.
For Mac users, install ffmpeg with brew install ffmpeg
. You can check if it's installed with ffmpeg -version
. I'm not sure how you'd install ffmpeg on Windows or Linux, so you are on your own.
Then run npm install
or yarn install
. This will automatically run browserify public/packages.js > public/packages.combined.js
after it installs all of the Node dependencies - see package.js
.
This is a little complex to run. You'll need to run:
- A static file server to host the Aframe scene. You can run this using
npm start
oryarn start
. These are just shortcuts fornode app.js
. This will run on port 3000. - The web socket server that will stream our mpeg data to the browser, adapted from jsmpeg. I've also put the command for this into a script so you can just run
sh scripts/start-streaming-server.sh
. This listens for data from PhantomJS on port 8082 and then allows the browser to connect via a Websocket on port 8084. Once the browser connects, it will send all of the mpeg data to the browser. - A script that runs PhantomJS, pipes the rendered PNG output to ffmpeg, and then streams the mpeg output from that to the server mentioned in step 2. I've put the command for this into a script, so you can just run
sh scripts/phantom-ffmpeg-stream.sh
.
Navigate to http://localhost:3000
. It will probably take a few seconds for streaming to start. If the 3D web page doesn't show up, try refreshing.
The PhantomJS script will output to two logfiles:
logs/main
which will show general PhantomJS log messageslogs/page_errors
which will show any Javascript errors on webpages that you load.
I'd recommend tailing these logs as you work with tail -f logs/main
and tail -f logs/page_errors
. The reason for doing this instead of using the usual console.log
is that we can't output anything to stdout
except for the raw PNG data since we are piping this into ffmpeg.
- https://github.com/phoboslab/jsmpeg
- http://mindthecode.com/recording-a-website-with-phantomjs-and-ffmpeg/
- http://stackoverflow.com/questions/21921790/best-approach-to-real-time-http-streaming-to-html5-video-client
- http://phoboslab.org/log/2013/09/html5-live-video-streaming-via-websockets