Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New Docs section: Embedding videos in Remotion #4516

Merged
merged 2 commits into from
Nov 19, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
---
image: /generated/articles-docs-miscellaneous-snippets-accelerated-video.png
title: 'Change the speed of a video over time'
sidebar_label: Changing speed over time
crumb: 'Snippets'
---

Expand All @@ -11,10 +12,13 @@ To speed up a video over time - for example to start with regular speed and then
It is not as easy as interpolating the [`playbackRate`](/docs/video#playbackrate):

```tsx twoslash title="❌ Does not work"
import {interpolate, Video} from 'remotion';
import {interpolate, OffthreadVideo} from 'remotion';
let frame = 0;
// ---cut---
<Video playbackRate={interpolate(frame, [0, 100], [1, 5])} />;
<OffthreadVideo
playbackRate={interpolate(frame, [0, 100], [1, 5])}
src="https://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4#disable"
/>;
```

This is because Remotion will evaluate each frame independently from the other frames. If `frame` is 100, the `playbackRate` evaluates as 5 and Remotion will render the 500th frame of the video, which is undesired because it does not take into account that the speed has been building up to 5 until now.
Expand Down
38 changes: 19 additions & 19 deletions packages/docs/docs/miscellaneous/snippets/align-duration.mdx
Original file line number Diff line number Diff line change
@@ -1,32 +1,32 @@
---
image: /generated/articles-docs-miscellaneous-snippets-align-duration.png
title: How do I make the composition the same duration as my video?
sidebar_label: Same composition duration as video
crumb: "FAQ"
sidebar_label: Align timeline duration to video
crumb: 'FAQ'
---

If you have a component rendering a video:

```tsx twoslash title="MyComp.tsx"
import React from "react";
import { OffthreadVideo, staticFile } from "remotion";
import React from 'react';
import {OffthreadVideo, staticFile} from 'remotion';

export const MyComp: React.FC = () => {
return <OffthreadVideo src={staticFile("video.mp4")} />;
return <OffthreadVideo src={staticFile('video.mp4')} />;
};
```

and you want to make the composition the same duration as the video, first make the video source a React prop:

```tsx twoslash title="MyComp.tsx"
import React from "react";
import { OffthreadVideo, staticFile } from "remotion";
import React from 'react';
import {OffthreadVideo, staticFile} from 'remotion';

type MyCompProps = {
src: string;
};

export const MyComp: React.FC<MyCompProps> = ({ src }) => {
export const MyComp: React.FC<MyCompProps> = ({src}) => {
return <OffthreadVideo src={src} />;
};
```
Expand All @@ -41,12 +41,12 @@ type MyCompProps = {

// ---cut---

import { CalculateMetadataFunction } from "remotion";
import { getVideoMetadata } from "@remotion/media-utils";
import {CalculateMetadataFunction} from 'remotion';
import {getVideoMetadata} from '@remotion/media-utils';

export const calculateMetadata: CalculateMetadataFunction<
MyCompProps
> = async ({ props }) => {
> = async ({props}) => {
const data = await getVideoMetadata(props.src);
const fps = 30;

Expand All @@ -61,9 +61,9 @@ Finally, pass the `calculateMetadata` function to the `Composition` component an

```tsx twoslash title="Root.tsx"
// @filename: MyComp.tsx
import React from "react";
import { CalculateMetadataFunction } from "remotion";
import { getVideoMetadata } from "@remotion/media-utils";
import React from 'react';
import {CalculateMetadataFunction} from 'remotion';
import {getVideoMetadata} from '@remotion/media-utils';

export const MyComp: React.FC<MyCompProps> = () => {
return null;
Expand All @@ -74,7 +74,7 @@ type MyCompProps = {

export const calculateMetadata: CalculateMetadataFunction<
MyCompProps
> = async ({ props }) => {
> = async ({props}) => {
const data = await getVideoMetadata(props.src);
const fps = 30;

Expand All @@ -87,9 +87,9 @@ export const calculateMetadata: CalculateMetadataFunction<
// @filename: Root.tsx
// ---cut---

import React from "react";
import { Composition } from "remotion";
import { MyComp, calculateMetadata } from "./MyComp";
import React from 'react';
import {Composition} from 'remotion';
import {MyComp, calculateMetadata} from './MyComp';

export const Root: React.FC = () => {
return (
Expand All @@ -101,7 +101,7 @@ export const Root: React.FC = () => {
width={1920}
height={1080}
defaultProps={{
src: "https://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4",
src: 'https://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4',
}}
calculateMetadata={calculateMetadata}
/>
Expand Down
29 changes: 19 additions & 10 deletions packages/docs/docs/miscellaneous/snippets/hls.mdx
Original file line number Diff line number Diff line change
@@ -1,8 +1,9 @@
---
image: /generated/articles-docs-miscellaneous-snippets-hls.png
title: "HLS support (HTTP Live Streaming)"
title: 'HLS support (HTTP Live Streaming)'
sidebar_label: 'HTTP Live Streaming'
id: hls
crumb: "Video"
crumb: 'Video'
---

## No native support
Expand All @@ -21,21 +22,29 @@ You can play HLS videos during preview in the [`<Player>`](/docs/player) and in

Note the following caveats:

<Step>1</Step> This code only shows how to connect the video tag to the HLS stream, it has not been tested on a real project. <br/>
<Step>2</Step> Audio will not work when rendering a video to an MP4 using this code. Use an alternative source during rendering. See <a href="/docs/miscellaneous/snippets/offthread-video-while-rendering">&lt;OffthreadVideo&gt; while rendering
</a> and <a href="/docs/get-remotion-environment"><code>getRemotionEnvironment()</code></a> for how to use different components based on whether you are rendering or previewing.<br/><br/>
<Step>1</Step> This code only shows how to connect the video tag to the HLS
stream, it has not been tested on a real project. <br />
<Step>2</Step> Audio will not work when rendering a video to an MP4 using this
code. Use an alternative source during rendering. See <a href="/docs/miscellaneous/snippets/offthread-video-while-rendering">
&lt;OffthreadVideo&gt; while rendering
</a> and <a href="/docs/get-remotion-environment">
<code>getRemotionEnvironment()</code>
</a> for how to use different components based on whether you are rendering or
previewing.
<br />
<br />

```tsx twoslash title="HlsDemo.tsx"
import Hls from "hls.js";
import React, { useEffect, useRef } from "react";
import { AbsoluteFill, RemotionVideoProps, Video } from "remotion";
import Hls from 'hls.js';
import React, {useEffect, useRef} from 'react';
import {AbsoluteFill, RemotionVideoProps, Video} from 'remotion';

const HlsVideo: React.FC<RemotionVideoProps> = ({ src }) => {
const HlsVideo: React.FC<RemotionVideoProps> = ({src}) => {
const videoRef = useRef<HTMLVideoElement>(null);

useEffect(() => {
if (!src) {
throw new Error("src is required");
throw new Error('src is required');
}

const startFrom = 0;
Expand Down
4 changes: 4 additions & 0 deletions packages/docs/docs/miscellaneous/snippets/jumpcuts.mdx
Original file line number Diff line number Diff line change
@@ -1,9 +1,13 @@
---
image: /generated/articles-docs-miscellaneous-snippets-jumpcuts.png
title: 'Jump Cutting'
sidebar_label: Jump Cuts
crumb: 'Snippets'
---

You might wanna use a "jump cut" to skip parts of a video.
Use the following snippet to skip certain sections of a video, without re-mounting it.

```tsx twoslash
import React, {useMemo} from 'react';
import {
Expand Down
67 changes: 35 additions & 32 deletions packages/docs/docs/video-manipulation.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,11 @@
image: /generated/articles-docs-video-manipulation.png
id: video-manipulation
title: Video manipulation
crumb: "How To"
sidebar_label: Manipulating pixels
crumb: 'How To'
---

import { VideoCanvasExamples } from "../components/GreenscreenExamples/index";
import {VideoCanvasExamples} from '../components/GreenscreenExamples/index';

You can draw frames of an [`<OffthreadVideo>`](/docs/offthreadvideo) or a [`<Video>`](/docs/video) onto a `<canvas>` element using the [`drawImage()`](https://developer.mozilla.org/en-US/docs/Web/API/CanvasRenderingContext2D/drawImage) API.

Expand All @@ -19,39 +20,42 @@ Browser support: Firefox 130 (August 2024), Chrome 83, Safari 15.4.
In this example, an [`<OffthreadVideo>`](/docs/offthreadvideo) is rendered and made invisible.
Every frame that is emitted is drawn to a Canvas and a grayscale [`filter`](https://developer.mozilla.org/en-US/docs/Web/API/CanvasRenderingContext2D/filter) is applied.

<VideoCanvasExamples type="base"/>
<br/>
<VideoCanvasExamples type="base" />
<br />

```tsx twoslash
import React, { useCallback, useEffect, useRef } from "react";
import { AbsoluteFill, useVideoConfig, OffthreadVideo } from "remotion";
import React, {useCallback, useEffect, useRef} from 'react';
import {AbsoluteFill, useVideoConfig, OffthreadVideo} from 'remotion';
// ---cut---
export const VideoOnCanvas: React.FC = () => {
const video = useRef<HTMLVideoElement>(null);
const canvas = useRef<HTMLCanvasElement>(null);
const { width, height } = useVideoConfig();
const {width, height} = useVideoConfig();

// Process a frame
const onVideoFrame = useCallback((frame: CanvasImageSource) => {
if (!canvas.current ) {
return;
}
const context = canvas.current.getContext("2d");
const onVideoFrame = useCallback(
(frame: CanvasImageSource) => {
if (!canvas.current) {
return;
}
const context = canvas.current.getContext('2d');

if (!context) {
return;
}
if (!context) {
return;
}

context.filter = "grayscale(100%)";
context.drawImage(frame, 0, 0, width, height);
}, [height, width]);
context.filter = 'grayscale(100%)';
context.drawImage(frame, 0, 0, width, height);
},
[height, width],
);

return (
<AbsoluteFill>
<AbsoluteFill>
<OffthreadVideo
// Hide the original video tag
style={{ opacity: 0 }}
style={{opacity: 0}}
onVideoFrame={onVideoFrame}
src="http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4"
/>
Expand All @@ -68,8 +72,8 @@ export const VideoOnCanvas: React.FC = () => {

In this example, we loop over each pixel in the image buffer and if it's green, we transparentize it. Drag the slider below to turn the video transparent.

<VideoCanvasExamples type="greenscreen"/>
<br/>
<VideoCanvasExamples type="greenscreen" />
<br />

```tsx twoslash
declare global {
Expand All @@ -88,36 +92,36 @@ declare global {
type VideoFrameRequestCallbackId = number;
interface HTMLVideoElement extends HTMLMediaElement {
requestVideoFrameCallback(
callback: (now: DOMHighResTimeStamp, metadata: VideoFrameMetadata) => any
callback: (now: DOMHighResTimeStamp, metadata: VideoFrameMetadata) => any,
): VideoFrameRequestCallbackId;
cancelVideoFrameCallback(handle: VideoFrameRequestCallbackId): void;
}
}
import React, { useCallback, useEffect, useRef } from "react";
import { AbsoluteFill, useVideoConfig, OffthreadVideo } from "remotion";
import React, {useCallback, useEffect, useRef} from 'react';
import {AbsoluteFill, useVideoConfig, OffthreadVideo} from 'remotion';

// ---cut---
export const Greenscreen: React.FC<{
opacity: number;
}> = ({ opacity }) => {
}> = ({opacity}) => {
const canvas = useRef<HTMLCanvasElement>(null);
const { width, height } = useVideoConfig();
const {width, height} = useVideoConfig();

// Process a frame
const onVideoFrame = useCallback(
(frame: CanvasImageSource) => {
if (!canvas.current ) {
if (!canvas.current) {
return;
}
const context = canvas.current.getContext("2d");
const context = canvas.current.getContext('2d');

if (!context) {
return;
}

context.drawImage(frame, 0, 0, width, height);
const imageFrame = context.getImageData(0, 0, width, height);
const { length } = imageFrame.data;
const {length} = imageFrame.data;

// If the pixel is very green, reduce the alpha channel
for (let i = 0; i < length; i += 4) {
Expand All @@ -130,15 +134,14 @@ export const Greenscreen: React.FC<{
}
context.putImageData(imageFrame, 0, 0);
},
[height, width]
[height, width],
);


return (
<AbsoluteFill>
<AbsoluteFill>
<OffthreadVideo
style={{ opacity: 0 }}
style={{opacity: 0}}
onVideoFrame={onVideoFrame}
src="https://remotion-assets.s3.eu-central-1.amazonaws.com/just-do-it-short.mp4"
/>
Expand Down
9 changes: 4 additions & 5 deletions packages/docs/docs/video-vs-offthreadvideo.mdx
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
---
image: /generated/articles-docs-video-vs-offthreadvideo.png
id: video-vs-offthreadvideo
title: "<Video> vs. <OffthreadVideo>"
crumb: "Comparison"
title: '<OffthreadVideo> vs. <Video>'
crumb: 'Comparison'
---

We offer two components for including other videos in your video: [`<Video />`](/docs/video) and [`<OffthreadVideo />`](/docs/offthreadvideo).
Expand Down Expand Up @@ -40,13 +40,12 @@ Is based on the native HTML5 `<video>` element and therefore behaves similar to
**Pros**

✅ &nbsp; Can render a video without having it to be downloaded fully (if you don't pass the [`muted`](/docs/video/#muted) prop, the video will still be downloaded fully to extract its audio).
✅ &nbsp; You can attach a ref to the `<video>` element.
✅ &nbsp; You can attach a ref to the `<video>` element.

**Cons**

⛔ &nbsp; Fewer codecs are supported.
⛔ &nbsp; Chrome may throttle video tags if the page is heavy.
⛔ &nbsp; If too many video tags get rendered simultaneously, a timeout may occur.
⛔ &nbsp; If the input video framerate does not match with the output framerate, some duplicate frames may occur in the output.
⛔ &nbsp; A Google Chrome build with proprietary codecs is required.

⛔ &nbsp; A Google Chrome build with proprietary codecs is required.
Loading
Loading