Adaptive video streaming with dash.js in react – smashing magazine

Adaptive video streaming with dash.js in react - smashing magazine

I was recently assigned the task of creating video rolls to be played smoothly under a slow network or on low-end devices. I started with the native HTML5 Take but hit a wall quickly – it just doesn’t cut it when the connections are slow or devices are underpowered.

After some research i found that Adaptive bitrate -streaming was the solution I needed. But here is the frustrating part: finding an extensive, beginner -friendly guide was so difficult. The resources on MDN and other sites were useful, but lacked the end-to-tutorial I was looking for.

That’s why I write this article: Giving you the step-by-step guide I wish I had found. I will bridge the gap between writing FFMPEG scripts, code video files and implementing the dash-compatible video player (dash.js) with code examples you can follow.

Goes beyond the native HTML5 Roof

You might be wondering why you can’t just trust HTML element. There is a good reason for that. Let’s compare the difference between a native Element and adaptive video streaming in browsers.

Progressive Download

With progressive downloading, your browser downloads the video file linearly from the server over HTTP and starts playback as long as it has buffet enough data. This is the default behavior of element.

When you play the video, check your browser’s network tab and you will see more requests with 206 Partial Content Status code.

(Large preview)

It uses HTTP 206 -series of queries to retrieve the video file in chunks. The server sends specific byte areas of the video to your browser. When you search, the browser will make several row requests asking for new city areas (eg “Give me bytes 1,000,000-2,000,000”).

In other words, it doesn’t pick up the whole file at once. Instead, the partial byte provides ranging from the individual MP4 Videofil as needed. This is still considered a Progressive Download Because only a single file is retrieved over HTTP – there is no bandwidth or quality adjustment.

If the server or browser does not support range requests, the entire video file is downloaded in a single request and returns a 200 OK Status code. In this case, the video can only start playing when the entire file has finished downloading.

The problems? If you are on a slow connection trying to watch high resolution video, you wait a long time before playing starts.

Adaptive bitrate -streaming

Instead of serving a single video file, Adaptive bitrate (ABR) streaming Split the video into several segments by different bitrates and solutions. During playback, ABR allorithm automatically selects the highest quality segment that can be downloaded in time for smooth playback based on your network connection, bandwidth and other device features. It continues to adapt everywhere to adapt to changing conditions.

This magic happens through two key browser technologies:

  • Media Source Extension (MSE)
    It provides the opportunity to pass on a Mediasource object to src attribute in that enables the sending of multiple source buffer objects representing video segments.
  • Media features API
    It provides information about your device’s video decoding and coding skills, enabling ABR to make informed decisions about which resolution to provide.

Together, they enable the core functionality of ABR and serve video pieces optimized for your specific device restrictions in real time.

Streaming Protocols: MPEG-Dash vs. HLS

As mentioned above, to stream media adaptively, a video is divided into chunks at different quality levels across different times. We need to facilitate the process of switching between these segments adaptive in real time. To achieve this, ABR streaming depends on specific protocols. The two most common ABR protocols are:

  • MPEG-DASH,
  • Http live streaming (HLS).

Both of these protocols use HTTP to send video files. Therefore, they are compatible with http web servers.

This article focuses on MPEG dash. However, it is worth noting that DASH is not supported by Apple devices or browsers, as mentioned in Mux’s article.

MPEG-DASH

MPEG-DASH enables adaptive streaming through:

  • A media presentation description (MPD) file
    This XML Manifest File contains information on how to select and manage streams based on adaptive rules.
  • Segmented media files
    Video and audio files are divided into segments into different solutions and durations using MPEG-Dash-Compatible Codecs and Formats.

On the client side, a dash-compatible video player reads the MPD file and monitors continuous networking bandwidth. Based on available bandwidth, the player selects the appropriate bitrate and requests the corresponding video piece. This process is repeated throughout playback and ensures smooth, optimal quality.

Now that you understand the basic elements, let’s build our adaptive video player!

Step to build an adaptive bitrate streaming video player

Here’s the plan:

  1. Transcode MP4 video for audio and video reproductions at different solutions and bitrates with FFMPEG.
  2. Generates an MPD file with FFMPEG.
  3. Serve the output files from the server.
  4. Build the DASH compatible video player to play the video.

Install ffmpeg

For MacOS users, install FFMPEG using brew by running the following command in your terminal:

brew install ffmpeg

For other operating systems, refer to FFMPEG’s documentation.

Generating audio release

Then run the following script to extract the audio track and code it in webm format for DASH Compatibility:

ffmpeg -i "input_video.mp4" -vn -acodec libvorbis -ab 128k "audio.webm"
  • -i "input_video.mp4": Specifies the input video.
  • -vn: Disables the video flow (Audio output only).
  • -acodec libvorbis: User Libvorbis Codec to code audio.
  • -ab 128k: Setting the sound bit rate to 128 Kbps.
  • "audio.webm": Specifies the output sound file in webm format.

Generate video renderings

Run this script to create three video rapes with different resolutions and bitrates. The largest resolution must match the input file size. For example, if the input video is 576 × 1024 With 30 pictures per Second (FPS) generates the script reproductions optimized for vertical video playback.

ffmpeg -i "input_video.mp4" -c:v libvpx-vp9 -keyint_min 150 -g 150 \
-tile-columns 4 -frame-parallel 1 -f webm \
-an -vf scale=576:1024 -b:v 1500k "input_video_576x1024_1500k.webm" \
-an -vf scale=480:854 -b:v 1000k "input_video_480x854_1000k.webm" \
-an -vf scale=360:640 -b:v 750k "input_video_360x640_750k.webm"
  • -c:v libvpx-vp9: User LIBVPX-VP9 Like the VP9 video cod for webm.
  • -keyint_min 150 and -g 150: Set A. 150-frame keyframe interval (About every 5 seconds at 30 fps). This allows Bitrate Switching every 5 seconds.
  • -tile-columns 4 and -frame-parallel 1: Optimizing coding performance through parallel treatment.
  • -f webm: Specifies the output format as webm.

In each reproduction:

  • -an: Excludes audio (video output only).
  • -vf scale=576:1024: Scales the video to a resolution of 576×1024 pixels.
  • -b:v 1500k: Sets the videoitrata to 1500 Kbps.

Webm is selected as output format as they are smaller in size and optimized, yet wide compatible with most web browsers.

Generate MPD Manifest File

Combine video reproductions and audio tracks in a dash-compatible MPD manifest file by running the following script:

ffmpeg \
  -f webm_dash_manifest -i "input_video_576x1024_1500k.webm" \
  -f webm_dash_manifest -i "input_video_480x854_1000k.webm" \
  -f webm_dash_manifest -i "input_video_360x640_750k.webm" \
  -f webm_dash_manifest -i "audio.webm" \
  -c copy \
  -map 0 -map 1 -map 2 -map 3 \
  -f webm_dash_manifest \
  -adaptation_sets "id=0,streams=0,1,2 id=1,streams=3" \
  "input_video_manifest.mpd"
  • -f webm_dash_manifest -i "…": Specifies the inputs so that the aske video player switches between them dynamically based on network conditions.
  • -map 0 -map 1 -map 2 -map 3: Includes all video (0, 1, 2) and audio (3) in the final manifesto.
  • -adaptation_sets: Groups flows to Customizer Set:
    • id=0,streams=0,1,2: Grouping video releases for a single customization kit.
    • id=1,streams=3: Assigns the soundtrack to a separate adjustment kit.

The resulting MPD file (input_video_manifest.mpd) Describes the streams and enables adaptive bitrate streaming in MPEG-Dash.




  
    
      
      
        input_video_576x1024_1500k.webm
        
          
        
      
      
      
        input_video_480x854_1000k.webm
        
          
        
      
      
      
        input_video_360x640_750k.webm
        
          
        
      
      
    
    
    
      
      
        audio.webm
        
          
        
      
      
    
  

After completing these steps you have:

  1. Three Video Reproductions (576x1024At 480x854At 360x640)
  2. An soundtrack and
  3. An MPD manifest file.
input_video.mp4
audio.webm
input_video_576x1024_1500k.webm
input_video_480x854_1000k.webm
input_video_360x640_750k.webm
input_video_manifest.mpd

The original video input_video.mp4 Should also be stored as a relapse video source later.

Serve the output files

These output files can now be uploaded for sky storage (eg AWS S3 or CloudFlare R2) for playback. While they can be served directly from a local folder, I highly recommend storing them in cloud storage and utilizing a CDN to cache the assets for better performance. Both AWS and Cloudflare support HTTP row requests out of the box.

Building the Dash-Compatible Video Player in React

There is nothing like a true example to help understand how everything works. There are different ways we can implement a dash-compatible video player, but I will focus on an approach using React.

Install first Dash.JS NPM pack by running:

npm i dashjs

Then create a component called And initialize dash mediaclayer occurrence by pointing it to the MPD file when the component is mounted.

REF -ADDITION CALL function runs on component mounting and within the callback function, playerRef will refer to the actual DASH MediaPlayer occurrence and be tied with event listeners. We also include the original MP4 -url in Element as a relapse if the browser does not support MPEG dash.

If you are using Next.js App RouterRemember to add ‘use client’ Directive to enable Hydration of the Client page as the video player is only initialized on the client page.

Here is the full example:

import dashjs from 'dashjs'
import { useCallback, useRef } from 'react'

export const DashVideoPlayer = () => {
  const playerRef = useRef()

  const callbackRef = useCallback((node) => {
    if (node !== null) {  
      playerRef.current = dashjs.MediaPlayer().create()

      playerRef.current.initialize(node, "https://example.com/uri/to/input_video_manifest.mpd", false)
  
      playerRef.current.on('canPlay', () => {
        // upon video is playable
      })
  
      playerRef.current.on('error', (e) => {
        // handle error
      })
  
      playerRef.current.on('playbackStarted', () => {
        // handle playback started
      })
  
      playerRef.current.on('playbackPaused', () => {
        // handle playback paused
      })
  
      playerRef.current.on('playbackWaiting', () => {
        // handle playback buffering
      })
    }
  },[])

  return (
    
  )
}

Result

Observe the changes in the video file when the network connection is adjusted from fast 4G to 3G using Chrome Devtools. It changes from 480p to 360p, which shows how the experience is optimized for more or less accessible bandwidth.

ABR -example

Conclusion

That’s it! We just implemented a functioning dash-compatible video player in React to establish a video with adaptive bitrate streaming. Again, the benefits of this are rooted in performance. When we adopt ABR streaming, we request the video in smaller chunks, allowing for more immediate playback than we would get if we first needed to download the video file. And we’ve done it in a way that supports multiple versions of the same video so we can serve the best format to the user’s device.

References

Smashing editorial
(GG, YK)

Leave a Reply

Your email address will not be published. Required fields are marked *