H.265 web player technology implementation

Jarry
3 min readJun 23, 2020

--

With the development of video encoding technology, compared with H.264, H.265 has only half the volume of the same picture quality, half of the bandwidth consumption, and more detailed picture quality. However, the web browser does not support the decoding of H.265. so we make the project based on Web Assembly (from FFmpeg), JS demux, paint by Canvas, and AudioContext to achieve H265 play on the web browsers.

Here is an open-source for the H.265 player. https://github.com/goldvideo/h265player

  1. DEMO

https://omc3i.codesandbox.io/

Screenshot 1, H2.65 Web Player
Screenshot 2, H2.65 video player

2. Main Architecture

The architecture of H.265 player

The player is divided into four parts of UI: Loader, Data processing, Data rendering, and 3 threads. The threads One is the main thread, including interface control, download control, data flow control, audio and video control, and other functions; data loading thread to complete requests for metadata and data fragments; data processing thread to complete data decapsulation and decoding.

  • UI: The player has two parts that contain the screen and controller. The screen includes video images, popups, posters, etc. And the controller includes components such as progress bar, play button, and volume control.
  • Loader: responsible for loading and parsing of media data, currently only supports HLS protocol. The data request through by the web worker. After the loading is completed, the requested ts data is stored according to the set cache size, and the loading is stopped when the cache upper limit is reached. After the decoder obtains ts from the ts data queue, HlS Loader will release the requested ts, continue to load the next ts, and stop loading after reaching the maximum cache limit
  • Data processing: mainly including data demux and H265 decoding. Demux is realized by the demuxe.js module, H265 decoding worked by the wasm software solution generated by ffmpeg packaging, and the CPU usage rate is high.
  • Data rendering: including video rendering and audio rendering. Video rendering uses ImagePlayer to render the decoded yuv data directly to the canvas. Audio uses AudioPlayer to decode the AAC data for audio playback. Finally, audio and video synchronization is achieved through pts. The synchronization strategy uses audio as a reference to determine the difference between the current video pts and the get audio pts to adjust the video display time to achieve audio and video synchronization.

2. Decode progress

decode progress

Five-step:

First, use XMLHttp to request m3u8 and get segments.

Second, use the demuxer tool to demux segment.

Third, use WAMS to decode the video binary code.

Fourth, draw Canvas from the `yuv` data of image.

Fifth, use the browser’s `audioContext` to play the AAC audio.

if you want to know more, please visit: https://github.com/goldvideo/h265player

--

--

Jarry
Jarry

No responses yet