Youtube-like "Click to Unmute" Feature in NextJS

Banner for a MediaJam post

Tosin Moronfolu

Introduction

When you're about to watch a YouTube video on a mobile device, at first, the video is on mute. Then you'd see a button at the top left corner for unmuting the video.

This article will walk you through implementing that feature using video events in Next.js.

Sandbox

The completed project is on CodeSandbox. Fork it and run the code.

GitHub repository

https://github.com/folucode/click-to-unmute-nextjs-demo

Prerequisite

To follow through with this article, we need to have:

  • An understanding of JavaScript and Next.js.
  • A Cloudinary Account. You can create a free one if you don't already have one.

Project Setup

Node and its package manager npm are required to initialize a new project.

Using npm would require that we download the package to our computer before using it, but npm comes with an executor called npx.

npx stands for Node Package Execute. It allows us to execute any package we want from the npm registry without installing it.

To install Node, we go to the Nodejs website and follow the instructions. We verify Node.js’ installation using the terminal command below:

1node -v
2v16.10.0 //node version installed

The result shows the version of Node.js we installed on our computer.

Create a Next.js application

We'll create our Next.js app using create-next-app, which automatically sets up a boilerplate Next.js app. To create a new project, run:

1npx create-next-app@latest <app-name>
2# or
3yarn create next-app <app-name>

After the installation is complete, change directory into the app we just created:

1cd <app-name>

Now we run npm run dev or yarn dev to start the development server on http://localhost:3000.

Installing Cloudinary

Cloudinary provides a rich media management experience enabling users to upload, store, manage, manipulate, and deliver images and videos for websites and applications.

Due to the performance benefits of serving images from an optimized Content Delivery Network, we’ll use images stored on Cloudinary. We’ll utilize the Cloudinary-react package to render Cloudinary videos on a page. We Install the cloudinary-react package in the project using npm by running the following command in the project’s root directory:

1npm i cloudinary-react

With installations completed, we’ll start the react application using the command below:

1npm run dev

Once run, the command spins up a local development server which we can access on http://localhost:3000.

Rendering the parent video component

In the “index.js” file in the “pages” folder, we replace the boilerplate code with the code below which renders a Cloudinary video.

1import { Video, CloudinaryContext } from 'cloudinary-react';
2
3return (
4 <div className='App'>
5 <h1>Create youtube-like `click to unmute` feature in Next.js</h1>
6 <div className='video-area'>
7 <CloudinaryContext cloudName='chukwutosin'>
8 <Video
9 publicId='Dog_Barking'
10 controls
11 muted={true}
12 width='500px'
13 innerRef={videoRef}
14 autoPlay
15 />
16 </CloudinaryContext>
17 </div>
18 </div>
19);
20}

In the code above, we first imported the Video component that renders the video onto the page and CloudinaryContext. This component takes props of any data we want to make available to its child Cloudinary components.

The cloudName prop is our Cloudinary cloud name which we get from our Cloudinary dashboard.

We use the Video component to render a 500px wide video player , having controls, and is muted on playback. We then pass a prop called publicId, which is the public ID of the video we want to render. A public ID is a unique identifier for a media asset stored on Cloudinary.

The rendered video should look like this:

Accessing the methods and properties of the video

We can seamlessly access the methods and properties of the video element using the innerRef prop provided by the cloudinary-react package.

By creating a reference using React’s useRef() hook and assigning it to the innerRef property, we can utilize the properties and methods of the underlying video element. Subsequently, all the video element’s available properties and methods will be on the .current property of the ref.

We create a ref and assign it with:

1import { useRef } from 'react'; // change 1
2import { Video, CloudinaryContext } from 'cloudinary-react';
3
4export default function IndexPage() {
5 const videoRef = useRef(); // change 2
6
7 return (
8 <div className='App'>
9 <h1>Create youtube-like `click to unmute` feature in Next.js</h1>
10 <div className='video-area'>
11 <CloudinaryContext cloudName='chukwutosin'>
12 <Video
13 publicId='Dog_Barking'
14 controls
15 muted
16 width='500px'
17 innerRef={videoRef} // change 3
18 autoPlay
19 />
20 </CloudinaryContext>
21 </div>
22 </div>
23 );
24}

Managing video events

When the video playback starts, the video is on mute, and then once the unmute button is clicked, we trigger an event on the video.

Using HTML's video events, we'll change the state of the video player.

We create a function to handle the event change with:

1import { Video, CloudinaryContext } from "cloudinary-react";
2import { useRef } from "react";
3
4export default function IndexPage() {
5 const videoRef = useRef();
6
7 const unmute = () => {
8 let video = videoRef.current;
9 video.muted = false;
10 };
11
12 return (
13 // render component here
14 );
15}

First, we created a function name unmute to handle the change of event value on the video when a button is clicked.

We then get the current video with the .current property of the ref. Next, we update the muted event of the video to false.

Attaching the unmute function to a button

We start by creating a regular HTML button element with a value of unmute and then add an onClick handler like so:

1import { useRef } from 'react';
2import { Video, CloudinaryContext } from 'cloudinary-react';
3
4export default function IndexPage() {
5 ...
6
7 return (
8 <div className='App'>
9 ...
10 <div className='button-area'>
11 <button type='button' onClick={() => unmute()}>
12 unmute
13 </button>
14 </div>
15 ...
16 );
17}

The unmute function is put in an arrow function so it won't be called immediately after the component renders.

We need to properly style our app to align the different elements, especially the unmute button. We want the button to sit on the video element in the top left corner.

We do this by deleting the boilerplate styles in the globals.css file in the styles folders and then adding the following styles:

1.App {
2 font-family: Cambria, Cochin, Georgia, Times, "Times New Roman", serif;
3 text-align: center;
4 margin: 0;
5 padding: 0;
6 width: auto;
7 height: 100vh;
8}
9h3 {
10 color: rgb(46, 44, 44);
11}
12.video-area {
13 position: relative;
14 width: 500px;
15 margin: auto;
16 left: calc(50% — 200px);
17}
18.button-area {
19 left: 50%;
20 margin-left: -250px;
21 width: 55px;
22 height: 20px;
23 padding: 5px;
24 position: absolute;
25 z-index: 1;
26 background: rgba(0, 0, 0, 0.6);
27}
28button {
29 text-align: center;
30 border: solid 1px rgb(255, 255, 255);
31 width: 100%;
32 height: 20px;
33}

The styles will be automatically added to our component by Next.js.

After completing that step, we should see our app look like this:

Now, once our video starts playing, it is muted and immediately the unmute button is clicked, the video is then unmuted.

Conclusion

This article addressed utilizing video events to create, mute, and unmute a video when it starts playing and when a button is clicked.

Resources

You may find these resources helpful.

Tosin Moronfolu

Software Engineer

I'm a software engineer with hands-on experience in building robust and complex applications improving performance and scalability. I love everything tech and science, especially physics and maths. I also love to read books, mainly non-fiction, and I love music; I play the guitar.