Audio Streaming with Web Audio API and Cloudinary

Christian Nwamba

Once upon a time, if you wanted to play audio on the web, you will require Flash, Quicktime, or another plugin. Then came the HTML5 audio tag which allowed you to play audio without requiring any plugins and some other functionalities. Essentially, if you wanted to embed sound on a web page, or stream audio using the MediaStream API; the HTML5 audio tag could handle that. However, as technology keeps advancing and the demand for web browsers and applications to handle more complex functionalities, the audio tag is very limited. Hence, the Web Audio API.

What is Web Audio API

The Web Audio API provides a system for controlling audio on the Web, allowing us to choose audio sources, add audio effects, audio visualizations, apply spatial effects and much more. Some of the features of this high level web API are:

  • Processing of audio sources from an audio or video media element.
  • Processing live audio input using a MediaStream from getUserMedia().
  • Spatialized audio supporting a wide range of 3D games and immersive environments
  • A wide range of very high quality room effects including:
    • Small / large room
    • Cathedral
    • Concert hall
    • Cave
    • Tunnel e.t.c
  • Modular routing for simple or complex mixing/effect architectures.

We now have an overview of the web audio api, let’s see how it’s used.

Usage

It’s important to know that the Web audio API doesn’t replace the audio tag, rather it complements it. It depends on the context you’re working with, for instance if you want to control audio playback (operations such as volume, play/pause, previous/next), the audio tag is recommended. On the other hand, if you want to perform complex audio operations such as adding audio effects, streaming live audio input, and also playback, the audio API is best suited to handle those operations. Let’s briefly illustrate how the API works:

  • Firstly, create an Audio context. The Audio context gives us full access to the features and functionalities of the Web Audio API.
1//javascript
2const audioContext = new AudioContext();
  • Next, we will create an audio source and pass it into the audio context.
1//javascript
2<audio src="newTrack.mp3" id="audio"></audio>
3// get the audio element
4const audioElement = document.getElementById('audio');
5
6// pass it into the audio context
7const track = audioContext.createMediaElementSource(audioElement);

The createMediaElementSource() here is used to create a new MediaElementAudioSourceNode object, given an existing HTML <audio> or <video> element, the audio from which can then be played and manipulated.

  • Next, we can now create effect node to the audio such as reverb, panner, e.t.c
1//javascript
2const pannerOptions = { pan: 0 };
3const panner = new StereoPannerNode(audioCtx, pannerOptions);
  • Finally, we have to connect our audio source/input to a destination preferably our speakers.
1//javascript
2track.connect(audioContext.destination);

Now we’ve understood how the web audio API works, let’s go ahead to building a web app that requires audio functionality.

Pre-requisites

  • Knowledge of JavaScript and React
  • Node >v14 installed
  • A Cloudinary account. Sign in here
  • A Supabase account. Create one here.
  • A code editor VS Code preferably

The complete code for this tutorial is on Codesandbox.

Getting Started

Let’s start by setting up a react project with create-react-app. Run this command:

1#bash
2npx create-react-app music-app

When it’s done installing, change directory to music-app and open it on your code editor. Go ahead and install the following packages:

1npm i @chakra-ui/react @emotion/react @emotion/styled framer-motion

This adds chakra ui to your project dependencies, and go ahead to update your index.js to look like this:

1#src/index.js
2import React from 'react';
3import ReactDOM from 'react-dom/client';
4import App from './App';
5import { ChakraProvider } from '@chakra-ui/react'
6
7const root = ReactDOM.createRoot(document.getElementById('root'));
8 root.render(
9 <ChakraProvider>
10 <React.StrictMode>
11 <App />
12 </React.StrictMode>
13 </ChakraProvider>
14 );

Let’s go ahead and create our components. Firstly, create a components folder in the src directory and add these files Upload.js, MusicList.js, MusicPlayer.js.

Adding the Cloudinary Upload widget

We will be using cloudinary upload widget to add songs to our music list. We will make use of the CDN to integrate the upload widget into our app. We will add this CDN using React Helmet. Helmet is a reusable React component will manage all of your changes to the document head. Install helmet by running npm i react-helmet

Go ahead and modify your App.js to look like this:

1//javascript
2import { Box } from "@chakra-ui/react";
3import MusicList from "./components/music";
4import "./App.css";
5import { Helmet } from "react-helmet";
6import Upload from "./components/Upload";
7
8export default function App() {
9 return (
10 <Box width="1200px" margin="auto" padding="2rem">
11 <Helmet>
12 <meta charset="UTF-8" />
13 <script src="https://widget.Cloudinary.com/v2.0/global/all.js"
14 type="text/javascript"
15 ></script>
16 </Helmet>
17 <Upload />
18 <MusicList />
19 </Box>
20 );
21}

The widget requires your cloudinary upload_preset and cloud_name. Head over to Cloudinary Settings and click on the Upload tab, and scroll down to upload presets to create one.

I have already created some for myself, you can go ahead to create an upload preset by clicking on the Add upload preset link. Take note of the preset name, we’ll be using it soon. You can find your cloudinary cloud name on the Dashboard tab.

Now go ahead to update your Update.js fie with the following lines of code:

1//src/components/Upload.js
2
3import { Box, Button } from "@chakra-ui/react";
4import { useState } from 'react';
5
6export default function Upload() {
7const [title, setTitle] = useState("");
8const [url, setUrl] = useState("");
9
10const openWidget = () => {
11 // create the widget
12 const widget = window.cloudinary.createUploadWidget(
13 {
14 cloudName: "<your cloud name>",
15 uploadPreset: "<your upload preset>"
16 },
17 (error, result) => {
18 if (result.event === "success") {
19 if (result.info.is_audio === true) {
20 console.log(result)
21 }
22 } else {
23 console.log(error)
24 }
25 }
26 );
27 widget.open(); // open up the widget after creation
28};
29
30
31return (
32 <Box
33 p="3"
34 px="4"
35 mb="3"
36 >
37 <Button
38 onClick={() => openWidget()}
39 variant="solid"
40 width="10rem"
41 padding=".5rem"
42 colorScheme='teal'
43 >
44 Upload Song
45 </Button>
46 </Box>
47)
48}

Here, we have a button that triggers our upload widget open. Save and run the app. You should see something like this when you click the upload button.

Awesome. Right now we can use the upload widget but we have not yet added functionality on our app. Let’s go ahead to save the music we will be uploading with a supabase db.

Adding Supabase DB

Supabase claims to be the open source version of Firebase. It provides all the backend services you need to build a product. For the purpose of our article, we’ll be using the DB service only. Go ahead and install the JavaScript package with npm i @supabase/supabase-js. Ensure you’ve created a supabase account. Follow these steps:

  • Create a new project
  • Copy the annon public key
  • Copy the URL

Create a src/client.js file and add these lines of code:

1// src/client.js
2import { createClient } from "@supabase/supabase-js";
3const URL = process.env.REACT_APP_SUPABASE_URL;
4const PUBLIC_SECRET =process.env.REACT_APP_SUPABASE_PUBLIC_SECRET;
5export const supabase = createClient(URL, PUBLIC_SECRET)

Create a .env file and add the required values.

Create a table in supabase and call it any name of your choice, i called mine audio-player. Also create columns for url and title. You should have a screen that looks like this:

Now, we will go ahead and update our components/Upload.js file to look like

1// src/components/Upload.js
2
3import { Box, Button } from "@chakra-ui/react";
4import { supabase } from "../client";
5import { useState } from 'react';
6
7export default function Upload() {
8 const [title, setTitle] = useState("");
9 const [url, setUrl] = useState("");
10 const openWidget = () => {
11 // create the widget
12 const widget = window.cloudinary.createUploadWidget(
13 {
14 cloudName: "sammy365",
15 uploadPreset: "tutorial"
16 },
17 (error, result) => {
18 if (result.event === "success") {
19 if (result.info.is_audio === true) {
20 setUrl(result.info.secure_url);
21 setTitle(result.info.original_filename);
22 console.log(result)
23 }
24 } else {
25 console.log(error)
26 }
27 }
28 );
29 widget.open(); // open up the widget after creation
30};
31
32const createSong = async () => {
33 await supabase
34 .from("audio-player")
35 .insert([
36 {
37 url, title
38 }
39 ]).single();
40 }
41
42if (url && title) {
43 createSong();
44 }
45
46return (
47 <Box
48 p="3"
49 px="4"
50 mb="3"
51 >
52 <Button
53 onClick={() => openWidget()}
54 variant="solid"
55 width="10rem"
56 padding=".5rem"
57 colorScheme='teal'
58 >
59 Upload Song
60 </Button>
61 </Box>
62)
63}

Here, we have a createSong() function inserts the value of url and title to the table. Let’s now go ahead to create our component that retrieves and displays the songs we saved in our db. Naviage to your MusicList.js and add these lines of code:

1// src/components/MusicList.js
2
3import { Box, Center, Heading } from "@chakra-ui/react";
4import MusicPlayer from "./MusicPlayer";
5import { supabase } from "../client";
6import { useState, useEffect } from 'react';
7
8export default function Music() {
9 const [music, setMusic] = useState([]);
10 const fetchSongs = async () => {
11 const { data } = await supabase.from("audio-player").select();
12 setMusic(data);
13 }
14
15useEffect(() => {
16 fetchSongs()
17}, []);
18
19return (
20 <>
21 <Heading as="h2"> Uploaded Songs </Heading>
22 <Box display="grid" gridTemplateColumns='repeat(3, 1fr)' gridGap={4} mt="8">
23 {music.map((m, key) => (
24 <MusicPlayer music={m} index={key} />
25 ))}
26 </Box>
27 <Center>
28 {
29 music.length === 0 ?
30 <Box>
31 <Heading as="h6" size="lg"> No song has been uploaded</Heading>
32 </Box>
33 : ""
34 }
35 </Center>
36 </>
37)
38}

Here, we are simply fetching music data from our supabase table and passing it to a MusicPlayer component. Let’s go ahead to create the MusicPlayer component. Add the following lines of code in components/MusicPlayer.js

1// components/MusicPlayer.js
2
3import moment from "moment";
4import { Box, Heading, Spacer } from "@chakra-ui/react";
5
6export default function MusicPlayer({ music, index }) {
7 return (
8 <Box key={index} border='1px' borderColor='gray.200' boxShadow='lg' p="6" >
9 <Box>
10 <Spacer mt="6">
11 {moment(music.created_at).format("MMMM Do YYYY")}
12 </Spacer>
13 </Box>
14 <Box mt="8">
15 <Heading as="h3" size="lg" mb="4"> {music.title} </Heading>
16 <audio
17 src={music.url}
18 controls
19 volume={0.5}
20 />
21 </Box>
22 </Box>
23 )
24 }

Here, we are retrieving data from our db table. If you also recognize, we use the HTML5 audio tag as our music player. You can now go ahead to save and run the app.

https://www.dropbox.com/s/ablauh3ezea0rpx/screencast-nimbus-capture-2022.07.19-16_05_21.webm?dl=0

Awesome. If you notice they are two problems:

  1. On our supabase table, 2 rows are created instead of ome
  2. When we click on Done on cloudinary, our app should show the uploaded files, instead of refreshing manually to see the uploaded files.

I’ll encourage you to take a look at these problems and solve them.

Conclusion

In this post, we learned about the Web audio API, and how it works. We then went on to build a music app with Reactjs and Cloudinary. I hope you’ve learned something valuable from this piece.

Happy Coding!

Further Reading

Christian Nwamba

Developer Advocate at AWS

A software engineer and developer advocate. I love to research and talk about web technologies and how to delight customers with them.