Building an Audio Player App with Next.js

Asharib Ali
8 min readOct 7, 2024

--

Hey everyone, I hope you’re all doing well and enjoying your coding journey, Am I right? By the way, today marks the 30th (last) day of the 30-day of 30-projects challenge. Our 30th project will be creating a mini Next.js application — a audio player. Please make sure to clap and comment on this blog post. It motivates me to create more amazing content like this, Let’s get started straight away.

Overview of the Mini Next.js Application

Our Audio Player application allows users to:

  • Upload and play audio files
  • Basic controls (play, pause, skip)
  • Display current track and playback time

Tech-Stack Used:

  • Next.js: A React framework for building full-stack web applications.
  • React: A JavaScript library for building user interfaces.
  • Tailwind CSS: A utility-first CSS framework for styling.
  • Shadcn UI: Beautifully designed tailwindcss components that you can copy and paste into your application.
  • Vercel: For deploying the Nextjs web application.

Initialize a Nextjs project

Start by following this guide to set up the new project.

  • Go to the “components” folder.
  • Create a new file named “audio-player.tsx”.
  • This file will manage the entire functionality of the project.

We will go through the code step-by-step to make it easy to understand.

Component Breakdown

Import Statements

"use client";

import React, { useState, useRef, useEffect } from "react";
import { Button } from "@/components/ui/button";
import { Card, CardContent } from "@/components/ui/card";
import { Progress } from "@/components/ui/progress";
import {
ForwardIcon,
PlayIcon,
RewindIcon,
UploadIcon,
PauseIcon,
} from "lucide-react";
import Image from "next/image";

These import statements include React hooks, various custom components from Shadcn UI, and icons from Lucide React for building the user interface.

Define Types and Component State

interface AudioPlayerComponentProps {}

interface Track {
title: string;
artist: string;
src: string;
}

const AudioPlayerComponent: React.FC<AudioPlayerComponentProps> = () => {
const [tracks, setTracks] = useState<Track[]>([]);
const [currentTrackIndex, setCurrentTrackIndex] = useState<number>(0);
const [isPlaying, setIsPlaying] = useState<boolean>(false);
const [progress, setProgress] = useState<number>(0);
const [currentTime, setCurrentTime] = useState<number>(0);
const [duration, setDuration] = useState<number>(0);
const audioRef = useRef<HTMLAudioElement | null>(null);

This component manages state for the list of tracks, the current track index, play/pause status, progress, current time, and duration of the audio track using React’s useState hook. It also uses a ref to manage the audio element.

Handle File Upload

const handleUpload = (event: React.ChangeEvent<HTMLInputElement>) => {
const files = event.target.files;
if (files) {
const newTracks: Track[] = Array.from(files).map((file) => ({
title: file.name,
artist: "Unknown Artist",
src: URL.createObjectURL(file),
}));
setTracks((prevTracks) => [...prevTracks, ...newTracks]);
}
};

This function handles file uploads, creates new track objects from the uploaded files, and updates the state with the new tracks.

Handle Play/Pause Toggle

const handlePlayPause = () => {
if (isPlaying) {
audioRef.current?.pause();
setIsPlaying(false);
} else {
audioRef.current?.play();
setIsPlaying(true);
}
};

This function toggles the play/pause state of the audio player.

Handle Track Navigation

const handleNextTrack = () => {
setCurrentTrackIndex((prevIndex) => (prevIndex + 1) % tracks.length);
};

const handlePrevTrack = () => {
setCurrentTrackIndex((prevIndex) =>
prevIndex === 0 ? tracks.length - 1 : prevIndex - 1
);
};

These functions handle navigation to the next and previous tracks.

Handle Time Updates and Metadata

const handleTimeUpdate = () => {
if (audioRef.current) {
setCurrentTime(audioRef.current.currentTime);
setProgress(
(audioRef.current.currentTime / audioRef.current.duration) * 100
);
}
};

const handleLoadedMetadata = () => {
if (audioRef.current) {
setDuration(audioRef.current.duration);
}
};

These functions handle updates to the current time and progress of the track and set the duration of the track when the metadata is loaded.

Format Time

const formatTime = (time: number) => {
const minutes = Math.floor(time / 60);
const seconds = Math.floor(time % 60);
return `${minutes}:${seconds < 10 ? "0" : ""}${seconds}`;
};

This function formats the time in minutes and seconds.

Handle Track Change

useEffect(() => {
if (audioRef.current) {
audioRef.current.pause();
audioRef.current.src = tracks[currentTrackIndex]?.src || "";
audioRef.current.load();
audioRef.current.currentTime = 0;
setCurrentTime(0);
setProgress(0);
if (isPlaying) {
audioRef.current.play();
}
}
}, [currentTrackIndex, tracks, isPlaying]);

This useEffect hook handles changes to the current track, pauses the current track, loads the new track, and resets the progress and current time.

Render the Audio Player UI

return (
<div className="flex flex-col items-center justify-center h-screen bg-background text-foreground">
<div className="max-w-md w-full space-y-4">
<div className="flex items-center justify-between">
<h1 className="text-2xl font-bold">Audio Player</h1>
<label className="flex items-center cursor-pointer">
<UploadIcon className="w-5 h-5 mr-2" />
<span>Upload</span>
<input
type="file"
accept="audio/*"
multiple
className="hidden"
onChange={handleUpload}
/>
</label>
</div>
<Card>
<CardContent className="flex flex-col items-center justify-center gap-4 p-8">
<Image
src="/music.svg"
alt="Album Cover"
width={100}
height={100}
className="rounded-full w-32 h-32 object-cover"
/>
<div className="text-center">
<h2 className="text-xl font-bold">
{tracks[currentTrackIndex]?.title || "Audio Title"}
</h2>
<p className="text-muted-foreground">
{tracks[currentTrackIndex]?.artist || "Person Name"}
</p>
</div>
<div className="w-full">
<Progress value={progress} />
<div className="flex justify-between text-sm text-muted-foreground">
<span>{formatTime(currentTime)}</span>
<span>{formatTime(duration)}</span>
</div>
</div>
<div className="flex items-center gap-4">
<Button variant="ghost" size="icon" onClick={handlePrevTrack}>
<RewindIcon className="w-6 h-6" />
</Button>
<Button variant="ghost" size="icon" onClick={handlePlayPause}>
{isPlaying ? (
<PauseIcon className="w-6 h-6" />
) : (
<PlayIcon className="w-6 h-6" />
)}
</Button>
<Button variant="ghost" size="icon" onClick={handleNextTrack}>
<ForwardIcon className="w-6 h-6" />
</Button>
</div>
<audio
ref={audioRef}
onTimeUpdate={handleTimeUpdate}
onLoadedMetadata={handleLoadedMetadata}
/>
</CardContent>
</Card>
</div>
</div>
);
};

export default AudioPlayerComponent;

This snippet renders the Audio Player UI, including file upload, playback controls, track details, and playback progress.

(Bonus just for you): Full Code with Comments

"use client"; // Enables client-side rendering for this component

import React, { useState, useRef, useEffect } from "react"; // Import React hooks
import { Button } from "@/components/ui/button"; // Import custom Button component
import { Card, CardContent } from "@/components/ui/card"; // Import custom Card components
import { Progress } from "@/components/ui/progress"; // Import custom Progress component
import {
ForwardIcon,
PlayIcon,
RewindIcon,
UploadIcon,
PauseIcon,
} from "lucide-react"; // Import icons from lucide-react
import Image from "next/image"; // Import Next.js Image component

// Define types for the component props and state
interface AudioPlayerProps {}

// Define the Track interface
interface Track {
title: string;
artist: string;
src: string;
}

const AudioPlayer: React.FC<AudioPlayerProps> = () => {
const [tracks, setTracks] = useState<Track[]>([]); // State to manage the list of tracks
const [currentTrackIndex, setCurrentTrackIndex] = useState<number>(0); // State to manage the current track index
const [isPlaying, setIsPlaying] = useState<boolean>(false); // State to manage the play/pause status
const [progress, setProgress] = useState<number>(0); // State to manage the progress of the current track
const [currentTime, setCurrentTime] = useState<number>(0); // State to manage the current time of the track
const [duration, setDuration] = useState<number>(0); // State to manage the duration of the track
const audioRef = useRef<HTMLAudioElement | null>(null); // Ref to manage the audio element

// Function to handle file upload
const handleUpload = (event: React.ChangeEvent<HTMLInputElement>) => {
const files = event.target.files;
if (files) {
const newTracks: Track[] = Array.from(files).map((file) => ({
title: file.name,
artist: "Unknown Artist",
src: URL.createObjectURL(file),
}));
setTracks((prevTracks) => [...prevTracks, ...newTracks]);
}
};

// Function to handle play/pause toggle
const handlePlayPause = () => {
if (isPlaying) {
audioRef.current?.pause();
setIsPlaying(false);
} else {
audioRef.current?.play();
setIsPlaying(true);
}
};

// Function to handle next track
const handleNextTrack = () => {
setCurrentTrackIndex((prevIndex) => (prevIndex + 1) % tracks.length);
};

// Function to handle previous track
const handlePrevTrack = () => {
setCurrentTrackIndex((prevIndex) =>
prevIndex === 0 ? tracks.length - 1 : prevIndex - 1
);
};

// Function to handle time update of the track
const handleTimeUpdate = () => {
if (audioRef.current) {
setCurrentTime(audioRef.current.currentTime);
setProgress(
(audioRef.current.currentTime / audioRef.current.duration) * 100
);
}
};

// Function to handle metadata load of the track
const handleLoadedMetadata = () => {
if (audioRef.current) {
setDuration(audioRef.current.duration);
}
};

// Function to format time in minutes and seconds
const formatTime = (time: number) => {
const minutes = Math.floor(time / 60);
const seconds = Math.floor(time % 60);
return `${minutes}:${seconds < 10 ? "0" : ""}${seconds}`;
};

// useEffect to handle track change
useEffect(() => {
if (audioRef.current) {
audioRef.current.pause();
audioRef.current.src = tracks[currentTrackIndex]?.src || "";
audioRef.current.load();
audioRef.current.currentTime = 0;
setCurrentTime(0); // Reset the current time for the new track
setProgress(0); // Reset the progress for the new track
if (isPlaying) {
audioRef.current.play();
}
}
}, [currentTrackIndex, tracks, isPlaying]);

// JSX return statement rendering the Audio Player UI
return (
<div className="flex flex-col items-center justify-center h-screen bg-background text-foreground">
<div className="max-w-md w-full space-y-4">
<div className="flex items-center justify-between">
<h1 className="text-2xl font-bold">Audio Player</h1>
<label className="flex items-center cursor-pointer">
<UploadIcon className="w-5 h-5 mr-2" />
<span>Upload</span>
<input
type="file"
accept="audio/*"
multiple
className="hidden"
onChange={handleUpload}
/>
</label>
</div>
<Card>
<CardContent className="flex flex-col items-center justify-center gap-4 p-8">
<Image
src="/music.svg"
alt="Album Cover"
width={100}
height={100}
className="rounded-full w-32 h-32 object-cover"
/>
<div className="text-center">
<h2 className="text-xl font-bold">
{tracks[currentTrackIndex]?.title || "Audio Title"}
</h2>
<p className="text-muted-foreground">
{tracks[currentTrackIndex]?.artist || "Person Name"}
</p>
</div>
<div className="w-full">
<Progress value={progress} />
<div className="flex justify-between text-sm text-muted-foreground">
<span>{formatTime(currentTime)}</span>
<span>{formatTime(duration)}</span>
</div>
</div>
<div className="flex items-center gap-4">
<Button variant="ghost" size="icon" onClick={handlePrevTrack}>
<RewindIcon className="w-6 h-6" />
</Button>
<Button variant="ghost" size="icon" onClick={handlePlayPause}>
{isPlaying ? (
<PauseIcon className="w-6 h-6" />
) : (
<PlayIcon className="w-6 h-6" />
)}
</Button>
<Button variant="ghost" size="icon" onClick={handleNextTrack}>
<ForwardIcon className="w-6 h-6" />
</Button>
</div>
<audio
ref={audioRef}
onTimeUpdate={handleTimeUpdate}
onLoadedMetadata={handleLoadedMetadata}
/>
</CardContent>
</Card>
</div>
</div>
);
};

export default AudioPlayer;

Okay, you’ve completed the main component with functional UI. Now, you need to import this component into the app directory to use it in the app/page.tsx file. Your final code should look like this:

import AudioPlayer from "@/components/audio-player";

export default function Home() {
return (
<div>
<AudioPlayer />
</div>
);
}

Running the Project

To see the audio player in action, follow these steps:

  1. Start the Development Server: Run npm run dev to start the development server.
  2. Open in Browser: Open http://localhost:3000 in your browser to view the application.

Make sure to test it properly (each and everything) so that we don’t have any errors in the production mode aka when we host on the internet.

Now, we want people to see our application on the internet. All you have to do is create a repository on GitHub and then push your code to it. After that, deploy the Audio Player application using Vercel.

Once you’re done with the deployment, please share the application link with me by commenting on this blog post, on Linkedin, and (most importantly, on X a.k.a. Twitter. Tag me there, and I’ll reply and appreciate your efforts) 👀

(Optional): One thing you can do on your own is to add new functionalities, enhance the styling, and improve the overall application. This way, you’ll learn something new by making modifications.

✨ Star Github Repository of the project 👈

Conclusion

In this blog post, we built an Audio Player application using Next.js. We covered:

  • Setting up the project and using client-side rendering.
  • Handling file uploads and managing audio playback.
  • Displaying track details and playback controls.
  • Managing state and user interactions in a React application.

Alhamdulillah, we’ve done the 30 days of 30 projects challenge.

Happy coding!

Stay updated with the latest in cutting-edge technology! Follow me:

Thanks for reading!

--

--

Asharib Ali
Asharib Ali

Written by Asharib Ali

✨ I build & teach about AI and Blockchain stuffs⚡

Responses (4)

Write a response