A music composition and playback application built with Next.js, React, and the Web Audio API. This app allows users to create, edit, and play musical notes in a timeline-based interface. The project is an exploration of turning the theory of music into actionable notes, leveraging the multi-modal capabilities of LLMs (Large Language Models) to bridge the gap between abstract musical concepts and concrete, playable compositions.
The Music App aims to democratize music composition by providing an intuitive, interactive, and extensible platform for both beginners and experienced musicians. By decoupling the core logic from the UI and audio synthesis, the app is designed to be flexible and adaptable, enabling future integration with advanced tools like LLMs and support for multiple platforms (web, mobile, desktop).
The ultimate goal is to create a tool that not only simplifies music creation but also serves as an educational platform, helping users understand music theory through hands-on experimentation and real-time feedback.
Feature | Status | Notes |
---|---|---|
Playback Control | ✅ Completed | Play, pause, stop, and seek functionality implemented. |
Note Management | 🚧 In Progress | Add notes is done; remove and edit notes are in progress. |
- Add Notes | ✅ Completed | Users can add notes by clicking on the canvas. |
- Remove Notes | 🚧 In Progress | Basic removal is in progress. |
- Edit Notes | 🚧 In Progress | Editing functionality is under development. |
Audio Synthesis | ✅ Completed | Uses Web Audio API with ADSR envelopes. |
- Velocity Support | 🚧 In Progress | Higher-level abstractions for velocity are planned. |
Real-Time Visualization | ✅ Completed | Canvas-based UI with timeline, notes, and playback progress. |
Interactive UI | ✅ Completed | Add notes via canvas and control playback with buttons. |
JSON Editing | 🚧 In Progress | Basic JSON editing is implemented; improvements planned. |
LLM Integration | ⏳ Primary | Use LLMs to generate musical ideas and transform theory into notes. |
Multi-Platform Support | ⏳ Maybe | Explore mobile (React Native) and desktop (Electron/Tauri) support. |
Advanced Audio Features | ⏳ Primary | Add effects (reverb, delay) and support for multiple waveforms. |
Educational Features | ⏳ Unlikely | Tutorials, scales, chords, and intervals visualization. |
Collaboration Tools | ⏳ Unlikely | Real-time collaboration and cloud storage for compositions. |
Legend:
- ✅ Completed
- 🚧 In Progress
- ⏳ Planned
The app is designed with Clean Architecture in mind, separating concerns into distinct layers:
- Contains the business logic for playback, note management, and timeline control.
PlaybackUseCase.ts
: Implements the core algorithms for scheduling notes, handling playback states, and managing the timeline. It adheres to thePlaybackGateway
interface, ensuring that the core logic is decoupled from external systems.gateway.ts
:PlaybackGateway
: Defines the interface thatPlaybackUseCase
implements. It abstracts the core playback functionality, such as play, pause, stop, seek, and note management.PlaybackDependencies
: Defines the interface for injecting dependencies intoPlaybackUseCase
, such as the synthesizer (SynthGateway
) and timeline duration. This ensures that the core logic remains independent of specific implementations.
- Bridges the core logic with the UI and external systems.
context.tsx
: Provides a React context (PlaybackContext
) to share playback state and controls across components. It acts as the glue between the core logic and the React UI implementation.
- Implements the details of external systems like the Web Audio API and React hooks.
WebAudioSynth.ts
: Implements theSynthGateway
interface using the Web Audio API. It handles audio synthesis, including note scheduling and ADSR envelopes.useAnimationFrame.ts
: A custom hook for running animations at ~60fps.
- Contains the React components that render the user interface.
MusicUI.tsx
: The main UI component for interacting with the timeline.MusicCompositionUI.tsx
: An alternative UI for editing notes as JSON.- This layer is framework-specific and depends on React for rendering.
- Contains shared types, constants, and utilities used across layers.
types.ts
: Defines shared types likeNote
,PlaybackStatus
, andPlaybackTime
.constants.ts
: Contains constants likeTIMELINE_DURATION_SECONDS
.
The Clean Architecture approach was chosen to ensure that the app remains flexible and maintainable. By decoupling the core logic from the UI and audio synthesis, the app can easily adapt to new environments or technologies. For example:
- Replace React: The core logic is independent of React, so the UI could be rebuilt with raw JavaScript, Vue, Svelte, or even a mobile framework like React Native.
- Replace Web Audio API: The synthesizer implementation is abstracted behind the
SynthGateway
interface, making it easy to swap out the Web Audio API for another audio synthesis library or API. - Extend to LLM Integration: The core logic can be extended to integrate with LLMs for generating musical ideas or transforming theoretical concepts into playable notes.
- Node.js (v16 or higher)
- npm or yarn
-
Clone the repository:
git clone https://github.com/Vanuan/music-app.git cd music-app
-
Install dependencies:
npm install # or yarn install
-
Start the development server:
npm run dev # or yarn dev
-
Open your browser and navigate to
http://localhost:3000
.
Contributions are welcome! Please follow these steps:
- Fork the repository.
- Create a new branch (
git checkout -b feature/YourFeature
). - Commit your changes (
git commit -m 'Add some feature'
). - Push to the branch (
git push origin feature/YourFeature
). - Open a pull request.