Blind people are unable to concurrently "read and play" the score but they have incredible tactile and auditory abilities, this project focuses on playing the time beats and the notes on mobile devices using the vibration functionality.
Notes Sound Generator
is a MuseScore 3.x plugin for blind people; it is capable of reading the current score (which can be edited with MuseScore itself) and extracting, at playing time, the notes and the time signature of the score.
My solution uses the MuseScore plugin development stack, the Alphatab JS library to extract the notes and the time signature from a score, and the JavaScript WebSocket protocol to send data to mobile devices.
The score is a MusicXML file generated from the developed MuseScore plugin; this plugin executes the job of exporting the current score (eventually edited with the program itself) in MusicXML format and then passing it to a webpage where an instance of Alphatab is executed. The webpage is able, exploiting the low-level APIs of the library, to play the score and vibrate on mobile devices the time beats and the currently played notes.
In the implementation, the playing of the time beats
is communicated by descriptive text, points and a smartwatch vibration; text and points are colored in red for the first beat of a bar, and in green for the other beats.
The current played notes
are communicated by descriptive text which indicates the MIDI note number and the duration.
A partial
portion of the plugin
code is shown below; it emphasizes its behavior.
When the plugin is run, it saves
the current score
in a MusicXML file named 'new-exported.musicxml'
in the same folder
of the AlphaTab instance, and then opens a new web browser page
with the URL of the AlphaTab instance by specifying the parameter filename
.
function openGenerator(filePath, filename) {
var newFilePath = filePath + "/src/webpage/" + filename;
if (!writeScore(curScore, newFilePath, "musicxml")) {
alert.text = "Cannot export the current score, try again.";
alert.open();
return;
}
Qt.openUrlExternally("http://localhost:8000?filename=" + filename);
Qt.quit();
}
onRun: {
var filename = "new-exported.musicxml";
openGenerator(filePath, filename);
}
The library is initialized by setting the master volume
to zero in order to avoid the sound playing of the score, and the file
parameter is set to the URL param of the page if it is specified. This parameter will be set from the MuseScore plugin with the filename of the file exported from itself.
Two WebSockets are opened in order to send the beat and notes infos to the mobile devices.
var timeWebSocket = new WebSocket("ws://localhost:8080/time");
var notesWebSocket = new WebSocket("ws://localhost:8080/notes");
const urlParams = new URLSearchParams(window.location.search);
const urlFileName = urlParams.get("filename");
const settings = {
file: urlFileName ?? "/file.musicxml",
player: {
enablePlayer: true,
enableCursor: true,
enableUserInteraction: true,
soundFont: "/dist/soundfont/sonivox.sf2",
scrollElement: wrapper.querySelector(".at-viewport"),
},
};
let api = new alphaTab.AlphaTabApi(main, settings);
api.masterVolume = 0;
This event is fired every time a score is loaded. When it's fired, the createMetronome
function is called which is responsible for building the timeSignaturePauses
array that is used to play the time signature beats
.
api.scoreLoaded.on((score) => {
trackList.innerHTML = "";
score.tracks.forEach((track) => {
trackList.appendChild(createTrackItem(track));
});
createMetronome(score);
});
This function creates the timeSignaturePauses
array which is used to play the time signature beats
. It iterates over the score's bars and calculates the wait time after a beat
is played, by using the tempoAutomation
value of a bar. The function also determines if the beat is the first beat in the bar.
function createMetronome(score) {
let tempoAutomation = 0;
score.masterBars.forEach((bar) => {
if (
bar.tempoAutomation != null &&
tempoAutomation != bar.tempoAutomation.value
) {
tempoAutomation = bar.tempoAutomation.value;
}
let barDuration =
parseFloat(60 / parseInt(tempoAutomation)) *
parseInt(bar.timeSignatureNumerator);
if (parseInt(bar.timeSignatureNumerator) == 0) return;
let beatsWaitTime = barDuration / parseInt(bar.timeSignatureNumerator);
for (
let index = 1;
index <= parseInt(bar.timeSignatureNumerator);
index++
) {
if (index == 1) {
timeSignaturePauses.push({
waitTime: beatsWaitTime,
isFirstBeat: true,
});
} else {
timeSignaturePauses.push({
waitTime: beatsWaitTime,
isFirstBeat: false,
});
}
}
});
}
This event is fired every time the user clicks on the play
or pause
buttons.
When the play
is fired, a new metronomeWorker
Web Worker is started which is responsible for sending the beat
to be played and waiting for the pause
time indicated in the timeSignaturePauses
array. The playing of a beat is fired through the metronomeWorker.onmessage
callback and the timeSignaturePauses
array is sent to the Web Worker via the metronomeWorker.postMessage
function. When the playing of a beat is fired, a message via the WebSocket is sent to the mobile devices.
When the pause
is fired, the previously metronomeWorker
Web Worker started, is terminated.
playPause.onclick = (e) => {
if (e.target.classList.contains("disabled")) {
return;
}
if (e.target.classList.contains("fa-play")) {
let currentBarIndex = getCurrentBarIndex(api.tickPosition);
api.tickPosition = api.score.masterBars[currentBarIndex].start;
metronomeWorker = new Worker("/js/metronomeWorker.js");
beatLogger.innerHTML = "";
metronomeWorker.postMessage({
startIndex: currentBarIndex,
pauses: timeSignaturePauses,
});
metronomeWorker.onmessage = function (message) {
if (timeWebSocket.readyState != 1) return;
if (message.data.isFirstBeat) {
beatLogger.innerHTML = '<p style="color: green;">BEAT</p>';
highlightBeat("green");
} else {
beatLogger.innerHTML += '<p style="color: red;">BEAT</p>';
highlightBeat("red");
}
timeWebSocket.send(
JSON.stringify({ isFirstBeat: message.data.isFirstBeat })
);
beatLogger.scrollTo(0, beatLogger.scrollHeight);
};
api.playPause();
} else if (e.target.classList.contains("fa-pause")) {
api.playPause();
noteLogger.innerHTML = "";
beatLogger.innerHTML = "";
metronomeWorker.terminate();
}
};
The metronomeWorker
Web Worker
Every time that the worker is launched, it iterates over the timeSignaturePauses
array by sending the beat
message and then waiting for the waitTime
specified in the current element, until the array is entirely consumer or the Web Worker is terminated.
{
function sleep(delay) {
var start = new Date().getTime();
while (new Date().getTime() < start + delay);
}
self.onmessage = function (message) {
let timeSignaturePauses = message.data.pauses;
let startIndex = message.data.startIndex;
for (let index = startIndex; index < timeSignaturePauses.length; index++) {
const element = timeSignaturePauses[index];
self.postMessage(element);
sleep(element.waitTime * 1000);
}
};
}
This event is fired every time a note (or a group of notes) in the score is played. When it's fired, a message via the WebSocket is sent to the mobile devices, and the DOM element noteLogger
content is replaced with the description of the current notes played; the notes are extracted from the activeBeats
variable in the args
parameter.
const noteLogger = document.getElementById("note-logger");
api.activeBeatsChanged.on((args) => {
noteLogger.innerHTML = "";
for (let index = 0; index < args.activeBeats.length; index++) {
const duration = args.activeBeats[index].duration;
const noteValues = Array.from(
args.activeBeats[index].noteValueLookup.keys()
);
let i = 0;
for (i = 0; i < noteValues.length; i++) {
noteLogger.innerHTML +=
'<p style="text-align: center;">Note ' +
noteValues[i] +
" (" +
duration +
")</p>";
}
noteLogger.scrollTo(0, noteLogger.scrollHeight);
}
if (notesWebSocket.readyState != 1) return;
notesWebSocket.send(JSON.stringify({ data: noteLogger.innerHTML }));
});
Every time a new client
opens the connection to the WebSocket, the server attaches the connection to the respective clients' array
identified by the connection path
specified.
public function onOpen(ConnectionInterface $conn)
{
$request = $conn->httpRequest;
if ($request->getUri()->getPath() === '/time') {
$this->timeClients->attach($conn);
echo "New connection to time channel: {$conn->resourceId}\n";
}
if ($request->getUri()->getPath() === '/notes') {
$this->notesClients->attach($conn);
echo "New connection to notes channel: {$conn->resourceId}\n";
}
}
Every time a client
sends a message to the WebSocket server, it broadcasts the message to the other clients connected to the same channel identified by the connection path
specified.
public function onMessage(ConnectionInterface $from, $msg)
{
$request = $from->httpRequest;
if ($request->getUri()->getPath() === '/time') {
foreach ($this->timeClients as $client) {
if ($from !== $client) {
$client->send($msg);
}
}
}
if ($request->getUri()->getPath() === '/notes') {
foreach ($this->notesClients as $client) {
if ($from !== $client) {
$client->send($msg);
}
}
}
}
On app startup
, the WebSocket connections are created.
final timeChannel = IOWebSocketChannel.connect('ws://YOUR_LOCAL_MACHINE_IP:8080/time');
final notesChannel = IOWebSocketChannel.connect('ws://YOUR_LOCAL_MACHINE_IP:8080/notes');
In the build
method of the main widget, a new StreamBuilder
is created for the timeChannel
connection; the StreamBuilder
widget executes the builder
function instructions when receiving a new message
on the WebSocket.
It vibrates the mobile device and prints on the screen the beats, using a ListView
starting from the beats
elements List
, which is emptied when the first beat of the bar is received.
StreamBuilder(
stream: timeChannel.stream,
builder: (context, snapshot) {
if (!snapshot.hasData) {
return const Center(child: CircularProgressIndicator());
}
final message = jsonDecode(snapshot.data);
HapticFeedback.heavyImpact();
if (message["isFirstBeat"]) beats = [];
beats.add(message["isFirstBeat"]);
return Expanded(
flex: 1,
child: Align(
alignment: Alignment.center,
child: ListView.builder(
scrollDirection: Axis.horizontal,
itemBuilder: (context, index) {
return Center(
child: Icon(
Icons.circle,
size: 30,
color: beats[index] ? Colors.green : Colors.red,
),
);
},
itemCount: beats.length,
),
),
);
},
)
In the build
method of the main widget, a new StreamBuilder
is created for the notesChannel
connection; the StreamBuilder
widget executes the builder
function instructions when receiving a new message
on the WebSocket.
It prints on the screen the notes contained in the WebSocket message.
StreamBuilder(
stream: notesChannel.stream,
builder: (context, snapshot) {
if (!snapshot.hasData) {
return const Center(child: CircularProgressIndicator());
}
final message = jsonDecode(snapshot.data);
return Expanded(
flex: 5,
child: Html(data: message["data"]),
);
},
)
This tutorial shows how to locally deploy and run the project.
- MuseScore 3
- Docker and Docker Compose (Application containers engine)
- Flutter Version Management
- Android Studio for building the Android App
- XCode for building the iOS App
- The ports
8000
and8080
free on your local machine
Clone the repository into the Plugins
folder of MuseScore 3:
$ cd ~/Documents/Musescore 3/Plugins
$ git clone https://github.com/IvanBuccella/notes-sound-generator
- Build the Docker environment:
$ cd notes-sound-generator
$ docker-compose build
-
Replace the web socket URL with your local machine IP address in the file
notes-sound-generator/src/mobile_app/lib/main.dart
. -
Set the required Flutter version and install the dependencies:
$ cd notes-sound-generator/src/mobile_app
$ fvm install 3.7.2
$ fvm use 3.7.2
$ fvm flutter pub get
- Build the Android App with Flutter:
$ cd notes-sound-generator/src/mobile_app
$ fvm flutter build apk
The output is located at the path notes-sound-generator/src/mobile_app/build/app/outputs/flutter-apk/app-release.apk
.
- Build the iOS App with Flutter:
$ cd notes-sound-generator/src/mobile_app
$ fvm flutter build ipa
$ chmod +x run.sh
$ ./run.sh
- And then, launch the App on your mobile device
In order to execute the plugin you need to do the following steps in order:
- Run the docker container:
$ cd notes-sound-generator
$ docker-compose up -d
- Launch the MuseScore program
- And then, launch the App on your mobile device
This project welcomes contributions and suggestions. If you use this code, please cite this repository.
Credit to CoderLine: AlphaTab is a cross platform music notation and guitar tablature rendering library. You can use alphaTab within your own website or application to load and display music sheets from data sources like Guitar Pro or the built in markup language named alphaTex.