-
The Miditzer, which is a pipe organ simulator (www.miditzer.org), has included a copy of FluidSynth as a default soft synthesizer for over 15 years. During that time FluidSynth has undergone major enhancements, the Miditzer not so much. We are now in the process of doing a major overhaul to the Miditzer trying to bring it up to date, or at least not too far behind the times. The Miditzer is essentially an elaborate MIDI router with controls that allow the performer to adjust the routing in real time. Up to this point, the Miditzer has collected real time MIDI inputs, mouse input, computer keyboard input, and FluidSynth MIDI player input and merged them. It then does the routing functions and emits MIDI output and/or sends FluidSynth MIDI Channel Messages. 15 years ago, when computers were almost all single core and parallelism was minimal to non-existent, this worked fine. As we update the Miditzer to a modern 64 bit programs, Windows only for the time being, my ears are telling me that the timing is no longer being maintained with sufficient accuracy. As I study the FluidSynth API, my understanding is that we should be using FluidSynth MIDI Events, which I understand to be timestamped, to keep the relative timing of MIDI events more consistent. Am I on the right track so far? The first question is does this mean that we should be sending Miditzer output to the FluidSynth MIDI Sequencer? Would the high level data flow for the MIDI Player be: The next question is can we use FluidSynth to receive real time MIDI input and provide timestamped MIDI events so that the MIDI Player and real time MIDI input will have consistent time stamps? The final question for now is whether there is a way of applying a synchronous time stamp to mouse input and computer key input to generate time stamped MIDI events to send to the FluidSynth MIDI Sequencer? Thanks in advance for any help you can provide to make the Miditzer developers better FluidSynth users. Jim Henry |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
Ok, so you would use
Those "merged real-time events" could be of type While you do all this event processing, you would have another thread continuously rendering audio from the synth. This could be one of fluidsynth's audio drivers or Miditzer itself by calling |
Beta Was this translation helpful? Give feedback.
-
In the case you intend to use intensive sequencing and flexible MIDI event routing without any problem about timing accuracy and time stamp MIDI event (1 ms resolution), perhaps may you consider using the excellent MidiShare Library (http://midishare.sourceforge.net/). |
Beta Was this translation helpful? Give feedback.
Ok, so you would use
fluid_player_set_playback_callback()
to intercept MIDI events coming from the player. And optionally, you could usenew_fluid_midi_driver()
to intercept MIDI events collected by fluidsynth coming from real-time devices. Miditzer would then receive a bunch offluid_midi_event_t
. Note that those are real-time event, i.e. they don't have a timestamp, because they are valid for the time you're receiving the callback, and you should also …