Installing the Bol Processor BP3 does not require any programming skills. Just use the installers for MacOS and Windows, or the installation scripts for Linux.
To use the Bol Processor BP3, you first need to install a local Apache HTML/PHP server on your desktop computer. This server runs a dedicated "web service" that is restricted to your computer. Only PHP (with its GD Graphics option) needs to be running, as no database is used by the Bol Processor interface.
On MacOS and Windows we recommend MAMP or XAMPP, both of which are Apache servers with pre-installed features. On Linux, XAMPP is the only choice. The procedure is given on the pages that show the installation of BP3 in the different systems, see below.
Once you've installed MAMP or XAMPP, installing Bol Processor is almost a one-click process.
MacOS users can quickly do the installation using an installer called BolProcessorInstaller.pkg. Follow instructions on this page.
Windows users can quickly do the installation using an installer called BolProcessorInstaller.exe. Follow instructions on this page.
Linux users can quickly do the installation using dedicated scripts. Follow instructions on this page.
👉 Once you've installed the Bol Processor BP3, visit this page to familiarise yourself with how to use it.
The file structure of your installation
👉 Only for geeks!
Let us assume that your installation was successful. It created a "htdocs/bolprocessor" folder.
The file structure inside this folder is shown on the left. There is nothing related to Bol Processor outside of this folder.
This image includes "bp" which is the compiled version of the BP3 console for MacOS. The same is called "bp.exe" in Windows and "bp3" in Linux.
The "temp_bolprocessor" and "my_output" folders are automatically created when the interface is run. The contents of the "temp_bolprocessor" folder is cleared of all files/folders older than 24 hours.
Another folder called "midi_resources" is also created to store the settings for the real-time MIDI input and output ports.
Two additional folders, "csound_resources" and "tonality_resources", are created by the installation and filled with data shared by all projects.
Running the interface will also create "BP2_help.html" in the "php" folder using "BP2_help.txt" as its source.
The "ctests" folder — which we call a workspace — contains sample material used to check the operation of Bol Processor and to illustrate some musicological issues. It is updated by the installation scripts each time you upgrade to a new version.
If you create new material in the "ctests" workspace it won't be deleted by upgrades. However, if you modify files that come from the distribution, they will revert to the current distribution version on each upgrade. It is therefore a good idea to keep a copy of the "ctests" folder, as you are likely to modify some of its data files while using the program. You may want to restore the original versions later. You can also create your own workspaces (in tree structures) using either the BP3 interface or your computer's file manager.
The core of the Bol Processor, in all its versions, is an inference engine capable of generating 'items' — strings of variables and terminal symbols — treated like the score of a musical work. The inference engine does this through the use of rules from a formal grammar.
In its initial versions (BP1 and BP2), the inference engine was also able to analyse a score — for example, a sequence of drum beats — to check its validity against the current grammar. This feature is not (yet) implemented in BP3.
A brief presentation of grammars
The grammars used by the Bol processor are similar to those described in formal language theory with a comprehensive layout:
Rules can be context-sensitive, including with remote contexts on the left and the right.
Rules can contain patterns of exact or pseudo repetitions of fragments. Pseudo repetitions make use of transformations (homomorphisms) on the terminal symbols.
A terminal symbol represents a time object which can be instantiated as a simple note or a sound object, i.e. a sequence of simple actions (MIDI messages or Csound score lines).
The grammars are layered — we call them 'transformational'. The inference engine first does everything it can with the first grammar, then jumps to the second, and so on.
The “produce all items” procedure
Grammars can produce infinite strings of symbols if they contain recursive rules. This is of no practical use in the Bol Processor, as it will eventually lead to a memory overflow. When recursive rules are used, control is exercised by dynamically decreasing rule weights or using 'flags' to invalidate recursivity.
This means that the machine only generates finite languages within its technical limitations. Theoretically, it should be able to enumerate all productions. This is the aim of the "produce all items" procedure. In addition, identical items are not repeated; to this effect, each new item is compared with the preceding ones.
For geeks:This is done by storing productions in a text file which is scanned for repetitions. The efficiency of this method depends on the technology of the working disk. A SSD is highly recommended!
A simple example
Let us start with a very simple grammar "-gr.tryAllItems0" which is made up of two layers of subgrammars:
-se.tryAllItems0 -al.abc RND gram#1[1] S --> X X X gram#1[2] S --> X X ----- RND gram#2[1] X --> a gram#2[2] X --> b
The RND instruction indicates that the rules in the grammar will be selected randomly until no rule applies. The first subgrammar produces either "X X X" or "X X", then the machine jumps to the second subgrammar to replace each 'X' with either 'a' or 'b'.
In the " Produce all items" mode, rules are called in sequence, and their derivations are performed by picking up the leftmost occurrence of the left argument in the work string.
In the settings of " tryAllItems0 " (see picture), "Produce all items" is checked. A parameter " Max items produced" can be used to limit the number of productions.
The output is set to "BP data file" for this demo, although real-time MIDI, MIDI files and Csound score are possible because 'a' and 'b' are defined as sound-objects. However, the sound output is completely irrelevant with this simple grammar.
Any production that still contains a variable is discarded. This never happens with the " tryAllItems0 " grammar.
The production of this grammar is:
a a a a a b a b a a b b b a a b a b b b a b b b a a a b b a b b
All the steps are shown on the self-explanatory trace:
S X X X a X X a a X a a a a a b a X a a a a a b a a b X a b a a b b a X b a a b a b b X a X a a X a a a a a b X a a a a a b a a b a X b a a b a b X a b a a b b a b X X a a X a a a a a b a X a a a a a b a a b X a b a a b b a X b a a b a b b a b X X b a X b a a b a b b X a b a a b b a b b X b b a b b b b X b b a b b b b X b X a b X a b a a b b X b a a b a b b a b b X b b a b b b X b b a b b b b b X X b a X b a a b a b b X a b a a b b a b b X b b a b b b b X b b a b b b b b X X a X a a a b X a a a b a b X b a b b X b a b b b
A pattern grammar
Let us modify "-gr.tryAllItems0" as follows:
-se.tryAllItems0 -al.abc RND gram#1[1] S --> (= X) X (: X) gram#1[2] S --> X X ----- RND gram#2[1] X --> a gram#2[2] X --> b
The first rule gram#1[1] contains a pattern of exact repetition: the third 'X' should remain identical to the first one. Keeping the pattern brackets, the production would be:
(= a) a (: a) (= a) b (: a) (= b) a (: b) (= b) b (: b) a a a b b a b b
This output shows that the third terminal symbol is a copy of the first. These items can be played on MIDI or Csound, as the machine will remove structural markers. However, structural markers can also be deleted on the display by placing a " _destru" instruction under the "RND" of the second subgrammar. This yields
a a a a b a b a b b b b a a a b b a b b
To become more familiar with patterns (including embedded forms), try "-gr.tryDESTRU" in the "ctests" folder.
A more complex example
Consider the following grammar "-gr.tryAllItems1" in the "ctests" folder:
RND gram#1[1] S --> X Y /Flag = 2/ /Choice = 1/ ----- RND gram#2[1] /Choice = 1/ /Flag - 1/ X --> C1 X gram#2[2] /Choice = 2/ X --> C2 X _repeat(1) gram#2[3] <100-50> /Choice = 3/ X --> C3 X gram#2[4] X --> C4 _goto(3,1) gram#2[5] Y --> D3 Gram#2[6] X --> T ----- RND gram#3[1] T --> C5 _failed(3,2) gram#3[2] Y --> D6
This grammar uses a flag 'Choice' to select which of the rules 1, 2 or 3 will be used in subgrammar #2. Just change its value to try a different option, as they produce the same 'language'. Terminals are simple notes in the English convention: C1, C2, etc.
The flag 'Flag' is set to 2 by the first rule. If 'Choice' is equal to 1, rule gram#2[1] is applied, and it can only be applied twice due to the decrementation. This ensures that the language will be finite.
Rule gram#2[4] contains a "_goto(3,1)" instruction. Whenever it is fired, the inference engine will leave subgrammar #2 and jump to rule #1 of subgrammar #3. If the rule is a candidate, it will be used and the engine will continue to look for candidate rules in subgrammar #3. If the gram#3[1] rule is not applicable, the engine will jump to rule #2 of subgrammar #3, as instructed by "_failed(3,2)". In fact, these _goto() and _failed() instruction have no effect on the final production, but they do modify the trace.
If 'Choice' is equal to 2, the " _repeat(1)" instruction will force the gram#2[2] rule to be applied two times. If 'Choice' is equal to 3, the rule gram#2[3] will be applied twice because it has an initial weight of 100 which is reduced by 50 after each application. When it reaches zero, the rule is neutralised.
Capturing MIDI input opens the way to "learning" from the performance of a musician or another MIDI device. The first step is to use the captured incoming NoteOn/Noteoff events, and optionally ControlChange and PitchBend events, to build a polymetric structure that reproduces the stream.
The difficulty of this task lies in the design of the most significant polymetric structure — for which AI tools may prove helpful in the future. Proper time quantization is also needed to avoid overly complicated results.
We've made it possible to capture MIDI events while other events are playing. For example, the output stream of events can provide a framework for the timing of the composite performance. Consider, for instance, the tempo set by a bass player in a jazz improvisation.
The _capture() command
A single command is used to enable/disable a capture: _capture(x), where x (in the range 1…127) is an identifier of the 'source'. This parameter will be used later to handle different parts of the stream in different ways.
_capture(0) is the default setting: input events are not recorded.
The captured events and the events performed on them are stored in a 'capture' file in the temp_bolprocessor folder. This file will later be processed by the interface.
(Examples are found in the project "-da.tryCapture".
The first step to use _capture() is to set up the MIDI input and more specifically its filter. It should at least treat NoteOn and NoteOff events. ControlChange and PitchBend messages can also be captured.
If the pass option is set (see picture), incoming events will also be heard on the output MIDI device. This is useful if the input device is a silent device.
It is possible to create several inputs connected to several sources of MIDI events, each one with its own filter settings. Read the Real-time MIDI page for more explanations.
Another important detail is the quantization setting. If we want to construct polymetric structures, it may be important to set the data to the nearest multiple of a fixed duration, typically 100 milliseconds. This can be set in the settings file "-se.tryCapture".
Simple example
Let us take a look at a very simple example of a capture on top of a performance.
C4 _D4 _capture(104) E4 F4 G4 _capture(0) A4 B4
The machine will play the sequence of notes C4 D4 E4 F4 G4 A4 B4. It will listen to the input while playing E4 F4 G4. It will record both the sequence E4 F4 G4 and the notes received from a source tagged "104".
Suppose that the sequence G3 F3 D3 was played on top of E4 F4 G4. The capture file might look like this:
Note that all dates are approximated to multiples of 100 milliseconds. For example, the NoteOff of input note G3 falls exactly on the date 3000 ms, which is the NoteOff of the played note E4.
The recording of input and played notes starts at note E4 and ends at note G4, as specified by _capture(104) and _capture(0).
An acceptable approximation of this sequence would be the polymetric expression:
C4 D4 {E4 F4 G4, - - G3 - F3 - D3 - -} A4 B4
Approximations will be created from the capture files at a later stage.
Combining 'wait' instructions
Try:
_script(wait for C3 channel 1) C4 D4 _capture(104) E4 F4 G4 _script(wait for D3 channel 1)_capture(0) A4 B4
The recording takes place during the execution of E4 F4 G4 and during the unlimited waiting time for note D3. This allows events to be recorded even when no events are being played.
The C3 and D3 notes have been used for ease of access on a simple keyboard. The dates in the capture file are not incremented by the wait times.
The following is a setup for recording an unlimited sequence of events while no event is being played. Note C0 will not be heard as it has a velocity of zero. Recording ends when the STOP or PANIC button is clicked.
_capture(65) _vel(0) C0 _script(wait forever) C0
Interpreting the recorded input as a polymetric structure will be made more complex by the fact that no rhythmic reference has been provided.
Microtonal corrections
In the following example, both input and output receive microtonal corrections of the
Below is a capture file obtained by entering G3 F3 D3 over the sequence E4 F4 G4 A4.
The output events (source 0) are played on MIDI channel 2, and the input events (source 104) on MIDI channel 1. More channels will be used if output notes have an overlap — see the page MIDI microtonality. In this way, pitchbend commands and the notes they address are distributed across different channels.
Added pitchbend
In the following example, a pitchbend correction of +100 cents is applied to the entire piece. It does modify output events, but it has no effect on input events.
Again, after playing G3 F3 D3 over the sequence E4 F4 G4 A4:
Pitchbend corrections applied to the input (source 104) are only those induced by the microtonal scale. Pitchbend corrections applied to the output (source 0) are the combination of microtonal adjustments (see previous example) and the +100 cents of the pitchbend command.
Capturing and recording more events
The _capture() command allows you to capture most types of MIDI events: all 3-byte types, and the 2-byte type Channel pressure (also called Aftertouch).
Below is a (completely unmusical) example of capturing different messages.
The capture will take place in a project called "-da.tryReceive":
The recording will have a "111" marker to indicate which events have been received. Only two notes D4 are played during the recording, the second one is raised by 50 cents and has a channel pressure of 35 and a modulation of 42.
After playing back the two D4s, the machine will wait until the STOP button is clicked. This gives the other machine time to send its own data and have it recorded.
The second machine is another instance of BP3 — actually another tag on the interface's browser with the "-da.trySend" project:
This project will start by sending "{_vel(0) <<C0>>} " which is the note C0 with velocity 0 and duration null (an out-time object). This will trigger "-da.tryReceive" which waited for C0. The curly brackets {} restrict velocity 0 to the note C0. Outside of this expression, the velocities are set to their default value (64). In this data, channel pressure, modulation and a pitchbend correction of +100 cents are applied to the final note E3.
The resulting sound is terrible, you've been warned:
However, the 'capture' file shows that all events have been correctly recorded:
Captured events (from "-da.trySend") are coloured red. They have been automatically assigned to MIDI channel 2, so that corrections will not be mixed between performance and reception.
The pitchbend corrections are shown in the "cents correction" column, each applied to its own channel.
The channel pressure corrections (coloured blue) display the expected values. The modulation corrections (in the range 0 to 16383) are divided into two 3-byte messages, the first carrying the MSB and the second the LSB.
There is a time mismatch of approximately 150 milliseconds between the expected and actual dates, but the durations of the notes are accurate. The mismatch is caused by the delay in the transmission of events over the virtual port. The data looks better if the quantization in the "-da.tryReceive" project is set to 100 ms instead of 10 ms. However, this is of minor importance, as a "normalisation" will take place during the (forthcoming) analysis of the "capture" file.
For geeks: The last column indicates where events have been recorded in the procedure sendMIDIEvent(), file MIDIdriver.c.
Capture events without the need to perform
The setup of the "-da.tryReceive" project should be for example:
_capture(99) _vel(0) C4 _script(wait forever)
Note C4 is not heard due to its velocity 0. It is followed with all MIDI events received by the input until the STOP button is clicked. This makes it possible to record an entire performance. The procedure can be checked with items produced and performed by the Bol Processor.
Note that the inclusion of pitchbend messages makes it possible to record music played on microtonal scales and (hopefully) identify the closest tunings suitable for reproduction of the piece of music.
For example, try to capture this phrase from Oscar Peterson's Watch What Happens:
Only significant columns are displayed. The origin of dates is set to the first NoteOn or NoteOff received.
The next task on our agenda will be to analyse the 'capture' file and reconstruct the original polymetric expression (shown above) or an equivalent version. Then we can consider moving on to grammars, similar to what we've done with imported MusicXML scores (read page).
From the outset, I have viewed the Bol Processor project as a research endeavor rather than a straightforward software design project. Its focus is on "computational musicology," which involves using computational models to explore musicology across and beyond cultural boundaries.
This project represents a long-term commitment to developing music creation software that addresses musicological challenges, rather than simply creating tools for recombining pre-composed music or sound fragments. Our current goal is to integrate this approach with high-quality music and sound editors used by composers and sound designers.
The Bol Processor model could easily be extended to other activities such as scheduling, sizing and precise timing of video clips, robot commands and so on. These extensions are highly desirable.
The research nature of this project necessitates an open-access ethic. As such, the Bol Processor, including its installation and source files, will remain freely accessible. Software developers are encouraged to reuse the source code to create variants or updated versions of BP3.
However, there is a practical reason why I have chosen not to decline personal donations: I am increasingly incurring costs for various tools required for software development. These include hosting fees, the remuneration of "virtual" assistants (AI tools), and, when necessary, human experts.
For this reason, if you are so inclined, you can make a donation using the PayPal link below. Any funds not used for project related activities will be transferred to charities.
This page is a demo of the handling of microtonality in the real-time MIDI and MIDI file environments of the Bol Processor BP3 (version 3.0.7 and higher). Install BP3 by following the instructions for MacOS, Linux and Windows on the page Bol Processor 'BP3' and its PHP interface.
All examples here are from the "-da.tryMPE" project, which is part of the ctests folder (download here). The syntactic model for microtonality is explained here. For details on working with real-time MIDI, read the Real-time MIDI page. Some Csound scores are shown for the sake of clarity, as the handling of microtonality in the Csound environment of BP3 produces the same results as MIDI.
👉 The following is a comprehensive but detailed presentation of all aspects of the use of microtonality in BP3. It is not necessary to understand the details when starting with microtonality! The explanation is only intended to assist musicians who wish to create new material by combining several tuning schemes in the same musical work. To try the microtonal process on real musical works, listen for instance to the comparison of temperaments, or play François Couperin's Les Ombres Errantes (in the ctests/Imported_MusicXML folder) on a MIDI instrument using its optimal tuning scheme rameau_en_sib:
For geeks: Microtonality in real-time MIDI and MIDI files mimics the MIDI Polyphonic Expression (MPE) method of modifying pitchbend values on notes distributed on separate channels (up to 15 simultaneous notes). However, it works on devices that are not MPE-compliant.
Check pitchbender sensitivity
Make sure that your output MIDI device is sensitive to pitchbend messages. Try the following:
You should hear C4 F4 D4 Bb3 C4 instead of C4 C4 C4 C4 C4. This shows that the MIDI device accepts pitchbend messages and that its range is ± 200 cents, or ± 2 semitones. This is the range we use for microtonality.
For geeks: The actual values are in the range 0 - 16383, but thanks to the "_pitchrange(200)" instruction, the actual cent values can be used.
When using microtonal scales, this pitch range of ± 200 cents is set automatically by sending an appropriate message to the 16 MIDI channels.
The "_pitchbend()" commands will be taken care of, and their values will be added to the pitchbend commands that adjust the pitches to the microtonal scale. If this combination exceeds the range of ± 200 cents, an error message will be displayed.
MIDI channels
In the previous example, MIDI events (notes and pitchbender commands) were sent on channel 2. This is to ensure that your MIDI output device is receiving and mixing all channels, technically MIDI mode 4 (omni off, mono).
It was possible to send messages on channel 2 because the Microtonality mode was not set. This mode is set on as a "_scale()" command is found. In this case, the "_chan()" commands are ignored, as all channel assignments are made by the microtonality process.
Diapason tuning
Since note frequencies are displayed when the Trace microtonality mode is activated in "-se.tryMPE", the tuning of the diapason (note A4/la 3 on a conventional keyboard) is important.
By default (in Bol Processor settings and on MIDI devices) this setting is 440 Hz. If you change the value in the settings, the note frequencies will change accordingly. The BP3 will send a message to the MIDI device to tune the diapason, but many devices do not understand this command. (This is the case with PianoTeq Stage.) In this case, tune the device independently.
Microtonal scales
On top of project "-da.tryMPE" you can see the line:
-to.tryMPE
This refers to a tonality resource stored in the "tonality_resources" folder. This resource has been downloaded to your computer when running an installer (or a Linux script) as explained on pages Quick install MacOS, Quick install Windows, or Quick install Linux.
At the bottom of the project page there is a button called EDIT '-to.tryMPE'. This will take you to this resource:
Here are the scales stored in "-to.tryMPE":
Most of these are "exotic" in the sense that they won't produce interesting music. They have been designed to highlight technical features:
The grama scale is an interpretation of the Indian system that divides the octave into "twenty-two shrutis", see The two-vina experiment for details. We use one particular (probably incorrect) solution, which sets the pramana shruti at 21 cents. Technically speaking — the reason for this choice — this scale has 23 grades which count as 22 notes. Click the EDIT button to see its structure.
The just intonation scale is a standard scale with 12 grades and 12 notes, probably suitable for use in some harmonic contexts. Click the EDIT button and display see picture).
The zest24-supergoya17plus3_Db scale was created by importing its SCALA definition (from this archive). It covers a conventional octave (ratio 2/1) with 20 grades, but the SCALA file did not contain any note names. So, 12 notes were chosen at random, with key numbers as their names.
The first thing we notice is that the frequency of C4 (key #60) is 261.630 Hz, the base frequency of the block key. One octave higher, the frequency of C5 is 525.260 Hz. This gives an octave ratio of 2.0034, which equates to a stretching of 3 cents, close to 4 cents due to the rounding.
When the frequencies are displayed using the just intonation scale, the octave ratio is exactly 2/1. Listen to the scale with decreasing velocities:
On the same scale, listen to a series of fifths C4/G4, D4/A4, E4/B4, F5/C5 showing that they are perfect except the wolf's fifth D4/A4 (
Due to rounding (3 cents instead of 4) in the piano scale, check that rounding errors do not accumulate over octaves. The following is also an exercise for those whose ear is trained in piano tuning. We'll play identical notes in two scales: the piano scale with its extended octave, then the standard equal temperament scale with an octave of 2/1:
The instruction "_scale(0,0)" sets the scale to the standard standard equal temperament scale. The layout of the notes in this polymetric structure is as follows:
There is a short delay (1/16 beat) on the notes of the second line to emphasise the beats, if there are any. You can hear beats in the first part, as scales are different, but perfect unison in the second part. This is how it sounds on a PianoTeq Stage physical modelling synthesiser:
This result should not be taken as a radical statement about how to tune a piano! Pianoteq synthesizers already reproduce the octave stretching that piano tuners tend to do to compensate for the inharmonicity of the strings. An additional octave stretching of four cents is therefore not worth mentioning.
The trace only shows notes whose frequencies have been corrected:
We note that 5 octaves gives a total stretch of 18 cents, or 3.6 cents per octave. The frequency ratio between C4 and C7, three octaves higher, is 2106.381/261.630 = 8.0509, whose cube root is 2.0042, again very close to the ratio of 2.0046 in the piano scale definition. Unsurprisingly, the C-sound score reveals exactly the same numbers.
Effect of the block key
Let us superimpose two phrases of the same notes in the same scale but without the same block key:
First, key #72 (C5) of scale #4 has 0 cents correction because the block key of this scale is C4 and it has no octave stretching. The same for key #69 (A4) of scale #3 whose block key is A4.
Let us use meantone_try and meantone_try2 scale to play the same phrase. We call these scales "exotic" because the names of their notes are not in the English, Italian/Spanish/French, or Indian standard. Here we use the key numbers of the MIDI output device.
The only difference is the key numbers. In meantone_try2, the base key is #64 instead of #60.
The frequency of key#69 is 440 Hz since it is the block key. The actual sequence heard on the MIDI output device is C4 D4 F4 A4 C5. Note that the octave ratio C5/C4 is 527.204/260.875 which is greater than 2 because this scale has an octave stretched by 19 cents.
Note again that the frequency of the base key #73 is 440 Hz. The frequencies are identical, the only change is the key numbers associated with the notes. The actual sequence played on the MIDI output device should again be C4 D4 F4 A4 C5, assuming that key #64 is the middle key of its keyboard.
A very exotic scale
The scale called zest24-supergoya17plus3_Db is more "exotic" than the previous one because it has 20 grades and 12 notes. The original scale downloaded from an archive did not have note names, so we decided to label twelve positions with the key numbers #60 to #71. As you can see in the picture, the intervals are very irregular. The choice of 12 tones is motivated by the desire to be able to map them onto the 12 keys of a standard piano keyboard. We'll see a different case later.
Key #60 is the base key and its frequency is 261.630 Hz as declared in the tonality resource. Note that the key #72, an octave higher, also has "key#60" as its note name. Its frequency of 523.260 Hz is twice 261.630 since octaves are not stretched.
Looking at the picture, we can calculate the frequency of key#65 which has a frequency ratio of 1.478. This gives 1.478 x 261.63 = 386.69 Hz, which is very close to that in the score. Minor errors are due to the rounding of cents to whole numbers.
If you play this score on a conventional MIDI device, you won't hear the correct frequencies unless the device is tuned to a 20-grade equal temperament scale. Conversely, the rendering in C-Sound is accurate.
When a scale has more than 12 grades, the reference tempered scale must have the same number of grades, regardless of the number of notes (which is indeed smaller). Apart from the musical aspect — which we won't discuss here — this has a technical advantage: the cent corrections, which are deviations from the equal temperament scale, will always be less than 100 cents. This is important because the sensitivity of pitchbenders is set to ± 200 cents.
The Bohlen-Pierce scale
The Bohlen-Pierce scale has a "tritave" interval of ratio 3/1 instead of 2/1 in the octave. The tritave is divided into 13 grades and 13 notes (-to.tryMPE — see picture.
The frequency ratio C5 / C4 is, as expected, 784.457 / 261.630 = 3.
Since the scale covers a tritave that would extend from A4 (key #60) to G5 (key #79) on a conventional 12-note MIDI device, each note is mapped to the key that requires the least amount of pitchbend. As a result, pitchbend corrections are never greater than ± 100 cents.
To convert to Bohlen-Pierce notes played on an external MIDI device, connect it to the input of BP3, then run the following "tuning daemon" (see below):
In the following example, two phrases are played on top of each other, using different microtonal scales.
On the pianoroll (see picture), key#60 is shown as C4. The note key#62, shown as D4, seems to be unique, although two key#62 notes are superimposed with slightly different cent corrections. The same is true with key#72 shown as C5.
The second (key 62) and last (key 72) notes are identical, but because they belong to different scales, their frequencies are not identical. For this purpose, they are played on different MIDI channels. The superimposition creates (nasty) mismatches that reflect the differences in tuning:
Use of _scale(0,0)
So far we have used "_scale(0,0)" to specify the return to a 12-grade equal tempered scale after using a microtonal scale. It can also be used to force microtonal mode in a musical item that does not require specific microtonal scales.
This is a (rather silly) way of creating a sequence of notes using the same note with pitchbend corrections. In fact, we are looking forward to hearing:
C4 C#4 {B3, D4} C#4 C4
The first solution does not work because the chord {B3, D4} consists of two of the same note A4 with different pitchbend values. It works in Csound, but in MIDI we hear:
Proper notation, without the aid of microtonality, would be, for example:
So we have to send the two C4s of the polymetric expression on separate MIDI channels. But the microtonal calculation does this automatically. So, putting "_scale(0,0)" at the beginning won't change the tuning but it will force the microtonal mode:
For geeks:It wouldn't be a good idea to set microtonality mode by default for all musical works, because (1) channel assignment takes up processing time, and (2) this would render all "_chan()" commands ineffective. In some MIDI environments, MIDI channels are used to send messages to different instruments.
Combination with pitchbend commands
The following is an example of combining a microtonal phrase with a global pitchbend command of + 100 cents:
The "_chan(4)" command is used here to prove that it is ignored in microtonality mode. The MIDI trace shows that an additional correction has been applied. Therefore, the frequency values are not those played on the output MIDI device. However, the Csound score is explicit:
The numbers 99.988 and 99.988 are the pitchbend corrections (in cents) at the beginning and end of the note declared on each line.
The "grama" Indian scale
We have already discussed the ancient Indian tonal system which divides the octave into "twenty-two shrutis", see The two-vina experiment for details. We'll try this grama scale by setting the pramāņa ṣruti to 21 cents.
In short, this tuning scheme is a twelve degree chromatic scale: Sa, Re komal, Re, Ga komal, Ga, Ma, Ma tivra, Pa, Dha komal, Dha, Ni komal, Ni. These names stand for C, Db, D, Eb, E, F, F#, G, Ab, A, Bb, B in English notation.
Each note of the Indian scale, except Sa (C) and Ma tivra (F#), can occupy two enharmonic positions. This explains why the grama tuning scheme has 23 positions and 22 notes.
In accordance with the syntax of the Bol processor, notes are referred to as sound objects in lower case. For example, the two enharmonic positions of Re komal are called r1_ and r2_, and the two positions of Re are called r3_ and r4_. A trailing '_' is necessary to indicate octave numbers unambiguously: the note d3_4 is the low position of Dha in the 4th octave, which is admittedly close to A4 in the Western scale.
The pramāņa ṣruti is the tonal distance between all pairs of enharmonic positions, for instance between r1_ and r2_. For the sake of simplicity, we've set it to 21 cents (a syntonic comma), which is a common mistake made by Western and Indian musicologists. In reality it is a variable value — see Raga intonation.
Listen to the same scale played against a drone (read the Microtonality page):
We said earlier that, in the grama tuning scheme, d3_4 occupies the position of A4 in the Western scale. Since the block key is #76 (d3_4), the frequency of d3_4 is close to 440 Hz. Consequently, the position of sa_4 (263.907 Hz) is 15 cents higher than the base frequency (261.63 Hz).
This scale can be played on any MIDI device that accepts pitchbend commands. The 22 notes of the scale, covering an octave, are mapped to the 12 keys of a MIDI keyboard. Each note is mapped to the key that requires the least amount of pitchbend. As a result, pitchbend corrections are never greater than ± 100 cents. For example, d3_4 is mapped to key #69, which happens to be that of A4 on a conventional keyboard.
Read the Raga Intonation page to see how this theoretical framework can be adapted for modelling real music.
Microtonality in sound-objects
A sound-object is a sequence of MIDI events and/or Csound score lines — read Sound-object prototypes for details. Therefore the pitches of notes it contains can be modified by microtonal scales. In the "-da.tryMPE" project, try for instance:
_scale(just intonation,0) a f b b
and check frequency corrections in the trace (both MIDI and Csound):
This demonstrates the BP3's ability to act as an interface between MIDI devices, retuning the input in real time to a microtonal scale.
The filter of the input MIDI device should be set to "treat & pass" for all categories of events that will be transmitted (see picture).
We show a temporary solution that works very well, but will be simplified in the future.
If, for example, you want to retune the input to the just intonation scale, run the following "tuning daemon":
_script(wait for C0 channel 16) _scale(just intonation,A4) _vel(0) C0
The "wait for C0 channel 16" command that causes the machine to hang up while it listens for some kind of input. In fact, the note "C0 channel 16" should not be part of the stream of notes you need to retune, otherwise it will stop the process!
The note C0 with velocity 0 is inaudible and will not be played unless the note "C0 channel 16" releases the waiting state. This note is needed for attaching the script instruction.
The "wait forever" command causes the machine to hang until the STOP or PANIC button is pressed. Again, we need a dummy (inaudible) note C0, at the end of which the script instruction is appended.
Because multiple instances of BP3 can be run simultaneously (read Real-time MIDI), you can set up a bank of "tuning daemons" that interact with people and MIDI devices to create interesting variations of tonal structures.
This installation is checked with Ubuntu +Linux Lite 7.0 running on an HP Intel Core i5-6200U (64-bit, 8 Gb RAM).
Install XAMPP
The installation of BP3 should take place after the installation of the local Apache server XAMPP. Follow instructions here: https://www.apachefriends.org/
XAMPP creates a /opt/lampp/htdocs/ directory that will contain "bolprocessor". The "bolprocessor" folder will contain (language C) source files for the "bp3" console, the "MakeFile" to compile them, and all the data files. It will also contain a "php" folder filled with PHP pages and some related files for running the interface.
If you wish to open XAMPP automatically at startup, read this page. (This can be done later.)
These zip files are updated regularly. Major updates are announced on the BP developers list and uploaded to their GIT repositories. Make sure that they can be found in your Downloads folder: /home/linuxlite/Downloads
Install the Bol Processor
Safely do the installation using the shell scripts "linux-scripts" downloaded here. Current version: 27 October 2024, size 12010 bytes. Unpack and copy these scripts to the /home folder:
cd /home/linuxlite/Downloads/ unzip linux-scripts.zip -x "__MACOSX/*" sudo chmod -R 775 linux-scripts cd linux-scripts sudo cp -a . /home/ cd /home/ sudo chmod +x *.sh
Scripts "modify_xampp.sh", "prepare.sh", "unpack_bp3.sh", "install_bp3.sh" and "get_ready_bp3.sh" have been copied to the /home folder.
Four more scripts: "update_console.sh", "update_interface.sh", "update_data.sh" and "restart_xampp.sh", have been copied to the /home folder. They will be used later.
The installation procedure is similar to that for MacOS and Windows, but includes a few actions specific to Linux.
Run the scripts in the following order. 👉 You will be asked for your password the first time, as they need to be run in "admin" mode (the "sudo" command).
sudo /home/modify_xampp.sh will adjust XAMPP settings.
sudo /home/prepare.sh will install required resources on your machine, and create two virtual MIDI ports if they are not yet existing.
sudo /home/unpack_bp3.sh will unpack the zip files.
sudo /home/install_bp3.sh will create the /opt/lampp/htdocs/bolprocessor/ directory and fill it with the contents of BP3 packages. Files/folders already existing will simply be updated.
sudo /home/get_ready_bp3.sh will set up permissions and owners for using the console. The owner is "daemon" (the same one used by XAMPP) and permissions are "775". 👉 The content of "bolprocessor" is strictly private. No risk setting up permissions!
Avoid running the same script more than once, although this should not create duplicates or unwanted effects.
Once the Bol Processor BP3 is installed and running, you should delete the "zip" files in the Downloads folder. This will allow downloading new versions.
Compile the 'bp3' console
Start the XAMPPApache server. You can send the terminal command:
sudo /home/restart_xampp.sh
Point your browser at localhost/bolprocessor/php/. This will display the home page of the Bol Processor.
If you see this frame in the image at the top right of the page, your life will be easy! All you have to do is click on the link to compile the console, which will take less than a minute.
f you don't see the link to compile, and instead a mention that 'gcc' is not responsive, you are in great trouble! This suggests a bug in the interface (contact us) or in your installation.
Install Csound
Csound is not required to run the Bol Processor, as you can work with MIDI files and real-time MIDI. However, it will give you access to a different approach to sound synthesis, and it will handle microtonality in its own way.
If you wish to install Csound, simply type the following commands:
sudo apt update sudo apt install csound
Then verify the installation and check the path to the console:
csound --version which csound
By default, the Bol Processor sets the path to "usr/bin", which seems to be standard on Linux. It is given by the "which csound" command.
The BP3 interface will be able to figure out the location of "csound" and fix its path accordingly. If it does not respond, you will be asked to change the path and perhaps the name of the Csound console (see image).
👉 Currently, the Csound orchestra file "0-default.orc" does not work on Linux. We're trying to fix this… In the settings of your projects, enter "BP2test.orc" as replacement, or select other files such as "new-vina.orc".
😀 Enjoy Bol Processor BP3 on Linux!
Restart XAMPP after a crash
If you blow up the memory, for example with a quantization that is too low for the size of a piece played in realtime MIDI, the XAMPP server may freeze: the browser will refuse to display pages.
To restart XAMPP, go to the terminal and run the following script:
sudo /home/restart_xampp.sh
Updating to new versions
👉 If you update the "bp3" console, you should also update the "php" interface, as the two are linked.
(1) Update the "bp3" console: Delete the graphics-for-BP3.zip file if it exists in your Downloads folder. Download https://github.com/bolprocessor/bolprocessor/archive/graphics-for-BP3.zip Run the (superfast!) script: sudo /home/update_console.sh This script deletes the 'bp3' console to force a compilation of the new version.
(2) Update the "php" interface: Delete the php-frontend-master.zip file if it exists in your Downloads folder. Download https://github.com/bolprocessor/php-frontend/archive/master.zip Run the (superfast!) script: sudo /home/update_interface.sh Note that this script will preserve the "_settings.php" file (if it exists), which contains your project settings.
(3) Update the test data: this will only update the contents of the "ctests" folder. If you have created folders and files for your personal data, these will not be affected. However, if you have modified a sample file without changing its name, it will be reverted to its distribution version. Delete the bp3-ctests-main.zip file if it exists in your Downloads folder. Download https://github.com/bolprocessor/bp3-ctests/archive/main.zip Run the (superfast!) script: sudo /home/update_data.sh
👉 Please send your suggestions or modified files to our contact.
Uninstall the Bol Processor
Uninstalling the Bol Processor and all the data downloaded or created for its use, is very simple: delete the "htdocs/bolprocessor" folder.
A one-click notarized installer of Bol Processor BP3 is available. It is called "BolProcessorInstaller.pkg" and it can be downloaded from here (unique location).
Geeks may prefer an equivalent method using a script included in this package, see below.
This installer (or the script) is used for both initial installation and updates. Each time you run it, it will download the latest versions of the BP3 console source files, the interface PHP files and the sample set contained in the 'ctests' folder. Data, grammars and scripts that you've created will not be deleted. However, if you have modified files in the 'ctests' folder, they will be reverted to the current distribution version.
Install MAMP or XAMPP
If you try to run the installer of Bo Processor, it will first check that a local Apache server (MAMP or XAMPP) has been installed. Both are suitable since the Bol Processor interface contains exclusively HTML, PHP and JavaScript code. No database is required.
Don't try the virtual machine version of XAMPP! It won't work on Macs with M1 chips (and above). Use the native installer.
if you choose the (free) MAMP version, both MAMP and MAMP PRO will be installed, and the interface will occasionally prompt you to "upgrade" to MAMP PRO. But you don't need it for the Bol Processor!
For MAMP, the "htdocs" folder is in "Applications/MAMP". For XAMPP, it is in "Applications/XAMPP/xamppfiles".
If you want Apache to start automatically when you start your computer, this process is easy with MAMP. For XAMPP, you can create a startup script.
➡ You will not be able to run both MAMP/MAMP PRO and XAMPP Apache servers at the same time if they use the same ports. This wouldn't be a good idea anyway…
MAMP PRO
Below are instructions for rich people running MAMP PRO.
Launch MAMP PRO from the Applications folder.
In the MAMP main window, click the Apache Enable button (see image). No need for MySQL.
The image shows the default settings for PHP, which is started with Apache.
In case of trouble, check the settings for ports (see image) and of hosts (general and Apache).
XAMPP
Open the XAMPP folder in the Applications folder and launch manager-osx.app as shown below.
The XAMPP main page will appear. Click on the Manage Servers tab, then select Apache Web Server and click Configure. This is necessary to set up the port to any value except "80" (or except "8888" in case you are also running MAMP). Suggestion: set it to "81". Then click the Start button. If there is no conflict with the ports, Apache will show up as "running":
Once Apache is running, you can click on the Welcome tag and the Go to Application button. This should display a (local) page about XAMPP in the path http://localhost/dashboard. Both the dashboard and bolprocessor folders will be located in the Applications/XAMPP/xamppfiles/htdocs folder.
Install the Bol Processor
After installing MAMP or XAMPP, you can run the installer "BolProcessorInstaller.pkg" or the "install_bolprocessor.sh" script. Both are equivalent.
Using the installer
Download "BolProcessorInstaller.pkg" from here and double-click it.
This installer has been notarized, which means it contains information that allows Apple to certify its validity.
An equivalent method is to run the "install_bolprocessor.sh" script found in the "macos-scripts" folder downloaded here. This makes it possible to understand each step of the installation and possibly suggest improvements.
After downloading "macos-scripts.zip", open the Terminal and type:
cd Downloads unzip -qo macos-scripts.zip cd macos-scripts sudo ./install_bolprocessor.sh
Installation issues
If the installer (or the script) does not find a "htdocs" folder created by MAMP or XAMP, it will stop the installation, warning you that one of them should be installed. In case both MAMP and XAMPP are installed — a bad idea! — the installer will choose MAMP.
Compile the 'bp' console
Now, assuming that the installation was successful, start MAMP or XAMPP and point your browser to localhost/bolprocessor/php/. This will display the home page of the Bol Processor.
If you see this frame in the image at the top right of the page, your life will be easy! All you have to do is click on the link to compile the console, which will take a minute or two.
If you don't see the link to compile, and instead a mention that 'gcc' is not responsive, things are bad! You are probably using an obsolete version of MacOS. You may need to install the command line developer tools in OS X (explanations) or the Xcode toolkit on your machine.
Install Csound
Csound is not required to run the Bol Processor, as you can work with MIDI files and real-time MIDI. However, it will give you access to a different approach to sound synthesis.
The BP3 interface should be able to figure out the location of "csound" and fix its path accordingly. If it does not respond, you will be asked to change the path and perhaps the name of the Csound console (see image).
To update the Bol Processor console, its PHP interface and examples (the contents of the "ctests" folder), simplyrerun "BolProcessorInstaller.pkg". Using the latest version is safe!
The installer will download and install current versions of the software and data. It will delete the compiled "bp" console and prompt you to recompile it (with a single click).
Updating will not modify or delete any data you have created in the "ctests" folder or outside it. However, if you have modified a sample file without changing its name, it will be reverted to its distribution version.
The installer will also preserve the "_settings.php" file (if it exists), which contains your project settings.
Security
You are right to be concerned about security. Can you be sure that you have downloaded the correct version of "BolProcessorInstaller.pkg"? Normally yes, it is safe, because this installer has been notarized.
The size of the "BolProcessorInstaller.pkg" file is exactly 19966 bytes and its MD5 is 806a83f72daf1abd22436555abc2cc8b. You can calculate the MD5 checksum on this page. These numbers will indeed be subject to change with the release of new versions of the installer. Current version: 28 October 2024.
Geeks may want to customise it for their own use. Just download this folder which contains the script files (install_bolprocessor.sh and postinstall) along with instructions on how the installer has been built.
For readers not conversant with Unix shell scripts, the following is a description of the process in human language:
Check that an Apache server MAMP or XAMPP is installed by finding either MAMP/htdocs or xampp/htdocs on the computer (not case-sensitive). If it is not found, exit with the warning that either MAMP or XAMPP should be installed.
Download the latest distribution files from GitHub: https://github.com/bolprocessor/bolprocessor/archive/graphics-for-BP3.zip https://github.com/bolprocessor/php-frontend/archive/master.zip https://github.com/bolprocessor/bp3-ctests/archive/main.zip
Unzip these three files. They create folders with names: bolprocessor-graphics-for-BP3 php-frontend-master bp3-ctests-main
Create a folder named "bolprocessor" (if it does not yet exist) inside the "htdocs" folder of the Apache server
Copy bolprocessor-graphics-for-BP3/source to htdocs/bolprocessor/ If there is already a "source" folder, delete it
Copy bolprocessor-graphics-for-BP3/Makefile to htdocs/bolprocessor/ Copy bolprocessor-graphics-for-BP3/BP3_help.txt to htdocs/bolprocessor/ Copy bolprocessor-graphics-for-BP3/Credits.txt to htdocs/bolprocessor/ Copy bolprocessor-graphics-for-BP3/BP3-To-Do.txt to htdocs/bolprocessor/ Copy bolprocessor-graphics-for-BP3/License.txt to htdocs/bolprocessor/ Copy bolprocessor-graphics-for-BP3/ReadMe.txt to htdocs/bolprocessor/
Copy bolprocessor/php/_settings.php to bolprocessor/ (if it exists)
Copy php-frontend-master/php to htdocs/bolprocessor/ If there is already a "php" folder, delete it
Copy bolprocessor/_settings.php to bolprocessor/php/ (if it exists)
Create a folder htdocs/bolprocessor/csound_resources if it does not yet exist
Copy the content of php-frontend-master/csound_resources to htdocs/bolprocessor/csound_resources Files that already exist should be replaced with their updated versions
Create a folder htdocs/bolprocessor/ctests if it does not yet exist
Copy the content of bp3-ctests-main to htdocs/bolprocessor/ctests Files that already exist should be replaced with their updated versions
Delete the temporary download directory
Set permissions of the bolprocessor folder recursively to "775"
➡ There is no security risk in setting "775" permissions, as the MAMP or XAMPP Apache server will be running on your private computer. The Bol Processor never creates/modifies files outside of its "bolprocessor" folder.
Delete htdocs/bolprocessor/bp if it exists. This ensures that the 'bp' console is recompiled after each update.
Uninstall the Bol Processor
Uninstalling the Bol Processor, and all the data downloaded or created for its use, is very simple: delete the "htdocs/bolprocessor" folder.
A one-click installer of Bol Processor BP3 is available. It is called "BolProcessorInstaller.exe" and it can be downloaded here (unique location).
This installer is used for both initial installation and updates. Each time you run it, it will download the latest versions of the BP3 console, including its source files, the interface PHP files, and the sample set contained in the 'ctests' folder. Data, grammars and scripts that you've created will not be deleted. However, if you have modified files in the 'ctests' folder, they will be reverted to the distribution version.
This installation is checked with Windows 10 running on an HP Intel Core i5-6200U (64-bit, 8 Gb RAM). It should work fine with Windows 11, please check it and report!
First install MAMP or XAMPP
If you try to run the installer, it will first check that a local Apache server (MAMP or XAMPP) has been installed. Both are suitable since the Bol Processor interface contains exclusively HTML, PHP and JavaScript code. No database is required.
if you choose the (free) MAMP version, both MAMP and MAMP PRO will be installed, and the interface will occasionally prompt you to "upgrade" to MAMP PRO. But you don't need it for the Bol Processor!
For MAMP on Windows, the "htdocs" folder is in "C:\MAMP". For XAMPP on Windows, the "htdocs" folder is in "C:\xampp". (To be verified)
If you want Apache to start automatically when you start your computer, this process is easy with MAMP, but a bit more complex with XAMPP: try this method,
Create the 'bp.exe' console
After installing MAMP or XAMPP, run the installer "BolProcessorInstaller.exe" downloaded here . In case both MAMP and XAMPP are installed (a bad idea!) the installer will choose MAMP.
Now start MAMP or XAMPP and point your browser at localhost/bolprocessor/php/. This will display the home page of Bol Processor BP3.
If you see this image at the top right of the page, the console is ready. Click on the lamp if you prefer to use the light mode for the interface.
You can ignore the next section. 😀
Compile the 'bp.exe' console (if necessary)
The Windows installation of Bol Processor includes the pre-compiled console (named 'bp.exe' ). If, some some reason, the console is not responding, or if you modified its source code (in the source/BP3 directory), you may need to recompile it.
If you see this image at the top right of the page, your life will be easy!
All you have to do is click on the link to compile the console, which will take a minute or two.
If the frame says that 'gcc' is not responding (see picture) you need to install MinGW. This is the main drawback of Windows: its default installation does not handle 'gcc' (the standard C compiler). You need 'gcc' to compile the Bol Processor console, and perhaps other applications to come. So, install MinGW, carefully following instructions on this page. It is simple, but you shouldn't miss a step!
Once 'gcc' is responding, reload the Bol Processor home page and click on the link to compile the console.
Install Csound
Csound is not required to run the Bol Processor, as you can work with MIDI files and real-time MIDI. However, it will give you access to a different approach to sound synthesis.
The BP3 interface will be able to figure out the location of "csound.exe" and fix its path accordingly. If it does not respond, you will be asked to change the path and perhaps the name of the Csound console (see image). Once it works after a modification, please contact us so that we can update the default paths and names of Csound in your installation of Windows.
To update the Bol Processor console, its PHP interface and examples (the contents of the "ctests" folder), simply rerun the installation. It will download and install the latest versions of the software and data. It will delete and replace the compiled "bp.exe" console.
The installer will not modify or delete any data you have created in the "ctests" folder or outside it. However, if you have modified a sample file without changing its name, it will be reverted to its distribution version.
The installer will also preserve the "_settings.php" file (if it exists), which contains your project settings.
Security
You are right to be concerned about security. Can you be sure that you have downloaded the correct version of "BolProcessorInstaller.exe"?
The size of this file is exactly 1810925 bytes and its MD5 is 03b53cf9a18e1e382bc7bd8e8d046a7a. You can calculate the MD5 on this page. These numbers will indeed be subject to change with the release of new versions of the installer. The current version is dated 26 November 2024.
You may also want to know all the details of how it works. Geeks may want to customise it for their own use. Just download this folder which contains the source files (installer.ps1 and setup.iss) along with a summary of how to build the installer.
For readers not conversant with WindowsPowerShell, the following is a description of the process in human language:
Check that an Apache server MAMP or XAMPP is installed by finding either MAMP\htdocs or xampp\htdocs on the computer (not case-sensitive). If it is not found, exit with the warning that either MAMP or XAMPP should be installed.
Download the latest distribution files from GitHub: https://github.com/bolprocessor/bolprocessor/archive/graphics-for-BP3.zip https://github.com/bolprocessor/php-frontend/archive/master.zip https://github.com/bolprocessor/bp3-ctests/archive/main.zip
Unzip these three files. They create folders with names: bolprocessor-graphics-for-BP3 php-frontend-master bp3-ctests-main
Create a folder named "bolprocessor" (if it does not yet exist) inside the "htdocs" folder of the Apache server
Copy bolprocessor-graphics-for-BP3/source to htdocs/bolprocessor/ If there is already a "source" folder, delete it
Copy bolprocessor-graphics-for-BP3/Makefile to htdocs/bolprocessor/ Copy bolprocessor-graphics-for-BP3/BP2_help.txt to htdocs/bolprocessor/ Copy bolprocessor-graphics-for-BP3/Credits.txt to htdocs/bolprocessor/ Copy bolprocessor-graphics-for-BP3/BP3-To-Do.txt to htdocs/bolprocessor/ Copy bolprocessor-graphics-for-BP3/License.txt to htdocs/bolprocessor/ Copy bolprocessor-graphics-for-BP3/ReadMe.txt to htdocs/bolprocessor/
Copy bolprocessor/php/_settings.php to bolprocessor/ (if it exists)
Copy php-frontend-master/php to htdocs/bolprocessor/ If there is already a "php" folder, delete it
Copy bolprocessor/_settings.php to bolprocessor/php/ (if it exists)
Create a folder htdocs/bolprocessor/csound_resources if it does not yet exist
Copy the content of php-frontend-master/csound_resources to htdocs/bolprocessor/csound_resources Files that already exist should be replaced with their updated versions
Create a folder htdocs/bolprocessor/ctests if it does not yet exist
Copy the content of bp3-ctests-main to htdocs/bolprocessor/ctests Files that already exist should be replaced with their updated versions
Delete the temporary download directory
Replace htdocs/bolprocessor/bp.exe with the updated version.
Uninstall the Bol Processor
Uninstalling the Bol Processor and all the data downloaded or created for its use, is very simple: delete the "htdocs/bolprocessor" folder.
If you need to start XAMPP automatically after a reboot, here's a basic way to do it using systemd, which is the init system for most Linux distributions:
1) Create a systemd service file
Open a text editor to create a new file, for example:
sudo nano /etc/systemd/system/xampp.service
Add the following content to the file:
[Unit] Description=XAMPP Control Panel After=network.target
In May 2024, the Bol Processor BP3 acquired the ability to manage the input and output of MIDI events. This allows it to "communicate" in real time with external MIDI devices (keyboards, synthesizers) and even with other instances of BP3 running on the same machine.
For geeks and programmers: This feature had already been implemented in the earlier (MacOS only) version called 'BP2'. However, the implementation in a C language 'console' to work in MacOS, Linux and Windows environments was more technical. In addition, the concept of "real time" in the current MIDI setup is different from the previous one using Opcode Music System.
The following examples will work the same in MacOS, Windows and Linux. They have been tested on a recent PowerBook running MacOS (Sonoma) with 16 Gb RAM, and an HP Intel Core computer with 8 Gb RAM running Windows 10 (64-bit) and LinuxLite 7.0 (in Ubuntu). Memory size can become critical if many MIDI devices or virtual ports are connected.
Using microtonal scales is now possible in real-time MIDI. Read the Check MIDI microtonality page for details.
Setting up the MIDI environment
Let us assume that you have successfully downloaded, installed and compiled the Bol processor BP3, as described on the page Bol Processor ‘BP3’ and its PHP interface.
In Bol Processor jargon, a 'project' is either a grammar (with a '-gr' prefix) or a set of data (with a '-da' prefix). So, create or load a simple project, e.g. "-da.acceleration" which can be found in the "ctests" folder (download it here).
An output
By default, a project is set up to create MIDI files, as shown on the selector (see picture). Make sure your project is working! Then select Real-time MIDI and click SAVE format.
The selector will now display a different image, as shown below:
By default, the MIDI output used for sending events is numbered '0' — and the MIDI input used for receiving events will be numbered '1'. This is a common situation. In MacOS and Windows, these numbers are taken as 'ports'. In Linux they are considered as 'clients', each 'client' having its own 'ports', so certainly numbers '0' and '1' won't work… Never mind this issue, BP3 will take care of it when scanning real or virtual devices and trying to connect. Read more below.
We cannot rely on "port numbers" alone because they change when we turn on and off MIDI devices connected to the computer. In Linux, the client number is more specific to a MIDI device. In fact, the only reliable identification is its name, which is empty by default: the next field at the right of the input/output number.
Let us check the MIDI output. Windows does this automatically. The good news is that Windows 10 (and presumably later versions) comes with a built-in MIDI device called Microsoft GS Wavetable Synth. The Bol Processor will automatically detect it and connect to it if no other device is connected to the system.
Linux also connects, by default, the output to a virtual device whose client number is '0', but it won't produce any sound in the basic installation of Ubuntu. So, to try real-time MIDI on Linux, you need to connect an external MIDI device via its USB/MIDI interface, or to install a software synthesizer. Read more below.
Clicking Add an input will create fields for you to select an input device. We'll use this later.
To connect external MIDI input/output devices to Windows, you may need to install an environment similar to IAC on MacOS. Read details below. However, most tests shown on this page can be performed on Windows without any additional installation.
The following paragraphs are for MacOS users. Windows and Linux users can happily jump to the next section.
Turn on a MIDI device (synthesizer, piano, etc.) connected to the computer. On my personal Mac, I usually use the Pianoteq synthesiser, which produces a physical model synthesis of various keyboard instruments. It communicates with BP3 via a device called the Inter-Application Communication (IAC) architecture — read this if you need details.
The IAC driver is installed by default on recent MacOS systems. (It is a part of the CoreMIDI framework provided by Apple.) It allows you to create virtual MIDI ports that enable MIDI applications to communicate internally within the same machine. Equivalent devices exist in the Linux and Windows environments, see below.
The IAC also communicates with external MIDI devices via the USB ports, BlueTooth and possibly more network protocols. We'll try it later.
To set up the IAC, run the Audio MIDI Setup application (in the Utilities folder). Ask it to show "MIDI Studio". On my personal computer, it looks like this: the IAC driver, the Pianoteq synthesiser, a Pocket Key 15 keyboard connected to a USB port, and a Yamaha piano connected by standard MIDI cables and a USB MIDI interface. The Yamaha piano appears grey because it is switched off.
On active MIDI devices you will see triangles indicating input/output ports. These are used to connect devices directly by drawing a 'cable' to connect them. We don't need to use these 'connectors' as BP3 communicates via the IAC MIDI ports.
Check output options
(MacOS, Linux and Windows)
The easiest way to proceed now is to run any project in the Real-time MIDI mode, and see if sounds are produced… Whatever the result, at the end of the (potentially silent) performance, you will see a Show all … messages button along with a blinking red signal "=> 1 warning". Click on the button to read detailed explanations of the failure (or success):
🎹 Setting up MIDI system MIDI output = 0: “Bus 1” 👉 the number of your choice MIDI settings saved to ../temp_bolprocessor/trace_974dd9ab22_-gr.tryTimePatterns_midiport 🎹 Name(s) of MIDI input or/and output changed and will be updated when saving the page of your project
(For MacOS users)
This all makes sense given the Audio MIDI Setup shown above. The Bol Processor scanned all output (and input) MIDI ports. Given port '0' as an output by default, it assigned it to "Bus 1" which the 'port' set up in IAC.
If your synthesiser happens to be connected to "Bus 1", you will hear sounds and the problem is solved. Let us suppose that you are running Pianoteq and hear nothing. Open the settings of Pianoteq and select "Devices". All you have to do is check "IAC Driver Bus 1". You might also check other inputs, including Pocket Key 25 if you want to connect your small keyboard directly to Pianoteq, but these are extra procedures.
Opening Pianoteq settings informed us that it is communicating with IAC, and it suggested to use a IAC 'Bus' for this communication. The 'Bus 1' port is technically called a virtual port.
All you need to do to ensure that the connection remains correct when more devices are switched on/off and MIDI port numbers change. The only reliable way is to write the name "Bus 1" as the MIDI output. You can also write "Pocket Key 25" (or whatever is detected as your input MIDI device) to the MIDI input, as we will use it later. Note that MIDI port numbers are now irrelevant, as names take precedence. BP3 will correct them automatically.
Click the SAVE MIDI ports button to store this setting. Clicking on the SAVE format does the same thing, so don't worry about confusing buttons!
To the right of the MIDI port name is an empty field where you can enter a comment. For example, write "Pianoteq synth" to the right of "Bus 1".
Let us now switch on a Yamaha piano which is connected via a USB MIDI interface. The interface I use has a green light that indicates it has power. If the piano is actually communicating with it, we should see a flashing red light. In MacOS, sometimes it is necessary to restart the computer after switching on the piano… But in Windows and Linux the red light flashes immediately.
As soon as the red light flashes, open the Pianoteq settings. Great! We can now see that the Yamaha piano is recognised and connected to the IAC.
The easiest way to connect the Yamaha piano to BP3 is to click PLAY. Whatever happens, we'll get a warning and see the following diagnosis:
🎹 Your real-time MIDI settings: MIDI output = 0: “Bus 1” - MIDI input = 1: “new input” - 🎹 Setting up MacOS MIDI system MIDI output = 0: “Bus 1” 👉 the name of your choice Trying to assign ports to 1 input(s) without names but possibly with numbers MIDI input = 1: “Bus 2” 👉 the number of your choice MIDI input 1 makes BP3 interactive 🎶 More MIDI output options are available: MIDI output = 1: “Bus 2” MIDI output = 2: “Pocket Key 25” MIDI output = 3: “USB MIDI Interface” 🎶 More MIDI input options are available: MIDI input = 0: “Bus 1” MIDI input = 2: “Pocket Key 25” MIDI input = 3: “USB MIDI Interface” MIDI settings saved to ../temp_bolprocessor/trace_974dd9ab22_-gr.tryTimePatterns_midiport 🎹 Name(s) of MIDI input or/and output changed and will be updated when saving the page of your project
The MIDI input identified as "Pocket Key 25" is correctly connecting to port '2'. But the Yamaha piano is identified as "USB MIDI Interface". This is the name we need to copy to the MIDI output, then SAVE MIDI ports and PLAY. Another option is to leave the name empty and enter the MIDI output number '3'.
We hear the output on the Yamaha piano, although port numbers were incorrect on the interface. The inconsistency is resolved by the MIDI driver selecting ports by name in order of priority. Port numbers (and names) are updated as soon as you save or reload the project (data or grammar). Then you get:
Why does the name "Yamaha piano" appear in Pianoteq settings, but not in the MIDI ports scanned by BP3? This is a mystery that expert users of a MIDI studio could probably solve… For the time being, just write "Yamaha piano" in the comment field at the right of "USB MIDI interface".
The _part() command
An important feature implemented in October 2024 is the ability to send parts of a Bol Processor score to separate outputs. Parts are identified by the "_part(x)" command in which 'x' is an integer in range 1..12. We will be able to handle more than 12 parts if it turns out to be necessary.
The "_part(x)" command directs MIDI messages to a specific MIDI output, which in most cases will be an instrument. The image on the side shows the mapping of port #3 (USB MIDI interface) to part #2, as set up in its filter.
By default, MIDI outputs "hear" all 12 parts, but here we've restricted this one to part #2.
For MIDI port #0 (Bus 1) we've restricted the output to part #5.
Let us play the following score:
C3 D3
This sequence of notes is heard on both instruments. As there is no "_part()" command in the score, all outputs send the MIDI messages.
Now let us try:
G2 _part(5) C3 _part(2) D3
Note G2 is heard on both instruments. But, as expected, the note C3 is heard on Bus 1 and D3 is heard on the USB MIDI interface.
The "_part()" command has exactly the same syntactic behaviour as "_chan()" and "_ins()". For example, it "follows" the score along the fields of polymetric structures:
In this example, G2 is heard on both instruments. Then C3 is sent to Bus 1, as well as D3 and E3, since they are the first field of the polymetric structure. In the same time, B2 is sent to the USB MIDI interface, then A2 to Bus1. At the output of the polymetric structure, F3 is sent to Bus 1 which was the mapping before the structure. At last, G3 is sent to the USB MIDI interface.
The sound-object graph shows that D3 and B2 are played together, although on different instruments, and E3 and A2 are played together on the instrument connected to Bus 1.
Parts are mostly relevant when importing digitised scores. They are used to declare instruments in MusicXML files. When importing a score, the Bol Processor will optionally place "_part()" or "_chan()" commands in the imported score, so that it can be played on the same set of digital instruments.
Using "_part()" is a better option than "_chan()" to name an instrument, because MIDI channels can be modified to handle microtonal adjustments. On the Data page, there is a MANAGE _chan(), _ins(), _part() button that opens a dialog for converting parts to/from channels, parts to/from instruments, etc.
An input
Setting up an input follows exactly the same protocol as setting up the output. For example, we can set up the input on "Pocket Key 25” as shown above. “USB MIDI Interface” (the Yamaha piano) is another possible choice. Let us continue with Pocket Key 25.
Windows users can simply plug their external MIDI keyboard (e.g. "Pocket Key 25”) to a USB port of their computer, as it will be automatically recognised and set up by the system.
Connecting an input to BP3 is of little interest if BP3 does nothing with input events. The instructions it can handle are listed in the section List of scripts for dealing with real-time MIDI below. "Wait for note…" means that BP3 will stop playing until it receives a NoteOn of the note in question — even with velocity zero.
The script command tells that the performance should start when note C3 is received on MIDI channel 1. To avoid any confusion about octave numbers, I have written the name on the lowest key of my Pocket Key 25 (see photo). This confusion is made worse by the fact that the Italian/Spanish/French convention uses lower octave numbers!
So, the labelled key is the one we need to press to start this show. Let's try it…
When the PLAY button is clicked on the Data page, a flashing STOP button is displayed. The machine would wait forever unless the correct MIDI event has been received. The STOP button — or the PANIC button at the top right — can be used to abort the process cleanly. If all goes well, pressing the C3 key should produce this sound:
(This little "acceleration" piece was composed by Harm Visser to illustrate the period notation. Read his tutorial.)
Multiple interruptions are of course possible. Try this:
Now the machine will start its performance after receiving a NoteOn of C3. It will then stop after three beats and wait for a NoteOn of C4. A noteworthy detail is that one second after an interruption, AllNotesOff is sent to all MIDI channels and the pedals are set to off. This prevents notes waiting for their NoteOff from being heard. This "All Notes Off" feature can be turned off in the preferences file.
MIDI input filters
Let us play with the continuous improvisation "Mozart’s musical dice game" (called "-gr.Mozart" in the "ctests" folder). If this project is set for real-time MIDI, the improvisation will not stop until we click on the STOP or PANIC button. Inserting a "wait for note…" at the beginning would of course stop the performance at the beginning of every variation. Beware that you will have to write "do2" instead of "C3" due to the note convention!
But let's try something else, using the external keyboard (the Pocket Key 25 or Yamaha piano) to play notes on top of the performance. How strange! We don't hear any notes played on the external keyboard unless it's connected directly to the output device.
The reason for this becomes clear after clicking on the FILTER button for MIDI input 2:
All types of MIDI events are listed along with how they are processed by BP3. Here we are only interested in NoteOn/NoteOff events. The default setting is '1', which means that they can trigger script commands, but are not forwarded to the MIDI output. This is why 'C3/do2' was able to start the performance, although we could not hear it.
To play notes over the performance, we need to set the status of NoteOn and NoteOff to '2'. Note: If we only set the NoteOn status, BP3 will automatically set the NoteOff status to avoid confusion. Once you have changed the settings, click SAVE MIDI ports, then PRODUCE ITEM(S).
Since the Pocket Key 25 keyboard sends only NoteOn/Noteoff messages, we could as well set other event filters (KeyPressure, etc.) to '0'.
These filter settings are stored, together with the MIDI port names or numbers, in a temporary file whose name depends on both the session number (created by your browser) and the project name. A copy of these settings is stored in the (permanent) folder "midi_resources". This storage makes it possible to launch several instances of BP3 on the same browser or on different browsers, as we will now see.
Several BP3s performing and communicating
From the previous description of interactions via MIDI events — limited for the time being to waiting for a particular note — you may have guessed that a great feature of the Bol Processor BP3 environment is the possibility of running several BP3s, on different machines, or even on a single machine and the same browser… in cooperation with real humans playing MIDI instruments!
Each instance of BP3 can be thought of as a 'musician' with their own compositional skills embedded in a grammar or data (a set of pre-composed musical fragments). We are working on interactions that will allow each musician to modify the behaviour of another musician's grammar, for example by changing rule weights — which may result in some rules being suppressed while others are activated — or changing metronome settings if they need to perform faster/slower, etc. All these features were part of earlier versions (BP2) several decades ago!
Let us start with an extremely simple example using the "wait for note…" script.
Create two projects that contain only data, for example "-da.Beatrix" and "-da.Alan":
Note that these melodies do not contain the same number of notes, but they will have the same duration (2 beats) because of their polymetric structures.
We want Alan's performance to start precisely after the last note of Beatrix's performance. As we don't want E4 to overlap with F3, we have put a silence '-' before F3. In the following, we'll have a solution to overcome this limitation.
To manage the interaction in MacOS, we need an additional IAC port which is (automatically) named "Bus2". To do this, open Audi MIDI Setup and click on the IAC driver. Then add a port (see picture). You can create as many ports as you wish.
Set both Beatrix's MIDI output and Alan's MIDI input to "Bus2".
Now we want to hear both performances. Alan's MIDI output is sent to "Bus1" and will therefore be audible on the Pianoteq synthesiser.
Windows and Linux users can connect the two performers more easily: send both Beatrix's and Alan's messages to the external MIDI device, and connect Alan's input to the same MIDI device. But… the input filter should receive events and not forward them to the output, which is the same device, otherwise the loop will produce a disastrous bouncing effect!
Back to MacOS, there are two ways to send Beatrix's performance to the Pianoteq synthesiser:
Pianoteq settings make it possible to listen to both "Bus1" and "Bus2" virtual ports.
You can set up the MIDI event filter on Alan's project to route input NoteOn/NoteOff events to the current MIDI output. See above for filters.
To play the performance, click PLAY on Alan's project so that it is ready to perform. Then click PLAY on Beatrix's project. This is the result:
No doubt this sounds rather unmusical! In fact, we publish tasteless technical examples to encourage musicians to compose interesting pieces! 😀
Using out-time inaudible notes as signals
The idea of beginning Alan's performance with a silence that is filled by Beatrix's final note E4 is unelegant. Below is a better solution:
The secret is the expression "_vel(0) <<C0>>" which is an out-time expression of note C0 with velocity zero. The velocity ensures that the note won't be heard, and the out-time property gives it a null duration. Any note can be used here provided that it is mentionend in the "_script(wait…)" instruction.
If "_vel(0) <<C0>>" is followed by more notes, it is necessary to reset the velocity to its default value. The solution is to write it between curly brackets, so that _vel(0) only applies to the content of the expression: "{_vel(0) <<C0>>}"
Checking the time accuracy
Let us check that the real-time synchronisation is not affected by delays. We'll now ask Alan and Beatrix to play the same piece of music (one octave apart) in the same time.
In MacOS, Alan will send MIDI events to "Bus2". Beatrix will listen to "Bus2" and send MIDI events to "Bus1" (Pianoteq). Beatrix will set her input filter to the pass option, routing the incoming events to the output. The Pianoteq synthesiser will be set to listen to "Bus1" only.
To start the performance, first click on the PLAY button of Beatrix's project, then on the PLAY button of Alan's project.
Have you noticed that Beatrix is waiting for E3, which does not appear in Alan's score? Oh yes, it does! There is a _transpose(12) command that changes E2 (the first note) to E3. So, it works. This is the performance:
Not too bad? Despite the lack of musical interest, we must admit that the superimposition is technically acceptable, even if it is not perfect: there is a delay of about 60 milliseconds on the first note, the time it takes Beatrix's machine to detect that it has received a NoteOn for C3. The subsequent notes are programmed to compensate for this delay, but there are still discrepancies (which can be quantified on the Pianoteq MIDI input). They seem to be caused by delays outside BP3.
You can adjust the delay in Beatrix's project settings "-se.Beatrix". There is a parameter called "Sync delay", which is the number of milliseconds Beatrix's output events should be postponed after the synchronisation. We currently find that 380 ms is a good value.
In fact, the superimposition would sound even better if both performances were triggered by the same event, such as the conductor pressing a key on the external keyboard. This exercise was only intended to show that synchronisation between "virtual musicians" works well.
Working with multiple MIDI inputs
In the previous example, we could decide that Alan's performance will start when he receives a particular note from the Pocket Key 25 keyboard. In this case, we need to click on both START buttons, putting both 'musicians' in wait mode, and the performance will not start until the correct key is pressed on the keyboard.
This case is manageable with single inputs on each instance of BP3. More complicated cases, however, require external 'actors', such as a Pocket Key 25 keyboard sending to all the 'musicians' synchronisation messages, or messages modifying parameters in grammars, changing the metronome value, etc.
To achieve this, the Bol Processor is a able to manage multiple MIDI inputs.
The new game is as follows: both Beatrix and Alan will take turns playing variations of Mozart's musical dice game (see '-gr.Mozart'), one octave apart. They will use the Improvise mode to continue throwing the dice and creating unheard variations. But they will wait for a signal from the other to start playing a new variation.
In short, both musicians will use the same grammar, with only a small change for mutual synchronisation. Their settings must be carefully adjusted:
Select Italian/Spanish/French as a note convention
Check Non-stop improvise
Adjust Pclock = 3 and Qclock = 11 to get the same metronome speed of 220 bpm
Set Sync delay to 380 ms
We do't want both musicians to repeat the same variations. So, set the Seed for randomization to different values, for instance '1' and '2'. Or set it to zero to instruct the machine to seed the random sequence with an arbitrary number of its choice.
In the current version of BP3, the easiest way to send a signal is to send a note with a velocity of zero, which will therefore go unheard. So we need to change the grammar to add these particular notes.
In fact, the same notes should never be part of the score, so that the signal is really sent at the end. This is easy with Mozart's game, for example we can use C# (do#) for the synchronisation. Below are the tops of the grammars used by Beatrix and Alan.
Beatrix '-gr.Beatrix':
-se.Beatrix ORD gram#1[1] S --> _script(wait for do#3 channel 1) _vel(80) A B _vel(0) do#2 gram#1[2] A --> A1 A2 A3 A4 A5 A6 A7 A8 A1 A2 A3 A4 A5 A6 A7 A'8 gram#1[3] B --> B1 B2 B3 B4 B5 B6 B7 B8 B1 B2 B3 B4 B5 B6 B7 B8 ------------------- LIN [Select rules randomly and apply from left to right] etc.
Alan's '-gr.Alan':
-se.Alan ORD gram#1[1] S --> _script(wait for do#2 channel 1) _vel(80) _transpose(-12) A B _vel(0) do#4 gram#1[2] A --> A1 A2 A3 A4 A5 A6 A7 A8 A1 A2 A3 A4 A5 A6 A7 A'8 gram#1[3] B --> B1 B2 B3 B4 B5 B6 B7 B8 B1 B2 B3 B4 B5 B6 B7 B8 ------------------- LIN [Select rules randomly and apply from left to right] etc.
Again, we put do#4 at the end of Alan's performance because it is played as do#3 (one octave lower) due to the _transpose(-12) instruction.
Now we need to set up the MIDI inputs and outputs. Beatrix will send events to "Bus 1" which is the Pianoteq synthesizer. She will receive events from "Bus 2", use them for synchronisation, and forward them to the output.
Alan will send events to "Bus 2" and listen to "Bus 1" for the synchronisation.
This is all perfect on paper, but who is going to start? We have created a chicken and egg situation, so we need a superpower to start the process! Actually, a real human pressing the do#2 key on a Pocket Key 25 keyboard will do.
The interface has a Add an input button. We click it on Alan's project and paste the name Pocket Key 25. We also use the comment fields to remember the use of each port:
To start the concert, we'll click START on both projects. The order is irrelevant. Then we'll press a key on the Pocket Key 25. Which key?
If we press the do#2 key, we will certainly trigger Alan's improvisation and the cycle will start. But if we press the do#3 key, nothing will happen because the filter of the Pocket Key 25's input, by default, does not transmit NoteOns to the output. So Beatrix won't hear it… By setting NoteOn to status '2' (treat and pass) on this filter, it will be possible to decide who will start the performance: do#2 for Alan and do#3 for Beatrix.
Here we go (starting with Alan):
👉 This simple show should convince musicians to create "virtual bands" of BP3s playing different grammars and sending specific synchronisation signals according to which variation has just been produced. Along with human performers who join in the fun!
The "virtual musicians" can be on the same computer or remotely connected via network (BlueTooth) or MIDI cables and USB interfaces. If they are on the same computer, they can be run on different browsers and/or the same browser. In the latter case, BP3 will not allow the same project (grammar or data) to be run in the same session. In general, just like human musicians in an orchestra have individual scores, it makes sense that virtual musicians don't share the same project…
The number of MIDI inputs and outputs in a project is currently limited to 32. It is very unlikely that a (human) musician will need more!
Synchronise to a sequence (or list) of notes
The following expression
gram#1[1] S --> _script(wait for C3 channel 1) - _script(wait for E3 channel 1) etc.
synchronises the production to the sequence of notes C3 E3 (whatever the duration and velocity). This creates interesting situations where a "virtual musician" is expected to start playing after receiving a signal (C3) from one partner, then a signal (E3) from another partner.
Note that there is a silence '-' between the two script instructions. If there is no silence, then BP3 will resume playing if either C3 or E3 has been received.
Remember that because of the MIDI channel specification (range 1 to 16), the detection of signals can be very selective. They are also inaudible when transmitted by NoteOns with velocity zero.
Crashing the band!
In the example of Alan & Beatrix playing Mozart, the connection seems to create a loop: Beatrix sends events to Pianoteq and Alan (bus 1), who in turn sends events to Beatrix (bus 2). Isn't that dangerous?
The reason it doesn't crash is that Alan's input "fromBeatrix" (Bus 1) is filtered: NoteOns are received and processed (for synchronisation), but not passed to the output (Bus 2), i.e. to Beatrix. You can try to change the filter of input "Bus1" on Alan's project, setting NoteOns to status '2' (treat + pass): you will get a superb crash after a flood of notes! ➡ This shouldn't happen, because BP3's MIDI drivers have been equipped with an anti-bouncing mechanism.
Working with multiple MIDI outputs
The Bol Processor currently accepts up to 32 MIDI inputs and outputs.
Example of a project using two inputs and two outputs:
The procedure for adding outputs is the same as the one for adding inputs: click on the Add an output button, then enter the name of the MIDI device if you know it exactly, otherwise leave it blank and let the machine connect it by default to the next available output, while suggesting other options:
🎹 Your real-time MIDI settings: MIDI output = 0: “Bus 1” - Pianoteq MIDI output = 3: “USB MIDI Interface” - Yamaha piano MIDI input = 1: “Bus 2” - from Alex MIDI input = 2: “Pocket Key 25” - a small keyboard 🎹 Setting up MacOS MIDI system MIDI output = 0: “Bus 1” 👉 the name of your choice MIDI output = 3: “USB MIDI Interface” 👉 the name of your choice MIDI input = 1: “Bus 2” 👉 the name of your choice MIDI input 1 makes BP3 interactive MIDI input = 2: “Pocket Key 25” 👉 the name of your choice MIDI input 2 makes BP3 interactive 🎶 More MIDI output options were available: MIDI output = 1: “Bus 2” MIDI output = 2: “Pocket Key 25” 🎶 More MIDI input options were available: MIDI input = 0: “Bus 1” MIDI input = 3: “USB MIDI Interface”
The fact that a MIDI input or output is "available" does not guarantee that it will do what we want it to do. For example, sending MIDI messages to the Pocket Key 25 keyboard will actually do nothing.
Filtering MIDI outputs
In the example above, MIDI output 3 (the Yamaha piano connected to the USB MIDI Interface) has the following filter:
The channel filter specifies that the Yamaha piano will receive all MIDI channels except those emitted on MIDI channel 2. Filtering MIDI channels makes it possible to send events exclusively to different instruments.
MIDI events can also be filtered by type. The idea is the same as for MIDI input filters, see above.
👉 If you do not hear any sound in real-time MIDI, you may consider checking the output MIDI filters before you kick the piano or screw up its cables!
Using standard MIDI control
MIDI has standard control messages, namely Start, Continue and Stop, which can be used to coordinate multiple "virtual musicians" (instances of BP3). The advantage is the clarity of the data and the grammars programmed for interactions. The disadvantage is that these messages are not assigned to specific MIDI channels. This can be a problem with a large number of "musicians". They also introduce a delay of about 250 milliseconds due to the time it takes for the MIDI device to process them.
Let us look at a trivial example (of no musical interest), again with Beatrix and Alan playing together. This time, they take turns playing their items (simple sequences of notes).
In short, he will play three notes (E3 D3 C3), then send a START to Beatrix and wait for a CONTINUE from Beatrix, then on receipt play the final three notes A2 B2 C3.
This is Beatrix's data:
_script(wait for Start) E4 D4 C4 _script(MIDI send Continue)
Beatrix's project sends its output to "Bus 1", the Pianoteq synthesizer. Its input is connected to the virtual port "Bus 2".
Alan's project sends its output to the virtual port "Bus 2", and its input is connected to "Bus 1".
We'll start the performance with Beatrix. Her machine will stay waiting for START. Then we'll start Alan's part, which will play three notes, then send a START message to Beatrix, who will play her part, and return to Alan, via a CONTINUE message, for the final part…
This all sounds logical, but it doesn't work! We do hear Alan's E3 D3 C3, but then nothing… The first reason is that Beatrix should be able to hear Alan's START command, which is no longer a NoteOn as in the previous examples. This means that the filter of her input "Bus 2", from which she receives Alan's MIDI messages, must be set correctly. The Start event should be received, see picture on the side. Also remember that NoteOn and NoteOff should be received and transmitted to the output (the Pianoteq synth).
Well, now we hear Alan's E3 D3 C3 followed with Beatrix's E4 D4 C4, but then… nothing! 😢
Careful analysis is needed to solve the problem. However, this is simple logic. Remember that Alan is playing on "Bus 2", which is not connected to any MIDI device. If we hear Alan's production, it is because it is received by Beatrix on "Bus 2" and then forwarded to "Bus 1" (the Pianoteq synth). The problem is that the final part A2 B2 C3 is sent to Beatrix, but she has already stopped listening because her own data is finished!
You can imagine a band in which one musician plays an improvisation and then gives a signal to another musician to start their own improvisation, but the careless musician has already vacated the place believing that the programme was finished. The solution is to tell the musicians not to go away until they receive a STOP signal. Maybe a signal from a conductor (here using the Pocket Key 25 keyboard), maybe a signal from the musician who is in charge of ending the performance. So we'll tell Alan to send a STOP signal at the end of his performance, and Beatrix to wait for Alan's STOP signal. Below are the revised scores.
// Alan E3 D3 C3 _script(MIDI send Start) _script(wait for Continue) A2 B2 C3 _script(MIDI send Stop) // Beatrix _script(wait for Start) E4 D4 C4 _script(MIDI send Continue) _script(wait for Stop)
The MIDI messages Start, Continue, Stop have been used here to facilitate the reading of scores (or grammars), but these can be replaced by NoteOns with durations and velocities zero on different MIDI channels when working with a larger number of actors.
By the way, using MIDI messages Start, Continue and Stop can be problematic with physical or virtual MIDI devices. The Yamaha piano, for example, does not transmit these messages. So, when connected to an input, it will only send 3-byte messages such as NoteOn/NoteOffs. In the Windows environment, the Microsoft GS Wavetable Synth also does not transmit any message at all. The best way to exchange messages is via virtual MIDI ports created by "loopMIDI" (see below). In Linux, virtual ports such as 'VirMIDI 0-0' (see below) do not seem to transmit these Start, Continue and Stop messages.
For geeks: In the Bol processor, scripts are appended to the next following sound object. For example, _script(wait for Start) is appended to note E4 in Beatrix's score. But what about scripts at the end of a score? The secret is that BP3 creates an invisible MIDI event (ActiveSensing) at the end of each element to which it can append the final scripts.
List of scripts for dealing with real-time MIDI
The list below will be kept up to date as there are many scripts on the agenda. These instructions are not case-sensitive.
Input scripts
When a note is specified, be sure to use the same note convention as in the project, e.g. C3 or do2 or sa3, etc.
Wait for note channel c
Wait for a NoteOn of the specified note on channel c (1…16)
Wait for Start
Wait for a Start MIDI message (250)
Wait for Continue
Wait for a Continue MIDI message (251)
Wait for Stop
Wait for a Stop MIDI message (252)
Wait forever
Wait until STOP or PANIC button is clicked
Velocity param Kx = note channel c
Set parameter Kx (0 < x < 128) to the velocity (range 0…127) of the specific note on channel c (1…16)
Control param Kx = #y channel c
Set parameter Kx (0 < x < 128) to the value (range 0…127) of MIDI controller #y (0 < y < 128) on channel c (1…16)
Output scripts
Hold for x milliseconds
Delay all subsequent events by the specified duration x (integer).
Send Start
Send Start MIDI message (250)
Send Continue
Send Continue MIDI message (251)
Send Stop
Send Stop MIDI message (252)
Scripts on top of a grammar
(To be continued)
Capture incoming events
The _capture() command allows incoming MIDI events to be recorded to a 'capture' text file. See the Capture MIDI input page for explanations.
Alternatives to IAC
Here are the equivalents of Apple's IAC (Inter-Application Communication) for each system:
Windows environment
On Windows, you can use software like loopMIDI or virtualMIDISynth to create virtual MIDI ports. These tools work similarly to the IAC Driver on macOS:
loopMIDI: Created by Tobias Erichsen, loopMIDI is a popular choice for creating virtual MIDI ports on Windows. It allows you to create and manage several virtual ports which can be used by applications to communicate with each other.
These tools integrate with software applications that support MIDI, providing a seamless way to connect various MIDI applications without needing external MIDI hardware.
Linux environment
On Linux, ALSA (Advanced Linux Sound Architecture) provides capabilities to create virtual MIDI devices through its sequencing API.
snd-virmidi: This ALSA MIDI driver provides virtual MIDI ports for sending and receiving MIDI between applications running on the same system. It's part of the standard ALSA module set and can be configured to provide multiple ports.
To set up virtual MIDI ports on Linux using ALSA, you typically need to load the snd-virmidi module. You can do this by running:
sudo modprobe snd-virmidi midi_devs=2
This command loads the snd-virmidi module and sets it up to provide two virtual MIDI devices (you can increase the number of devices by changing the midi_devs parameter). The virtual ports, namely 'VirMIDI 0-0' and 'VirMIDI 0-1' , can then be accessed by MIDI applications on the Linux system. Please note that they do not appear to transmit the Start, Stop and Continue messages.
👉 This is done automatically by the "install_bp3.sh" shell script installing BP3 on Linux/Ubuntu (download here).