I'm very excited to have found this community! Thanks in advance for your help and direction.
Here's my midi challenge:
Essentially, I would like a more detailed version of Apple's ChordTrigger or Xfer's Cthulhu.
I would like to be able to select a note and be able to transpose it differently across many different midi channels. For example, when I play C3, I would like to be able to send C3 to channel 1, E2 to channel 2, G1 to channel 3, C1 to channel 4, etc...
Preferably, I would like to be able to quickly make these mappings. A learn button for each channel perhaps.
Then I would like to be able to apply a pitch adjustment, pitch randomization, velocity adjust, velocity randomization, delay, and delay randomization to each output.
Also a keyboard GUI that shows the mappings would be INCREDIBLE.
Currently I used many layers of chord trigger to accomplish this (currently doing without the parameter adjustments). I play monophonic melodies and my transposition army follows me with chorale-voiced harmonies. Throw in a breath controller and keyboard horns sound REAL! A lot of very cool possibility here. My current setup works, but it's inefficient with memory and difficult to organize and program. I would love to be able to do this from inside a single plugin. If you can help me make this happen you would change my life! (And remove the world of the scourge of cheesy keyboard horns)
Sounds a bit ambitious, but don't let that put you off. Might be better to start with the simple stuff and then build up though.
Definitely not within StreamByter's abilities to show a keyboard GUI I'm afraid - I can tell you that one straight up.
Internally you probably need 12 sets of channel+transposition lookups (1 for each note) if I have understood your challenge.
Rather than a learn button for programming the channels/transpositions, maybe a note (or number of semitones) dropdown for each channel would work, and you use the last note played to determine which input note you are working with.
Alternatively, you could program the note transpositions from the external MIDI controller itself in real-time using some sort of SHIFT function (a CC or a note) that when held down switches into 'programming' mode and then interprets the first received note as the input trigger note and then the next notes to be played on channels 1, 2, 3 until shift is released. Something like that, although I guess that is pretty well your learn functionality.
You might find it easier to break out the other features (pitch adjustment etc) into separate modules that get fed the output of the transposition army, rather than trying to come up with an all singing/dancing script.
You could use the SB sliders to set the transpositions in semitones for the different accompanying sounds. Note that they're all negative in the proposed example. Then use note on to generate the other notes on different channels. You should probably save the notes that were sent, so they can all be turned off on note off. Assuming you always play monophonically (this can be enforced), it would be pretty simple. However, you'd need to adjust the sliders to produce a different pattern.
Alternatively, you could pre-define patterns for Major, minor, dominant7th, 6th, etc. If 5 or 6 patterns are enough, you could use a controller knob or slider to send a CC to make the selection. With only a few choices, you can scale the knob so it's not hard to hit the spots blind. (nic scale 0..127 to 0..5, with half steps at the ends.) Then, since you're playing mono, you could use your free hand to change the pattern for the next note (that's why I suggested actually saving the notes for note off, so they don't need to be recomputed).
You could extend this further by creating several pattern sets, selected by an SB slider or another controller knob. The styling would all be "canned" but you could have a lot of flexibility in a fairly simple program. And you can edit the data in the script to adapt for a particular song. And, when you save SB in an Audiobus preset, or whatever, it saves the current version of the program. So you can save a preset for a song, with the harmonies tuned to suit.
I wouldn't favor the Learn route, because it's so difficult interacting with a keyboard and GUI at the same time, with minimal feedback. And you quickly need an editing capability, so it becomes very messy. But I guess you could use the SB GUI to select a pattern, and display the current offsets on the sliders. That might work.
Thanks for your replies! I appreciate you putting your energy into this. If I understand you both, I don’t think either setup you’ve suggested will afford me the flexibility I need. I need any of the 88 keys to be able to send any combination of notes to any channel that will change on a patch per patch basis. Controlling switches or sliders isn’t a possibility. I will be using both hands on the keyboard. The setup is aimed at making “unplayable” things playable and also the ability to easily scatter song specific switches and samples within easy playing distance. Let me run some scenarios in which I’ll be using the software:
Programming pad chords (different MIDI channel) on unused notes in a scale, to be triggered with my thumb while the lower fingers in my left hand run a bass line
Remapping the erratic trigger notes of a sampler to a comfortable position on the keyboard
Programming horn sections (each instrument on a different channel) to follow simple melodies, sometimes with added chromatic, passing tones, and contrary motion, depending on the song (often I’ll reprogram different chord changes to different octaves if it isn’t the same throughout the song)
Using MIDI notes to trigger DMX/video samplers
Currently I’m running 8 instances of Mainstage’s chord trigger or Xfer’s Cthulhu with each one sending to a different sound module. This setup works, and I like how I can choose my mappings. I’m able to turn on the app, touch a note, select a note that I want it mapped to, select the next note I want remapped, select it’s destination...etc. However, I would also love to be able to sculpt pitch, velocity and timing for a more organic ensemble sound.
When it works, it blows any other keyboard setup out of the water. When it doesn’t work I’m left with high blood pressure and my band is left without a keyboardist at all.
I’ve come realize that this is not something I can figure out on my own. I’d be happy to commission someone to solve my problem, if there are any developers who are up to tackling it.