Saturday 19 May 2018

Video Tutorial - TuxGuitar and Reaper

The follow up to my Bass MIDI Sampler video. In this one, I show how to get TuxGuitar and Reaper to communicate via MIDI, so that you can compose in TuxGuitar, but play all your sounds using your own sampled instruments.


So what next?

So the main driver behind doing this was to compose and document my riffs in TuxGuitar (originally GuitarPro 5, but ir doesn't support 8 strings). I found TuxGuitar to be a fantastic free replacement. The learning curve is not too hard. Some stuff is not intuitive (I could do another video on that) and the one feature I'd really like to see - Support for MIDI Time Code (MTC) or Song Position Pointer (SPP) - would be great if they could add it.

loopMIDI

loopMIDI are (free) multiple virtual MIDI cables, used to create virtual (loopback) MIDI-ports that interconnect MIDI applications in Windows. This allows two or more applications to send and receive MIDI signals from each other - and that's how I get TuxGuitar and Reaper to talk to each other.


Within Reaper I set the appropriate MIDI channel and arm recording on the track, so that it is now listening for MIDI events on that loopback MIDI channel.

TuxGuitar

I only really looked at TuxGuitar (TG) a few weeks ago. I had low expectations - as much as I love Open Source Software, quite often the quality is less than desired - but not with TuxGuitar. It's a really nice application. Yes, it does some weird stuff. Yes, they don't seem very responsive on their support tickets or forums - but hey, it's all volunteer efforts (I should know I wrote Open Source software for years), and that's just part of the package. Fingers crossed, they think what I'm asking for is important enough to code.


On a brand new TG project, I setup the track to be for a bass guitar. Some of the settings are applicable to the internal MIDI synthesizers (like my selection of Picked Bass instrument), and don't affect what I want to do in Reaper. As long as I select Channel 1 (CH #0 in TG), then we should be good.


At this point, you can try it out - you should technically hear it come out of your default speakers using the internal TG FluidSynth [MagicSFver2] synthesizer. This is probably a good test. But if you want it to redirect into Reaper, you need to change this to loopMIDI Port. Not sure why two of them show up, but both seem to work.

Reaper



Once you've done this, you should be able to hear the sampler you setup in Reaper.

Last Words

So the last thing I wanted to show you is how I use this.

I do most of my drum grooves in Reaper customising stock grooves from MT Power Drum Kit. As I do this in Reaper, I will export to TuxGuitar as time goes on and the "Master drum beat" is maintained in Reaper. The only tip about doing it this way, is you have to create a MIDI clip, select All and change the MIDI channel to Channel 10, otherwise TG doesn't recognise it as a drum beat.

The rest I do in TuxGuitar, primarily to maintain the guitar tablature. If I was to re-export out of Reaper I would lose all that information.

When it's ready to go into Reaper, do the export and add MIDI items to my Reaper project. Here's a couple of screenshots not in the video.



Thanks for listening. I hope you've found it useful. Let me know if you have any questions or comments.

Friday 18 May 2018

Video Tutorial - Bass MIDI

Hi everyone - my first video tutorial. I'm using Reaper to DIY sample my own bass, so that I can compose in MIDI and get pretty close to my sound. There is a second video for using TuxGuitar to compose and play in Reaper using this sampler.

If you want to just watch the video, here it is. I'll write up the info below for those who prefer to read about it.



But why?

So why would I bother? Well, when I started using Guitar Pro 5 I really didn't like the sounds. Even the RSE sounds just didn't do it for me. I also found that while composing, simple changes to riffs and bass lines would mean I'd have to re-record stuff and it was time consuming and inefficient, especially in the early stages of composition.

So I started writing in MIDI. But again, even in Reaper or Guitar Pro, when it's not your sound - you can't really get into it.

Where to start?

So I had used Reaper's ReaSamplOmatic 5000 (RS5K) when migrating my Virtual Drum Kit from Ableton to Reaper last year. I knew I could get away with one WAV file for my samples, so I went about creating the sample WAV file.

The WAV file above is just me playing the open strings on my 4 string bass. I tuned to BEAD.

The Sampler

The next step is making an RS5K instance for each string.


You can see in the FX chain above, I have an RS5K instance for each string on my bass. I basically set the Attack and Release markers around the open string section that I am mapping. The instance above shows the lowest B string.

To ensure I consistently setup the samples correctly, I usually zoom into 100 spl/pix and setup the Attack marker on the first hit of the waveform. At the same zoom level, I ensure the Release is happening before the next sample.

I tried both Modes, Freely configurable shifted and Note (Semitone shifted). The second one would have been useful if there was no starting point for each sample (it just tunes the sample however you configure it). But for my four samples, they start at the open string note and therefore raising 25 semitones (seems to be 24+1) on that string makes sense when trying to capture the tone of my bass guitar.

The Note start on my open B string was a B0 (23) and as its a 24 fret bass, technically, the highest note I'd want this instance to play to would be B2 (47) - Note end. There are many ways I could have done this, but this way worked at the time, so I went with it. The Pitch@end setting is the maximum semitone shift this sample should accommodate.

Pitch offset got used on the other instances where the tuning of my sample needed to be corrected. Max voices was one I had to play around with. I tried Max voice of 1, but what happened was, if  say you played a B0, followed by a B1, this instance of RS5K would evaluate "well I could play B1, but I'm only allowed a single voice, so I'll accept the MIDI note (and remove it) but won't actually play it" - rather than saying, "we'll I can't play it, let me pass it to another string". I reduced the number of voices as the strings got higher to encourage the lower strings to be used, but I don't think that setup is ideal.

I could try and play around with this again, see if I can get it to work - but it's all about order of RS5K instances and order of the MIDI notes you play. If you're slightly out, playing higher notes first, the lower strings will accept them, leaving no room to play the lower notes coming afterwards.

Remove played notes from FX chain MIDI stream is enabled, because once we've played a note, we don't want other instances playing it again. And lastly, Obey note-offs is enabled, to ensure I can  mute strings if I have too.

After the Sampler

If I didn't have the sampler, I would have my bass track with a prototype FX chain to get a sound similar (but cut down) to my production bass tone. In my blog post How I Record my Bass in the Box, I cover off my full-blown bass tone setup. But for the purposes of composing, I can't afford to be that complicated. But I do want something that sounds like my bass.

Bass Setup

So in the track, where I would normally plug in my bass, I redirect the output of my sample so that I can continue to shape my tone.


After some basics like Compression and Gating, I used a TSE R47 fuzz box, BOD SansAmp clone and some free Ampeg impulse responses. I usually automate the fuzz pedal as I need it, and have the BOD try to be as bassy as possible (not much grit).

There is a part 2 of this post, TuxGuitar and Reaper which shows how to get MIDI communications between those programs.