Composers' Forum

Music Composers Unite!

How to Write Your Music in a Notation Program and Have It Played by a DAW - AT THE SAME TIME

I started writing this up for a Facebook group, but thought it would be worth sharing here...  Happy to answer any questions if folks are interested in setting up their own workflow like this...

John

The Best of Both Worlds
How to Write Your Music in a Notation Program and Have It Played by a DAW - AT THE SAME TIME

A couple people have asked me about my setup, so I decided to write it up in detail here. I may make a video at some point, though I thought it would be most helpful to write it up first.

WHY NOT JUST A DAW?
There is a gulf between those who write using notation and those who write in a DAW. I am one of those old school guys who just *thinks* in notation, not MIDI data. If I am working on a large score, I want to *see* everything that's going on, both while I'm writing and during playback.

WHY NOT JUST FINALE?
Finale has "human playback" which automates a lot of MIDI functions during playback. I dare say it pretty good and is rather customizable.

However, Finale is very poor at hosting VSTs (virtual instruments), especially any of the current generation (e.g., Spitfire). First, if you are running any version of Finale before the current one, you are using a 32-bit program that is just not able to handle the amount of processing that is needed.

Finale is also very limited when it comes to any sort of secondary audio processing (reverb/EQ/etc). Unless you do it within the VST itself, you are limited to one of the three slots and any tweaking outside of volume can only be applied to ALL the sound coming out of Finale.

Finally, Finale is bad at video. There is a sort of syncing/video hosting capability, but it is extremely buggy and will likely cause crashes. Similarly, I believe you can add a single line of audio to an existing file, but it is wonky and of course, you can't make any edits to it within Finale.

THE PRINCIPLE
So the idea is to combine the benefits of Finale (notation and decent automated human playback) with the benefits of a DAW (powerful VST hosting and the ability to adjust the audio to your heart's content).

In order to do this:

1) you need to have both programs
2) you need to run both programs at the same time
3) you need to have them talk to each other using virtual MIDI cables

I use LoopBe30, which provides you up to 30 virtual MIDI ports and is <$20. After installation and rebooting the system, it will run in the background and both Finale and Cubase will recognize its MIDI ports. You can tell Finale to send MIDI data to a specific LoopBe MIDI port and Cubase will be able to receive the data from the same port.

Now, important caveat: when Finale is running in MIDI mode (as opposed to VST mode), you are limited to only eight MIDI Out ports. As you probably know, each MIDI port has 16 available channels, so this makes for a grand total of 128 MIDI channels available to be sent from Finale to Cubase. This may seem a like a lot, but if you try to create an orchestral template with every single instrument having up to 20-30 articulations on different channels, you will very quickly run out. So to set up an orchestral template under this system, you will have to make heavy use of libraries that allow you to change articulations on the same channel using keyswitches or something like Spitfire's UACC system (more on why I love their system later).

Right now, my standard orchestral template has each of the eight MIDI ports designated for the following orchestral instrument families:

1 - ETC - ancillary instruments not commonly used
2 - Winds
3 - Brass
4 - Standard Percussion
5 - Keyboards / Harp
6 - Solo Voices & Chorus
7 - Strings (I use Spitfire Symphonic and Chamber for divisi)
8 - Solo Strings

So, say I want to write a piece for solo alto flute and I decide that the instrument is going to sit on MIDI port 1, channel 1.

I start Cubase and load up an alto flute VST instrument, telling the VST to receive MIDI data from MIDI port 1, channel 1. I then open up Finale, create a new staff called "Alto Flute" and tell it to send data on MIDI port 1, channel 1. I type some notes into Finale and when I hit play in Finale, Finale sends the MIDI data (realtime) via the LoopBe MIDI cables to Cubase which executes them as sound (realtime).

Do this a hundred times with all the different orchestral instruments and MIDI channels and you have an orchestral template where you can enter any note on a staff in Finale and it will play back through the DAW as the specific named instrument.

WHAT ELSE DO I NEED TO DO?
Since Finale is essentially translating written notes/etc into MIDI data for Cubase to execute, you will need to create special instructions in the form of "expressions" for any MIDI activities that go beyond pitch and duration.

One of the most important set of expressions is dynamics. If you are familiar with most modern VST libraries, you know that they often have more than one way to use MIDI data to control dynamics. Most commonly these are CC#1 and CC#11. Because CC#1 is the most common, I set Finale "Human Playback" to interpret all dynamics as CC#1 by default. However, since I also want to control CC#11, I have made a separate set of dynamics that only change CC#11. I use the same "pp" and "ff" graphics, but I make them somewhat smaller and "hidden" (so they will not print out on the score).

The other major category of expressions you will need to create are for articulation changes. Most libraries use keyswitches, but an extremely frustrating part of many keyswitch instruments is that they are all slightly different. This makes it very hard to create a simple set of text expressions to control all orchestral instruments.

THIS is why I love Spitfire's UACC (their effort to standardize the data standard for articulation changes so that is the same, at least across instrument families). Using UACC means that I can create a single text expression ("legato") and if I put it on the flute staff or the bassoon staff (or even the cello staff!), Cubase will switch the VST to the legato articulation. As I do for the CC#11 dynamics, I do make these articulation-switching expressions smaller and "hidden".

Views: 312

Reply to This

Replies to This Discussion

Hi John,

I don't want to de-rail your thread by talking about reverbs and mixing. Suffice to say that I use reverbs of no longer than 2-2.5sec. Pre-delay also plays a big part in creating depth (the lower the delay, the further away the sound and vice versa) and I'd recommend you try experimenting with it along with a reduced decay time. It's always good to experiment and listen/compare to good recordings...darn,  I talked about reverbs...;-)

mikehewer.com

Thanks, Mike.  Good info.  I will continue to fiddle with that and the close mics.  Don't worry about derailing this thread--I consider it more like life support...  :D

Don't you think reverb is not only a question of personal taste but also the application?  i.e., more reverb in film scores than recordings of concert works?

Presonus has focused on integrating their notation software (Notion) with their DAW (Studio One) and you can go back and forth from one to the other. Also, once you have a Notion license (which costs a fraction of Finale), you can install it on your laptop as well as iPad and iPhone. It has hand writing recognition, so if you have a musical idea you can quickly write it on your phone, then edit it later on your laptop. I haven't used Studio One yet, so I haven't experimented with the integration - though I am planning to do it soon. I've seen very good reviews about how they work together. I'm short of time right now, or I would dig up those reviews to post here (sorry). 

On the other hand, I don't know whether Notion can do all the things you can do with Finale. Maybe there are Notion experts here?

I dunno John, I have classical recordings that have over-egged processed reverb to make the sound bigger ( and yet just diffusing the sound too much) and there have been plenty of examples of comparitively dry scores over the years. It has to be appropriate for the music first and foremost imo. Film, especially, is a medium where the reality of acoustic space can also be warped for psychological reasons and so reverb can also be considered as a creative tool too.

It is all too tempting to bathe samples in reverb, sometimes because of compensation for them not being real.

Don't you think reverb is not only a question of personal taste but also the application? i.e., more reverb in film scores than recordings of concert works?

Reply to Discussion

RSS

donate

© 2018   Created by Gav Brown.   Powered by

Badges  |  Report an Issue  |  Terms of Service