Bandlab - A Daw Overview: Cursor (F) ) and Through The Bars of The Song, Shown As Numbers. Here, A Cycle (Loop) of 4 Bars
Bandlab - A Daw Overview: Cursor (F) ) and Through The Bars of The Song, Shown As Numbers. Here, A Cycle (Loop) of 4 Bars
1)
a. Tracks – Your song comes together track by track. In this case, it’s showing six different tracks
labelled E-Bass, Lofi Drums 10, Lofi FX 06, Lofi Guitar 04, Lofi Keys 09 and Rainbell Tambourine.
b. Instrument track – In any project, you can use either instrument or audio tracks. In this
particular case, the only instrument track is a virtual instrument which I played and recorded
manually using a MIDI keyboard.
c. Audio track – The other type of track is an audio track. Its information is displayed in the
corresponding audio region as a graphical wave [h]. Note that the audio tracks in this project
were created automatically by dropping loops onto the main arrangement window and thus
named after them.
d. Main arrangement window – This is where the main elements of the song are shown, where
you can see the information (notes, sound waves, etc.) as data that corresponds to the tracks
on the left.
e. Timeline – As the song plays, it moves from left to right (as indicated by the playhead / edit
cursor [f]) and through the bars of the song, shown as numbers. Here, a cycle (loop) of 4 bars
is set (see cycle function [k]).
g. Instrument region – The visual representation of the track’s content. Its MIDI note data is
shown as small grey bars. Note that this region is selected (white outline).
h. Audio region – The visual representation of the track’s content. Its data is displayed as a
graphical waveform.
i. Drop area – Here, you can drag and drop either a loop or an audio/MIDI file. This creates a
new track according to and containing the dropped content.
1
j. Undo/redo buttons
k. Cycle function – If a cycle is set and activated in the timeline [e], the cursor never leaves the
marked area, and it is repeated over and over. This is very convenient when, for instance, you
are building a hip-hop beat or want to work on a specific section of a song.
l. Slice tool – Allows you to split a selected region at the playhead, thus creating two new,
smaller slices.
m. Transport controls – You can move the playhead [f] by pressing various transport controls to
play, record, rewind or fast-forward. The small window to the right [n] displays your position
in the song either in minutes and seconds or in bars and beats.
n. Song position – This small window displays your position in the song either in bars and beats
or, as here, in minutes and seconds.
o. Song settings – In this box, you can set the key, tempo and time signature of your piece.
p. Metronome – Provides a short and steady clicking or tapping sound to define and keep the
tempo. The metronome can be set to only be active during the count-off/ count-in.
2)
a. Track automation – Automation can be used for automatically modulating and tracking
parameters, such as a pan sweep or volume change, over time throughout a track of your song.
Using automation is a useful tool for adding variation and extra detail in a project. Automation
settings can either be configured manually or recorded into the arrangement of a track, after
which they automatically adjust a parameter.
b. Selected instrument track – Often, tracks and regions need to be selected before their content
or settings can be edited.
d. Mute switch – Mutes the sound of a track, keeping the audio from passing through to the
output. In this example, it is switched on.
e. Solo switch – Temporarily silences all channels besides the selected one(s). Only soloed sound
is heard. On our specific track, it has been activated as well.
f. Pan pot – Pan pots (short for Panoramic Potentiometers) are knobs used to send a signal to
the left or right channel (or speakers), both channels equally, or a portion of each. Pan pots
are used to create a stereo image when using two or more speakers.
2
3)
Instrument settings – In this panel, you can configure the virtual instrument of a selected
instrument track as well as your MIDI input device. You can choose from a large bank of
acoustic and electronic sounds, ranging from violin to percussion to synthesizer lead. The
software music keyboard can be used to play and record notes or as visual feedback for an
external input device (such as your computer keyboard or a MIDI keyboard).
When the selected track or region contains audio data, the settings panel also changes:
Source settings – Here, you can specify the input device (mostly a microphone or an audio
interface) and channel for your audio recording. The central window gives you visual feedback
for your setup.
4)
Effects panel – This is where you can add effects to your track in order to shape its sound.
a. Effect presets – The preset drop-down menu allows you to choose from pre-configured sets
of effects, which you then can change and adapt to your needs.
b. Loaded effects – This window displays which effects are currently loaded onto the track and
allows you edit them. You may of course add (additional) effects manually.
In our example, the effects applied to the track include distortion, a compressor, an equalizer,
as well as an audio filter. Check the Music Production Glossary for a more extensive list of
effects and their description.
3
5)
MIDI Editor – The MIDI editor shows the notes in a MIDI region as coloured bars in a time grid.
Horizontal lines show the time position, while vertical lines indicate pitch. A keyboard along
the left edge of the editor provides a reference for the note pitches. Recorded MIDI notes can
be changed manually, e.g. by dragging them to a different position or adjusting their duration.
The MIDI inspector, to the left of the notes area, includes controls for editing MIDI regions and
notes.
a. Quantisation – A means of taking MIDI/audio data and shifting it so it will then line up with
user defined subdivisions of a musical bar. Your recorded performance will thus be ‘on the
grid’ and in time. The tool may be used to correct timing errors, but over-quantisation can
remove the human feel from a performance. Useful when MIDI or audio has been recorded
with improper timing.
Again, as for the instrument settings panel, the editor panel is completely different when you
select an audio track or region:
(Audio) Editor – The audio editor shows a close-up view of part of an audio track, displaying
the audio waveform of the track’s regions in a time grid.
In the audio editor, you can move and trim audio regions, split (slice) and join (merge) them,
as well as edit them in other ways. Just like the MIDI editor, you can scroll and zoom in for a
more detailed view. Edits you make in the audio editor are non-destructive, so you can always
return to your original recordings.
a. Pitch shift – The pitch shift option allows you to transpose audio regions, i.e., to change their
key signature. Audio regions can be transposed up or down by up to 12 semitones (= 1 octave,
interval between two notes with the same name).
b. Fade slider – In order to add a fade in or fade out to an audio region or to adjust its length,
drag the nodes in the region’s upper corners to the desired position.
4
Music Production Glossary
Digital audio workstation (DAW) – Electronic device or application software used for
recording, editing and producing music, songs, speech, radio, television, soundtracks,
podcasts, sound effects and nearly any other situation where complex recorded audio is
needed. DAWs come in a wide variety of configurations from a single software program on a
laptop all the way to a highly complex configuration of numerous components controlled by
a central computer. Regardless of configuration, modern DAWs have a central interface that
allows the user to alter and mix multiple recordings and tracks into a final produced piece.
Popular software DAWs include GarageBand, Logic Pro, Ableton Live, Pro Tools, Cubase and
PreSonus Studio One.
MIDI (Musical Instrument Digital Interface) is a protocol developed in the 1980s allowing
electronic instruments and other digital musical tools to communicate with each other. MIDI
can be used within a single machine like a computer, or to transmit data between several
devices, such as a MIDI controller or a drum machine and a computer. MIDI data itself doesn’t
make any sound, it consists of a series of messages (such as "note on", "note off", "note/pitch",
"pitch bend") that are interpreted by a MIDI instrument to produce sound.
While the term MIDI generally refers to the notes and other data recorded when using
software instruments, it’s also used to designate the virtual instruments themselves. For
example, a software piano is also known as a MIDI piano, and the notes it records in your DAW
are known as MIDI notes.
MIDI notes are a prime example of MIDI data on BandLab:
Channel – A pathway through an audio device. For example, sound mixers have multiple input
channels and output channels. In the context of MIDI, Channel refers to one of 16 possible
data channels over which MIDI data may be sent. The organisation of data by channels means
that up to 16 different MIDI instruments or parts may be addressed using a single cable.
Track – Either means an individual audio channel in the production process (see page 1) or a
full piece of digital music.
Mono – The opposite of stereo. A sound that has one source, rather than two.
Stereo – The opposite of mono. A sound that has two sources, rather than one. Creates the
illusion of horizontal space in recordings.
Fade (in, out) – The increase and decrease of volume at the beginning and end of a sound or
a song.
5
Audio effects – A treatment applied to an audio signal in order to change or enhance the way
it sounds. Audio effects can come in the form of electronic devices such as guitar pedals, as
well as digital processing units found within a DAW. Unprocessed audio is metaphorically
referred to as dry, while processed audio is referred to as wet.
Equalization (EQ) – A sound processor that can boost or reduce certain frequencies in a sound.
Reverb – Short for reverberation. Reverb can be described as a time-based effect featuring a
series of echoes rapidly occurring one after the other. Simply put, it is the sound reflected off
a room or environment after an initial sound has been produced inside it. If more reverb is
desired, it can be added to a recording digitally via a reverb plugin.
Delay (or echo) – A processor that creates copies of a sound source that repeat over and over,
fading slowly. Commonly used with vocals and electric guitar.
Compression – Reducing a signal’s output volume in relation to its input volume to reduce its
dynamic range. Basically, when a sound gets louder than a certain level, a compressor turns
the sound down somewhat. This controls the dynamics of that sound to make it more
consistent.
Overdrive (or drive) – Usually refers to the type of distortion that occurs when an amplifier is
overloaded. Commonly used to describe guitar amp distortion. Overdrive is considered to be
“creamier” than the harshness of digital distortion.
Chorus – A sound processor that makes a sound seem doubled by creating several delayed
copies of the original sound and slightly varying the pitch of each copy. Used to “thicken” a
sound.
Tremolo – A sound processor that either quickly turns the volume of a sound up and down,
or quickly pans it left to right.
Flanger – Uses the same process as chorus, but with dramatically short delays. Rather than
“thickening” a sound, a flanger is usually less subtle. It’s been described as sounding “like an
airplane flying right over your head.”
Phaser – A sound processor that removes certain random frequencies by creating a copy of
the soundwave and moving it back and forth, causing a “phasing” sound.
Filter – A feature of an EQ that cuts the sound of the low end or the high end of the frequency
spectrum, or both. These are known as high-pass filter (HPF), low-pass filter (LPF), and band-
pass filter (BPF) respectively.
6