Complete Course Notes of 205 Lighting in Video
Complete Course Notes of 205 Lighting in Video
The Viewfinder
As technology progresses, tech gets smaller and cheaper – and
camcorders are no exception. When battling on price point – trying to make
the cheapest possible product – manufacturers start looking at where they
can cut costs. With camcorders, this lead to viewfinders no longer included
and relying on LCD screens instead.
Don’t make this mistake. LCDs are great, but they have serious limitations.
If you’re filming outdoors in sunny conditions, forget about trying to use
your LCD, and regardless of conditions, it’s difficult to keep your camera
steady when using the screen instead of a viewfinder. Holding a camera
steady against your eye is just much easier, to say nothing of the way that
LCD screens gobble up battery power.
That’s not to say that LCD screens are bad; they offer some serious
advantages. Reviewing footage, navigating menus, and allowing you to
shoot at different angles; they have a lot to offer.
So get the best of both worlds, and pick up a model with both.
Image Stabilization
Even when shooting through a viewfinder, you’re not going to be a stable
filming platform; shaky hands are an inevitability. Image stabilizers are
meant to counteract this issue. There are two primary types of stabilizers;
optical, and digital.
An optical image stabilizer is built right into the camera’s lens, its sensors
counteracting the myriad small movements that hand held filming
produces. Digital stabilizers, on the other hand, work by centering your
image during recording, which tends to reduce your resolution.
This is a tricky situation, as image stabilizers can vary greatly in efficacy,
even in top end models. This is where it pays to do your homework; read
user reviews, and figure out which image stabilizers are getting people real
results.
Optical Zoom
“Mr. Deville, I’m ready for my close up.” Not if you’re relying on digital
zoom, you’re not.
Camcorders and cameras, in general, have two kinds of zoom; digital, and
optical. Digital zoom is like enlarging a web page; it makes the image
bigger, sure, but also blocky, fuzzy, and distorted. Optical zoom, on the
other hand, is the same type used in SLR cameras, where the lens moves
to highlight and focus on an image.
There’s no debate; if you’re going to zoom, you want to use an optical
zoom if you have any desire to see what you’re filming. Which is kind of the
point of shooting in the first place, right?
Manual Controls
Your camcorder has default settings for how you capture video, and those
are great. But you’re going to want to change them now and then, and for
that, you’ll need manual controls. Focus, shutter speed, exposure, and
especially white balance – these are integral to getting a clear, crisp image.
White balance is necessary for capturing color accurately, and adjusting
focus and shutter speed lets you adjust how much light hits the lens –
something that is incredibly useful when the camera can’t figure it out on its
own.
Without manual controls, you’re stuck with whatever your camera can
come up with on its own; with them, you can always ensure that you’re able
to film.
Aperture
Lens
Shutter Release Button
Memory Card
Viewfinder
User Controls
If you are taking a picture, you must press the Shutter Release Button to
take the picture. When the button is pressed, the Exposure Meter will
determine the appropriate setting of the lens. This means that the release
button will open and close the shutter in the proper time enough to allow
the light to enter.
Memory Card in Camera
Memory Cards come in different sizes. The size is determined by the
memory capacity that the card can take. Some of them can be tiny in size
(Micro SD Card) and as yet they have huge memories that are measured in
Terabytes. Other Memory Cards have gigabytes of memory. Memory-
Cards should be kept in a memory card case to protect the data. Dust and
weather conditions could damage a Memory Card.
Viewfinder in Camera
Viewfinder is the component that displays the image to be shot. When the
device is held to the photographer’s eye; he tries to find a spot to focus on,
through Viewfinder. Some have an LCD screen and more often than not,
the LCD acts as the Viewfinder. Getting a good and sharp photograph
depends on factors like Camera’s Megapixels, light falling on the object,
quality and type of lens.
Parts of a Camera
Shutter.
Image Sensor- The Most Important Part of a Camera.
Viewfinder.
Digital LCD Display.
Button Interface.
Inbuilt Flash.
Shutter Trigger.
Mode Dial.
Parts of a Camera
1. Aperture
Aperture is the opening in front of the camera. It will be present in the lens
part.
For, an interchangeable lens camera, you will have the option to change
the lens. So, you will have more options with the Aperture.
But, for a standard point and shoot or bridge camera, the lens is a fixed
one. So, the options are limited. You can vary the Aperture in lens from the
camera body.
2. Shutter
The shutter is another vital part of a camera. It controls the time duration for
which the image sensor is exposed to the light.
Most of the digital cameras come with a combination of electronic and
mechanical shutter or mechanical alone.
All digital cameras are designed for a specific shutter life, also known as
the camera shutter count. The reliable shutter operation is guaranteed
only up to this value. The top-end models will come with a higher shutter
count.
3. Image Sensor- The Most Important Part of a Camera
It is the image sensor that decides the image resolution. So, it is like the
heart of the camera.
In the early days, the film used to do this job. Now, it got replaced by CCD
and CMOS sensors.
They are responsible for acquiring each of the pixels in an image. An image
sensor is quantified based on its size and number of megapixels.
4. Viewfinder
The viewfinder is the small rectangular opening, seen on top of the camera.
You can see through this window to compose and frame the shot.
Digital cameras either have an optical viewfinder or an electronic
viewfinder. The viewfinder also shows parameters like exposure, shutter
speed, Aperture, ISO, and a few other basic settings for image capture.
5. Digital LCD Display
All digital cameras will have an LCD to view images and to set the different
parameters and modes.
It is the visual interface that helps the photographer to set the camera
settings according to his choice. It is on the backside of the camera.
Some high-end models come with dual displays. The secondary display will
be on the top side.
6. Button Interface
You can find many buttons that are configured to do certain operations on
the backside and top of the body. Some cameras allow you to configure
some buttons according to your choice.
7. Inbuilt Flash
More than 90% of digital cameras will come with an inbuilt flash. It will be
on the top side. It will pop up only when you enable flash in the settings or
manually.
You will not get the same performance as an external flash. It will also
consume a good amount of battery power, especially for point and shoot
ones.
8. Shutter Trigger
Shutter trigger is a kind of tactile push-button switch which comes with dual
press option. The first press, which is referred to as the half click, is to
acquire the focus on the subject.
The second press, which is the full press is to activate the shutter
mechanism.
Some cameras allow you to separate the half-press Autofocus feature from
this button. You can configure the button on the backside for focusing. So,
the shutter button only activates the shutter alone in this condition.
The shutter trigger button is located on the top right-hand side of the
camera for usability.
9. Mode Dial
The Mode dial is another part of a camera used to change different modes.
Some of the standard Modes include Aperture mode, Shutter mode,
Manual mode, and Auto mode. It is located on the top side.
10. Hotshoe
Hotshoe is another integral part of a digital camera. It is on the top side of
all cameras.
It is mainly for mounting the external flash. You can also use it to mount
wireless triggers, external microphone, and spirit bubble level.
This Hotshoe mount varies for different camera manufacturers. So, you
cannot use one model of external flash on all bodies.
Communication ports are usually on either side of the camera. USB is the
most common type of communication port, present in all models. It is for
image transfer from the camera to the computer.
Other communication interfaces include HDMI port, Audio port, Ethernet,
Wired remote trigger port, and Display port. These ports may not be
present in all models.
Bluetooth, Wifi, and NFC are some of the wireless communication
interfaces supported by a camera. You need to refer to the camera manual
to check the different types of communication interfaces.
12. Recording Medium
In digital cameras, the memory card is the photo storage medium. The type
of memory card varies with different types of cameras. There will be a card
slot located on the side or bottom to insert the memory card. Some
cameras come with dual memory card slots.
SD card is the commonly supported memory card for most of the digital
cameras. Compact Flash card, Micro SD card, XQD card, C Fast card, or
some of the other memory cards used in DSLR and mirrorless digital
cameras.
All digital camera needs a battery for its operation. The type of battery
varies for different camera types. Most cameras use Lightweight
rechargeable Lithium polymer batteries. It will be a custom one, supplied
along with the digital camera.
Some point and shoot models use alkaline batteries. The battery
compartment is usually at the bottom or side of the camera.
14. Tripod Mount
All Digital cameras will come with a tripod mount, located at the bottom
side. It allows you to mount the camera on a tripod.
Most of the cameras will have ¼ 20 UNC thread. Some come with a 3/8 16
UNC thread. So, check the manual to know the right tripod thread size.
ISO
ISO is your camera's sensitivity to light as it pertains to either film
or a digital sensor. A lower ISO value means less sensitivity to
light, while a higher ISO means more sensitivity.
Broadcast Standard:
The major analog TV standards are NTSC, PAL and SECAM. The video
signals consist of one luma signal and two chroma signals. Luma contains
information about black and white video. Chroma contains additional
information for black and white video to be converted to color video.
In, India, PAL video format is supported. NTSC is the video standard
commonly used in North America and most of South America. PAL is the
video standard which is popular in most of the European and Asian
countries.
Television Standards/Broadcast Standards
There are a number of TV Standards worldwide. Not all television sets in
the world are alike. Countries use one of the three main video standards –
PAL, NTSC or SECAM. What this means is that a video from a PAL
country will not play in a country that uses the NTSC standard.
Frames
Before we dive deep into the various TV Standards we shall take a look at
a few basics of TV transmission. A television transmission consists of a set
of rapidly changing pictures to provide an illusion of continuous moving
picture to the viewer. The pictures need to come at a rate of 20 pictures per
second to create this illusion. Each of these "rapidly changing" pictures is a
frame. A typical TV transmission is at 25-30 frames per second (fps).
Lines
Each frame consists of several closely spaced lines. The lines are scanned
from left to right and from top to left. A typical TV picture consists of 525 to
625 lines. Considering this large number of lines, if all were to be written
one after another the picture would begin to fade at the top by the time the
last line is written. To avoid this, the first frame carries the odd numbered
lines and the next frame carries the even numbered lines. This provides
uniformity in the picture and this is called interlacing.
Timing
TV receivers require a source to time the rapid succession of frames on the
screen. Designers decided to use the Mains power supply frequency as
this source for two good reasons. The first was that with the older type of
power supply, you would get rolling hum bars on the TV picture if the mains
supply and power source were not at exactly the same frequency. The
second was that the TV studio lights or for that matter all fluorescent, non
incandescent lights flicker at the mains frequency. Since this flicker is much
higher than 16 times per second the eye does not detect it. However this
flicker could evolve into an extremely pronounced low frequency flicker on
TV screens due to a "beat" frequency generated between the light flicker
and the mains frequency. This would have made programmes un-viewable
particularly in the early days of development of TV receivers.
The two mains power frequencies worldwide are 50Hz and 60Hz. This
meant that there was an immediate division in the TV standards - the one
with 25 frames per second (50 Hz) and 30 frames per second (60 Hz).
Most of the compatibility problems between TV standards across the world
stem from this basic difference in frequencies.
NTSC (National Television Standards Committee)
The majority of 60Hz based countries use a technique known as NTSC
originally developed in the United States by a focus committee called the
National Television Standards Committee. NTSC (often funnily referred to
as Never Twice the Same Colour) works perfectly in a video or closed
circuit environment but can exhibit problems of varying colour when used in
a broadcast environment.
PAL (Phase Alternate Lines)
This hue change problem is caused by shifts in the colour sub-carrier
phase of the signal. A modified version of NTSC soon appeared which
differed mainly in that the sub-carrier phase was reversed on each second
line; this is known as PAL, standing for Phase Alternate Lines (it has a wide
range of funny acronyms including Pictures At Last, Pay for Added Luxury
etc). PAL has been adopted by a few 60Hz countries, most notably Brazil.
SECAM
Amongst the countries based on 50Hz systems, PAL has been the most
widely adopted. PAL is not the only colour system in widespread use with
50Hz; the French designed a system of their own -primarily for political
reasons to protect their domestic manufacturing companies - which is
known as SECAM, standing for Sequential Couleur Avec Memoire. The
most common facetious acronym is System Essentially Contrary to
American Method.
SECAM ON PAL
Some Satellite TV transmissions (usually Russian) that are available over
India, are in SECAM Since the field (25 frames /sec) and scan rates are
identical, a SECAM signal will replay in B&W on a PAL TV and vice versa.
However, transmission frequencies and encoding differences make
equipment incompatible from a broadcast viewpoint. For the same reason,
system converters between PAL and SECAM, while often difficult to find,
are reasonably cheap. In Europe, a few Direct Satellite Broadcasting
services use a system called D-MAC. Its use is not wide-spread at present
and it is trans-coded to PAL or SECAM to permit video recording of its
signals. It includes features for 16:9 (widescreen) aspect ratio
transmissions and an eventual migration path to Europe's proposed HDTV
standard. There are other MAC-based standards in use around the world
including B-MAC in Australia and B-MAC60 on some private networks in
the USA. There is also a second European variant called D2-MAC which
supports additional audio channels making transmitted signals
incompatible, but not baseband signals.
Quick Facts:
NTSC and PAL are video standards that are recorded on the cassette.
These videos send and electronic signal to the television, then only it can
be viewed.
In, India, PAL video format is supported.
NTSC is the video standard commonly used in North America and most
of South America.
PAL is the video standard which is popular in most of the European and
Asian countries.
The difference between NTSC and PAL is the transmission of number of
frames per second. In NTSC, 30 frames are transmitted per second. Each
frame is constituted up of 525 scan lines.
In PAL, 25 frames are transmitted per second. Each frame consists of
625 scan lines.
Second, the power frequency used in NTSC is 60 Hz. While in PAL, the
power frequency is 50 HZ.
2. Depth of Field
The depth of field (DOF) is the distance between the nearest and the
furthest objects that are in acceptably sharp focus in an image
captured with a camera.
Depth of field (DoF) is the distance between the nearest and furthest
elements in a scene that appear to be "acceptably sharp" in an image.
The distance between the camera and the first element that is considered
to be acceptably sharp is called DoF near limit.
4. Focal Length
It is not a measurement of the actual length of a lens, but a calculation
of an optical distance from the point where light rays converge to
form a sharp image of an object to the digital sensor or 35mm film
at the focal plane in the camera. The focal length of a lens is
determined when the lens is focused at infinity.
5. Aspect Ratio
Aspect ratio is the proportional relationship of the width of a video
image compared to its height. It is usually expressed as width:height
(separated by a colon), such as 16:9 or 4:3. The aspect ratio sets how wide
a video is formatted and affects how it will fit on your viewing screen.
the 16:9 aspect ratio can fit more information horizontally, while the
4:3 aspect ratio has more space vertically. Because of these
characteristics, they're each used for different purposes. Typically, most
videos have a 16:9 ratio, and the 4:3 ratio is best for photos!
What is film aspect ratio?
Aspect ratio is a numerical formula that describes the relationship of an
image’s width to its height. Comprised of two numbers separated by a
colon, the first number denotes the image’s width and the second its height.
For example, an aspect ratio of 1.33:1 means the image's width is 1.33
times the size of its height. If you wanted to eliminate the decimals in this
ratio, it can be (and often is) written as 4:3 instead.
Discover four modern film and TV aspect ratios.
Although there have been many different aspect ratios throughout film and
television history, the four following ratios are the most common today.
Film cinematography aspect ratios:
1.85:1.
Similar to the 16:9 size but slightly wider, whatever you shoot in 1.85:1 will
show on widescreen TVs and computer monitors with thin black bars on
the top and bottom of the screen. Most feature films use this aspect ratio,
but some high-end TV shows also shoot in 1.85.1.
2.39:1.
This is known as the anamorphic widescreen format and is the widest
aspect ratio used in modern cinema. Premium dramatic features best
showcase its wide field of view and ability to capture broad, scenic
landscapes.
Television cinematography aspect ratios:
4:3 or 1.33:1.
Until widescreen HDTVs came on the scene, 4:3 was the normal
ratio for standard-definition television sets. Today, the 4:3 aspect
ratio primarily serves stylistic purposes — for example, giving off
the vintage vibe that was popular before widescreen aspect ratios
became the norm.
16:9.
This is the most used aspect ratio for display on standard size HDTV
widescreens and computer monitors. In addition, 16:9 is also used for most
video filmed for TV and the internet since other film aspect ratios tend to
give off a more cinematic look. In fact, outside of actual in-person movie
theaters, most viewers watch content on 16:9 screens. It’s also the
standard aspect ratio for YouTube content.
Why does aspect ratio matter in film?
The right aspect ratio can make a huge difference in not just how films and
TV are displayed — but also in how they attract viewers and create viral
buzz for the project. The concept is important to both independent
filmmakers and big studio directors.
Due to limited technology, early films could only be produced in a boxy,
almost square format but now advances in screen and camera equipment
offer so many options that filmmakers and video creators alike can ask
themselves which would work best for their films — the old school square
style? Something long and wide or taller and narrower? Learning about
older aspect ratios can help modern creators finesse their current projects.
4. Close-up
The close-up is often used to show a character from the top of the shoulders to the top
of the head. It’s used for capturing a character’s facial expression, heightening
emotions and building tension. It’s another great shot type for dialogue.
5. Extreme close-up
And lastly, we have an extreme close-up, when an object, item or body part fills the
frame, which is used for emphasis, showing detail and, once again, heightening
emotion. In this instance, the focus on the second hand of the clock suggests that time
will be an important factor in the sequence to follow.
Let’s now move on to camera angles.
6. High-angle
The shot below is at a high angle. Angles can use any of the framing types we’ve
discussed above, but the camera must be positioned at an angle looking down at the
subject. Generally, a high angle is used to make the subject within the frame seem
small, isolated, vulnerable or less powerful. The extremity of the angle can be altered,
often causing the desired effect to be more or less impactful. In this case, the high
angle is used to make the characters seem even more vulnerable.
7. Low angle
The low angle can also be used in combination with any camera shot type, but the
camera must be positioned down low at an angle looking up at the subject. Generally,
a low angle is used to make the subject within the frame seem large, imposing,
daunting or more powerful. The extremity of the angle can be altered, often causing
the desired effect to be more or less impactful. In this case, the low angle wide shot of
these trees makes them look dominant, reinforcing the power of nature.
As you experiment with shot types, framing and angles, you’ll be able to create some
really interesting combinations and think about adding in camera movement in too,
such as pans, tilts and even tracking shots.
Shot
In filmmaking and video production, a shot is a series of frames that runs
for an uninterrupted period of time. Film shots are an essential aspect of a
movie where angles, transitions and cuts are used to further express
emotion, ideas and movement.
about them. Your subject will appear largest in a close-up or choker shot
shots. It’s about composing an image rather than just pointing the camera
at the subject. Some considerations when you’re framing the shot are the
relationships between characters in the shot — if there are more than one
— the size of the subject, and the elements on the left and right side of the
filming are relatively sharp. The point of focus is the object in the frame
that the filmmaker most wants to call attention to. The imaginary two-
dimensional plane that extends from that point is referred to as the plane of
focus. When you’re filming, any part of the image that falls on the plane of
Camera Shots
Camera Angles
The camera angle marks the specific location at which the movie
camera or video camera is placed to take a shot. A scene may be shot
from several camera angles simultaneously. This will give a different
experience and sometimes emotion.
A long shot captures the subject within a wide view of their surroundings.
This type of camera shot is commonly used to set the scene. It gives
viewers a sense of perspective as they can see how the subject relates to
their environment.
A closer version of the long shot is known as a full shot. In a full shot, the
subject fills the frame. This captures the subject’s general appearance,
while still showing the scenery surrounding them.
The medium shot is used to reveal more details on the subject, capturing
them from the waist up. As it includes the subject’s hands and part of their
surroundings, it’s the best way to capture actions in detail, while
maintaining a general view. This is why the medium shot is one of the most
popular types of shots.
There are two main variants of this shot: medium long shot and cowboy
shot. The medium long shot sits halfway between long and medium shots.
It frames the subject from the knees up. The cowboy shot, which cuts the
frame at mid-thigh, was widely used in western movies in order to show
gun holsters on cowboys’ hips.
The medium close-up shot frames the subject from the chest up. It is
generally used to capture enough detail on the subject’s face, while still
keeping them within their surroundings. During conversations, medium
close-up shots are used to keep some distance between the characters.
05. Close-up shot
A close-up shot tightly frames the subject’s face in order to focus on their
emotions. These types of shots are great to connect with the audience, as
there are no elements distracting them from the subject’s gestures and
reactions.
In an extreme close-up shot, a detail of the subject fills the whole frame. It
is used to emphasize certain features or actions. The most common use of
this shot will capture a character’s eyes, mouth, or fingers performing a
critical action.
A two shot includes two subjects in the frame. They don’t necessarily have
to be next to each other, nor given equal treatment. In many examples of a
two shot, one subject is placed in the foreground and the other, in the
background.
Bird’s-eye view is the name given to the type of shot taken from an
elevated point. As its own name indicates, it offers a perspective similar to
that which birds see while flying. This camera angle is used to magnify the
scale and movement.
High angle shot is taken pointing the camera down on the subject. As a
result, the subject is perceived as vulnerable and powerless. In this type of
shot, the camera angle can be anywhere from directly above the subject to
just above the subject’s line of sight.
A low angle shot is taken from below the subject’s eye line, pointing
upwards. This camera angle makes a subject look powerful and imposing.
This angle can create a visual distortion in types of shots closer to the
subject, as it’s not a common point-of-view. Because of this, a low angle is
commonly used with wider frames such as medium or medium close-up
shots.
An over the shoulder framing captures the subject from behind another
character. Typically, the shot will include the second character’s shoulder
and part of their head. This camera angle is primarily used during
conversations, as it maintains both characters in scene while focusing on
one at a time.
17. Dolly
18. Truck
Truck shots are those in which the camera is attached to a device that
moves smoothly along a horizontal track. These shots are most commonly
used to follow an action or walk the audience around a scene. Because the
camera itself is moving, the result allows viewers to feel as if they are also
moving across the scene.
19. Pedestal
20. Roll
Step your speed up a notch, and you get the whip pan, which is
handy for transitions showing the passing of time or travelling a
distance dramatically or comically.
Tilt
To tilt, imagine your camera is your head nodding up and down.
Zoom
‘Zooming’ is probably the most commonly used camera
movement; it lets you quickly move closer to the subject without
physically moving. But be careful with these, as zooming lessens
your image quality.
Tracking shot
A ‘tracking shot’ is one in which the camera moves alongside
what it’s recording. Tracking shots are sometimes called dolly
shots, but they can be differentiated by the direction they take.
Rules of Composition
As in visual arts, composition in photography is simply the arrangement of
visual elements within a frame. The term composition literally means
'putting together'. So, to get the perfect shot, the photographer has to
organize all objects in a scene.
COMPOSITION DEFINITION
What is composition?
Composition refers to the way elements of a scene are arranged in a
camera frame. Shot composition refers to the arrangement of visual
elements to convey an intended message.
Imagine a tic-tac-toe board — two lines running vertical, and two more
running horizontal.
As the camera frames your shot, keep the image on the intersecting lines.
It’s more pleasing to the eye. But also, different camera framing will tell a
different story. It is an easy way to determine the character's place in the
world.
The frame composition in Nightcrawler is, well, crawling with this rule. Lou
appears on the side of the frame, away from the world he exists in.
Gilroy's use of the rule of thirds isolates Lou, highlighting this as the main
theme. The rule of thirds can also be used with two characters.
The director's decisions to position Lou this way, showing only his profile,
creates an untrustworthy, distant, character.
Mastering frame composition and framing in film also allows you to break
some of the rules of composition.
RULES OF COMPOSITION
Artists use this technique to direct the viewer’s eye to a specific place. And
leading the eye to the center of the screen might end up serving your story
better, and garner more emotion. Past films have done this well. Balance
and symmetry in a shot can be very effective.
BLOCKING DEFINITION
What is blocking?
Blocking is the way the director moves actors in a scene. The director's
approach to blocking is dependent on the desired outcome (e.g., for
dramatic effect, to convey an intended message, or to visualize a power
dynamic).
Blocking the actor in a symmetrical shot can be a very effective way to lead
the viewer to a certain feeling or emotion. This video is part of
our Filmmaking Techniques Masterclass allows us to visualize the power of
blocking and staging.
Leading lines
Leading lines are actual lines (or sometimes imaginary ones) in a shot, that
lead the eye to key elements in the scene.
Artists use this technique to direct the viewer’s eye but they also use it to
connect the character to essential objects, situations, or secondary
subjects. Whatever your eye is being drawn to in a scene, leading lines
probably have something to do with it.
It is a very useful type of shot composition as it conveys essential context
to the audience. Let's see how its used in Nightcrawler. This stringer
scenes use leading lines to take us to the accident.
The diagonal line from Lou's feet to the back wheels of the police car, help
frame the shot. It is a leading line that interestingly enough, also represents
what his camera is able to capture.
Both the diagonal and straight line frame the crash as the focus. What's
interesting is that his camera is also doing that inside of the scene.
While this rule of composition helps lead us to our focus, other techniques
help us connect to our focus.
Eye-level framing
Just by showing the viewer the eye's of the character, the audience sees
into their soul. It might not be a steadfast rule of shot composition, but it is
an effective technique.
A close-up on his eyes signals that Lou's state of mind and inner feelings
are important right at that moment. It allows us to feel what he’s feeling. It is
the easiest way to garner empathy.
COMPOSITION EXAMPLES
Depth of field
Depth of field is essentially your zone of sharpness. If you make that zone
longer, bringing more objects into focus, you will have a deep depth of field.
Similarly, if you make that zone shorter or smaller, with less in focus, you
will have a shallow depth of field. One way to achieve this adjustment is by
using the lens aperture.
Now that we know a little more about this, we can manipulate our depths of
field to convey different feelings, tones, and relationships between objects.
A rack focus in filmmaking is changing the focus during a shot. The term
can refer to small or large changes of focus. If the focus is shallow, then the
technique becomes more noticeable.
If the filmmaker starts off with a large depth of field, and in the same shot
moves to a shallow one, the new focused element becomes the
centerpiece for the scene.
BOKEH DEFINITION
What is bokeh?
This kind of shot helps distance Lou from the world around him.
The director also, keeps people out of focus. This technique is used to
reveal more about Lou’s alienation from society.
RULES OF COMPOSITION
We will define deep space shots, as well as something called deep focus,
and determine how they all relate to each other.
We then will examine how they often work together to capture intentional
(and incredible) moments in film.
Unlike, deep focus, defined by elements both near and far from the
camera in focus.
Citizen Kane's famous deep focus scenes are still some of the best
examples of how knowing the rules of shot composition can help you tell a
deeply personal story.
The different depths are indicative of what is going on with each character.
The little boy appears far away, but in frame, to remind us that he is going
to be out of the picture soon, once they send him away.
With Lou positioned far from the camera and Nina stationed a bit closer, we
see deep space at work. This separation from each other highlights their
different personalities.
FILM COMPOSITION
Like we said before, the rules of composition are more like suggestions.
They are meant to guide and assist, not to limit or prohibit. There are many
instances when you really should obey the rules of composition, and other
times when breaking them is ideal.
White Balance
What is white balance?
White balance refers to the colour temperature at which white objects
on film actually look white. But it's not just about the appearance of
white; all the colours in your shot are determined by how you set your white
balance.
The function that corrects these color issues is the digital camera's "white
balance." Essentially, white balance adjusts images to make white
subjects look white in the final product. By making good use of white
balance, you'll be able to manipulate the tone of your pictures at will.
The correct white balance for a scene depends on the colour temperature
of the light. “In the most basic terms, your white balance tells you that if
your whites are off, your colour temperature’s off. And if your colour
temperature is off, you've got to figure out if your camera is set at the
wrong colour temperature or if the issue is in your lighting,” says
videographer Hiroshi Hara.
Daylight
The standard temperature for outdoor natural light is 5,600 degrees Kelvin
(K). This means that if you want a white piece of paper to appear white in
your shot, you would need to set your white balance to 5,600 K. This is the
industry standard setting, but it’s just a starting point. A sunny day with a
blue sky might be slightly warmer than an overcast, cloudy day. Sunset and
sunrise will almost always have a much lower colour temperature than high
noon.
Tungsten
For indoor lighting, also referred to as tungsten light, the standard setting is
3,200 K. Light bulbs and other artificial lighting usually have warmer
temperatures than outdoor lighting, so if you move your piece of paper from
outside to inside, you need to dial down your white balance to compensate
for the warmer colour temperature. Like daylight, tungsten settings vary
across the spectrum, from warm incandescent lights to LEDs that sit closer
to daylight temperatures.
A grey card is similar to a white card in that they are both reference points
to gauge white balance and exposure. But a grey card is a specific shade
of grey made to be completely neutral. This makes it easier for your
camera to read the light and choose the best white balance. To use a grey
card, simply place it in front of the camera while on customised white
balance mode and take a few shots. This is the manual version of using
AWB, in which your camera searches for neutral areas in the frame for
itself.
Lighting
“Any Time you mix lighting sources, it’s going to make it hard to find your
white balance,” says videographer Margaret Kurniawan. Use a single light
source or match every light to the same temperature to avoid different
colour temperatures across your scene. You can also use a light metre to
get a reading on the temperature. Aim for consistency across your set,
lighting and camera so you can minimise time spent correcting colours in
post-processing.
Fine-tune your white balance in editing.
Just because you nail your white balance in-camera doesn’t mean you
should leave it untouched in post-production. “There are two sides to
manipulating white balance,” says cinematographer Mike Leonard.
“There’s colour correction, which is the science side of it and then there’s
colour grading, which is the art of it.” Colour correction is about bringing
colours back to their accurate tones for a true-to-life look. You can set the
correct white balance in-camera and continue this process in post.
Colour grading, on the other hand, is a subjective art. “A great film example
of this is The Matrix,” says Leonard. When they’re in the Matrix everything
has a harsh green hue, but when they’re in the real world there’s a very
distinct blue gradient. That was a creative decision to make the two worlds
feel very different.”
Whether you want to evoke happiness and nostalgia with warm colours or
cooler, bluer tones for a moodier aesthetic, you can create the look you
want with video editing apps like Adobe Premiere Pro.
Reflectors:-
What is a Reflector?
A reflector is an object with a highly reflective surface. As a
result, when the light hits the reflector, it bounces back at an
equal angle. This allows you to change the direction of the light
onto your subject.
For example, if the light source is on the subject’s left side, you
can place a reflector on its right side and achieve illumination from
both sides. By adjusting the reflector’s position, you change the
reflection angle and control the lighting effect.
You may also want to use a reflector when the light comes from
behind the subject, and you need it to come from the front too.
Some photographers use a reflector to reflect the light coming
from a flash to make it less harsh and obvious.
Cuttters: –
Cutters or A flag is a device used in lighting for motion
picture and still photography to block light. It can be used
to cast a shadow, provide negative fill, or protect the lens from
a flare.
Cutters are large, irregular-shaped light flags. Flag cutters
can keep light from spilling over to unwanted areas. They will
also protect your lens from light flare.
Cutters i.e A flag is a device used in lighting for motion
picture and still photography to block light. It can be used to
cast a shadow, provide negative fill, or protect the lens from
a flare.
Lighting Gels:-
What Are Lighting Gels? Lighting gels are colored, translucent
sheets of thin plastic. Photographers, filmmakers, and stage
lighting technicians use them as filters to correct color and lighting
issues.
Simply defined, a gel is a transparent colored material used to
modify lights for photography and cinematography, placed
over light sources to create colorful effects. The two basic
types are color-correction gels and non-corrective, color-effect
gels.
What Are Lighting Gels?
Lighting gels are colored, translucent sheets of thin plastic.
Photographers, filmmakers, and stage lighting technicians use
them as filters to correct color and lighting issues. Use colored
gels for lights, and color correction filters and gels to create
various tones and moods for movies, to replicate either daytime or
nighttime lighting, or to balance intensity, light temperature, and
other parameters.
1. Natural Lighting
2. Key lighting
3. High Key Lighting
Low Key Lighting
4. Fill Lighting
The Three-Point lighting setup
5. Backlighting
6. Practical Light
7. Hard Lighting
8. Soft Light
9. Bounce Lighting / Bounce light summary:
10. Side Lighting or Chiaroscuro Lighting
12. Motivated Lighting
13. Ambient Light
Lighting Techniques:
Key Lighting. The key light is also known as the main film light of
a scene or subject.
Cinematography and film lighting is closely similar to photography
lighting. You’ve probably heard many of these techniques,
especially if you’ve done some studio photography in the past, but
it helps to learn how they can uniquely benefit filmmakers in
creating different moods and atmospheres in every scene.
It’s also important to note that these techniques are not clear-cut,
so many of them can actually take the form of several other film
lighting techniques. What matters is that you learn what each is
good for and are able to make the best use of them for achieving
your cinematic goals. The following are all the different types and
techniques of lighting in film:
Key Lighting
Fill Lighting
Back Lighting
Side Lighting
Practical Light
Hard Lighting
Soft Lighting
Bounce Lighting
High Key
Low Key
Motivated Lighting
Ambient Light
1. Key Lighting
The key light is also known as the main film light of a scene or
subject. This means it’s normally the strongest type of light in
each scene or photo. Even if your lighting crew is going for a
complicated multi-light setup, the key light is usually the first to be
set up.
2. Fill Lighting
As the name suggests, this technique is used to “fill in” and
remove the dark, shadowy areas that your key light creates. It is
noticeably less intense and placed in the opposite direction of the
key light, so you can add more dimension to your scene.
Because the aim of fill lighting is to eliminate shadows, it’s
advisable to place it a little further and/or diffuse it with
a reflector (placed around 3/4 opposite to the key light) to create
softer light that spreads out evenly. Many scenes do well with just
the key and fill studio lighting as they are enough to add
noticeable depth and dimension to any object.
When to Use Fill Lighting:
Use fill lighting to counteract shadows, or to bring up exposure
and decrease the contrast in a scene. With fill light, your viewer
can see more of the scene clearly.
3. Backlighting
Backlighting is used to create a three-dimensional scene, which is
why it is also the last to be added in a three-point lighting setup.
This also faces your subject—a little higher from behind so as to
separate your subject from the background.
When used with a fill light, it’s advisable to lessen the fill light’s
intensity down to 1/8 of that of the side light to keep the dramatic
look and feel of a scene.
5. Practical Lighting
Practical lighting is the use of regular, working light sources like
lamps, candles, or even the TV. These are usually intentionally
added in by the set designer or lighting crew to create a cinematic
nighttime scene. They may sometimes be used to also give off
subtle lighting for your subject.
6. Bounce Lighting
Bounce lighting is about literally bouncing the light from a strong
light source towards your subject or scene using a reflector or any
light-colored surface, such as walls and ceilings. Doing so creates
a bigger area of light that is more evenly spread out.
If executed properly, bounce lights can be used to create a much
softer key, fill, top, side, or backlighting, especially if you don’t
have a diffuser or soft box.
7. Soft Lighting
Soft light doesn’t refer to any lighting direction, but it’s a technique
nonetheless. Cinematographers make use of soft lighting (even
when creating directional lighting with the techniques above) for
both aesthetic and situational reasons: to reduce or eliminate
harsh shadows, create drama, replicate subtle lighting coming
from outside, or all of the above.
When to Use Soft Lighting:
Soft lighting is more flattering on human subjects. The soft
quality of the light minimizes the appearance of shadows,
wrinkles, and blemishes. Use soft lighting for beautification.
8. Hard Lighting
Hard light can be sunlight or a strong light source. It’s usually
unwanted, but it certainly has cinematic benefits. You can create
hard lighting with direct sunlight or a small, powerful light source.
9. High Key
High key refers to a style of lighting used to create a very bright
scene that’s visually shadowless, often close to overexposure.
Lighting ratios are ignored so all light sources would have pretty
much the same intensity. This technique is used in many movies,
TV sitcoms, commercials, and music videos today, but it first
became popular during the classic Hollywood period in the 1930s
and 40s.
Direction
Direction refers to where the light or lights are coming from in
relation to the camera. Some common terms that refer to direction
of light is back light, top light, frontal, and profile. There are often
several different directions of light working together to make up
the totality of the lighting direction. If the light is hard enough, you
can often tell from which direction the light is coming.
Intensity
The intensity of the light is how much light is hitting any part of
your scene. The intensity can and often does vary from one part
of the frame to another. It also varies from one subject to another.
When working on set you will often hear that there should be a 4-
to-1 ratio from one side of the face to the other. Alternatively, you
can have a 3-to-1 ratio from the subject to the background. This
means the intensity of the light should be four times greater on
one side of the face in reference to the other and three times
greater on the subject than the background.
Softness or hardness
Unlike direction or intensity, the softness or hardness of the light
is a more subjective quality. Hard light is often used to create
more mystery and drama. Soft light is often used when the drama
is not quite so intense or for more of a naturalistic look.
Lighting can be very impactful in setting the mood of the scene in a motion
picture. It, thus, plays an important role in making films. It shows us what
is happening on the screen. The simple idea of how lighting is used in a
film suggests the technicality which will make the scene look better. If you
are a movie buff, you must have certainly noticed that the frame of the
scene looks different whether the camera angle, actors, or settings are
repeating or not. This depends on the element of lighting. It is an
important tool that not only sets and creates the mood of the film but also
makes the scene much more impactful.
There are different types of film lighting techniques which will set the mood
for the movie:
Three-point Lighting:
It is the most significant form of lighting used today. This lighting is used to
make the scene less dramatic and balance the shadows and the
highlights which aren’t done. As the name of the lighting suggests, the
lights are placed at three different positions: the key light, backlight, and
fill light.
Harsh Light:
Harsh lighting is strong, directed light that is often present in the middle of
the day. It creates deep shadows with sharp lines on your subjects. It is a
technique that includes more contrast in the shot than the three key light
setups.
Soft Light:
The soft light will create shadows which will slowly transform from light to
dark. This gentle gradation encapsulates the notion of soft light. When soft
light is presented upon the subject, they won’t have any particularly visible
harsh lines. Before reaching the subject, the soft light will pass from a
medium. Some examples of soft light are indirect light, cloudy day, or
sunset light.
Ambient Lighting:
Warm Lighting:
This form of lighting is considered with red, yellow, and orange tones. This
form of lighting is usually used in youth dramas and romantic stories.
Lip synchronization:
Recording of sound simultaneously with photographing of action
so as to secure perfect synchrony of both (as when a motion
picture is projected)
2
Recording of sound and photographing of action at separate
times but utilizing techniques designed to secure synchrony of
sound and action when the two are combined
Voice Over
Voice-over (also known as off-camera or off-stage
commentary) is a production technique where a voice—that is
not part of the narrative (non-diegetic)—is used in
a radio, television production, filmmaking, theatre, or
other presentations. The voice-over is read from a script and may
be spoken by someone who appears elsewhere in the production
or by a specialist voice actor. Synchronous dialogue, where the
voice-over is narrating the action that is taking place at the same
time, remains the most common technique in voice-overs.
Asynchronous, however, is also used in cinema. It is usually
prerecorded and placed over the top of a film or video and
commonly used in documentaries or news reports to explain
information.
Voice-overs are used in video games and on-hold messages, as
well as for announcements and information at events and tourist
destinations. It may also be read live for events such as award
presentations. Voice-over is added in addition to any existing
dialogue and is not to be confused with voice acting or the
process of replacing dialogue with a translated version, the latter
of which is called dubbing or re-voicing.
Music
Music : Music is the art of producing pleasing or expressive
combinations of tones especially with melody, rhythm, and
usually harmony. : a musical composition set down on paper.
bring your music. sounds that have rhythm, harmony, and
melody.
The science or art of ordering tones or sounds in succession, in
combination, and in temporal relationships to produce a
composition having unity and continuity.
Music, art concerned with combining vocal or instrumental sounds
for beauty of form or emotional expression, usually according to
cultural standards of rhythm, melody, and, in most Western
music, harmony. Both the simple folk song and the complex
electronic composition belong to the same activity, music.
It can help craft a wide range of emotional responses from an
audience, create rhythm for clips and scenes, and emphasize
the overall story, even in marketing. Music adds to the
experience of a video, regardless of if it is a blockbuster film,
television sitcom, advertisement, or brand video.
The Importance of Music
Music touches us – it is a universal language of human-kind.
Whether it is being listened to or performed, it is incredibly
unifying. Music grants us feelings we can not convey by simply
talking. The influence music has on human emotion makes it an
extremely powerful tool for us in the world of marketing, or for
anyone who produces video.
Music in video can serve several functions. It can help craft a
wide range of emotional responses from an audience, create
rhythm for clips and scenes, and emphasize the overall story,
even in marketing. Music adds to the experience of a video,
regardless of if it is a blockbuster film, television sitcom,
advertisement, or brand video.
f you are having trouble understanding the importance of music in
video, imagine you’re watching a movie, television show, or
commercial, silent except for dialogue. Although some films use
this technique to convey a sense of gravity or silence, most of the
time, it can be highly uncomfortable to watch.
When you’re producing a commercial or another video to market
your business, most of the time you want to make your audience
as comfortable as possible. A video created specifically for a
business might include music so the audience feels at ease and
develops a connection to the company.
The Role of Music in Video
Music plays a vital role in video – it is one of the most crucial
steps in post-production and is an excellent way to capture
attention and communicate your brand.
Although music is a key component in video, a beginner may
overlook its importance or choose a song just because they like it,
which is detrimental to the success and quality of a video.
Experienced editors effectively choose music that enhances the
video and communicates the main message. Given the thousands
of songs that exist, choosing music for your video may seem
intimidating, but it is going to help your video and your brand more
than you can imagine.
As you start to develop your video, the information outlined below
can help you better understand why music plays such an
important role in your video, and how to choose the right music for
your video and brand.
Music Captures Attention
Music holds the attention of an audience – it shapes emotion and
motivates viewers. Whether it is inspirational or sad music, it can
be used as a signal to help viewers know what to pay close
attention to and how to feel.
Music builds value, making a product more memorable as the
music lingers in a viewer’s mind. Whether it’s a high-energy brand
video or a down-to-earth explainer video, you want your music to
be personalized instead of using generic stock music. Stock
music is a less expensive alternative to the use of well-known
music in a video. However, a lot of stock music sounds almost the
same – using it might not make your video or company
memorable.
For example, commercials and radio stations use original, catchy
jingles to establish their brand and make them recognizable to
audiences. Using music to establish an emotional connection with
a brand increases brand recognition and drives customers to
discover and share more of your brand’s content.
Music can give your business the boost it needs in order to win
the war for attention and develop a genuine relationship with
viewers.
Music Communicates your Brand
Music is a key component in conveying your business and brand
message. Music speaks volumes about your brand, which is why
the role of music is so important in video. Both puns intended,
music accompanying your video must be in tune with your
message. Every choice you make surrounding your business
conveys something about your brand- including music! If you are
making multiple videos within your company, music can vary quite
a lot from video to video. It’s one of the things that help distinguish
your videos from each other. Music reinforces the specific
message your brand is trying to convey to an audience and
reveals your brand’s personality.
Music also establishes mood and elicits certain emotions. With
the right music, you can make someone associate your brand
with a certain feeling, enabling them to take interest in your
company, brand, or product. Ideally, the music in your video
creates a positive feeling and therefore, gives the viewer a
positive feeling; or it might convey the gravity of a situation which
your business is helping to resolve.
Think of music as an opportunity to create meaning for your brand
by employing interesting musical pieces. Through the right music
choice, the customer imagines your identity, and buys into the
message of the video. With music, your video, and therefore your
brand and product, adopts meanings which are inherent in the
music.
Ambience
In filmmaking, ambience (also known as atmosphere, atmos, or
background) consists of the sounds of a given location or
space. It is the opposite of "silence".
An ambient film is largely plotless, focusing on character
through a more objective yet also more intimate viewpoint. In
ambient films we see characters live their lives in long takes that
are typically soundtracked with diegetic sound.
Ambience is another word for atmosphere in the sense of the
mood a place or setting has. If an expensive restaurant has
soft lighting and peaceful music, it has a pleasant, soothing
ambience.
Sound Effects
A sound effect is an artificially created or enhanced sound, or
sound process used to emphasize artistic or other content of
films, television shows, live performance, animation, video games,
music, or other media. Traditionally, in the twentieth century, they
were created with foley.
Sound is important because it engages audiences: it helps deliver
information, it increases the production value, it evokes emotional
responses, it emphasises what's on the screen and is used to
indicate mood.
A sound effect (or audio effect) is an artificially created or
enhanced sound, or sound process used to emphasize artistic or
other content of films, television shows, live performance,
animation, video games, music, or other media. Traditionally, in
the twentieth century, they were created with foley. In motion
picture and television production, a sound effect is a sound
recorded and presented to make a specific storytelling or creative
point without the use of dialogue or music. The term often refers
to a process applied to a recording, without necessarily referring
to the recording itself. In professional motion picture and
television production, dialogue, music, and sound effects
recordings are treated as separate elements. Dialogue and music
recordings are never referred to as sound effects, even though
the processes applied to such as reverberation or flanging effects,
often are called "sound effects".
Use of Microphones
A microphone is a device that translates sound vibrations in the
air into electronic signals and scribes them to a recording medium
or over a loudspeaker. Microphones enable many types of audio
recording devices for purposes including communications of many
kinds, as well as music vocals, speech and sound recording.
Microphones are used wherever sound needs to be picked up
and converted into an electrical format. Microphones are an
essential part of any audio recording system. The microphone
picks up the sound and converts it into electrical energy that can
then be processed by electronic amplifiers and audio processing
systems.
Microphones are used in many applications such as telephones,
hearing aids, public address systems for concert halls and public
events, motion picture production, live and recorded audio
engineering, sound recording, two-way radios, megaphones, and
radio and television broadcasting.
Types of Microphenes
There are 4 types of microphone:
Dynamic Microphones.
Large Diaphram Condensor Microphones.
Small Diaphram Condensor Microphones.
Ribbon Microphones.
Microphones capture sound waves in the air and turn them into
identical electrical signals. To replicate the original audio, you can
send the signals from the mic’s output to a mixer or audio
interface for recording or to studio monitors (or mixing
headphones), which turns them back into sound waves. But to get
something good out of your speaker, you need to make sure
you’re getting something good to begin with. So here’s our primer
on what to expect from different mics.
Each of the three primary types of microphones—dynamic
microphones, condenser microphones, and ribbon microphones—
has a different method for converting sound into electrical
signals.
All three have the same core construction, though. The capsule,
sometimes called a baffle, picks up the sound and converts it to
electrical energy. Inside the capsule is a diaphragm, a thin
membrane that vibrates when it comes in contact with sound
waves, initiating the conversion process.
The ideal types of microphones for a given situation directly
capture your intended audio source, such as your voice or a
musical instrument, without picking up any other nearby sound.
For example, a singer on stage needs a microphone that will
capture her voice while minimizing the pickup of the instruments
in her band. (The audio from other non-intended sources picked
up by a mic is referred to as “bleed” or “leakage.”)
Condenser microphones—
A condenser mic usually requires an external power source to
charge it. They typically pull their charge, called “phantom power,”
from a mixer or audio interface.
Of the three types of microphone designs, condenser mics
generally offer the best high-frequency audio reproduction, which
makes them the most common choice for capturing the nuances
of voices. Their high-end response also allows them to reproduce
better transients, the peaks at the beginning of a sound wave.
Hand percussion, such as shaker and tambourine, and acoustic
guitar, also benefit from accurate transient reproduction.
Condenser mics come in two basic categories: large-diaphragm
and small-diaphragm. Large-diaphragm mics are usually defined
as having diaphragms that are 1 inch or larger. Typically, large-
diaphragm condensers have a more well-rounded frequency
response and work best for recording voices. Small-diaphragm
condensers have the best high-end response and are preferred
for recording instruments.
Many large-diaphragm condensers offer multiple polar patterns so
that you can switch between cardioid, omni, and bi-directional.
Some even let you customize the polar pattern to fine-tune its
directional focus.
Ribbon microphones are technically a form of dynamic
microphone but are generally treated as a separate design
because they work and sound very different than their traditional
counterparts. The ribbon design includes an extended rectangular
diaphragm made of thin aluminum with magnets at either end.
When sound waves hit it, it vibrates to create an electrical charge.
Most ribbon mics feature a bi-directional (figure-8) polar pattern.
A great ribbon mic offers the most natural sound reproduction. Its
frequency range most closely mimics human hearing, so audio
doesn’t come in as bright as on condensers or dynamics, but
vocals and instruments sound very clear and natural. Ribbon mics
are primarily used in recording studios, where you can get perfect
positioning and protect them, as they tend to be more delicate
than the other types.
Durable
High-Mass
Directional
No Inherent Noise
Durable
Dynamic microphones relatively simplistic in their
construction, which makes them more rugged than the more
delicate types. Because they are so compactly constructed, the
noise from handling a dynamic microphone is greatly reduced.
The Shure SM58, a classic handheld dynamic microphone, is
known for its virtual indestructibility. These mics are notorious for
their ability to be dropped, tossed, and accidentally hit with
drumsticks, while maintaining a consistent sound quality
throughout their lifespan.
The durability and rugged construction of moving coil
microphones makes them great for live sound applications.
High-Mass
Dynamic microphones are relatively heavy, or massive. This
makes them less sensitive than other microphone types.
When a microphone has low sensitivity, it means that it can
handle a louder input.
More sensitive microphones will sound great for quiet sources,
but will start to distort the signal at higher levels. Dynamic
microphone diaphragms are generally heavier than the
diaphragms in other types of microphones. Although this means
they require more gain, it also means they can accurately capture
very loud signals without distortion.
If you put a sensitive microphone on a very loud guitar amplifier,
snare drum, or horn, for example, the sound will completely
overwhelm the diaphragm and cause saturation. These high-
output sources require a microphone that can accomodate the
sounds they create.
Directional
Dynamic microphones are capable of omnidirectional and cardioid
polar patterns.
Most have a cardioid polar pattern, meaning that they pick up
sounds best from in front of the diaphragm and reject
sounds best from behind the diaphragm. Although other
types of microphones are also available with a cardioid polar
pattern, dynamic microphones are superior in their ability to
reject sounds from the sides and rear.
This has many practical applications.
Firstly, the high directionality of dynamic microphones can
help capture only a particular instrument or source, even
when other sources of sound are nearby.
Whether you are trying to capture a vocalist standing next to a
drum kit, one horn in a brass section, or a podcast guest in a
noisy room, the superior directionality of dynamic microphones
can be useful.
The directionality of moving coil microphones also helps in
sound reinforcement situations.
Any time you send the signal from a microphone to speakers in
the same room, there is the danger that the sound from the
speakers will enter the microphone and create a feedback loop.
Dynamic microphones are able to provide more gain before
feeding back through the speakers. This is another reason
they are ideal for live sound.
No Inherent Noise
As you will see in the following sections, some other types of
microphones contain more complicated circuitry than the dynamic
type. The circuitry of those microphones may add some benefits,
but they come at a cost.
Dynamic microphones use simple, passive circuits to
convert sound to electricity. This simplicity offers the benefit
of no inherent noise, meaning you can use more gain without
starting to hear hiss or hum.
Condenser Microphones
How They Work
Condenser microphones are also called capacitor
microphones. They function on the following principle: If two
metal plates are in close proximity, the closer they are, the
higher the capacitance.
Capacitance is the ability of a system to hold an electrical charge.
In a condenser microphone, an electrically conductive diaphragm,
usually made of gold-sputtered mylar, is suspended in close
proximity to a solid metal plate. When sound waves interact with
the diaphragm, it moves back and forth relative to the solid metal
plate. This change in distance from the backplate to the
diaphragm creates a change in capacitance, and an electrical
signal is created.
An impedance conversion circuit must be placed after the
output of the capacitor to make the signal usable in an audio
system. This circuitry requires +48V-DC, known as phantom
power.
In some applications, such as cell phones and computers, electret
condenser microphones are used which utilize permanently
charged material and do not require phantom power. You can see
a simplified diagram of a condenser microphone below.
What They Do Best
Condenser microphones are also very common in professional
audio. They are useful, thanks to the following qualities:
Low-Mass
Variable Polar Pattern
Low-Mass
There are a few practical advantages of the low-mass diaphragm
in a condenser microphone. Firstly, these low mass of these
diaphragms makes them more capable of capturing transient
sound waves. A transient is the short, high-amplitude burst
at the beginning of a sound wave. If a snare drum is struck, the
microphone must move very quickly to capture the sound
accurately. The heavier the diaphragm, the longer it takes to
respond to the sound wave. Condenser microphones are
excellent in capturing these quick changes in sound pressure.
Another advantage to the low-mass diaphragms found in
condenser microphones is their frequency
response. Compared to other microphone types, condensers
have the widest frequency response. This means that condenser
microphones are capable of capturing variations in the air that
cycle very quickly. This property allows condenser microphones
to capture more detail from the sound source.
Inverse to the high-mass dynamic microphone diaphragms,
condenser diaphragms have high sensitivity. This has the
positive effect of increased clarity and ability to capture low
level sounds. However, this low mass also makes them
susceptible to saturation in high sound pressure level
applications.
Variable Polar Pattern
Most condenser microphones have a fixed polar pattern of
cardioid or omnidirectional.
However, because the depth of a condenser microphone
circuit is very small in comparison to moving coil circuits,
some condenser microphones offer the ability to vary the
polar pattern with a switch.
For example, the AKG C414 has the ability to operate in
omnidirectional, bidirectional, cardioid, wide cardioid, and
hypercardioid.
This is accomplished through the use of two diaphragms in
close proximity. The signal from each is mixed together, and
the phase interactions between the two signals creates
cancellation of sounds which enter from certain angles.
This property makes condenser microphones capable of being
very versatile. Not all condenser microphones are capable of this.
The microphones that do offer multipattern functionality, however,
can be an engineers best friend. These microphones generally
come at a financial cost, but their versatility makes them very
valuable in a variety of situations.
Ribbon Microphones
How They Work
Ribbon microphones are also called velocity microphones,
and are technically a variation of a dynamic microphone. The
function on the following principle: as an electrically
conductive ribbon moves within the field of a magnet, a
voltage is created on the ribbon.
In a ribbon microphone, a corrugated ribbon is suspended in the
field of a permanent magnet. The ribbon is typically made of low-
mass, aluminum alloy. When sound interacts with the suspended
ribbon, the ribbon moves in relation to the magnet. This creates a
voltage on the ribbon, which is connected to the output of the
microphone.
Low-Mass
Low-Sensitivity
Directional
Non-Linear Response
Low Inherent Noised
Low-Mass
The low mass of a ribbon diaphragm makes for an excellent
frequency response. As small changes in the velocity of air
particles occur, the ribbon can follow. This creates an electrical
signal that very closely represents the original sound. The Royer
R-121 is known for its transparent, realistic sound and remarkably
flat frequency response.
Although the low mass of the diaphragm benefits the sound
of a microphone, the durability of a ribbon microphone
suffers. The ribbon can be anywhere between .6-to-4 microns
thick. Compare this to the 100-micron diameter of a human hair,
and it becomes clear just how small a ribbon diaphragm is.
Sudden gusts of air, and even the sustained pull of gravity in a
certain direction can damage these microphones. This is
especially true of vintage models.
Low-Sensitivity
The low mass of condenser microphones make them very
sensitive. However, ribbon microphones have a low mass
while maintaining their ability to handle high SPL (Sound
Pressure Level). This makes them a great choice for capturing
detailed, transient sounds without fear of diaphragm saturation.
Directional
Ribbon microphones are inherently bi-directional. Sounds
from the front and back of the ribbon (perpendicular to the
ribbon) are picked up evenly. Thus, sounds from the side
(parallel to the ribbon) will place pressure on both sides of
the ribbon evenly and will result in no movement of the
ribbon at all.
This feature can be useful in many situations. In broadcast,
especially in the early days, ribbon microphones could capture
two people speaking while rejecting sound from an audience or
equipment rack nearby. The directionality can also be useful in
capturing the sound of a choir in a reverberant space, allowing
both the signal and the reverb to be recorded.
Non-Linear Response
The ribbon diaphragm does not respond linearly to sound
pressure levels. This means the correlation between sound
pressure and voltage is not parallel. This quality mimics the
human perception of loudness. The result is a more natural
sound that does less to embellish certain characteristics of a
sound source. This characteristic makes ribbon microphones
excellent for stereo recordings of an instrument or ensemble.
Low Inherent Noise
Ribbon microphones often utilize passive circuitry and are
thus less susceptible to electronic noise than condenser
microphones. The voltage on the ribbon simply represents the
movement of the ribbon itself, isolated from any electronic noise.
This allows ribbon microphones to be very quiet while maintaining
their ability to react quickly to small movements in the air.
Carbon Microphones
How They Work
Carbon microphones operate on the following concept:
When carbon granules are compressed, their resistance
decreases.
Carbon microphones are not commonly used in the modern
world, but were used in telephony and broadcast in the early days
of the technology up until the late 70s. A battery is required to
create an electric current to flow through the carbon granules. As
sound interacts with the carbon, the granules are compressed.
This changes the resistance of the carbon, in turn increasing and
decreasing the current with the movements of the air. Carbon
microphones suffer from significant noise and limited frequency
response.
Crystal Microphones
How They Work
Crystal microphones are also called piezoelectric
microphones. They function on the following principle: When
certain crystals are placed under mechanical force, a voltage
is produced.
This is called the piezoelectric effect. As sound places pressure to
diaphragm or to a crystal directly, the crystal is flexed. This flexion
creates an electrical charge on the crystal, which represents the
vibrations in the air. Crystal microphones are also not widely used
today. They do not offer wide frequency response and operate at
a high impedance not suitable for professional applications.
Audio Mixers for recording
A mixing console or mixing desk is an electronic device
for mixing audio signals, used in sound recording and
reproduction and sound reinforcement systems. Inputs to the
console include microphones, signals from electric or electronic
instruments, or recorded sounds. Mixers may control analog or
digital signals.
An audio mixer is a device with the primary function to accept,
combine, process and monitor audio. Mixers are primarily used
in four types of environments: live (at a concert), in a recording
studio, for broadcast audio, and for film/television. An audio mixer
can come in either analog or digital form.
There are two builds of Mixers
In - Line
An in-line mixer means there are two paths per channel. For
example, in a recording environment, an in-line mixer can use
the same channel to send and receive sound to and from a
digital audio workstation (DAW).
Split Monitor
A split monitor console has one path per channel. Each
channel can be used to either send or receive sound to or from a
DAW. Both of these builds have two areas for us to explore: the
channels and the master section.
Before we begin, let’s discuss some terminology and vocabulary.
An audio mixer may also be referred to as a “console,” “desk,” or
“board.” All three of these terms are synonyms with “mixer.”
Additionally, we must define “signal.” Signal is the generalized
term for any audio passing through a mixer. This could be a
vocal, drum, bass, synth, guitar or another instrument. All of it is
referred to as signal.
Parts of a Channel
Input Section
The input area of the channel strip may accept any or all of the
following levels of signal: mic, instrument, -10 line level or +4 line
level. Depending on which level of signal is desired to be
accepted, specific knobs and/or switches must be engaged and
used. For example, if a microphone is wired into a channel, the
channel must be configured to accept mic level signal. Then, the
mic pre knob must be raised in order to add gain to the signal.
The mixer will also be able to accept +4 line level, -10 line level
or both. This normally takes the form of deselecting all switches
and buttons but may vary between consoles. Some mixers have
a knob referred to as the “line trim.” This serves the same
purpose for line level signal as the mic pre does for mic level
signal: it adds or subtracts gain at line level.
EQ
Dynamics
Fader
Group Faders
Auxes
Bus Assignment
The most common bus assignment is the stereo bus. The stereo
bus is a two-channel mix (left and right) of all the faders on the
console. It can also be thought of as the sum of all the faders.
Most consoles require a button or switch to be engaged in order
to send the channel to the stereo bus (and therefore be audible
in the mix).
Pan
Patch Bay
Phantom Power
Polarity
PAD
Filter
Meters
Various styles of meters exist on mixers and for audio
production in general. On a mixer, the collective area of the
meters across all of the channels is referred to as the meter
bridge. Three common meters that may be found on a mixer are
VU, Peak, and RMS.
The right mixer in the right hands can become powerful. Make
sure to practice and purchase wisely.
Digital mixers do not sum signal in the same way: they are
merely processing the signal using digital, binary code (1’s and
0’s). The sonic texture of analog summing is so popular that
devices called “summing mixers” have become very popular in
home and small recording studios.
There’s a lot of debate online about the right way to level audio,
but there’s one thing that everyone can agree on: you should
never go above 0db. Even though your NLE may allow you to
raise your audio levels all the way up to 6db, it’s advised by
virtually everyone that you should keep your audio below 0db.
This is because your audio will begin to clip after 0db, resulting in
horrible distortion.
Think about it as a ceiling. As soon as your audio peaks and hits
the ceiling, it will begin to get distorted. This is why explosions
in hollywood movies sound awesome and explosions in bad
YouTube videos sound like muffled junk. Most major NLEs and
audio editing software will have red indicators pop up anytime
your audio goes above 0. If you’re getting this warning, then your
audio is too loud. For safety, no peaks should be above -6db.
Most video editors agree that the overall audio level of your audio
mix (all of your audio combined) should normalized between -
10db to -20db. I personally level my videos around -12db with
occasional peaks to -8db. The trick here is to stay away from 0db
as best you can. Experts differ on the exact numbers — because
there are no exact numbers. Editing audio is a lot like editing
video. At the end of the day, your ears should be your guides if
you’ve adhered to the technical rules.
Some people say that if you are creating a video for the web, then
you should get closer to 0db… but it all depends on the video you
are producing. If you’re creating a film, you might want to have
more range between your average mix level and your peaks. If
you’re doing a simple web commercial, it might make more sense
to have less range and a higher overall average level. You’ll figure
out what works for you over time.
Recommended Levels
Here’s a few of my recommended level suggestions.
Overall Mix Level: -10db to -14db
Dialogue: -12db to -15db
Music: -18db to -22db
Sound Effects: -10db to -20db with occasional spikes up to -8db.
These recommendations are a little on the quiet side when
compared to some industry experts, but I find it much better for an
audience to turn up the volume than run the risk of peaking or
distortion.
Before you finalize your film or video, take off your fancy studio
headphones and go play your video on the platform in which
your audience will be watching it. For example, if you’re
working on a TV commercial, try watching your video on a TV to
see how it sounds . It’s not uncommon for music producers to
listen to their tracks in a car before they sign off on distribution.
The same should be true for you. Whether it’s on a laptop,
iPhone, or TV, listen to your video on the medium in which your
audience will be watching and you’ll quickly find audio quirks.
It should also be noted that film distributors and TV broadcasters
often have their own standards for audio levels, so you should
absolutely consult with them before setting your audio levels. In
NTSC countries, for example, peak audio levels must not exceed
-10db with occasional -8db spikes, whereas PAL countries must
not exceed -9db.
Do I Need to Use Bars and Tone?
Audio Channel:-
These are the most common digital video formats and their most
frequent uses.
MP4
MP4 (MPEG-4 Part 14) is the most common type of video file
format. Apple’s preferred format, MP4 can play on most other
devices as well. It uses the MPEG-4 encoding algorithm to store
video and audio files and text, but it offers lower definition than
some others. MP4 works well for videos posted on YouTube,
Facebook, Twitter and Instagram.
MOV
MOV (QuickTime Film) stores high-quality video, audio and
effects, but these files tend to be quite large. Developed for
QuickTime Player by Apple, MOV files use MPEG-4 encoding to
play in QuickTime for Windows. MOV is supported by Facebook
and YouTube and it works well for TV viewing.
WMV
WMV (Windows Media Viewer) files offer good video quality and
large file size like MOV. Microsoft developed WMV for Windows
Media Player. YouTube supports WMV and Apple users can view
these videos, but they must download Windows Media Player for
Apple. Keep in mind you can’t select your own aspect ratio in
WMV.
AVI
AVI (Audio Video Interleave) works with nearly every web browser
on Windows, Mac and Linux machines. Developed by Microsoft,
AVI offers the highest quality but also large file sizes. It is
supported by YouTube and works well for TV viewing.
AVCHD
Advanced Video Coding High Definition is specifically for high-
definition video. Built for Panasonic and Sony digital camcorders,
these files compress for easy storage without losing definition.
WEBM or HTML5
These formats are best for videos embedded on your personal or
business website. They are small files, so they load quickly and
stream easily.
MPEG-2
If you want to burn your video to a DVD, MPEG-2 with an H.262
codec is the way to go.
MP3
MP3 (MPEG-1 Audio Layer III) is the most popular of the lossy
formats. MP3 files work on most devices and the files can be as
small as one-tenth the size of lossless files. MP3 is fine for the
consumer, since most of the sound it drops is inaudible, but that’s
not the case when it comes to bit depth. “MP3 files can only be up
to 16-bit, which is not what you want to be working in,” says
producer, mixer and engineer Gus Berry. “You want to be working
in at least 24-bit or higher when recording and mixing.”
AAC
Advanced Audio Coding or AAC files (also known as MPEG-4
AAC), take up very little space and are good for streaming,
especially over mobile devices. Requiring less than 1 MB per
minute of music and sounding better than MP3 at the same
bitrate, the AAC format is used by iTunes/Apple Music, YouTube
and Android.
Ogg Vorbis
Ogg Vorbis is the free, open-source audio codec that Spotify
uses. It’s great for streaming, but the compression results in some
data loss. Experts consider it a more efficient format than MP3,
with better sound at the same bitrate.
Lossless formats.
These files decompress back to their original size, keeping sound
quality intact. Audio professionals want all of the original sound
waves, so they prefer lossless. These files can be several times
larger than MP3s. Lossless bitrates depend on the volume and
density of the music, rather than the quality of the audio.
FLAC
Free Lossless Audio Codec offers lossless compression and it’s
free and open-source.
ALAC
Apple’s Lossless Audio Codec allows for lossless compression,
but it works only on Apple devices.
Uncompressed formats.
These files remain the same size from origin to destination.
WAV
WAV (Waveform Audio File) retains all the original data, which
makes it the ideal format for sound engineers. “WAV has greater
dynamic range and greater bit depth,” creative producer and
sound mixer Lo Boutillette says of her preferred format. “It’s the
highest quality,” Berry agrees. “It can be 24-bit, 32-bit, all the way
up to 192kHz sample rate and even higher these days.” If you’re
collaborating and sending files back and forth, WAV holds its time
code. This can be especially useful for video projects in which
exact synchronisation is important.
AIFF
Originally created by Apple, AIFF (Audio Interchange File Format)
files are like WAV files in that they retain all of the original sound
and take up more space than MP3s. They can play on Macs and
PCs, but they don’t hold time codes, so they’re not as useful for
editing and mixing.
DSD
Direct Stream Digital is an uncompressed, high-resolution audio
format. These files encode sound using pulse-density modulation.
They are very large, with a sample rate as much as 64 times that
of a regular audio CD, so they require top-of-the-line audio
systems.
PCM
Pulse-Code Modulation, used for CDs and DVDs, captures
analogue waveforms and turns them into digital bits. Until DSD,
this was thought to be the closest you could get to capturing
complete analogue audio quality.
BMP
The simplest way to define a raster graphic image is by using
color-coded information for each pixel on each row. This is the
basic bit-map format used by Microsoft Windows. The
disadvantage of this type of image is that it can waste large
amounts of storage. Where there’s an area with a solid color, for
example, we don’t need to repeat that color information for every
new contiguous pixel. Instead, we can instruct the computer to
repeat the current color until we change it. This type of space-
saving trick is the basis of compression, which allows us to store
the graphic using fewer bytes. Most Web graphics today are
compressed so that they can be transmitted more quickly. Some
compression techniques will save space yet preserve all the
information that’s in the image. That’s called “lossless”
compression. Other types of compression can save a lot more
space, but the price you pay is degraded image quality. This is
known as “lossy” compression.
TIFF
Most graphics file formats were created with a particular use in
mind, although most can be used for a wide variety of image
types. Another common bit-mapped image type is Tagged Image
File Format, which is used in faxing, desktop publishing and
medical imaging. TIFF is actually a “container” that can hold bit
maps and JPEGs and allows (but doesn’t require) various types
of compression.
PDF
PDF is an abbreviation that stands for Portable
Document Format. It's a versatile file format created by
Adobe that gives people an easy, reliable way to present
and exchange documents - regardless of the software,
hardware, or operating systems being used by anyone
who views the document.
*************************************************************************************