Web Audio API
Web Audio API
Gettingstarted
TheWebAudioAPIisoneoftwonewaudioAPIstheotherbeingtheAudioDataAPI
designedtomakecreating,processingandcontrollingaudiowithinwebapplicationsmuch
simpler.ThetwoAPIsarentexactlycompetingastheAudioDataAPIallowsmore
lowlevelaccesstoaudiodataalthoughthereissomeoverlap.
Atthemoment,theWebAudioAPIisaWebKitonlytechnologywhiletheAudioDataAPI
isaMozillathing.ItwasrecentlyannouncedthatiOS6willhaveWebAudioAPIsupport,
however,sotheresmobilesupportontheway.
Inthispage,wewillstartattheverybeginningandworkthoughthebasicconceptsuntil
wehaveaworkingexample.
Audioroutinggraphs
TheWebAudioAPIisanextremelypowerfultoolforcontrollingaudiointhebrowser.Itis
basedaroundtheconceptofAudioRouteswhichareacommontoolinsound
engineering.Thisisasimple,butpowerfulwayofrepresentingtheconnectionsbetweena
soundsourceandadestination,inthiscase,yourspeakers.Betweenthesetwoend
points,youcanconnectanynumberofnodeswhichtaketheaudiodatapassedin,
manipulateitinsomewayandoutputittowhichevernodesareconnectednextinthe
chain.
Therecanbeonlyone!
AudioContext,thatis.Unlikecanvasesandcanvascontexts,therecanbeonlyone
AudioContextperpage.Thisdoesntprovetobealimitationasyoucaneasilycreate
multiple,completelyseparateAudioGraphswithinthecontext.Essentially,thecontext
objectactsasaholderfortheAPIcallsandprovidestheabstractionrequiredtokeepthe
processsimple.
EventhoughthisisonlysupportedinWebKitatthemoment,thissnippetwillensurewere
preparedforfuturedevelopments.
var
context
if
(
typeof
AudioContext
!==
"undefined"
)
{
context
=
new
AudioContext
()
else
if
(
typeof
webkitAudioContext
!==
"undefined"
)
{
context
=
new
webkitAudioContext
()
else
throw
new
Error
(
'AudioContextnotsupported.:('
)
Createasoundsource
Unlikeworkingwithaudioelements,youcantsimplysetthesourceandhaveitload.Most
often,youwillloadtheaudiofilewithanXMLHttpRequestandanasynchronouscallback.
var
request
=
new
XMLHttpRequest
()
request.
open
(
"GET"
,
audioFileUrl
,
true
)
request.
responseType
"arraybuffer"
//Ourasynchronouscallback
request.
onload
function
()
var
audioData
=
request.
response
createSoundSource
(
audioData
)
request.
send
()
TheAudioContextprovidesusefulmethodstosimplifydownloadingremoteresourcesvia
streambuffers.UsethereceivedaudioDatatocreatethefullsoundsource.Welllookat
themakemonoparameterlater.
//createasoundsource
soundSource
=
context.
createBufferSource
()
//TheAudioContexthandlescreatingsource
//buffersfromrawbinarydata
context.
decodeAudioData
(
audioData
,
function
(
soundBuffer
){
//Addthebuffereddatatoourobject
soundSource.
buffer
=
soundBuffer
})
SeethisonJSFiddle.
Connectthesourcetothedestination
ThisiswherewestarttocreateourAudioRoutingGraphs.Wehaveoursoundsourceand
theAudioContexthasitsdestinationwhich,inmostcases,willbeyourspeakersor
headphones.Wenowwanttoconnectonetotheother.Thisisessentiallynothingmore
thantakingthecablefromtheelectricguitarandpluggingitintotheamp.Thecodetodo
thisisevensimpler.
soundSource.
connect
(
context.
destination
)
Thatsit.Assumingyoureusingthesamevariablenamesasabove,thatsallyouneedto
writeandsuddenlyyoursoundsourceiscomingoutofthecomputer.Neat.
Createanode
Ofcourse,ifitweresimplyconnectingasoundtoaspeaker,wewouldnthaveanycontrol
overitatall.Alongthewaybetweenstartandend,wecancreateandinsertnodesinto
thechain.Therearemanydifferentkindsofnodes.Eachnodeeithercreatesorreceives
andaudiosignal,processesthedatainsomewayandoutputsthenewsignal.Themost
basicisaGainNode,usedforvolume.
//Createavolume(gain)node
volumeNode
=
context.
createGain
()
//Setthevolume
volumeNode.
gain
.
value
0.1
Chaineverythingtogether
WecannowputourGaininthechainbyconnectingthesoundsourcetotheGainthen
connectingtheGaintothedestination.
soundSource.
connect
(
volumeNode
)
SeethisonJSFiddle
Lengthychains
Anothercommontypeofnodeisthe
BiquadFilter
.Thisisacommonfeatureofsound
engineeringwhich,throughsomeveryimpressivemathematicalcleverness,providesalot
ofcontrolovertheaudiosignalbyexposingonlyafewvariables.
Thisisnotnecessarilythebestplacetogointodetailbutheresaquicksummaryofthe
availablefilters.Eachofthemtakesafrequencyvalueandtheycanoptionallytakea
Q
factor
oragainvalue,dependingonthetypeoffilter.
Lowpass
Soundsbelowthesuppliedfrequencyareletthrough,soundsabovearequietened.The
higher,thequieter.
Highpass
Soundsabovethesuppliedfrequencyareletthrough,soundsbelowarequietened.The
lower,thequieter.
Bandpass
Soundsimmediatelyaboveandbelowthesuppliedfrequencyareletthrough.Sounds
higherandlowerthanacertainrange(specifiedbytheQfactor)arequieter.
Lowshelf
Allsoundsareletthrough,thosebelowthegivenfrequencyaremadelouder.
Highshelf
Allsoundsareletthrough,thoseabovethegivenfrequencyaremadelouder.
Peaking
Allsoundsareletthrough,thoseoneithersideofthegivenfrequencyaremadelouder.
Notch
OppositeofBandpass.Soundsimmediatelyaboveandbelowthesuppliedfrequencyare
madequieter.Soundshigherandlowerthanacertainrange(specifiedbytheQfactor)are
louder.
Allpass
Changesthephasebetweendifferentfrequencies.Ifyoudontknowwhatitis,you
probablydontneedit.
Connectingthesefilternodesisassimpleasanyother.
filterNode
=
context.
createBiquadFilter
()
//Specifythisisalowpassfilter
filterNode.
type
//Quietensoundsover220Hz
filterNode.
frequency
.
value
220
soundSource.
connect
(
volumeNode
)
volumeNode.
connect
(
filterNode
)
filterNode.
connect
(
context.
destination
)
SeethisonJSFiddle
Done
Bynow,youshouldhaveaworkingsampleoftheWebAudioAPIinfrontofyou.Nicejob.
Wehave,however,onlyscratchedthesurfaceoftheAPI.Wellgointothatmoresoon.
Attributions
Hello,Hello,Hellosamplefromfreesound.org
SpeakersymbolbyOkanBenn,fromthenounproject.com.