100% found this document useful (1 vote)
162 views

Web Audio API

The document provides an introduction and overview of the Web Audio API, including: - The Web Audio API allows creating, processing, and controlling audio in web applications through connecting sound sources and nodes to manipulate audio, and connecting them to a destination like speakers. - It uses audio routing graphs to represent connections between sources, nodes that process audio, and destinations. - The document provides examples of creating a sound source, adding nodes like gain and filters to control volume and frequencies, and connecting them together into a basic audio graph.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
162 views

Web Audio API

The document provides an introduction and overview of the Web Audio API, including: - The Web Audio API allows creating, processing, and controlling audio in web applications through connecting sound sources and nodes to manipulate audio, and connecting them to a destination like speakers. - It uses audio routing graphs to represent connections between sources, nodes that process audio, and destinations. - The document provides examples of creating a sound source, adding nodes like gain and filters to control volume and frequencies, and connecting them together into a basic audio graph.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

WebAudioAPI

Gettingstarted
TheWebAudioAPIisoneoftwonewaudioAPIstheotherbeingtheAudioDataAPI
designedtomakecreating,processingandcontrollingaudiowithinwebapplicationsmuch
simpler.ThetwoAPIsarentexactlycompetingastheAudioDataAPIallowsmore
lowlevelaccesstoaudiodataalthoughthereissomeoverlap.
Atthemoment,theWebAudioAPIisaWebKitonlytechnologywhiletheAudioDataAPI
isaMozillathing.ItwasrecentlyannouncedthatiOS6willhaveWebAudioAPIsupport,
however,sotheresmobilesupportontheway.
Inthispage,wewillstartattheverybeginningandworkthoughthebasicconceptsuntil
wehaveaworkingexample.

Audioroutinggraphs

TheWebAudioAPIisanextremelypowerfultoolforcontrollingaudiointhebrowser.Itis
basedaroundtheconceptofAudioRouteswhichareacommontoolinsound

engineering.Thisisasimple,butpowerfulwayofrepresentingtheconnectionsbetweena
soundsourceandadestination,inthiscase,yourspeakers.Betweenthesetwoend
points,youcanconnectanynumberofnodeswhichtaketheaudiodatapassedin,
manipulateitinsomewayandoutputittowhichevernodesareconnectednextinthe
chain.

Therecanbeonlyone!
AudioContext,thatis.Unlikecanvasesandcanvascontexts,therecanbeonlyone
AudioContextperpage.Thisdoesntprovetobealimitationasyoucaneasilycreate
multiple,completelyseparateAudioGraphswithinthecontext.Essentially,thecontext
objectactsasaholderfortheAPIcallsandprovidestheabstractionrequiredtokeepthe
processsimple.
EventhoughthisisonlysupportedinWebKitatthemoment,thissnippetwillensurewere
preparedforfuturedevelopments.

var
context

if

(
typeof
AudioContext
!==

"undefined"
)

{
context
=

new
AudioContext
()

else

if

(
typeof
webkitAudioContext
!==

"undefined"
)

{
context
=

new
webkitAudioContext
()

else

throw

new
Error
(
'AudioContextnotsupported.:('
)

Createasoundsource


Unlikeworkingwithaudioelements,youcantsimplysetthesourceandhaveitload.Most
often,youwillloadtheaudiofilewithanXMLHttpRequestandanasynchronouscallback.

var
request
=

new
XMLHttpRequest
()

request.
open
(
"GET"
,
audioFileUrl
,

true
)

request.
responseType

"arraybuffer"

//Ourasynchronouscallback
request.
onload

function
()

var
audioData
=
request.
response

createSoundSource
(
audioData
)

request.
send
()

TheAudioContextprovidesusefulmethodstosimplifydownloadingremoteresourcesvia
streambuffers.UsethereceivedaudioDatatocreatethefullsoundsource.Welllookat
themakemonoparameterlater.

//createasoundsource
soundSource
=
context.
createBufferSource
()

//TheAudioContexthandlescreatingsource
//buffersfromrawbinarydata
context.
decodeAudioData
(
audioData
,

function
(
soundBuffer
){

//Addthebuffereddatatoourobject

soundSource.
buffer

=
soundBuffer

})

SeethisonJSFiddle.

Connectthesourcetothedestination

ThisiswherewestarttocreateourAudioRoutingGraphs.Wehaveoursoundsourceand
theAudioContexthasitsdestinationwhich,inmostcases,willbeyourspeakersor
headphones.Wenowwanttoconnectonetotheother.Thisisessentiallynothingmore
thantakingthecablefromtheelectricguitarandpluggingitintotheamp.Thecodetodo
thisisevensimpler.

soundSource.
connect
(
context.
destination
)

Thatsit.Assumingyoureusingthesamevariablenamesasabove,thatsallyouneedto
writeandsuddenlyyoursoundsourceiscomingoutofthecomputer.Neat.

Createanode


Ofcourse,ifitweresimplyconnectingasoundtoaspeaker,wewouldnthaveanycontrol
overitatall.Alongthewaybetweenstartandend,wecancreateandinsertnodesinto
thechain.Therearemanydifferentkindsofnodes.Eachnodeeithercreatesorreceives
andaudiosignal,processesthedatainsomewayandoutputsthenewsignal.Themost
basicisaGainNode,usedforvolume.

//Createavolume(gain)node
volumeNode
=
context.
createGain
()

//Setthevolume
volumeNode.
gain
.
value

0.1

Chaineverythingtogether

WecannowputourGaininthechainbyconnectingthesoundsourcetotheGainthen
connectingtheGaintothedestination.

soundSource.
connect
(
volumeNode
)

SeethisonJSFiddle

Lengthychains

Anothercommontypeofnodeisthe
BiquadFilter
.Thisisacommonfeatureofsound
engineeringwhich,throughsomeveryimpressivemathematicalcleverness,providesalot
ofcontrolovertheaudiosignalbyexposingonlyafewvariables.
Thisisnotnecessarilythebestplacetogointodetailbutheresaquicksummaryofthe
availablefilters.Eachofthemtakesafrequencyvalueandtheycanoptionallytakea
Q
factor
oragainvalue,dependingonthetypeoffilter.

Lowpass
Soundsbelowthesuppliedfrequencyareletthrough,soundsabovearequietened.The
higher,thequieter.

Highpass

Soundsabovethesuppliedfrequencyareletthrough,soundsbelowarequietened.The
lower,thequieter.

Bandpass
Soundsimmediatelyaboveandbelowthesuppliedfrequencyareletthrough.Sounds
higherandlowerthanacertainrange(specifiedbytheQfactor)arequieter.

Lowshelf
Allsoundsareletthrough,thosebelowthegivenfrequencyaremadelouder.

Highshelf
Allsoundsareletthrough,thoseabovethegivenfrequencyaremadelouder.

Peaking
Allsoundsareletthrough,thoseoneithersideofthegivenfrequencyaremadelouder.

Notch
OppositeofBandpass.Soundsimmediatelyaboveandbelowthesuppliedfrequencyare
madequieter.Soundshigherandlowerthanacertainrange(specifiedbytheQfactor)are
louder.

Allpass
Changesthephasebetweendifferentfrequencies.Ifyoudontknowwhatitis,you
probablydontneedit.
Connectingthesefilternodesisassimpleasanyother.

filterNode
=
context.
createBiquadFilter
()

//Specifythisisalowpassfilter
filterNode.
type

//Quietensoundsover220Hz

filterNode.
frequency
.
value

220

soundSource.
connect
(
volumeNode
)

volumeNode.
connect
(
filterNode
)

filterNode.
connect
(
context.
destination
)

SeethisonJSFiddle

Done
Bynow,youshouldhaveaworkingsampleoftheWebAudioAPIinfrontofyou.Nicejob.
Wehave,however,onlyscratchedthesurfaceoftheAPI.Wellgointothatmoresoon.

Attributions
Hello,Hello,Hellosamplefromfreesound.org
SpeakersymbolbyOkanBenn,fromthenounproject.com.

You might also like