how to play audio files syncronously in javascript?
I am working on a program to convert text into morse code audio.
Say I type in sos
. My program will turn this into the array [1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1]
. Where s = dot dot dot
(or 1,1,1
), and o = dash dash dash
(or 2,2,2
). This part is quite easy.
Next, I have 2 sound files:
var dot = new Audio('dot.mp3');
var dash = new Audio('dash.mp3');
My goal is to have a function that will play dot.mp3
when it sees a 1
, and dash.mp3
when it sees a 2
, and pauses when it sees a 0
.
The following sort of/ kind of/ sometimes works, but I think it's fundamentally flawed and I don't know how to fix it.
function playMorseArr(morseArr) {
for (let i = 0; i < morseArr.length; i++) {
setTimeout(function() {
if (morseArr[i] === 1) {
dot.play();
}
if (morseArr[i] === 2) {
dash.play();
}
}, 250*i);
}
}
The problem:
I can loop over the array, and play the sound files, but timing is a challenge. If I don't set the setTimeout()
interval just right, if the last audio file is not done playing and the 250ms
has elapsed, the next element in the array will be skipped. So dash.mp3
is longer than dot.mp3
. If my timing is too short, I might hear [dot dot dot pause dash dash pause dot dot dot]
, or something to that effect.
The effect I want
I want the program to go like this (in pseudocode):
- look at the
ith
array element - if
1
or2
, start playing sound file or else create a pause - wait for the sound file or pause to finish
- increment
i
and go back to step 1
What I have thought of, but don't know how to implement
So the pickle is that I want the loop to proceed synchronously. I've used promises in situations where I had several functions that I wanted executed in a specific order, but how would I chain an unknown number of functions?
I also considered using custom events, but I have the same problem.
javascript html5-audio synchronous playback
add a comment |
I am working on a program to convert text into morse code audio.
Say I type in sos
. My program will turn this into the array [1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1]
. Where s = dot dot dot
(or 1,1,1
), and o = dash dash dash
(or 2,2,2
). This part is quite easy.
Next, I have 2 sound files:
var dot = new Audio('dot.mp3');
var dash = new Audio('dash.mp3');
My goal is to have a function that will play dot.mp3
when it sees a 1
, and dash.mp3
when it sees a 2
, and pauses when it sees a 0
.
The following sort of/ kind of/ sometimes works, but I think it's fundamentally flawed and I don't know how to fix it.
function playMorseArr(morseArr) {
for (let i = 0; i < morseArr.length; i++) {
setTimeout(function() {
if (morseArr[i] === 1) {
dot.play();
}
if (morseArr[i] === 2) {
dash.play();
}
}, 250*i);
}
}
The problem:
I can loop over the array, and play the sound files, but timing is a challenge. If I don't set the setTimeout()
interval just right, if the last audio file is not done playing and the 250ms
has elapsed, the next element in the array will be skipped. So dash.mp3
is longer than dot.mp3
. If my timing is too short, I might hear [dot dot dot pause dash dash pause dot dot dot]
, or something to that effect.
The effect I want
I want the program to go like this (in pseudocode):
- look at the
ith
array element - if
1
or2
, start playing sound file or else create a pause - wait for the sound file or pause to finish
- increment
i
and go back to step 1
What I have thought of, but don't know how to implement
So the pickle is that I want the loop to proceed synchronously. I've used promises in situations where I had several functions that I wanted executed in a specific order, but how would I chain an unknown number of functions?
I also considered using custom events, but I have the same problem.
javascript html5-audio synchronous playback
add a comment |
I am working on a program to convert text into morse code audio.
Say I type in sos
. My program will turn this into the array [1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1]
. Where s = dot dot dot
(or 1,1,1
), and o = dash dash dash
(or 2,2,2
). This part is quite easy.
Next, I have 2 sound files:
var dot = new Audio('dot.mp3');
var dash = new Audio('dash.mp3');
My goal is to have a function that will play dot.mp3
when it sees a 1
, and dash.mp3
when it sees a 2
, and pauses when it sees a 0
.
The following sort of/ kind of/ sometimes works, but I think it's fundamentally flawed and I don't know how to fix it.
function playMorseArr(morseArr) {
for (let i = 0; i < morseArr.length; i++) {
setTimeout(function() {
if (morseArr[i] === 1) {
dot.play();
}
if (morseArr[i] === 2) {
dash.play();
}
}, 250*i);
}
}
The problem:
I can loop over the array, and play the sound files, but timing is a challenge. If I don't set the setTimeout()
interval just right, if the last audio file is not done playing and the 250ms
has elapsed, the next element in the array will be skipped. So dash.mp3
is longer than dot.mp3
. If my timing is too short, I might hear [dot dot dot pause dash dash pause dot dot dot]
, or something to that effect.
The effect I want
I want the program to go like this (in pseudocode):
- look at the
ith
array element - if
1
or2
, start playing sound file or else create a pause - wait for the sound file or pause to finish
- increment
i
and go back to step 1
What I have thought of, but don't know how to implement
So the pickle is that I want the loop to proceed synchronously. I've used promises in situations where I had several functions that I wanted executed in a specific order, but how would I chain an unknown number of functions?
I also considered using custom events, but I have the same problem.
javascript html5-audio synchronous playback
I am working on a program to convert text into morse code audio.
Say I type in sos
. My program will turn this into the array [1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1]
. Where s = dot dot dot
(or 1,1,1
), and o = dash dash dash
(or 2,2,2
). This part is quite easy.
Next, I have 2 sound files:
var dot = new Audio('dot.mp3');
var dash = new Audio('dash.mp3');
My goal is to have a function that will play dot.mp3
when it sees a 1
, and dash.mp3
when it sees a 2
, and pauses when it sees a 0
.
The following sort of/ kind of/ sometimes works, but I think it's fundamentally flawed and I don't know how to fix it.
function playMorseArr(morseArr) {
for (let i = 0; i < morseArr.length; i++) {
setTimeout(function() {
if (morseArr[i] === 1) {
dot.play();
}
if (morseArr[i] === 2) {
dash.play();
}
}, 250*i);
}
}
The problem:
I can loop over the array, and play the sound files, but timing is a challenge. If I don't set the setTimeout()
interval just right, if the last audio file is not done playing and the 250ms
has elapsed, the next element in the array will be skipped. So dash.mp3
is longer than dot.mp3
. If my timing is too short, I might hear [dot dot dot pause dash dash pause dot dot dot]
, or something to that effect.
The effect I want
I want the program to go like this (in pseudocode):
- look at the
ith
array element - if
1
or2
, start playing sound file or else create a pause - wait for the sound file or pause to finish
- increment
i
and go back to step 1
What I have thought of, but don't know how to implement
So the pickle is that I want the loop to proceed synchronously. I've used promises in situations where I had several functions that I wanted executed in a specific order, but how would I chain an unknown number of functions?
I also considered using custom events, but I have the same problem.
javascript html5-audio synchronous playback
javascript html5-audio synchronous playback
edited 2 hours ago
Shidersz
5,9662729
5,9662729
asked 3 hours ago
JozurcrunchJozurcrunch
142210
142210
add a comment |
add a comment |
3 Answers
3
active
oldest
votes
Audio
s have an ended
event that you can listen for, so you can await
a Promise
that resolves when that event fires:
const audios = [undefined, dot, dash];
async function playMorseArr(morseArr) {
for (let i = 0; i < morseArr.length; i++) {
const item = morseArr[i];
await new Promise((resolve) => {
if (item === 0) {
// insert desired number of milliseconds to pause here
setTimeout(resolve, 250);
} else {
audios[item].onended = resolve;
audios[item].play();
}
});
}
}
What is the purpose ofundefined
ataudios
?
– guest271314
2 hours ago
1
Just a placeholder, since OP'smorseArr
has1
corresponding to thedot
audio, and2
corresponding to thedash
audio (but no audio corresponding to0
). Could've also doneconst item = morseArr[i] - 1
and had only two elements in theaudios
array
– CertainPerformance
2 hours ago
add a comment |
I will use a recursive approach that will listen on the audio ended event. So, every time the current playing audio stop, the method is called again to play the next one.
function playMorseArr(morseArr, idx)
{
// Finish condition.
if (idx >= morseArr.length)
return;
let next = function() {playMorseArr(morseArr, idx + 1)};
if (morseArr[idx] === 1) {
dot.onended = next;
dot.play();
}
else if (morseArr[idx] === 2) {
dash.onended = next;
dash.play();
}
else {
setTimeout(next, 250);
}
}
You can initialize the procedure calling playMorseArr()
with the array and the start index:
playMorseArr([1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1], 0);
add a comment |
Do not use HTMLAudioElement for that kind of application.
The HTMLMediaElements are by nature asynchronous and everything from the play()
method to the pause()
one and going through the obvious resource fetching and the less obvious currentTime
setting is asynchronous.
This means that for applications that need perfect timings (like a Morse-code reader), these elements are purely unreliable.
Instead, use the Web Audio API, and its AudioBufferSourceNodes objects, which you can control with µs precision.
First fetch all your resources as ArrayBuffers, then when needed generate and play AudioBufferSourceNodes from these ArrayBuffers.
You'll be able to start playing these synchronously, or to schedule them with higher precision than setTimeout will offer you (AudioContext uses its own clock).
Worried about memory impact of having several AudioBufferSourceNodes playing your samples? Don't be. The data is stored only once in memory, in the AudioBuffer. AudioBufferSourceNodes are just views over this data and take up no place.
// I use a lib for Morse encoding, didn't tested it too much though
// https://github.com/Syncthetic/MorseCode/
const morse = Object.create(MorseCode);
const ctx = new AudioContext();
(async function initMorseData() {
// our AudioBuffers objects
const [short, long] = await fetchBuffers();
btn.onclick = e => {
let time = 0; // a simple time counter
const sequence = morse.encode(inp.value);
console.log(sequence); // dots and dashes
sequence.split('').forEach(type => {
if(type === ' ') { // space => 0.5s of silence
time += 0.5;
return;
}
// create an AudioBufferSourceNode
let source = ctx.createBufferSource();
// assign the correct AudioBuffer to it
source.buffer = type === '-' ? long : short;
// connect to our output audio
source.connect(ctx.destination);
// schedule it to start at the end of previous one
source.start(ctx.currentTime + time);
// increment our timer with our sample's duration
time += source.buffer.duration;
});
};
// ready to go
btn.disabled = false
})()
.catch(console.error);
function fetchBuffers() {
return Promise.all(
[
'https://dl.dropboxusercontent.com/s/1cdwpm3gca9mlo0/kick.mp3',
'https://dl.dropboxusercontent.com/s/h2j6vm17r07jf03/snare.mp3'
].map(url => fetch(url)
.then(r => r.arrayBuffer())
.then(buf => ctx.decodeAudioData(buf))
)
);
}
<script src="https://cdn.jsdelivr.net/gh/Syncthetic/MorseCode@master/morsecode.js"></script>
<input type="text" id="inp" value="sos"><button id="btn" disabled>play</button>
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e) {
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom)) {
StackExchange.using('gps', function() { StackExchange.gps.track('embedded_signup_form.view', { location: 'question_page' }); });
$window.unbind('scroll', onScroll);
}
};
$window.on('scroll', onScroll);
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54509959%2fhow-to-play-audio-files-syncronously-in-javascript%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
Audio
s have an ended
event that you can listen for, so you can await
a Promise
that resolves when that event fires:
const audios = [undefined, dot, dash];
async function playMorseArr(morseArr) {
for (let i = 0; i < morseArr.length; i++) {
const item = morseArr[i];
await new Promise((resolve) => {
if (item === 0) {
// insert desired number of milliseconds to pause here
setTimeout(resolve, 250);
} else {
audios[item].onended = resolve;
audios[item].play();
}
});
}
}
What is the purpose ofundefined
ataudios
?
– guest271314
2 hours ago
1
Just a placeholder, since OP'smorseArr
has1
corresponding to thedot
audio, and2
corresponding to thedash
audio (but no audio corresponding to0
). Could've also doneconst item = morseArr[i] - 1
and had only two elements in theaudios
array
– CertainPerformance
2 hours ago
add a comment |
Audio
s have an ended
event that you can listen for, so you can await
a Promise
that resolves when that event fires:
const audios = [undefined, dot, dash];
async function playMorseArr(morseArr) {
for (let i = 0; i < morseArr.length; i++) {
const item = morseArr[i];
await new Promise((resolve) => {
if (item === 0) {
// insert desired number of milliseconds to pause here
setTimeout(resolve, 250);
} else {
audios[item].onended = resolve;
audios[item].play();
}
});
}
}
What is the purpose ofundefined
ataudios
?
– guest271314
2 hours ago
1
Just a placeholder, since OP'smorseArr
has1
corresponding to thedot
audio, and2
corresponding to thedash
audio (but no audio corresponding to0
). Could've also doneconst item = morseArr[i] - 1
and had only two elements in theaudios
array
– CertainPerformance
2 hours ago
add a comment |
Audio
s have an ended
event that you can listen for, so you can await
a Promise
that resolves when that event fires:
const audios = [undefined, dot, dash];
async function playMorseArr(morseArr) {
for (let i = 0; i < morseArr.length; i++) {
const item = morseArr[i];
await new Promise((resolve) => {
if (item === 0) {
// insert desired number of milliseconds to pause here
setTimeout(resolve, 250);
} else {
audios[item].onended = resolve;
audios[item].play();
}
});
}
}
Audio
s have an ended
event that you can listen for, so you can await
a Promise
that resolves when that event fires:
const audios = [undefined, dot, dash];
async function playMorseArr(morseArr) {
for (let i = 0; i < morseArr.length; i++) {
const item = morseArr[i];
await new Promise((resolve) => {
if (item === 0) {
// insert desired number of milliseconds to pause here
setTimeout(resolve, 250);
} else {
audios[item].onended = resolve;
audios[item].play();
}
});
}
}
answered 3 hours ago
CertainPerformanceCertainPerformance
84.1k154168
84.1k154168
What is the purpose ofundefined
ataudios
?
– guest271314
2 hours ago
1
Just a placeholder, since OP'smorseArr
has1
corresponding to thedot
audio, and2
corresponding to thedash
audio (but no audio corresponding to0
). Could've also doneconst item = morseArr[i] - 1
and had only two elements in theaudios
array
– CertainPerformance
2 hours ago
add a comment |
What is the purpose ofundefined
ataudios
?
– guest271314
2 hours ago
1
Just a placeholder, since OP'smorseArr
has1
corresponding to thedot
audio, and2
corresponding to thedash
audio (but no audio corresponding to0
). Could've also doneconst item = morseArr[i] - 1
and had only two elements in theaudios
array
– CertainPerformance
2 hours ago
What is the purpose of
undefined
at audios
?– guest271314
2 hours ago
What is the purpose of
undefined
at audios
?– guest271314
2 hours ago
1
1
Just a placeholder, since OP's
morseArr
has 1
corresponding to the dot
audio, and 2
corresponding to the dash
audio (but no audio corresponding to 0
). Could've also done const item = morseArr[i] - 1
and had only two elements in the audios
array– CertainPerformance
2 hours ago
Just a placeholder, since OP's
morseArr
has 1
corresponding to the dot
audio, and 2
corresponding to the dash
audio (but no audio corresponding to 0
). Could've also done const item = morseArr[i] - 1
and had only two elements in the audios
array– CertainPerformance
2 hours ago
add a comment |
I will use a recursive approach that will listen on the audio ended event. So, every time the current playing audio stop, the method is called again to play the next one.
function playMorseArr(morseArr, idx)
{
// Finish condition.
if (idx >= morseArr.length)
return;
let next = function() {playMorseArr(morseArr, idx + 1)};
if (morseArr[idx] === 1) {
dot.onended = next;
dot.play();
}
else if (morseArr[idx] === 2) {
dash.onended = next;
dash.play();
}
else {
setTimeout(next, 250);
}
}
You can initialize the procedure calling playMorseArr()
with the array and the start index:
playMorseArr([1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1], 0);
add a comment |
I will use a recursive approach that will listen on the audio ended event. So, every time the current playing audio stop, the method is called again to play the next one.
function playMorseArr(morseArr, idx)
{
// Finish condition.
if (idx >= morseArr.length)
return;
let next = function() {playMorseArr(morseArr, idx + 1)};
if (morseArr[idx] === 1) {
dot.onended = next;
dot.play();
}
else if (morseArr[idx] === 2) {
dash.onended = next;
dash.play();
}
else {
setTimeout(next, 250);
}
}
You can initialize the procedure calling playMorseArr()
with the array and the start index:
playMorseArr([1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1], 0);
add a comment |
I will use a recursive approach that will listen on the audio ended event. So, every time the current playing audio stop, the method is called again to play the next one.
function playMorseArr(morseArr, idx)
{
// Finish condition.
if (idx >= morseArr.length)
return;
let next = function() {playMorseArr(morseArr, idx + 1)};
if (morseArr[idx] === 1) {
dot.onended = next;
dot.play();
}
else if (morseArr[idx] === 2) {
dash.onended = next;
dash.play();
}
else {
setTimeout(next, 250);
}
}
You can initialize the procedure calling playMorseArr()
with the array and the start index:
playMorseArr([1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1], 0);
I will use a recursive approach that will listen on the audio ended event. So, every time the current playing audio stop, the method is called again to play the next one.
function playMorseArr(morseArr, idx)
{
// Finish condition.
if (idx >= morseArr.length)
return;
let next = function() {playMorseArr(morseArr, idx + 1)};
if (morseArr[idx] === 1) {
dot.onended = next;
dot.play();
}
else if (morseArr[idx] === 2) {
dash.onended = next;
dash.play();
}
else {
setTimeout(next, 250);
}
}
You can initialize the procedure calling playMorseArr()
with the array and the start index:
playMorseArr([1, 1, 1, 0, 2, 2, 2, 0, 1, 1, 1], 0);
edited 2 hours ago
answered 3 hours ago
ShiderszShidersz
5,9662729
5,9662729
add a comment |
add a comment |
Do not use HTMLAudioElement for that kind of application.
The HTMLMediaElements are by nature asynchronous and everything from the play()
method to the pause()
one and going through the obvious resource fetching and the less obvious currentTime
setting is asynchronous.
This means that for applications that need perfect timings (like a Morse-code reader), these elements are purely unreliable.
Instead, use the Web Audio API, and its AudioBufferSourceNodes objects, which you can control with µs precision.
First fetch all your resources as ArrayBuffers, then when needed generate and play AudioBufferSourceNodes from these ArrayBuffers.
You'll be able to start playing these synchronously, or to schedule them with higher precision than setTimeout will offer you (AudioContext uses its own clock).
Worried about memory impact of having several AudioBufferSourceNodes playing your samples? Don't be. The data is stored only once in memory, in the AudioBuffer. AudioBufferSourceNodes are just views over this data and take up no place.
// I use a lib for Morse encoding, didn't tested it too much though
// https://github.com/Syncthetic/MorseCode/
const morse = Object.create(MorseCode);
const ctx = new AudioContext();
(async function initMorseData() {
// our AudioBuffers objects
const [short, long] = await fetchBuffers();
btn.onclick = e => {
let time = 0; // a simple time counter
const sequence = morse.encode(inp.value);
console.log(sequence); // dots and dashes
sequence.split('').forEach(type => {
if(type === ' ') { // space => 0.5s of silence
time += 0.5;
return;
}
// create an AudioBufferSourceNode
let source = ctx.createBufferSource();
// assign the correct AudioBuffer to it
source.buffer = type === '-' ? long : short;
// connect to our output audio
source.connect(ctx.destination);
// schedule it to start at the end of previous one
source.start(ctx.currentTime + time);
// increment our timer with our sample's duration
time += source.buffer.duration;
});
};
// ready to go
btn.disabled = false
})()
.catch(console.error);
function fetchBuffers() {
return Promise.all(
[
'https://dl.dropboxusercontent.com/s/1cdwpm3gca9mlo0/kick.mp3',
'https://dl.dropboxusercontent.com/s/h2j6vm17r07jf03/snare.mp3'
].map(url => fetch(url)
.then(r => r.arrayBuffer())
.then(buf => ctx.decodeAudioData(buf))
)
);
}
<script src="https://cdn.jsdelivr.net/gh/Syncthetic/MorseCode@master/morsecode.js"></script>
<input type="text" id="inp" value="sos"><button id="btn" disabled>play</button>
add a comment |
Do not use HTMLAudioElement for that kind of application.
The HTMLMediaElements are by nature asynchronous and everything from the play()
method to the pause()
one and going through the obvious resource fetching and the less obvious currentTime
setting is asynchronous.
This means that for applications that need perfect timings (like a Morse-code reader), these elements are purely unreliable.
Instead, use the Web Audio API, and its AudioBufferSourceNodes objects, which you can control with µs precision.
First fetch all your resources as ArrayBuffers, then when needed generate and play AudioBufferSourceNodes from these ArrayBuffers.
You'll be able to start playing these synchronously, or to schedule them with higher precision than setTimeout will offer you (AudioContext uses its own clock).
Worried about memory impact of having several AudioBufferSourceNodes playing your samples? Don't be. The data is stored only once in memory, in the AudioBuffer. AudioBufferSourceNodes are just views over this data and take up no place.
// I use a lib for Morse encoding, didn't tested it too much though
// https://github.com/Syncthetic/MorseCode/
const morse = Object.create(MorseCode);
const ctx = new AudioContext();
(async function initMorseData() {
// our AudioBuffers objects
const [short, long] = await fetchBuffers();
btn.onclick = e => {
let time = 0; // a simple time counter
const sequence = morse.encode(inp.value);
console.log(sequence); // dots and dashes
sequence.split('').forEach(type => {
if(type === ' ') { // space => 0.5s of silence
time += 0.5;
return;
}
// create an AudioBufferSourceNode
let source = ctx.createBufferSource();
// assign the correct AudioBuffer to it
source.buffer = type === '-' ? long : short;
// connect to our output audio
source.connect(ctx.destination);
// schedule it to start at the end of previous one
source.start(ctx.currentTime + time);
// increment our timer with our sample's duration
time += source.buffer.duration;
});
};
// ready to go
btn.disabled = false
})()
.catch(console.error);
function fetchBuffers() {
return Promise.all(
[
'https://dl.dropboxusercontent.com/s/1cdwpm3gca9mlo0/kick.mp3',
'https://dl.dropboxusercontent.com/s/h2j6vm17r07jf03/snare.mp3'
].map(url => fetch(url)
.then(r => r.arrayBuffer())
.then(buf => ctx.decodeAudioData(buf))
)
);
}
<script src="https://cdn.jsdelivr.net/gh/Syncthetic/MorseCode@master/morsecode.js"></script>
<input type="text" id="inp" value="sos"><button id="btn" disabled>play</button>
add a comment |
Do not use HTMLAudioElement for that kind of application.
The HTMLMediaElements are by nature asynchronous and everything from the play()
method to the pause()
one and going through the obvious resource fetching and the less obvious currentTime
setting is asynchronous.
This means that for applications that need perfect timings (like a Morse-code reader), these elements are purely unreliable.
Instead, use the Web Audio API, and its AudioBufferSourceNodes objects, which you can control with µs precision.
First fetch all your resources as ArrayBuffers, then when needed generate and play AudioBufferSourceNodes from these ArrayBuffers.
You'll be able to start playing these synchronously, or to schedule them with higher precision than setTimeout will offer you (AudioContext uses its own clock).
Worried about memory impact of having several AudioBufferSourceNodes playing your samples? Don't be. The data is stored only once in memory, in the AudioBuffer. AudioBufferSourceNodes are just views over this data and take up no place.
// I use a lib for Morse encoding, didn't tested it too much though
// https://github.com/Syncthetic/MorseCode/
const morse = Object.create(MorseCode);
const ctx = new AudioContext();
(async function initMorseData() {
// our AudioBuffers objects
const [short, long] = await fetchBuffers();
btn.onclick = e => {
let time = 0; // a simple time counter
const sequence = morse.encode(inp.value);
console.log(sequence); // dots and dashes
sequence.split('').forEach(type => {
if(type === ' ') { // space => 0.5s of silence
time += 0.5;
return;
}
// create an AudioBufferSourceNode
let source = ctx.createBufferSource();
// assign the correct AudioBuffer to it
source.buffer = type === '-' ? long : short;
// connect to our output audio
source.connect(ctx.destination);
// schedule it to start at the end of previous one
source.start(ctx.currentTime + time);
// increment our timer with our sample's duration
time += source.buffer.duration;
});
};
// ready to go
btn.disabled = false
})()
.catch(console.error);
function fetchBuffers() {
return Promise.all(
[
'https://dl.dropboxusercontent.com/s/1cdwpm3gca9mlo0/kick.mp3',
'https://dl.dropboxusercontent.com/s/h2j6vm17r07jf03/snare.mp3'
].map(url => fetch(url)
.then(r => r.arrayBuffer())
.then(buf => ctx.decodeAudioData(buf))
)
);
}
<script src="https://cdn.jsdelivr.net/gh/Syncthetic/MorseCode@master/morsecode.js"></script>
<input type="text" id="inp" value="sos"><button id="btn" disabled>play</button>
Do not use HTMLAudioElement for that kind of application.
The HTMLMediaElements are by nature asynchronous and everything from the play()
method to the pause()
one and going through the obvious resource fetching and the less obvious currentTime
setting is asynchronous.
This means that for applications that need perfect timings (like a Morse-code reader), these elements are purely unreliable.
Instead, use the Web Audio API, and its AudioBufferSourceNodes objects, which you can control with µs precision.
First fetch all your resources as ArrayBuffers, then when needed generate and play AudioBufferSourceNodes from these ArrayBuffers.
You'll be able to start playing these synchronously, or to schedule them with higher precision than setTimeout will offer you (AudioContext uses its own clock).
Worried about memory impact of having several AudioBufferSourceNodes playing your samples? Don't be. The data is stored only once in memory, in the AudioBuffer. AudioBufferSourceNodes are just views over this data and take up no place.
// I use a lib for Morse encoding, didn't tested it too much though
// https://github.com/Syncthetic/MorseCode/
const morse = Object.create(MorseCode);
const ctx = new AudioContext();
(async function initMorseData() {
// our AudioBuffers objects
const [short, long] = await fetchBuffers();
btn.onclick = e => {
let time = 0; // a simple time counter
const sequence = morse.encode(inp.value);
console.log(sequence); // dots and dashes
sequence.split('').forEach(type => {
if(type === ' ') { // space => 0.5s of silence
time += 0.5;
return;
}
// create an AudioBufferSourceNode
let source = ctx.createBufferSource();
// assign the correct AudioBuffer to it
source.buffer = type === '-' ? long : short;
// connect to our output audio
source.connect(ctx.destination);
// schedule it to start at the end of previous one
source.start(ctx.currentTime + time);
// increment our timer with our sample's duration
time += source.buffer.duration;
});
};
// ready to go
btn.disabled = false
})()
.catch(console.error);
function fetchBuffers() {
return Promise.all(
[
'https://dl.dropboxusercontent.com/s/1cdwpm3gca9mlo0/kick.mp3',
'https://dl.dropboxusercontent.com/s/h2j6vm17r07jf03/snare.mp3'
].map(url => fetch(url)
.then(r => r.arrayBuffer())
.then(buf => ctx.decodeAudioData(buf))
)
);
}
<script src="https://cdn.jsdelivr.net/gh/Syncthetic/MorseCode@master/morsecode.js"></script>
<input type="text" id="inp" value="sos"><button id="btn" disabled>play</button>
// I use a lib for Morse encoding, didn't tested it too much though
// https://github.com/Syncthetic/MorseCode/
const morse = Object.create(MorseCode);
const ctx = new AudioContext();
(async function initMorseData() {
// our AudioBuffers objects
const [short, long] = await fetchBuffers();
btn.onclick = e => {
let time = 0; // a simple time counter
const sequence = morse.encode(inp.value);
console.log(sequence); // dots and dashes
sequence.split('').forEach(type => {
if(type === ' ') { // space => 0.5s of silence
time += 0.5;
return;
}
// create an AudioBufferSourceNode
let source = ctx.createBufferSource();
// assign the correct AudioBuffer to it
source.buffer = type === '-' ? long : short;
// connect to our output audio
source.connect(ctx.destination);
// schedule it to start at the end of previous one
source.start(ctx.currentTime + time);
// increment our timer with our sample's duration
time += source.buffer.duration;
});
};
// ready to go
btn.disabled = false
})()
.catch(console.error);
function fetchBuffers() {
return Promise.all(
[
'https://dl.dropboxusercontent.com/s/1cdwpm3gca9mlo0/kick.mp3',
'https://dl.dropboxusercontent.com/s/h2j6vm17r07jf03/snare.mp3'
].map(url => fetch(url)
.then(r => r.arrayBuffer())
.then(buf => ctx.decodeAudioData(buf))
)
);
}
<script src="https://cdn.jsdelivr.net/gh/Syncthetic/MorseCode@master/morsecode.js"></script>
<input type="text" id="inp" value="sos"><button id="btn" disabled>play</button>
// I use a lib for Morse encoding, didn't tested it too much though
// https://github.com/Syncthetic/MorseCode/
const morse = Object.create(MorseCode);
const ctx = new AudioContext();
(async function initMorseData() {
// our AudioBuffers objects
const [short, long] = await fetchBuffers();
btn.onclick = e => {
let time = 0; // a simple time counter
const sequence = morse.encode(inp.value);
console.log(sequence); // dots and dashes
sequence.split('').forEach(type => {
if(type === ' ') { // space => 0.5s of silence
time += 0.5;
return;
}
// create an AudioBufferSourceNode
let source = ctx.createBufferSource();
// assign the correct AudioBuffer to it
source.buffer = type === '-' ? long : short;
// connect to our output audio
source.connect(ctx.destination);
// schedule it to start at the end of previous one
source.start(ctx.currentTime + time);
// increment our timer with our sample's duration
time += source.buffer.duration;
});
};
// ready to go
btn.disabled = false
})()
.catch(console.error);
function fetchBuffers() {
return Promise.all(
[
'https://dl.dropboxusercontent.com/s/1cdwpm3gca9mlo0/kick.mp3',
'https://dl.dropboxusercontent.com/s/h2j6vm17r07jf03/snare.mp3'
].map(url => fetch(url)
.then(r => r.arrayBuffer())
.then(buf => ctx.decodeAudioData(buf))
)
);
}
<script src="https://cdn.jsdelivr.net/gh/Syncthetic/MorseCode@master/morsecode.js"></script>
<input type="text" id="inp" value="sos"><button id="btn" disabled>play</button>
edited 31 mins ago
answered 1 hour ago
KaiidoKaiido
40.8k461100
40.8k461100
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e) {
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom)) {
StackExchange.using('gps', function() { StackExchange.gps.track('embedded_signup_form.view', { location: 'question_page' }); });
$window.unbind('scroll', onScroll);
}
};
$window.on('scroll', onScroll);
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54509959%2fhow-to-play-audio-files-syncronously-in-javascript%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e) {
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom)) {
StackExchange.using('gps', function() { StackExchange.gps.track('embedded_signup_form.view', { location: 'question_page' }); });
$window.unbind('scroll', onScroll);
}
};
$window.on('scroll', onScroll);
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e) {
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom)) {
StackExchange.using('gps', function() { StackExchange.gps.track('embedded_signup_form.view', { location: 'question_page' }); });
$window.unbind('scroll', onScroll);
}
};
$window.on('scroll', onScroll);
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e) {
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom)) {
StackExchange.using('gps', function() { StackExchange.gps.track('embedded_signup_form.view', { location: 'question_page' }); });
$window.unbind('scroll', onScroll);
}
};
$window.on('scroll', onScroll);
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown