vgmplay | VGM file command-line player and Winamp plugin | Runtime Evironment library
kandi X-RAY | vgmplay Summary
kandi X-RAY | vgmplay Summary
The official and always up-to-date player for all VGM files. In the future, the existing VGMPlay will be replaced by libvgm, which is currently in development.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of vgmplay
vgmplay Key Features
vgmplay Examples and Code Snippets
Community Discussions
Trending Discussions on vgmplay
QUESTION
Since the old Webaudio scriptprocessor has been deprecated since 2014 and Audioworklets came up in Chrome 64 I decided to give those a try. However I'm having difficulties in porting my application. I'll give 2 examples from a nice article to show my point.
First the scriptprocessor way:
...ANSWER
Answered 2018-Feb-21 at 20:22So my question basically is how to do the above example in Audioworklet,
For your first example, there is already an AudioWorklet version for it: https://github.com/GoogleChromeLabs/web-audio-samples/blob/gh-pages/audio-worklet/basic/js/noise-generator.js
I do not recommend the second example (aka buffer stitching), because it creates lots of source nodes and buffers thus it can cause GC which will interfere with the other tasks in the main thread. Also discontinuity can happen at the boundary of two consecutive buffers if the scheduled start time does not fall on the sample. With that said, you won't be able to hear glitch in this specific example because the source material is noise.
when the data is generated continuously in the main thread in some array and the playback of that data is happening in the Webaudio thread.
The first thing you should do is to separate the audio generator from the main thread. The audio generator must run on AudioWorkletGlobalScope
. That's the whole purpose of AudioWorklet system - the lower latency and the better audio rendering performance.
In your code,
VGMPlay_WebAudio.generateBuffer()
should be called in AudioWorkletProcessor.process()
callback to fill the output buffer of the processor. That roughly matches what your onaudioprocess
callback does.
I've been reading about the messageport thing, but I'm not sure that's the way to go either. The examples don't point me into that direction I'd say. What I might need is the proper way to provide the process function in the AudioWorkletProcesser derived class with my own data.
I don't think your use case requires MessagePort
. I've seen other methods in the code but they really don't do much other than starting and stopping the node. That can be done by connecting/disconnecting AudioWorkletNode in the main thread. No cross-thread messaging necessary.
The code example at the end can be the setup for AudioWorklet. I am well aware that the separation between the setup and the actual audio generation can be tricky, but it will be worth it.
Few questions to you:
- How does the game graphics engine send messages to the VGM generator?
- Can the
VGMPlay
class live on the worker thread without any interaction with the main thread? I don't see any interaction in the code except for starting and stopping. - Is
XMLHttpRequest
essential to theVGMPlay
class? Or can that be done somewhere else?
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install vgmplay
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page