Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Minimal latency is only really needed for live performance and monitoring, though these do tend to be crucial demands in most cases. A major problem for browsers is also poor support for multichannel devices.

They do plan to have a "native wrapper like tauri" in the future. I've played around with node-web-audio-api for low latency multichannel for Electron, but it wasn't a great success. Mostly because Rust audio backends (and almost all audio backends in general) aren't very good in such usage.

https://github.com/ircam-ismm/node-web-audio-api



If you're just picking up samples from Splice or whatever and arranging them, sure, latency means nothing, but it becomes pretty crucial when you're recording an instrument.


Why would recording be the latency sensitive part, wouldnt it be the playback ?


Delays can be compensated for during either recording or playback, the problem is when both at the same time. It just so happens that recording is typically done "in-the-loop" with the backing track being played through headphones during the session.


I'm guessing they mean the monitoring latency while recording. For me, that's one of the most vital parts of a recording environment. If I hit the key or the drum and hear it come through the headphones 50ms later, it's completely unuseable.


Not if multitrack recording. If you record it with different latencies you're toast.


Not if you know the latencies.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: