Posted by ctoth 10/28/2024
> compilation of the libraries associated with handling audio and video in FFmpeg — libavformat, libavcodec, libavfilter, libavutil and libswresample — for WebAssembly
https://github.com/Yahweasel/libav.js/
EDIT: Looking closer at the above I don't see image formats, only audio and video. Maybe this gets closer:
> Decodes images using Rust's image crate compiled to WebAssembly.
> It supports decoding PNG, JPEG, WEBP, GIF, BMP, ICO, TGA, and several others.
A media library written for WASM with supplied codecs would not have that limitation, such that you could have an equivalent to VLC directly in a browser supporting immediate playback of any video/audio codec/container supplied in the WASM payload. The major advantage of a browser based application versus a desktop app is immediate playback of media via hyperlink without the soul crushing limitations Apple imposes on iOS.
I am trying to synchronize multiple video files that have uneven framerates. I have separate metadata containing the frame timing.
The video files are .ts (mpeg stream) files.
I suspect there exist filters in the browser APIs to adjust play back timing and rate. I have not attempted to modify the media in that way myself but I did see API filters to alter stero panning, tonal quality, wave shape, biquad, delay, and more.
only time i had to resort to those canvas/wasm hacks was to force autoplay video for advertisement, against all users best interest.
I just wanted to listen to music I already have played from my phone while I do dishes.
In my case I just wanted to have a media player that could play MP3s and feature a playlist dynamically populated from either the local device file system or, more ideally, from a shared network location. Restricted. There were some media apps that kind of, in a really shitty way, figured this out. Most of those apps, even when shitty, cost money from the app store.
Fuck that stupidity.
I wrote my own solution in JavaScript that executes in the browser and it does almost everything I want. The playlist data must be statically written into the browser page and cannot be dynamically populated from any file system because Apple disables that part of the browser's FileSystem API in browsers on iOS, though it executes correctly on desktop Safari. I then host the page from a shared network location so that I can access the same content in the same way on any device with access to the network. I adapted this home grown media player to execute video as well, but it can only play videos from codecs/containers already supported in the browser.
... ok, i get your point. we are all clows.
Edit: Should note that our decoder was developed in house and I have nothing to do with the project OP linked to.
OPUS is a descendant of CELT (which never caught on, for reasons) which is an open/unhindered by patents low latency lossy codec suitable for VoIP and live streaming. It's relatively recent, but also, those use cases are more popular recently in the browser.
And to go even more extreme, some applications really care about bit-exactness in their decoders, which you don't get by relying on other vendors.
https://github.com/livebassmusicrightnow/even-nicercast/blob...
https://github.com/TooTallNate/node-icy
WASM would be total overkill. The mpeg stream is readable natively in the browser — that’s how I served the client when I was running livebassmusicrightnow.