In chrome and firefox, we can use HTMLMediaElement.captureMediaStream() to capture the stream. But Safari does not support HTMLMediaElement.captureMediaStream().
The source of the video tag is the hls data received in segments.
Is there any other way to capture media data as a MediaStream in Safari?
Safari does not support the direct captureMediaStream method, at least for now, as you can see, but one way to achieve this is to use the browser ffmpeg based method.
In the past, running ffmpeg in a browser was not practical for most tasks because it was too large and too slow, but recent WASM (Web Assembly)-based implementations of ffmpeg have made it more accessible.
You can build your own ffmpeg-based solution using the following libraries:
You really need to be aware of the need for SharedArrayBuffer support:
The link mentioned above is here: https://caniuse.com/sharedarraybuffer
I've found that it works well for many common ffmpeg tasks, but it's sensitive to larger video files - I haven't tried HLS-to-file conversion, so you'll need to experiment to see if it works for your needs. Here is a demo where you can test your use case: https://ffmpegwasm.netlify. application/#demo
The ffmpeg command you want to use depends on the audio, but most likely:
(See this question and answer for details: Convert HLS (m3u8) to MP4)
There are also some open source ready-made solutions you can check out - for example: