-
Notifications
You must be signed in to change notification settings - Fork 984
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transcoding and Dash support #440
Comments
That would be absolutely fantastic! I was thinking of implementing a node-js runner that uses idle system resources to convert files in the background and make them available to streama when rdy, otherwise they are queued. But immediate transcoding would be even better. Its just not something that I know very much about |
On-the-fly transcoding is something I've been thinking about for a while, but have been working on other issues. My plan was to create a wrapper around FFMPEG to handle the transcoding. When the video player requests the file, return the transcode. |
@Jeronimo95 |
@dularion what are the possibilities for technical integration? Another option would be to have a 100% pure java code, relying on ffmpeg or gstreamer under the hood for transcoding Let me know what would work best |
We could build the transcoder to do both pre-transcoding and live-transcoding. Then make it selectable for the user, by default live-transcoding the files so users don't have to think about it on a new install. We could have options like:
Then the user would have the power to select what they wanted. For something like a small VPS - use the "Transcode on upload" or "No Transcoding". For a home server with a powerful CPU use "Live Transcoding". If you take a look at the processes when running a Plex transcode it has the following options, which look similar to FFMPEG, I wouldn't be surprised if they were using it as a base.
From this we can sort of tell that passing the video/segments/progress/etc back to Plex via http://127.0.0.1:32400/video/:/transcode/ As for system requirements it depends on how many streams the user wants to do at once. A Core 2 Duo 2.0 GHz should be able to do a single 720p transcode. I previously had a home server running a We should make it as easy for the end-user as possible, I think a solution that is contained within streama would be best and make it the most flexible. |
Will this transcoder support multiplexing video? No need to actually transcode a file that is already in H264, sometimes an MKV container just needs to be changed over to an MP4. This basically just takes way less time and CPU power while changing the file type and not reducing the video quality as far as I understand it. |
Yeah will definitely do that. If it's just the container that needs to be changed no sense to completely decode and reencode the data. |
Please note that if we want to support adaptive streaming (DASH/HLS) to
impove users' quality of experience, it is necessary to down-scale the
videos to genereate low-def chunks.
This requires a transcoding pass.
2017-09-29 3:43 GMT+02:00 Jeremy Ryan <notifications@github.com>:
… Yeah will definitely do that. If it's just the container that needs to be
changed no sense to completely decode and reencode the data.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#440 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AA6pNy44Bz9Y9ozyh4d4LQOGT_ORRLaEks5snEs5gaJpZM4PjxIt>
.
--
Nicolas Herbaut
+33671997796
|
Also, heavilly requested feature lacking from plex and other platforms is proper ordered chapters support. This allows TV series that ship with the same intro save space by only storing the intro/ending sequence once, and just playing that in a set time on the other MKVs Also allows easilly autoskipping intros, and playing only the intros. |
anything new in transcoding support? |
Nothing really "new" but I really want focus on figuring it out. I think what I'm going to try is to spawn an FFMPEG process to write a temp file in the correct format when the file is requested and can't be directly played by the device. That way we can support multiple devices and qualities and don't have to store the same video multiple times. You'll just need a few gigs of space to hold the temp file. |
Okay so what I'm looking at at the moment is to integrate hls.js and use ffmpeg to generate a hls stream. This will be compatible with:
I also want to do detection for direct play videos - so we don't have to create a hls stream if we don't need to. When I have a working alpha I'll push it to a branch. |
Any update on this? Happy to help test it out if there is. |
At the risk of being annoying: are there any news on this? I'd really LOVE to see this feature in streama! |
I know this is a very highly sought after feature. I had some tests I was working on back in March - nothing close to working. Unfortunately due to some bad hardware I don't have those anymore, not that they were incredibly useful. If anyone wants to help with this feature specifically get in touch with me. |
netflix use Maybe it's worth something this https://github.com/Netflix/vmaf |
vmaf is used to validate media visual quality, photon from my understanding is to validate IMF which is a containerized media, which probably isn't what most cases need for this software. |
Hi all, I have some ideas regarding transcoding. I think that we have to be realistic and admit that implementing transcoding as part of streama is a bad idea because:
So instead of having transoding feature I suggest having the ability to upload multiple videos of the same movie/episode but with different size and qulaity. For instance, you could upload 3 files for only one movie so you have high, medium and low quality uploads for the same movie. It's much easier to implement this, it doesn't choke low-end VPS instances, you have full control of encoder and encoding settings. What do you think? |
I'm not a Streama contributor by any means, but I'm a user and have followed this issue for a while. I do not wish to add noise to the issue but I thought @IvanBernatovic's comment was worth responding to since I believe my use case can't be that uncommon.
Valid point. But not an issue that I believe should remove the item from the roadmap altogether since someone might come by and want to scratch an itch. The bulk of the work isprobably already handled by third-party libraries such as ffmpeg so it would mostly be about integrating it into Streama in a user-friendly fashion and not breaking current file handling/serving code (I assume, since I'm not familiar with the code base).
I was actually surprised to hear that someone would use a VPS for hosting. I'd assume that the extreme amounts of storage and throughput would make cloud hosting very cost inefficient. I run it on an old server with tons of storage in a closet and my assumption would be that most people would have them stored on a NAS or something.
Of course would the suggested approach be the optimal to go by quality wise, but I don't think anyone wishing for this feature are wanting this to provide a top-notch streaming service, it's a convenience feature, otherwise they'd already convert the files into correct file format and codec. I have a big old DVD library that I digitalized, all in MKV, that I'll never have any time or interest in manually transcoding and reupload. It's a convenience feature, I don't care about possibly squeezing out a higher quality stream, I just want A stream.
Pretty sure all you really need to add is ffmpeg if you want to go for a straight forward implementation. I know this is ultimately up to the devs but I thought this alternative use case where hardware performance isn't necessarily an issue should be presented as well. I believe there are good reasons to support built-in transcoding. |
something new here? |
I noticed that streama didn't support file transcoding.
Is it your backlog? If not, can you elaborate on how to integrate it from a technical point of view?
I have a project that does transcoding and generates dash segments that works well with dash.js maybe I can work on a way to integrate it.
The text was updated successfully, but these errors were encountered: