![]() |
Transcoding in 2018? What are the must haves?
I know this question has been asked many times over the years but I am going to ask anyway. In 2018 what is the benchmark for transcoding these days? If I wanted to reach the most amount of users across multiple delivery devices what are the must have's in terms of back-end and front-end servers/applications are needed for 2018 and beyond.
I would love to hear peoples opinions on the matter and hopefully some concrete solutions on this, ie server setup, content management systems, online solutions, and so on. I am personally looking to see what I can do to transcode and manage and deliver my existing video archive (20K+ files of varying bitrates and resolutions) in the most efficient manner. |
Sell it for top dollar and let someone else deal with it.
|
Well I don't know much about trannies but I would think high heels, lipstick and a dick would be must haves.
|
Quote:
I don't have the time for a thorough response now.. but I recommend that this be a topic you discuss with our support manager when we have that call! Solutions can range the whole gamut from a VPS instance, entry dedicated, high spec dedicated or cloud instances. The decision on which is the best one varies based on multiple factors: 1) Size of batch to be encoded 2) Desired timeline 3) What work is "ongoing" or regular after getting over the hump on a large re-encoding of content What is for certain is that you certainly don't want to do the encoding on your active web server as this will affect performance. If it's a small amount of regular encoding on a weekly basis and if crons could be set for overnight, it is then less likely to effect your surfer experience. With CMSes like MechBunny and ElevatedX it is easy to slave off your encoding work. There really isn't any ongoing management work involved when you have programmatic encoding, so clients with a moderate or high amount of encoding are typically just looking at a server rental without management and backups, which is pretty cost-effective. Brad Sincerely, Brad |
There is no silver bullet for this. The amount of hardware needed would depend entirely on the time you are willing to spend transcoding. The thing is that while CPUs continue to advance quite fast with every next generation, transcoding continues to be extremely taxing and slow process. GPU transcoding, on the other hand, continues to be a bad alternative minaly due to the quality/bitrate limitations. The files transcoded with a GPU remain too big for online use. NVENC (Nvidia) does it better than QuickSync (Intel). However, Nvidia puts artificial limitation on their customer grade hardware so that only 2 simultaneous transcodes can be ran at any given moment. Their PRO cards dont have this limit but at prices of a few K $ per card, it is simply not a viable technology to use.
Modern devices all support HLS now. That is the adaptive bitrate streaming thing that auto adjusts the quality depending on the bandwidth available to the device, so that playback interruptons could be eliminated as much as possible. It's MPEG + AAC inside, which is playable on pretty much everything produced in the past few years. Alternatively, to reach as many devices as possible, you can go with MP4, keeping an eye on the so called levels to ensure maximum device compatibility. The output would at best be in 3 resolutions: 360p/480p with probably a main profile 3.1 level for older devices, and then 720p and 1080p which should do well at 4.1 main profile. H265 is what should come next, after H264 is no longer the deal. However, at present if you try to encode your library to H265 it would take literally forever. Then, H265 is not supported by the hardware of most devices from the lower end yet. So they will not be able to decode it for smooth playback. It's up to you to decide how you will deliver the content (Static MP4s, or HLS ABRs), but in both cases the process of transcoding the whole library will be very very lengthy. It will require multiple servers and some way for you to manage which one transcodes what, so you can spread the load on them. Intel E3s are your friend here. Due to limitations in libx264, the more cores you have for transcoding, the less gain per core you get in terms of speed. There is an overhead on the whole mutli-threading process. On the other hand, x264 benefits from faster cores, and E3s are fast. So money/performance they are a great choice. In any case, it would take long. We have a customer that came to us with 160GB of 720p MP4s they wanted to make 480p versions of. It took 26 hours on an E3-1245v2 or v3 (not sure). I believe we could have lowered it to ~20 hours if we were applying less compression, but then the whole idea of 480p nowadays is to use less bandwidth so... |
You should look into cloud based solutions. It's fast, cheap and the quality is amazing.
I use zencoder.com, you can just use their default h264 profiles for mp4 which will work on 99% of the browsers and devices It's very cheap, extremely fast and no initial payment needed. You can easilly include your logo and output in various bitrates. You can just make simple xml file to do the encoding. You can input/output from FTP, Amazon, etc. Here is a simple sample that I use the get the master .mov file from our FTP then encode it to with our logo into 2160p, 1080p, 720p, 480p, 360p and 240p and then it outputs it back on our FTP A programmer can easily make a simple script to batch everything and customize to your liking. Code:
{ |
Thank you for everyones replies. I will look into each post and come back if I have further questions.
|
All times are GMT -7. The time now is 09:05 PM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc