Review: Capella Systems Cambria FTC Offers Per-Title Encoding
Back on point, you control which measure that encoder uses to adjust the encoding ladder. A conservative approach might use 10 seconds as the measure, pushing the data rate up even though it might only affect one highly complex region in the file. This would generally result in files with few encoding artifacts, but with some bandwidth wasted in other areas.
Alternately, you could base the decision on the 30-second peak, which would result in a lower overall data rate, but perhaps some artifacts in some shorter regions. You could also set it for any arbitrary length.
Once this value is returned, the script shown in Figure 3 manages the adjustment. That is, if the encoding complexity was more than 7,000Kbps, the encoder would adjust the data rate of all streams in the adaptive group by 1.5, boosting the target data rate by 50 percent. If the complexity value was 2,000Kbps or less, the encoder would adjust the date by 0.6, dropping the data rate of all streams by 40 percent.
All these adjustments are totally configurable. For example, a conservative encoding shop could adjust the data rate upward for complex clips but never adjust the data rate downward for simple clips, maintaining a very high quality level. Or, you could adjust the data rates for some, but not all, of the streams. Of course, you select both the encoding thresholds and the percentage adjustments to each level.
How We Tested
Intuitively, the goal of any pertitle optimization technique would be to boost the data rate and quality of a file when necessary, but only when the improvement would be noticeable to the viewer. Otherwise, bandwidth would be wasted. Conversely, you would want the encoder to drop the data rate and quality when possible, but only when it wouldn’t result in visible artifacts that would degrade the viewer’s quality of experience.
How do we measure these concepts? In the “PerTitle Encode Optimization” blog post mentioned previously, Netflix made several general observations about the peak signal-to-noise ratio (PSNR) metric that originally powered its analysis (Netflix has since transitioned to its own video multimethod assessment fusion [VMAF] metric). First, Netflix stated that PSNR values in excess of 45 are seldom noticeable by the viewer (“for encodes with PSNR 45 dB or above, the distortion is perceptually unnoticeable”). At the other end of the quality spectrum, the researchers also stated that PSNR values below 35 are often accompanied by visible artifacts (“35 dB will show encoding artifacts”). Let’s agree that these are very rough metrics, but they’re a useful yardstick for assessing Cambria’s SABL-related performance.
Table 2 shows the results of our tests. The SABL data rate is the rate after the analysis; as you recall, the original rate was 4300Kbps (Figure 2). The bandwidth is the percentage reduction between the original and SABL rate, with the PSNR values calculated for both the original and SABL streams.
In these 10 tests, the results are all positive (in some cases, extremely positive). For example, with the cartoon El Ultimo, Cambria dropped the data rate by 50 percent while keeping the PSNR at 45.24 dB. This is quite a significant data rate drop with a quality delta that should be imperceptible to viewers. We see similar results in the Screencam and Tutorial clips, where 50 percent data rate reductions still left the SABL PSNR well above 45 dB. At no point did a drop in data rate push the PSNR value anywhere close to 35 dB. In the two clips where Cambria increased the data rate, the music video “Freedom” and the short Zoolander clip, the increase was clearly beneficial and not wasteful (i.e., it didn’t push the data rate above 45 dB).
Let me reiterate that virtually all the parameters used in these tests are configurable. If I wanted to create another level at the bottom that dropped the data rate by 60 percent, I could. I could also create another level at the top, or push the data rate for existing levels higher for complex videos. You control all basic parameters so you can dial in the desired level of risk and reward.
How long will this analysis take? This depends upon a number of factors, including the workstation that you’re testing on, content resolution, and the number of simultaneous encodes you’ve configured in Cambria Manager. We tested on an HP Z840 workstation with two 2.6 GHz E52690 v3 CPUs running Windows 10. We created six copies of the same 1:45 (hours:mins) 1080p feature film and processed the analysis phase running five simultaneous encodes, which pushed CPU utilization up to about 96 percent. The total time to process the five movies was just under 2 hours, which would be in addition to any encoding time. At the other end of the spectrum, with two simultaneous encodes, it took 4:07 (min:sec) to process the 4:25 (min:sec) 1080p “Freedom” music video clip.
What’s the bottom line? Fixed encoding ladders are history for the reasons discussed in the Netflix blog post and the text around Table 1. Assuming that you don’t have the development budgets of Netflix and Google, the solution offered by Cambria looks very promising. If you’re locked into another encoding platform, don’t be shy about pressing for a per-title option, since the basic building blocks are available to all encoding vendors, cloud and onprem, that use a CRF-compatible codec.
This article originally ran in the Autumn 2016 issue of Streaming Media European Edition as “Review: Capella Systems Cambria FTC.”
Related Articles
If you're not already using per-title encoding, it's time. Here's a guide to choosing the tool that's best for you.
08 Mar 2018
A new generation of encoders looks at the context of content to deliver better video playback and higher efficiency. Here's what publishers need to know about CAE.
23 Feb 2018
HEVC is barely out of the gate, but the race to achieve better quality and lower bandwidth might soon leave H.265 in the dust, according to speakers at a panel at this year's IBC
09 Oct 2015
While Bitcodin is still in development and has a few rough edges, it displayed impressive performance and quality in our testing.
10 Jun 2015
What will the video encoding landscape look like in 2020? We're conducting a survey to find out, and in exchange for your input, you'll be entered to win a $100 Amazon gift card.
08 Jun 2015