Googles appar
Huvudmeny

Post a Comment On: cbloom rants

"02-10-10 - Some little image notes"

1 Comment -

1 – 1 of 1
Blogger ryg said...

On 1): Another interesting thing in the same vein is the choice of binarization scheme when you're using binary arithmetic coding. The obvious effect is that it determines your prior model as long as you don't have sufficient statistics (and that way you could do "perceptual preconditioning"); but there's also secondary effects, because the binarization also makes some contexts a lot easier to capture than others (context within a single binarized symbol is trivial to include). Together with the prior expectation and RD optimization, this forms a positive feedback loop (you tend to favor small motion vectors because they initially take fewer bits, so they're estimated as more likely, so they take even fewer bits, and so on) - it's self-reinforcing in a way. That's not just in the beginning - at least some of the bits are going to be pretty random, so the shorter binarizations to tend to be cheaper to code than the long ones. Whatever binarization you pick actually biases your coding choices towards its own distribution, even after the model has been trained on your data.

February 10, 2010 at 4:10 PM

You can use some HTML tags, such as <b>, <i>, <a>

This blog does not allow anonymous comments.

Comment moderation has been enabled. All comments must be approved by the blog author.

You will be asked to sign in after submitting your comment.