Check does an image have an alpha channel, and if yes analyze alpha channel: is it a single yes-no (only full or none values), or does it have alpha values in between?
This is quite useful for automatic detection how alpha textures should be displayed: for simple yes/no alpha, OpenGL alpha_test is a simple solution. For full range alpha, OpenGL blending should be used. Blending is a little problematic, since it requires special rendering order, since it doesn't cooperate nicely with Z-buffer. That's why we try to detect simple yes/no alpha textures, so that we're able to use simpler alpha test for them.
This method analyzes every pixel. It's alpha is considered "simple" if it's <= AlphaTolerance, or >= 255 - AlphaTolerance. So for the default AlphaTolerance, "simple" alpha means only exactly 0 or 255 (maximum Byte values). The method returns true if the ratio of non-simple pixels is WrongPixelsTolerance. For example, default WrongPixelsTolerance = 0 means that every pixel must have "simple" alpha channel. Greated WrongPixelsTolerance values may allow some tolerance, for example WrongPixelsTolerance = 0.01 allows 1 percent of pixels to fail the "simple alpha" test and the image can still be considered "simple yes/no alpha channel".
In summary, default Tolerance values are 0, so exactly all pixels must have exactly full or exactly none alpha. Increasing tolerance values (for example, AlphaTolerance = 5 and WrongPixelsTolerance = 0.01 may be good start — still conservative enough, and tolerate small deviations) allows you to accept more images as simple yes/no alpha. Of course too large tolerance values have no sense — AlphaTolerance >= 128, or WrongPixelsTolerance >= 1.0 will cause all images to be accepted as "simple yes/no alpha".
Descendants implementors notes: in this class, this simply always returns atNone. For descendants that have alpha channel, implement it, honouring AlphaTolerance and WrongPixelsTolerance as described.
|