# Stencil Test Explained Using Code
I must admit I never used stencil buffer in my personal code. I know it's there available in GPUs and all graphics APIs for years and it's useful for many things, but somehow I never had a need to use it. Recently I became aware that I don't fully understand it. There are many descriptions of stencil test on the Internet, but none of them definitely answered my questions in the way I would like them to be answered. I thought that a piece of pseudocode would explain it better than words, so here is my explanation of the stencil test.
Let's choose Direct3D 11 as our graphics API. Other APIs have similar sets of parameters. D3D11 offers following configuration parameters for stencil test:
Parameter passed to
How do they work? If you render pixel (x, y) and you have current value of stencil buffer available as:
UINT8 Stencil[x, y]
Then pseudocode for stencil test and write could look like below. First, one of two sets of parameters is selected depending on whether current primitive is front-facing or back-facing:
if(primitive has no front and back face, e.g. points, lines)
StencilOpDesc = FrontFace
if(primitive is front facing)
StencilOpDesc = FrontFace
StencilOpDesc = BackFace
Then, a test is performed:
(StencilRef & StencilReadMask) StencilOpDesc.StencilFunc
(Stencil[x, y] & StencilReadMask)
StencilOpDesc.StencilFunc is a comparison operator that can be one of possible enum values:
ALWAYS. I think this is quite self-explanatory.
StencilRef is on the left side of comparison operator and current stencil buffer value is on the right.
StencilRef and current stencil buffer value are ANDed with
StencilReadMask before comparison.
Next, based on the result of this test, as well as result of depth-test aka Z-test (which is out of scope of this article), an operation is selected:
Op = StencilOpDesc.SencilPassOp
Op = StencilOpDesc.StencilDepthFailOp
Op = StencilOpDesc.StencilFailOp
Op is another enum that controls a new value to be written to stencil buffer. It can be one of:
case D3D11_STENCIL_OP_KEEP: NewValue = Stencil[x, y]
case D3D11_STENCIL_OP_ZERO: NewValue = 0
case D3D11_STENCIL_OP_REPLACE: NewValue = StencilRef
case D3D11_STENCIL_OP_INCR_SAT: NewValue = min(Stencil[x, y] + 1, 0xFF)
case D3D11_STENCIL_OP_DECR_SAT: NewValue = max(Stencil[x, y] - 1, 0)
case D3D11_STENCIL_OP_INVERT: NewValue = ~Stencil[x, y]
case D3D11_STENCIL_OP_INCR: NewValue = Stencil[x, y] + 1 // with 8-bit wrap-around
case D3D11_STENCIL_OP_DECR: NewValue = Stencil[x, y] - 1 // with 8-bit wrap-around
Finally, the new value is written to the stencil buffer. Notice how only those bits are changed that are included in
StencilWriteMask. Others remain unchanged.
Stencil[x, y] =
(Stencil[x, y] & ~StencilWriteMask) |
(NewValue & StencilWriteMask)
Now as we have all this explained in a very strict way using code, let me answer doubts I had before understanding this, in form of a FAQ.
Q: Are there no separate flags to enable stencil test and stencil write?
A: No. There is only one flag
StencilEnable to enable all this functionality.
Q: So how to use only one and not the other?
A: You can find specific set of settings to do that.
To perform only stencil test and not write, set
StencilEnable to true,
StencilFunc to the comparison function you need and set all
KEEP or alternatively set
StencilWriteMask to 0 to disable any modifications to stencil buffer.
To perform only stencil write and not stencil test, set
StencilEnable to true, all
StencilWriteMask to values you need and set
ALWAYS to make the stencil test always passing.
Q: Is the StencilRef value also masked by StencilReadMask?
A: Yes. As you can see in the code, it is also ANDed with
StencilReadMask, just as the previous value from stencil buffer. You don't need to provide it "pre-masked". (Comparison to "premultipled alpha" comes to my mind...)
In other words, we could say that only bits indicated by
StencilReadMask from both sides participate in comparison.
Q: What are stencil value bits replaced to in REPLACE Op mode?
A: They are replaced with
StencilRef value - the same that was used in comparison.
Q: Why is it the same StencilRef value as used for comparison, not separate one?
A: I don't know. There is separate
StencilWriteMask. They could have provided separate "StencilReadRef" and "StencilWriteRef" - but for some reason the didn't :P
Q: What value is incremented/decremented when Op in INCR*, DECR*?
A: It's the original value read from stencil buffer, not masked or shifted in relation to
StencilWriteMask. Which means it doesn't make much sense to use these ops if your
StencilWriteMask looks like e.g. 0xF0 - masks out least significant bits.
Q: Is depth buffer updated when stencil test fails?
A: No. Failing stencil test means that the pixel is discarded, so Z-buffer is not updated and color is not written or blended to render targets.
On the other hand, failing Z-test can cause stencil buffer to be updated when you use
StencilDepthFailOp other than
If I misunderstood something and some of the information in this article is wrong, please let me know by e-mail or comment below.