I have an application that renders multiple textured quads (images) in an essentially 2D context, which has worked fine. However after modifying it so that portions of some textures are transparent, I've ground to a halt trying to get it to behave in a seemingly standard, theoretically simplistic fashion: I just want it to draws the textures sequentially (as it has been doing), and when a texture has transparent pixels, to show whatever was previously drawn in those spots.
But what it is instead doing is showing a scaled version of each previously-drawn texture, behind the transparent sections, rather than the previously-rendered portion of the render target. So for instance if I tried to draw an opaque background texture and then a smaller entirely transparent texture, then the background would draw fine, but the transparent image would show the entire background scaled to the size/location of the new transparent image.
Subsequent rendered textures continue in this fashion, showing whatever the previous rendered texture ended up as (including elements from textures previous to it).
I'm obviously missing something fundamental about how textures/pixel shaders in DirectX work (which is no surprise, since I'm relatively new to it), but after reading everything online I could scrounge up, and experimenting in countless ways, I still can't figure out what I need to do.
I'm using one texture in the pixel shader, which may or may not be part of the problem. Each time I render the scene, I loop through all the textures I want to render, calling PSSetShaderResources() to bind a different texture to that pixel shader texture, each loop, and call DrawIndexed() after each time I change it. It seems like this is an effective way to proceed, since it doesn't seem to make sense to have a ton of shader textures when the pixel shader can't seem to be made to use an arbitrary one (it needs to be precompiled, no?).
At any rate, I'm hoping the symptoms will be sufficient for someone more knowledgeable than I to immediately realize the mistake I'm making. The code is pretty simple in these areas, but I might as well include a couple sections:
ever scene, for each shaderRV:
m_pd3d11ImmDevContext->PSSetShaderResources(0, 1, &shaderRV);
m_pd3d11ImmDevContext->DrawIndexed( ... )
Shader:
Texture2D aTexture : register(t0);
SamplerState samLinear : register(s0);
struct VS_INPUT
{
float3 position : POSITION;
float3 texcoord0 : TEXCOORD0;
};
struct VS_OUTPUT
{
float4 hposition : SV_POSITION;
float3 texcoord0 : TEXCOORD0;
};
struct PS_OUTPUT
{
float4 color : COLOR;
};
// vertex shader
VS_OUTPUT CompositeVS( VS_INPUT IN )
{
VS_OUTPUT OUT;
float4 v = float4( IN.position.x,
IN.position.y,
0.1f,
1.0f );
OUT.hposition = v;
OUT.texcoord0 = IN.texcoord0;
OUT.texcoord0.z = IN.position.z ;
return OUT;
}
// pixel shader
PS_OUTPUT CompositePS( VS_OUTPUT IN ) : SV_Target
{
PS_OUTPUT ps;
ps.color = aTexture.Sample(samLinear, IN.texcoord0);
return ps;
}
Blend Description settings (don't think the problem's here): blendDesc.RenderTarget[0].BlendEnable = true;
blendDesc.RenderTarget[0].SrcBlend = D3D11_BLEND_SRC_ALPHA;
blendDesc.RenderTarget[0].DestBlend = D3D11_BLEND_INV_SRC_ALPHA;
blendDesc.RenderTarget[0].BlendOp = D3D11_BLEND_OP_ADD;
blendDesc.RenderTarget[0].SrcBlendAlpha = D3D11_BLEND_ZERO;
blendDesc.RenderTarget[0].DestBlendAlpha= D3D11_BLEND_ZERO;
blendDesc.RenderTarget[0].BlendOpAlpha = D3D11_BLEND_OP_ADD;
Please let me know if any other code segments would be useful!