Sat
09
Jan 2016
The concept of "stride" or "pitch" - a step (in bytes) to be taken to proceed to next element of a data structure - is brilliant, because it gives great flexibility. In 3D graphics for example, explicitly specified number of bytes between vertices in a vertex buffer or rows in a texture can be:
This is a theory, because I just discovered that option 3 doesn't work in Direct3D 11 when passing pInitialData
to created texture. I cannot see any reason why specifying D3D11_SUBRESOURCE_DATA::SysMemPitch
== 0 should be considered invalid, other than trying to save developer from possibly unintended mistake. I think it is actually pretty useful for initializing a texture with the same data in each row, so it would be enough to allocate and fill the data for just one row, instead of full texture. And still, following code fails on call to CreateTexture2D
:
CD3D11_TEXTURE2D_DESC textureDesc = CD3D11_TEXTURE2D_DESC(
TEXTURE_FORMAT, // format
(UINT)TEXTURE_SIZE.x, // width
(UINT)TEXTURE_SIZE.y, // height
1, // arraySize
1, // mipLevels
D3D11_BIND_SHADER_RESOURCE, // bindFlags
D3D11_USAGE_DYNAMIC, // usage
D3D11_CPU_ACCESS_WRITE); // cpuaccessFlags
std::vector<uint32_t> initialTextureRow(TEXTURE_SIZE.x);
ZeroMemory(&initialTextureRow[0], TEXTURE_SIZE.x * sizeof(uint32_t));
D3D11_SUBRESOURCE_DATA textureInitialData = {
&initialTextureRow[0], // pSysMem
0, // SysMemPitch
0 }; // SysMemSlicePitch
ID3D11Texture2D *texture = nullptr;
ERR_GUARD_DIRECTX( m_Dev->CreateTexture2D(&textureDesc, &textureInitialData, &texture) );
DirectX debug layer reports error:
D3D11 ERROR: ID3D11Device::CreateTexture2D: pInitialData[0].SysMemPitch cannot be 0 [ STATE_CREATION ERROR #100: CREATETEXTURE2D_INVALIDINITIALDATA]
Dear Microsoft: Why? :)