Tag: rendering

Entries for tag "rendering", ordered from most recent. Entry count: 165.

Warning! Some information on this page is older than 5 years now. I keep it for reference, but it probably doesn't reflect my current knowledge and beliefs.

Pages: > 1 2 3 4 5 ... 21 >

# DirectX 11 Renderer - a Screenshot

Wed
16
Jan 2013

Here is what I've been working on in my free time recently. It's a renderer For PC, Windows, using DirectX 11.

It may not look spectacular here because I've just quickly put random stuff into this scene, but I already have lots of code that can do useful things, like deferred shading with dynamic directional and point lights, a bunch of material parameters, mesh processing and loading from OBJ file format, heightmap generation, particle effects and postprocessing (including bloom of course :)

In the next posts I will describe some pieces of my technology and share some C++ code.

Comments | #rendering #productions #directx Share

# Particle System - How to Store Particle Age?

Wed
12
Dec 2012

Particle effects are nice because they are simple and look interesting. Besides, coding it is fun. So I code it again :) Particle systems can have state (when parameters of each particle are calculated based on previous values and time step) or stateless (when parameters are always recalculated from scratch using fixed function and current time). My current particle system has the state.

Today a question came to my mind about how to store age of a particle to delete it after some expiration time, determined by the emitter and also unique for each particle. First, let's think for a moment about the operations we need to perform on this data. We need to: 1. increment age by time step 2. check if particle expired and should be deleted.

If that was all, the solution would be simple. It would be enough to store just one number, let's call it TimeLeft. Assigned to the particle life duration at the beginning, it would be:

Step with dt: TimeLeft = TimeLeft - dt
Delete if: TimeLeft <= 0

But what if we additionally want to determine the progress of the particle lifetime, e.g. to interpolate its color of other parameters depending on it? The progress can be expressed in seconds (or whatever time unit we use) or in percent (0..1). My first idea was to simply store two numbers, expressed in seconds: Age and MaxAge. Age would be initialized to 0 and MaxAge to particle lifetime duration. Then:

Step with dt: Age = Age + dt
Delete if: Age > MaxAge
Progress: Age
Percent progress: Age / MaxAge

Looks OK, but it involves costly division. So I came up with an idea of pre-dividing everything here by MaxAge, thus defining new parameters: AgeNorm = Age / MaxAge (which goes from 0 to 1 during particle lifetime) and AgeIncrement = 1 / MaxAge. Then it gives:

Step with dt: AgeNorm = AgeNorm + dt * AgeIncrement
Delete if: AgeNorm > 1
Progress: Age / AgeIncrement
Percent progress: AgeNorm

This needs additional multiplication during time step and division when we want to determine progress in seconds. But as I consider progress in percent more useful than the absolute progress value in seconds, that's my solution of choice for now.

Comments | #math #rendering Share

# Half-Pixel Offset in DirectX 11

Fri
09
Nov 2012

There is a problem graphics programmers often have to face known as half-pixel/half-texel offset. In Direct3D 9 the probably most official article about it was Directly Mapping Texels to Pixels. In Direct3D 10/11 they changed the way it works so it can be said that the problem is gone now. But I can see the entry about SV_Position in the Semantics article does not explain it clearly, so here is my short explanation.

A pixel on a screen or texel on a texture can be seen as a matrix cell, visualized as square filled with some color. That's the way we treat it in 2D graphics where we index pixels from left-top corner using integer (x, y) coordinates.

But in 3D graphics, texture can be sampled using floating-point coordinates and interpolation between texel colors can be performed. Texture coordinates in DirectX also starts from left-top corner, but position (0, 0) means the very corner of the texture, NOT the center of the first texel! Similarly, position (1, 1) on texture means its bottom-right corner. So to get exactly the color of the second texel of 8 x 8 texture, we have to sample with coordinates (1.5 / 8, 0.5 / 8).

Now if we rendered 3D scene onto a texture to perform e.g. deferred shading or some other screen-space postprocessing and we want to redraw it to the target back buffer, how do we determine coordinates for sampling this texture based on position of rendered pixel? There is a system value semantics available at pixel shader input called SV_Position which gives (x, y) pixel position. It is expressed in pixels, so it goes to e.g. (1920, 1080) and not to (1, 1) like texture coordinates. But it turns out that in Direct3D 10/11, the value of SV_Target for first pixels on the screen is not (0, 0), (1, 0), but (0.5, 0.5), (1.5, 0.5)!

Therefore to sample texture based on pixel position, it's enought to divide it by screen resolution. No need to perform any half-pixel offset - to add or subtract (0.5, 0.5), like it was done in Direct3D 9.

Texture2D g_Texture : register(t0);
SamplerState g_SamplerState : register(s0);

cbuffer : register(b0) {
    float2 g_ScreenResolution;
};

void PixelShader(float2 Position : SV_Position, out float4 Color : SV_Target)
{
    Color = g_Texture.Sample(g_SamplerState, Position / g_ScreenResolution);
}

Comments | #directx #rendering Share

# Flat Shading in DirectX 11

Fri
02
Nov 2012

I code a terrain heightmap rendering and today I wanted to make it looking like in Darwinia. After preparing approporiate texture it turned out that normals of my mesh used for lighting are smoothed. That's not a suprise - after all there is a single vertex in every particular place, shared by surrounding triangles, refered multiple times from index buffer.

How to make triangles flat shaded? Make every vertex unique to its triangle to have different normal? In Direct3D 9 there was this state used by fixed function pipeline you could set to device->SetRenderState(D3DRS_SHADEMODE, D3DSHADE_FLAT); What about Direct3D 10/11?

A simple solution is to use interpolation modifiers. Fields of structure passed from vertex shader to pixel shader can have these modifiers to tell GPU how to interpolate (or not to interpolate) particular value on the surface of a triangle. nointerpolation means the value will be taken from one of vertices. That solves the problem.

struct VS_TO_PS
{
    float4 Pos : SV_Position;
    nointerpolation float3 Normal : NORMAL;
    float2 TexCoord : TEXCOORD0;
};

Comments | #rendering #directx Share

# Developing Graphics Driver

Thu
12
Jul 2012

Want to know what do I do at Intel? Obviously all details are secret, but generally, as a Graphics Software Engineer, I code graphics driver for our GPU. What does this software do? When you write a game these days, you usually use some game engine, but I'm sure you know that on a lower level, everything ends up as a bunch of textured 3D triangles rendered with hardware acceleration by the GPU. To render them, the engine uses one of standard graphics APIs. On Windows it can be DirectX or OpenGL, on Linux and Mac it is OpenGL, on mobile platforms it is OpenGL ES. On the other side, there are many hardware manufacturers - like NVIDIA, AMD, Intel or Imagination Technologies - that make discrete or embedded GPUs. These chips have different capabilities and instruction sets. So graphics driver is needed to translate calls to API (like IDirect3DDevice9::DrawIndexedPrimitive) and shader code to form specific to the hardware.

Want to know more? Intel recently published documentation of the GPU from the new Ivy Bridge processor - see this news. You can find this documentation on intellinuxgraphics.org website. It consists of more than 2000 pages in 17 PDF files. For example, in the last volume (Volume 4 Part 3) you can see how instructions of our programmable execution units look like. They are quite powerful :)

Comments | #intel #driver #rendering #hardware Share

# 4k Executable Graphics with Fractal

Fri
15
Jun 2012

There was a competition announced on forum.warsztat.gd in making a 4k executable graphics. In case you don't know what is it: It is a category well known on demoscene where you have to write a program - Windows executable in assembler, C, C++, Java, HTML or any other language, that fits into 4 KB size limit and generates a static, nice looking image. Obviously there is no way to store a bitmap inside such size, so the graphics must be generated procedurally. For list best prods in this category, see this search query on pouet.net.

I participated in this competition and here is my entry with source code: gfx_4k_Reg.zip. EXE file itself has 1259 bytes. Archive has password "demoscene" (needed because server of my hosting company rejects archives containing unusual EXE files like this, thinking it is a virus :)

Not very impressive, but keep in mind that I coded it in about 2-3 hours, with little previous experience in this kind of coding. If you are interested in how such program is made, I used:

Comments | #rendering #demoscene Share

# Direct2D Class Diagram

Sat
19
May 2012

I've made class diagram for Direct2D library. Enjoy :)

Direct2D Library Class Diagram

By the way, yEd is a free editor for all kinds of graphs, including UML diagrams like this. It may not allow you to enter list of fields and methods in classes like StarUML or Visio do. It may not allow to freely draw shapes like Inkscape or OpenOffice Draw do. But for creating, editing and arranging graphs of different kinds of nodes, interconnected with different kinds of links, it is really great. I especially like the way it snaps all positions and sizes to the neighbour nodes. Commands for automatic arranging of nodes, called Layout, also look impressive. Plus it supports different file formats, including XML-based, so you can easily generate such file and then open it in yEd to visualize some data.

Comments | #yed #graphs #rendering #direct2d Share

# Direct2D - Unboxing

Mon
07
May 2012

Just as some bloggers record unboxing of products, I'd like to share my first impressions from playing around with Direct2D library. What is it? As you can guess from its name, it is a library from Microsoft for efficient, hardware-accelerated drawing of immediate-mode 2D graphics. It is successor of the old GDI, as well as GDI+. It is native library, object-oriented, based on COM. It works on Windows 7, as well as Windows Vista (after installing SP2 and Platform Update) and Windows 2008. The API seems modern, clean and very elegant, much like Direct3D 10/11. But of course it could be just a matter of taste. You can find its documentation, as well as headers and libraries in DirectX SDK. Docs say something about Windows 7 SDK, so probably it's there too.

What can it interop with? You can create a render target that will send the graphics to either a window (using HWND), a GDI canvas (HDC), DXGI surface (interop with Direct3D 10/11) or WIC Bitmap. Interop with GDI is also possible in the opposite way - you can obtain a HDC from D2D render target. Microsoft advices using WIC (Windows Imaging Component) for loading bitmap files. Another important library that cooperates with Direct2D is DirectWrite, which enables text rendering.

I didn't find any good tutorial about Direct2D, but I haven't look for it very hard either. I think the official documentation is good enough teaching all aspects of the library. After reviewing the documentation, I'm very impressed not only by the looks of the API, but also by the list of features is supports, like:

Things I don't like so far about the library:

My first test, drawn using following code:

using namespace D2D1;
const D2D1_SIZE_F rtSize = m_RenderTarget->GetSize();
m_RenderTarget->BeginDraw();

m_RenderTarget->Clear(ColorF(ColorF::White));

m_Brush->SetColor(ColorF(ColorF::Silver));
const float gridStep = 24.f;
for(float x = gridStep; x < rtSize.width; x += gridStep)
    m_RenderTarget->DrawLine(Point2F(x, 0.f), Point2F(x, rtSize.height), m_Brush, 1.f);
for(float y = gridStep; y < rtSize.height; y += gridStep)
    m_RenderTarget->DrawLine(Point2F(0.f, y), Point2F(rtSize.width, y), m_Brush, 1.f);

m_Brush->SetColor(ColorF(ColorF::Navy));
m_RenderTarget->DrawRectangle(RectF(200.f, 400.f, 400.f, 550.f), m_Brush);
m_RenderTarget->DrawRectangle(RectF(240.f, 450.f, 280.f, 500.f), m_Brush);
m_RenderTarget->DrawRectangle(RectF(320.f, 450.f, 360.f, 550.f), m_Brush));
m_RenderTarget->DrawLine(Point2F(200.f, 400.f), Point2F(300.f, 300.f), m_Brush, 1.f);
m_RenderTarget->DrawLine(Point2F(300.f, 300.f), Point2F(400.f, 400.f), m_Brush, 1.f);

m_RenderTarget->EndDraw();

Comments | #rendering #direct2d Share

Pages: > 1 2 3 4 5 ... 21 >

[Download] [Dropbox] [pub] [Mirror] [Privacy policy]
Copyright © 2004-2020