Tag: gpu

Entries for tag "gpu", ordered from most recent. Entry count: 12.

Warning! Some information on this page is older than 5 years now. I keep it for reference, but it probably doesn't reflect my current knowledge and beliefs.

Pages: > 1 2

# Lower-Level Graphics API - What Does It Mean?

Sat
06
Jun 2015

They say that the new, upcoming generation of graphics API-s (like DirectX 12 and Vulkan) will be lower-level, closer to the GPU. You may wonder what does it exactly mean or what is the purpose of it? Let me explain that with a picture that I have made few months ago and already shown on my two presentations.

Row 1: Back in the early days of computer graphics (like on Atari, Commodore 64), there were only applications (green rectangle), communicating directly with graphics hardware (e.g. by setting hardware registers).

Row 2: Hardware and software became more complicated. Operating systems started to separate applications from direct access to hardware. To make applications working on variety of devices available on the market, some standards had to be defined. Device drivers appeared as a separate layer (red rectangle).

Graphics API (Application Programming Interface), like every interface, is just the means of communication - standardized, documented definition of functions and other stuff that is used on the application's side and implemented by the driver. Driver translates these calls to commands specific to particular hardware.

Row 3: As games became more complex, it was no longer convenient to call graphics API directly from game logic code. Another layer appeared, called game engine (yellow rectangle). It is essentially a comprehensive library that provides some higher-level objects (like an entity, asset, material, camera, light) and implements them (in its graphical part) using lower-level commands of graphics API (like mesh, texture, shader).

Row 4: This is where we are now. Games, as well as game engines constantly become more complex and expensive to make. Less and less game development studios make their own engine technology, more prefer to use existing, universal engines (like Unity, Unreal Engine) and just focus on gameplay. These engines recently became available for free and on very attractive licenses, so this trend affects both AAA, as well as indie and amateur game developers.

Graphics drivers became incredibly complex programs as well. You may not see it directly, but just take a look at the size of their installers. They are not games - they don't contain tons of graphics and music assets. So guess what is inside? That is a lot of code! They have to implement all API-s (DirectX 9, 10, 11, OpenGL). In addition to that, these API-s have to backward compatible and not necessarily reflect how modern GPU-s work, so additional logic needed for that can introduce some performance overhead or contain some bugs.

Row 5: The future, with new generation of graphics API-s. Note that the sum width of the bars is not smaller than in the previous row. (Maybe it should be a bit smaller - see comment below.) That is because according to the concept of accidental complexity and essential complexity from famous book No Silver Bullet, stuff that is really necessary has to be done somewhere anyway. So lower-level API means just that driver could be smaller and simpler, while upper layers will have more responsibility of manually managing stuff instead of automatic facilities provided by the driver (for example, there is no more DISCARD or NOOVERWRITE flag when mapping a resource in DirectX 12). It also means API is again closer to the actual hardware. Thanks to all that, the usage of GPU can be optimized better by knowing all higher-level details about specific application on the engine level.

Question is: Will that make graphics programming more difficult? Yes, it will, but these days it will affect mostly a small group of programmers working directly on game engines or just passionate about this stuff (like myself) and not the rest of game developers. Similarly, there may be a concern about potential fragmentation. Time will show which API-s will be more successful than the others, but in case none of them will become standard across all platforms (Vulkan is a good candidate) and GPU/OS vendors succeed in convincing developers to use their platform-specific ones, it will also complicate life only for these engine developers. Successful games have to be multiplatform anyway and modern game engines do good job in hiding many of differences between platforms, so they can do the same with graphics.

Comments | #gpu #rendering #directx Share

# Lectures on ETI, Gdańsk University of Technology

Thu
08
Jan 2015

Employees of Intel Technology Poland are visiting Gdańsk University of TechnologyFaculty of Electronics, Telecommunications and Informatics (known as ETI). On Thursday - 8, 15, 22 January 2015, there will be lectures as part of "Computer Graphics" course. Time: 11:15 - 13:00, place: new ETI building, room NE AUD1L. It's a lecture for students of computer science, but anyone who is interested can come and listen.

Together with Piotr Kozioł, I will be presenting on January 22nd. Our presentation has title "Shaders and their compilation" and will cover:

During 2 hours we will cover lots of topics - basically all what happens to the shader after it's written in high level language and passed to graphics API - how it's processed by the driver and executed by the GPU.

Comments | #events #teaching #intel #gpu Share

# What do we have from benchmarks?

Sun
27
Apr 2014

There was this case some time ago about some graphics vendors cheating in Futuremark benchmark (see this). They basically detected this particular application and raised frequency to increase performance and gain higher score. So some devices have been delisted from the Best Mobile Devices list for cheating and they published this document: Benchmark Rules and Guidelines.

My first thought was: Good, they just want everyone to play fair. But then I read the rules again, especially this one: "The platform may not replace or remove any portion of the requested work even if the change would result in the same output." and I said: Wait, what? Isn't it a generic definition of every optimization? If a developer writes 2+2 in GLSL and the platform just uses 4, is it cheating because it removed requested work (addition in this case) even if result is the same?

And then I started thinking: What do we have from benchmarks after all? Is their importance a good thing for gamers and other customers of graphics technology? In theory, benchmarks should mimick some aspect of real applications to measure and compare how different hardware performs in this type of applications (e.g. games). But it may be that decision makers want to just see good scores in benchmarks (bosses generally like numbers and bars and graphs :) so engineers implement optimizations or even some cheats just for these benchmarks. And then media notice that, devices get delisted, benchmark creators write such rules... and gamers just want to play games.

If performance was measured just in real games, and platform vendors optimized or even cheated for a particular title, then at least we would have a better performing game. Just my personal opinion :)

Comments | #hardware #gpu Share

# Recent Demoscene Parties

Sun
04
Oct 2009

There have been several interesting demoscene parties recently. First one is RiverWash in Warsaw, Poland (I have been there by myself :) Next one is Function in Budapest, Hungary. And finally, today was the end of MAiN in Arles, France. After each party it's nice to download prods from compos (productions from competitions) from Pouet.net website (see prods from this year's RiverWash, Function and Main). Demos and intros are worth watching!

Different topic: the new Nvidia Fermi architecture looks very promising. Just have a look at White Paper PDF and watch Next Generation GPU Fluids on YouTube. This architecture is going to be released soon as GT300 graphics card. Intel Larrabee is no longer a revolutionary idea :)

By the way, the whole GPU Technology Conference (San Jose, USA, Sep 30 - Oct 2) looks like an interesting event. I wonder if there will be any papers available for download...

Comments | #demoscene #gpu Share

Pages: > 1 2

STAT NO AD
[Stat] [STAT NO AD] [Download] [Dropbox] [pub] [Mirror] [Privacy policy]
Copyright © 2004-2019