The Importance of Good Debugging Tools

Warning! Some information on this page is older than 6 years now. I keep it for reference, but it probably doesn't reflect my current knowledge and beliefs.

Thu
28
May 2015

Robert L. Glass wrote an interesting article "Frequently Forgotten Fundamental Facts about Software Engineering". While I am aware he is far more experienced than me and he has sources to backup his opinions, I dare to disagree with following paragraph:

T1. Most software tool and technique improvements account for about a 5- to 30-percent increase in productivity and quality. But at one time or another, most of these improvements have been claimed by someone to have "order of magnitude" (factor of 10) benefits. Hype is the plague on the house of software.

I think that having good tools is one of the most important (and sometimes underrated) factors in programmers' efficiency. From comfortable chair, through fast desktop PC, big monitors, fast Internet connection, to good software, including IDE (editor, compiler, debugger etc.) and auxiliary applications (like Total Commander for file management) - they can all make big difference in how developers feel about their job and how fast the work is going.

Of course, "tools" is a broad term. If by choosing good tools we mean changing used programming language or libraries in the middle of the project, then sure it is usually a bad idea. Writing a script to automate some task that can be done manually in just few minutes or even an hour is usually not worth doing as well.

But what is always worth investing time and money in (either buying or developing your own, learning to use them) are tools that help with debugging. As debugging is the hardest and most time consuming part of programming, any improvement in that process can make as big difference as between minutes and weeks - often in the most critical time, close to the deadline.

So in my opinion, being able to interactively debug executing program (or its trace) - to setup a breakpoint and preview current value of variables (as opposed to relying only on some debug text prints, analyzing logs or some command-line tools) is an absolute minimum to be able to call any programming environment reasonable, mature and eligible to write any serious software in it. If you are the one who writes a program and you have to treat that program as a black box while it is running, without a possibility to peek inside, then something is wrong. AFAIK that is the case with some technologies that deploy program to a server or an embedded device.

How is it in graphics programming? John Carmack once tweeted:

gl_FragColor.x = 1.0; is the printf of graphics debugging. Stone knives and bearskins.

While I share his frustration with such primitive methods of debugging, the reality of today's GPU programming is not all that bad as we could only render red pixels to see any debug information. NVIDIA Nsight offers debugging of a GPU, and so does Intel Graphics Performance Analyzers (GPA). Finally, there is a vendor-agnostic debugger for DirectX developed by Microsoft. Originating from Xbox (that is where its name comes from - Performance Investigator for Xbox), PIX used to be available for free as a standalone application. Recently, they have integrated it as part of Visual Studio called Graphics Diagnostics. It was available in commercial versions of Visual Studio only and not in free Express edition (very bad news for all amateur DirectX game developers), but finally they shipped it for free together with the new, fully-functional Visual Studio Community edition.

With these tools, there is no explanation for whining "nothing renders and I don't know why" - just go debug it! :) The future also looks bright. DirectX 12 will offer Debug layer. Vulkan also claims to support debugging tools and layers and LunarG company even started working on such tool - GLAVE.

Comments | #debugger #software engineering Share

Comments

[Download] [Dropbox] [pub] [Mirror] [Privacy policy]
Copyright © 2004-2024