Debugging horror
Programs that get written to process pixels of an image are the last things you would ever want to debug. There is just no simple way of debugging these programs. What it makes it really tough is that your program is processing every single pixel (assuming that you wrote a program which processes each pixel), and when something goes wrong you don't know which pixel is giving you the trouble out of the 360x240 = 86,400 (assuming you are working on 360-by-240 size images) pixels. Well, its simpler if you are working on one single image, you try to pinpoint visually which pixel it is, but imagine working on a video where you have 10 frames coming in every second, so now you multiply the 86,400 by 10, making it even worse. So on a 10 seconds video, you are pretty much screwed to figure out what the heck is going wrong.
These are some of the horrors of writing machine vision programs. Some day you are bound to get an error of this nature, and then its just you and your intuition.
I proclaim that debugging is an art. It is an indispensible tool for any programmer. Computer science courses at universities should formally set out some debugging principles and spend a substantial amount of time on the good debugging practices. I have never come across a single book on debugging. Perhaps it's time for someone to start writing one.
These are some of the horrors of writing machine vision programs. Some day you are bound to get an error of this nature, and then its just you and your intuition.
I proclaim that debugging is an art. It is an indispensible tool for any programmer. Computer science courses at universities should formally set out some debugging principles and spend a substantial amount of time on the good debugging practices. I have never come across a single book on debugging. Perhaps it's time for someone to start writing one.