Why isn't newer firmware better?
May 26, 2008 1:13 AM   Subscribe

Quite often when debugging hardware, such as a router, a recommended solution is to downgrade the firmware to a version that is generally agreed to be the most "stable". What is it about firmware that often leads to the latest released firmware not being the best firmware?

This also happens with hardware drivers. Sometimes the recommended solution to a video problem in a game is to downgrade the video card driver to a previous version. I have always assumed that this is due to a lack of widespread compliance with some standardised interface. That is, one game might rely on a certain non-standard behaviour of a video card, while another game might rely on the opposite, "correct" behaviour. Does something similar happen with firmware, or are there other reasons?

I appreciate that this sometimes happens with general software too, but I feel like it's significantly more common with firmware. Am I wrong about that?
posted by hAndrew to Computers & Internet (14 answers total)
 
Firmware is much harder to debug. The platform you're writing the code on is not the platform it runs on... you need to sort of write in your methods of troubleshooting output (LEDs!). On something like a video card, this can be very difficult (you need to write code to send messages to your main computer, and that code itself can have issues, etc.).
posted by phrontist at 1:27 AM on May 26, 2008


(I should mention that for something as sophisticated as a video card, a lot of emulation is used in the development process)
posted by phrontist at 1:29 AM on May 26, 2008


Like phrontist said, firmware is much much harder to debug. And all software tends to have bugs which only show up after extensive usage. As a result, some bugs might show up months after the firmware has been released. For example the Tomato firmware for WRT routers had a buggy version of Busybox which didn't make itself apparent initially, and only showed up after loads of users had used it for a while on their routers.

Same with hardware drivers, which tend to have bugs which only show up in certain rare circumstances, or even something like the Pentium processor floating point bug.
posted by cyanide at 1:50 AM on May 26, 2008


In my experience, this is more frequently true of open source firmware than closed source firmware, and also more frequently true of open source software than closed source software.
posted by Your Time Machine Sucks at 3:18 AM on May 26, 2008


Situations where programs fail only in extreme cases are typically based around edge cases or corner cases.

While programmers always attempt to test for a significant amount of edge and corner cases, sometimes either a case isn't considered or thought of. Sometimes, the environment necessary to trigger a given corner case may be so obfuscated, an engineer might not stumble upon it throughout their testing at any point and thus not consider it.

You'll definitely see this in software a lot as well. Consider patches to Firefox. I've seen an update come down with another following it almost the next day (and in fact, I think that was the quickest turnaround) because a major security flaw was revealed only after it was pushed out en masse. Because software allows for rapid development and deployment, and because it's not exactly possible (or at least probable) to "brick" software, developers will typically just increment and away you go to fix an issue.

When developing for firmware, the "set it and forget it" mantra has to be considered. So incrementing and establishing a "fix" to an issue in a recently released version of firmware can take considerably longer while the original problems surrounding the issue at hand are considered and especially while the fix applied to said issue is checked to ensure it doesn't cause any additional problems. It's almost a bit like chaos theory in that any small change can easily impact other elements of code as more of a negative externality—something you didn't intend to alter the performance of may behave in an unexpected manner under certain conditions. Again, this isn't a behavior limited to development of hardware drivers/firmware; it's just more visible when it occurs and the need to be considerably more careful is ever-present because of the potential to brick a device.
posted by disillusioned at 3:39 AM on May 26, 2008


In my experience, this is more frequently true of open source firmware than closed source firmware, and also more frequently true of open source software than closed source software.

Ha, ha, ha. I'm glad you said in your experience. The closed source issues are often hidden, or just plain not supported. When 64bits desktop CPUs shipped there were heaps of driver glitches. Open source got drivers out fast in most cases, even if they may be somewhat unstable - better to have a solution to 80% or even 50% of users than just nothing, which is what happens while you wait for commercial vendors to get a version they think they can cost effectively support.

As to why it is more often drivers and firmware, I think it is because drivers (think graphics cards) are really pushing the envelope on performance, while a new spreadsheet can be more conservative in how aggressively it behaves. For example, there are undocumented and unsupported ways to interact with hardware. I seem to remember cases of drivers that would not properly reset memory after it was released, as this would waste a clock cycle, and was unnecessary for that iteration of hardware/OS/games. Later on, an OS upgrade or similar suddenly leaves that trick broken.
posted by bystander at 4:14 AM on May 26, 2008


What is it about firmware that often leads to the latest released firmware not being the best firmware?

The latest firmware has not been used by bajillions of users in real-world situations.
posted by zippy at 4:19 AM on May 26, 2008


Sturgeon's law
posted by pompomtom at 5:27 AM on May 26, 2008


Most of the time development was done with the prior version of firmware (say for video games.) They get a release sometimes before you do...and often not. Then suddenly a flood of calls of people who upgraded the hardware before they tested their software.
posted by filmgeek at 6:56 AM on May 26, 2008


I haven't read any of the responses, but here's the deal:

1. If it ain't broke don't fix it.

2. If it's broke, change it.

You can't upgrade to a future release that doesn't exist, but you can downgrade to a previous version to see if that fixes your problem.
posted by thomas144 at 7:27 AM on May 26, 2008


Stupid outsourcing. When a company's primary motivation for moving development work offshore is short term financial savings, quality usually takes a bit hit.

I was with a small company that wrote network management software that supported a large number of such devices (wireless access points, in this case, though many of the vendors we worked with built routers, too). We noticed a drop in firmware quality among some small- and mid-sized vendors after they had moved their firmware development offshore. New firmware releases would add more new bugs than new features, with our company doing involuntary QA. That also put us in the position of telling our customers to avoid certain firmware versions.
posted by dws at 8:05 AM on May 26, 2008


Ha, ha, ha. I'm glad you said in your experience. The closed source issues are often hidden, or just plain not supported. When 64bits desktop CPUs shipped there were heaps of driver glitches. Open source got drivers out fast in most cases, even if they may be somewhat unstable - better to have a solution to 80% or even 50% of users than just nothing, which is what happens while you wait for commercial vendors to get a version they think they can cost effectively support.

Correct, bleeding-edge OSS will often have n%-working elements, while the equivalent closed-source software is often not released in that state, because the closed source company has to give support for that average-user-confusing situation and it would be expensive. I'm extremely pro-OSS, but I've installed plenty of open and closed router firmware, and in my experience, you roll back open source firmware to troubleshoot weirdness, while you live with the shortcomings of closed source firmware in order to not have weirdness. This applies to routers which have been out for a while; in my experience, early-version firmware of any kind is likely to suck pretty hard.
posted by Your Time Machine Sucks at 8:08 AM on May 26, 2008


Bad programming practices and bad assumptions:

- the firmware people screwed up with the new release, assuming that one change won't affect another. They forget the "work-around" they put in version one of the code to get around an error in the compiler. Now they've got an updated compiler and don't check the old code because it already works and the guy who did the workaround didn't document it or they didn't check the documentation.

- other software has invoked its own "work arounds" around someone elses screwup. So when someone else fixes their mistake, your code gets messed up.

Thomas144 is right. I *hate* it when people blindly say "upgrade the firmware" to solve every problem under the sun. If it worked last week, figure out what changed/broke. Don't add more complexity to the system.
posted by gjc at 11:37 AM on May 26, 2008


There's also something of a "they haven't released new firmware in 3 months - OMFG, it's unsupported and out of date!" attitude prevalent, particularly amongst early adopter users. This leads to manufacturers releasing what is effectively broken half-tested alpha or beta level firmware with incremental upgrades hyped as major "must-have" features in order to placate users.

Once one starts doing this, all must do it, so they can be seen to be up with or ahead of their rivals. It's become almost expected that we do their testing for them. This is particularly common amongst video card manufacturers, though it's also becoming common in stand-alone devices - for example, I'm sticking with May 2005 firmware (with a couple of minor non-critical cosmetic bugs) on my (free-to-air) PVR, because everything before was still a bit flakey and lacked a couple of features, while everything since has at least 1 show-stopper bug (e.g. failure to record timers, 0-length files, remote control bugs, etc). Nevertheless, people still must have that latest firmware, and whinge when there hasn't been a firmware release in the last few months...

(Recently the local Pay-TV company did a forced firmware upgrade - in the middle of prime-time, no less! - that broke a large number of boxes. And still, people were defending them because they now had the latest and greatest firmware!)
posted by Pinback at 4:15 PM on May 26, 2008


« Older DVD Slideshow recommendations?   |   If we build it will they come? Newer »
This thread is closed to new comments.