I reckon lower reliability in hardware these days is partly that the hardware/components (and programming methods) being used are sometimes a bit more computer/OS-like in the first place, and the OS itself often more complicated than older gear was, but also very much to do with the whole approach to the firmware/software development.
Nowadays you can release with a âgood enoughâ build and say youâll bug fix and add features later. Of course that ends up meaning bugs that hang around like a bad smell for longer than they should, and rare / hard to reproduce bugs being left because of the effort they would take to track down in internal QA, when itâs easier to do so with a larger pool of testers customers using it.
In the old days when the firmware was locked down before it left the factory, it had to be as right as you could possibly make it because you couldnât fix it later. You could still make sure the OS & firmware is rock solid - and thatâs easier to do on a fixed hardware configuration than on a PC/mac - but it would most likely mean longer development, fewer features, or a higher retail price. Imperfect reliability, where it occurs, exists because it is considered an acceptable compromise by the manufacturer.