Do you develop your apps for the least common denominator, or do you provide a set of absolute minimum requirements that must be met in order for your users to install your app?
I have an app that is installed in roughly 750 locations. Of those, nearly two-thirds of the users are still running Windows 98. Naturally, whenever I work on enhancements (and bug fixes) for this app, I have to develop to the least common denominator (in this case, Windows 98). As you can imagine, with the hardware for most of these users I also have to be careful about memory, hard drive space, and dial-up connectivity versus cable/dsl.
Some developers I've worked with insist that the absolute minimum OS for their software is Windows 2000, preferably Windows XP, and even more preferably, Windows XP Professional. I dunno. If I took that approach, I'd lose over 500 customers for this one app alone.
On the other hand, dropping support for the app on Windows 98 would save me a ton of time on testing. One less OS to worry about and, I have to admit, sometimes it would be nice to say, "You're running Windows 98? Sorry. Not supported on Windows 98."
When I look at Microsoft’s Product Lifecycle dates, I see that "Paid incident support is now available through 30-Jun-2006" for Windows 98, and I wonder if I should follow the same type of path with the apps I'm developing today.
Do you use Microsoft's Product Lifecycle in determining your minimum requirements, or are your minimum requirements simply based on what's needed by the individual app?