November 29, 2011 Leave a comment
Common mistakes as observed on the field
This series of posts aimed at sharing our experience. We detailed our own approach and how we deal with the different dimensions internally as an ISV.
But as we also have field experience in advising software companies, I thought it would be worth to also gather-up a list of common mistakes which we observed and that can severely impact the ability to master the product release management process.
- Linking product version to a particular customer: In the field, we see many companies claiming to be ISVs but in fact linking strongly product versions to one or a few customers. In our experience, it is very hard to keep a real product-oriented approach when you have a too strong customer connection. It often ends up in having a custom development that is too specific for the customer context.
- Setting too high expectations for version 1: Some ISVs also set very high expectations from the beginning, or for their new offering if they start a brand new product. Trying to address all needs in a very complete way in a first version is doomed to failure most of the time.
- Starting without a minimal vision of sellable product scope: On the opposite side of the spectrum, we also see teams that start without having a real vision of what minimal scope can be achieved and sold as a product. Pushed by the trendy messages about agile methodology, some people think you don’t need to know very well where you are heading to start your project. In most cases, this does not work well. In our views, agile methodologies work well when used as a delivery method inside a more macroscopic global planning process at vision level.
- Neglecting documentation: Because it is not very fun to write documentation, especially for developer profiles, this area is not enough covered with the appropriate quality level. As an ISV, you should never develop a feature that you are not ready to fully document. Isn’t it obvious written that way?
- Underestimating tests: Test and QA is a critical area for software success. It encompasses many dimensions (see our previous post about software testing best practices) and should be allocated the right level of resources and attention.
- Considering coding as pure low-added value production: We see some ISVs trying to apply the recipes of industry with a simple scheme saying that all the value is in the design, and that code can be outsourced to low-level service companies, those companies being local, near-shore or off-shore. It can be true for some very basic management software that is broad in coverage but very simple in the features. Still, most of the time, you would be able to achieve much more efficient results with 1/10th of the investment with software engineers of the right level, and above all, the proper level of experience.
- Subcontracting what is not internally mastered: I am a big believer of specialization and subcontracting. This can prove to be very useful when used properly. But there is a golden rule that I learned over time: you cannot subcontract what you don’t understand well. You need to have the ability to pilot and control the work of your subcontractors, and you cannot do this if you lose the basic knowledge of key elements. This applies strongly in all software dimensions that you might want to subcontract.
- Going for a technological leap without measuring the methodology impacts: I have audited tenths of projects that have fallen into this trap. Despite the platform vendor pitch that everything is getting simpler with new technologies such as .NET or Java, this is simply not true. These platforms give new technical opportunities and connection capabilities, but they are more complex and bigger than ever. An ISV switching from a traditional CASE Tool based approach (such as Progress, PowerBuilder or NSDK to name a few) is more than likely to fail his new projects if he does not measure the impact of technology change. The methodology needs to be adjusted as there are many implementation possibilities in those open platforms. Finding the right balance on many topics is more than tricky.
As a conclusion, we do hope that the experience we shared might be useful to some people in the field. These are only our views, we might have missed some points and we are always eager to learn more, so we welcome any feedback on those elements.