Tips to Avoid Smoke-and-Mirror Software Demonstrations
It is well-known that selecting the wrong software package can doom an ERP, CRM or any other software implementation before it gets started. The good news is plenty of information, tools and resources are available to aid in the selection process. Nevertheless, a failure to manage the software demonstrations could result in buying a lousy package, even if you do all the other things right.
While planning the evaluation is important, the software “demos” are where the rubber meets the road. If not done correctly, the evaluation will cost more, take much longer, and turn into a software beauty contest. This is when bias, subjectivity, and vendor promises (vaporware) taint the team’s final decision.
Formal software demonstrations should begin with the short-list of two or three packages. A list of detail requirements and software demonstration scripts are prerequisites to start the demos. The following are tips to manage the demos in order to prevent the vendors from hijacking the evaluation.
- Plan two rounds of software demonstrations (each lasting roughly 4 hours) for each module and vendor. This should be sufficient time to complete the functionality evaluation and tie up loose ends. When going beyond this, the law of diminishing returns quickly sets in.
- Establish a clear agenda for each session (including a timetable for each topic). Work with the vendors in establishing the agenda, but do not let them dictate it. The agenda should be identical for each vendor for a particular module, at least in the first round. In addition, it is important to first focus on the important requirements that drive the decision. These include perceived unique requirements and those weighted highest in priority.
- Do the vendor introductory meeting and basic software navigation first. This initial meeting allows the vendor to get their canned sales pitch out of the way and demonstrate the navigational aspects of the software to the entire evaluation team (instead of redundantly covering these topics in each module demonstration).
- Create an equal forum. For a given round of demonstrations, the meeting place and delivery method (sales team on-site or remote) should be the same for each vendor. This is about leveling the playing field. If one vendor performs a demo on-site and another does the same module remotely, who do you think has the opportunity to leave the best impression? This impression may have little to do with the software. Encourage all vendors to be onsite for the demonstrations.
- Request that each vendor send their A-Team (people that really understand the software and how to demonstrate it). If the demonstrator is not familiar with the software, he or she can make a good package appear inadequate. This is not necessarily just the vendor’s problem since it can become yours if the wrong package is selected.
- Educate the team on how to apply the package scoring method. Most methods to evaluate software functionality include a scoring system that quantifies the extent the software addresses a particular business need. The team needs to understand how to apply the system in order to properly disposition what they see within the software and vendor responses to specific questions. Regardless of the scoring method, functionality promised in a future release where the vendor has no design documentation to prove it, should get zero credit. Even if they can, never give full credit since an upgrade is necessary to get it.
- When one vendor uses sample data representative of your business for the demos make sure the other vendor does the same (or no vendor uses company data). Otherwise, this will give one vendor an unfair advantage. All vendors know the use of your company’s data for a demo makes their software look “less foreign” or more appealing to the evaluation team. Remember, this has nothing to do with what the software can actually do.
- Make sure all vendors perform demonstrations with a copy of the production version of the software and with technologies typically deployed. We want to avoid demos using software, databases, and other technologies not representative of the actual product. Though not common, some vendors modify screens and programs prior to demonstrations to make the software appear that it meets a requirement.
- Require the vendor to demonstrate the software release to be implemented (usually the current release). When a vendor wants to demonstrate a previous release this could mean the current release is not ready for prime time.
- The software evaluation project manager must be an interpreter. Though the vendor and the team are speaking to each other during the demo, this does not mean they are communicating. Vendors and users come from different worlds, use different terminologies, and the words can get lost in translation. It is the job of the project manager to ensure that communication is actually occurring.
- The evaluation project manager must be a referee. This includes ensuring adherence to the agenda, participation of all evaluation team members, and that the vendor is truly answering the questions and demonstrating the software (not just talking about it).
- Conduct a team follow-up meeting immediately after each demonstration. Allow time at the end of each demonstration for the team to discuss what they learned, score the package, and document follow-up questions when the software capabilities are still unclear. Of course, this segment of the meeting does not include the vendor. When this is not performed immediately following each demo, the team will be hard-pressed to remember what they saw. In addition, two people attending the same demo can walk away with very different perceptions.
[This post originally appeared on IT Toolbox and is republished with permission.]
Looking for more insider tips on choosing software? Browse our entire archive of blog posts written by experts by visiting the selecting and buying software tips section of the Business-Software.com blog.