I want to challenge a vendor on their anticipated testing effort for a proposed software delivery project, which seems way to low in my experience. The project is to migrate a dialer system to cloud, which includes a bunch of custom functionality. How much percentage of total capacity do you normally allocate to testing for such a cloud migration (test setup, execution of tests (if not automated), etc.)? I am looking for a figure like 20% of all development efforts should be allocated to testing. Any ideas?
Sort by:
I agree with John Fly, results are what matter. If the vendor knows his payments are tied to results then the onus is on them to test to ensure results & payment.
What we have used for acceptance is 80% success on the test cases, but in terms of the execution we require to test all the functionalities as part of the cloud migration. We don´t put a percentage of resources or time to be allocated for testing, we just setup the number of testcases and the time to perform the full test.
If the vendor has a % in the contract that is too low, then by all means push for more. I've been on the receiving end of projects w/o enough testing and it's a bad experience all around.
However I recommend John Fly 's results & milestones approach vs stated %time. At the end of the day it's the results that matter, not how much time a bean counter recorded per the contract. And if the progress is coming in at a lower bar, then the vendor would have to put in more testing time vs what you would have initially expected.
I wouldn't try and focus the time split in that language. You could have them do a provable 50/50 split on dev/test and still not get the outcomes you are seeking.
My experience being on the vendor and purchasing side of efforts like this has me preferring a milestone payment schedule.
If you get the results you're after, it shouldn't matter what the % each focus area received in their building process.
If you are after a successful implementation, ensure that the project, SOW, milestones, and possible payment are tied to a functional product.
If you need robust tests to continue to use after delivery, then ensure that is a stated outcome.
Cloud Migration projects are with high risk IT projects, so testing and quality gates are required during each stages of the migration program and each stages of software development. Test strategy and testing plans to be created at the beginning of the project and it will indicate the percentage of effort required from testing resources. Each of the application modules needs to be tested in an isolated environment, and migration dry runs, table top exercises should be conducted on top of regular testing cycles with integrated testing and UAT. Penetration testing also required from security perspective. These are my two cents..