Building Unit tests is (should be) part of "Build code", or even before it (like in XP). However it is good to show that you spend separate resources on that. If you ask how much time it takes, then the answer is it depends on:
1) What code coverage you want to achieve
2) How complicated the code is
P.S. Actually this question is related to how many test engineers you need, having X developers. The answer is somewhere between X and X/10... quite a vide range, isn't it? The same appy to Unit Testing resources.
?:the art of a constructive conflict perceived as a destructive diagnose.
I agree with Ainars that unit testing is rightly part of "build code." This includes those rare cases where unit definitions (module specs) are well-defined.
Only people with source code level knowledge of a unit can have any reasonable hope of testing at the unit level. (Even then their vision is constrained by their perspective of the software and the inaccuracies inherent in requirements specifications.)
Coders have knowledge of *how* their modules satisfy the requirements. Non-coders have an idea of *what* the unit should *do* and can test only its external behaviors. (The exception is the combination of test-first programming and pair-programming as Ainars suggests. In that XP case, the tester knows the internals - and the developer knows the tests.)
You are right, Steve, to include some validation of the modules but list it as "Integration Testing" in your project plans. You are interested in the external behaviors of the modules (e.g. how the work with other modules and with drivers and stubs).
Integration testing is often skipped except for the final part - the system level integration - which is a weak business decision. Budgeting for integration testing is a wise choice.
Now to answer your question, how much to time allocate for that testing: there are several factors including the project goals for quality, schedule and resources. But, in my opinion, the most important determinant is the quality of the modules themselves. If they are clean, they need less testing. If they are target-rich, you may spend inordinate time in test-and-fix cycles.
History of the development group is your best indicator of expected quality. You can also get a good indictation of the expected quality by first-piece inspections (scrutinizing the early deliveries and extrapolating from those results).
If you have no historical data, you can expect to spend roughly as much time testing as you did in development (i.e. 50% for testing and 50% for Analysis, Design & Build). Use as much of the testing budget on up-front, integration testing and less on system-level testing. The pre-release profile might look like this:
25% design & build:
35% integration test:
15% system test:
If you expect to spend less than 50% of your project budget on testing, then budget for considerable fire-fighting, damage control, and lost opportunities after the system is deployed.
The relentless pursuit of perfection keeps things from getting done. Strive for excellence, not perfection.