6 ways your proposal can fail – and how to avoid them

Bob LohfeldI received a call from a mid-sized large business that had submitted a proposal for IT services and had just learned their proposal did not make competitive range. They were irate and wanted to protest, alleging that the government had not fairly evaluated their proposal.

They had hired a proposal consultant, spent lots of money developing their proposal, and were assured their proposal was professionally done. Before filing the protest, the company asked me to review their proposal. Here’s what I found when I did the review and what I told them.

Professionally developed proposals always have the same characteristics — they are compliant, responsive, compelling, and customer focused. They present a solution that is easy to evaluate and score well — and they are aesthetically attractive. I used each of these criteria while reviewing this company’s submission.


The proposal’s structure is expected to follow the request for proposal’s (RFP) instructions (section L of this RFP) and also track with the evaluation criteria (section M).

Initially, this proposal followed section L, but then it departed and added sections not called for in sections L or M. It then skipped required section L topics. Finally, some evaluation criteria were never addressed in the proposal. The easiest way to lose points during an evaluation is to not follow the instructions or not address the evaluation criteria. Simply put, this proposal was non-compliant.


The content of each proposal section must respond precisely to each topic prescribed in the RFP. The section headings should track to the RFP instructions, and the associated discussions should be consistent with the section headings. When proposal text fails to address the section’s heading, the section isnon sequitur, e.g., an applicable response does not follow a particular section title.

The proposal seemed to have section text that was lifted from other proposals and pasted into this proposal. The responses were close, but not close enough. To the non-practitioner, much proposal text sounds alike. After all, if the RFP asks for a QA Plan and we give them a Configuration Management Plan, who would know the difference? This proposal team did just that. I scored some of the sections a zero because they failed the responsiveness test.


This is a proposal term that describes how convincing or persuasive the proposal is. In government procurements, we expect the proposal to meet the solicitation requirements fully and exceed those requirements, where practical, in a way that is beneficial to the customer. There should be many features in the proposal that demonstrate a high likelihood of contract success or that exceed solicitation requirements. Assertions about company performance and claims about solution features should be substantiated by real evidence, not boastful rhetoric. Features with relevant and substantiated benefits, presented persuasively, provide the basis for selecting one bidder’s proposal over another.

In this proposal, as I read through 200 pages of hum drum technical prose, I found features were few and benefits were even fewer. There was no basis for differentiation and no compelling basis for selection. This was not the way to write a proposal.


Proposals are customer focused, and marketing brochures are company focused. A customer-focused proposal discusses how your company proposes to do the work and the benefits the customer will receive from your performance. If the proposal just brags about how good the company is and how outstanding its processes are, then the proposal is company-focused at best. Company-focused proposals cause evaluators to lose interest, whereas customer-focused proposals hold evaluators’ interest and score higher.

Slogging through 200 pages about how good this company is does not substitute for a cogent explanation of what the company planned to do and how it was going do it. If I had read one more time that their processes were “best of breed” or “world class,” I think I would have just closed the book and quit reading.

Easy to evaluate

Evaluators generally start their review with the proposal evaluation criteria in section M of the RFP. They build an evaluation checklist, and then go looking through the proposal to find information that addresses the topics in the evaluation checklist. They search for only what they need to find to evaluate the proposal and write up their evaluation results. Call-out boxes, pull quotes, feature/benefit tables, sections headings, and other techniques help draw the evaluator’s attention to the appropriate information. Every evaluator will tell you that if they can’t find it, they can’t score it. Professional proposals are structured so the key evaluation points are extremely easy to find and evaluate.

As you might expect, in this proposal, key evaluations points were missing or not readily found.


Proposals should be attractive and easy to read. They should have a consistent document style, appropriate color pallet, paragraph labeling and numbering scheme traceable to the RFP, and an appropriate mix of text and supporting graphics. Single-column text is fine with half-page or quarter-page-size graphics positioned consistently on the page. Graphics should convey the intended message with the appropriate level of detail.

The proposal was attractive, and if you didn’t read the content, it looked like it would score pretty well. I gave them high marks for attractiveness and accolades to the desktop publishing team.

At the end of my review, I told the company executives to save their protest money. In this case, the government did them a favor by eliminating their proposal from the competitive range. This proposal was not professionally done, even though they thought it was, and it had no chance of winning. After the review, they agreed not to protest and resolved to do better next time.

About the Author

Bob Lohfeld is the chief executive officer of Lohfeld Consulting Group. E-mail your comments to RLohfeld@LohfeldConsulting.com.

This article was originally published October 5, 2011 in WashingtonTechnology.com.