Q&A – Doing more with less – and winning more! Part 2

The typical company spends on average 10% of their revenue target on Bids and Proposals (B&P). Risks that increase the B&P budget include poor bid decisions, an immature solution, insufficient training and tools, large review teams producing comments that are not actionable, and lack of executive support. With constrained budgets and increased competition for smaller work share, contractors cannot afford to waste B&P dollars.

Lohfeld Consulting Group’s Principal Consultants Brenda Crist, CPF APMP, and Lisa Pafe, CPP APMP, recently presented an interactive webinar, Doing more with less – and winning more, which highlighted how to increase productivity and win rates.

Click to view the webinar replay and download the presentation.

Here are some questions submitted by webinar participants and the answers Lisa and Brenda provided.

Q:  Will you please describe techniques or tools you have found successful to get color review teams to score rather than read?

A:  We recommend developing a scoring template that can be tailored to each bid. Our best practice is to incorporate Lohfeld Consulting Group’s seven quality measures into the score sheet. Assign review team members to review and score the proposal based on specific measures. For example, one review team member might review and score for compliance and another for visual appeal. Base the scoring both on the RFP’s evaluation methodology as well as on a simple four-color scale for more qualitative areas such as visual appeal.

 

Q:  B&P and IR&D share the same indirect cost account as far as DCAA is concerned. How do your B&P metrics apply when trying to create budget differentiation between these two types of indirect costs?

A:  Technical effort expended in developing and preparing technical data specifically to support bid or proposal activities is considered B&P, rather than IR&D. Administrative costs related to B&P activities are also covered under the B&P definition. B&P does not include the costs of efforts sponsored by a grant or cooperative agreement or required in the performance of a contract.

 

Q:  Do you have any suggestions for where to obtain a good example for conducting a proposal debrief or an internal “lessons learned”?

A:  Develop a simple template for internal lessons learned based on people, processes, and technology/tools. How did the team perform (writers, reviewers, etc.), and who needs additional training? Did our processes work well or do we need to change them? Did our tools and templates as well as our technology (collaborative software, on-line meeting tools) perform to expectations?

For each area of weakness, develop a plan to get better. For each area of strength, figure out how to leverage high performers. Also, make sure to hold the lessons-learned session very soon after the proposal submission and communicate results to all stakeholders.

 

Q:  What are the best methods to obtain accurate competitive analysis?

A:  Become a student of your marketplace and competition and you will eventually build competitive analysis profiles.

 

Q:  How do you know what your competitors are bidding?

A:  You don’t. You can only guestimate based on the best information available.

 

Q:  How do government reviewers score without reading?

A:  Government evaluators score rather than read because they have limited time and many proposals to review. They have to complete their score sheets, so they tend to skim through the proposal looking primarily for the information that will help them complete their score sheets. That is why it is so important to make it easy for the evaluators with crosswalks and cross-references to the evaluation factors as well as good visual appeal, leveraging tables, graphics, call-out boxes, action captions, and bullets—and avoiding dense text.