How bad are RFPs? Plenty

An analysis of five solicitations for contracts worth a total of $7 billion shows just how poor the writing is that plagues agency RFPs.

It is no secret that solicitations can be hard to read. The language is technical and often convoluted. It can lead to confusion and misunderstandings. Not to mentions delays and inefficiencies.

But just how bad is it? Well, maybe even worse than many of us thought, according to a report that was released this summer.

VisibleThread, a company with software products that use algorithms to analyze the language of documents and websites for changes and clarity, turned one of its tools loose on five solicitations for contracts worth a total of $7 billion.

The company’s primary business is helping contractors track changes in solicitations. Its algorithms don’t just identify changes in wording but it also highlights changes in context and substance.

A second product crawls websites and analyzes the language used on the website for clarity.

It was this website tool that VisibleThread used to analyze the solicitations for GSA’s Human Capital and Training Solutions (HCaTS), HHS United Program Integrity Contractor (UPIC), HHS Research, Measurement, Assessment, Design and Analysis (RMADA), the Air Force Joint Range Technical Services II (J-Tech II), and Navy Fielded Training Systems Support IV (FTSS-IV).

Company CEO Fergal McGovern said that the RFPs were evaluated for readability, passive language, long sentences and word complexity density.

The scores were then compared using the Flesch Kincaid tests, which the Navy developed in the 1970s to improve training manuals and other technical documents.

Each of the four parameters had a target goal of what would be considered good:

  • Readibility – a score of 50, which is about the 8th grade reading level.
  • Passive language – 4 percent or less of the sentences have passive construction.
  • Long sentences – 5 percent of less have 25 or more words.
  • Word complexity density – This scan looks for complex words and phrases based on the plain language guidelines the federal government has established. A score of 100 is the ideal.

VisibleThread didn’t look at entire solicitations but focused its analysis on the statement of work, Section L (the instructions) and Section M (the evaluation criteria.)

These areas are often the ones that cause the most confusion for contractors and result in plenty of back and forth between agency and bidders, McGovern said. After award these sections also are often cited in bid protests and lead to delivery issues.

The result of the analysis shows plenty of room for improvement:

  • Readability scored 32.9, four grade levels higher than recommended for clear writing.
  • Passive voice was present in 14 percent of the sentences, more than three times higher than recommended.
  • Twenty percent of sentences exceeded recommended length.
  • The average complexity score was 3.67, suggesting opportunities to simply word choices across the documents, according to the report.

Of the three parts of the RFPs that VisibleThread evaluated, Section M (the instructions) scored the worst. The statements of work were poor performers as well.

One of the key takeaways according to VisibleThreads report, is that the quality of solicitations can vary widely from one section to the next, which McGovern said points to different people writing the different sections and no clear authority to make them more consistent.

The analysis wasn’t just an intellectual exercise for McGovern and VisibleThread. He said he’s trying to call attention to poorly written solicitations by giving concrete data on where improvement is needed.

If a solicitation is difficult to understand it increases the cost for bidders and the government during the procurement process, McGovern said.

Because of confusion during the bidding process there are delays, which drive up costs. And then there is the issue of the contractor bidding and winning and then realizing what the government needs is different from what was in the solicitation. Again, there are more delays and costs overruns, McGovern said.

Improved writing will pay dividends for the government he said. “The better the RFP, the likelihood of success gets better down the road.”

A lot of the findings in the report point to good basic writing and communications practices:

  • Active voice is better because it is clearer and more direct.
  • Shorter sentences are clearer and easier to understand.
  • Word choice can improve readability and clarity.

The government has tried to address these issues as recently as 2010 when the Clear Writing Act was passed, but the law has no real teeth, McGovern said.

He compared the current practice of developing RFPs to writing and releasing code and not testing it.

“It should be a very simple step,” he said.

Just think of the pain and heartache and waste resources that could be avoided.