How the winners won SEWP V
If you’ve been following NASA’s $20 billion Solutions Enterprise Wide Procurement (SEWP V), you might be interested in knowing how the bidders won their awards. Like many procurements, this government-wide contract started out as a highly competitive procurement that received over 200 proposals from companies competing to win a coveted award on this 10-year (5-year base plus one 5-year option) IDIQ contract.
When awards were first announced, disgruntled bidders filed protests with the Government Accountability Office (GAO), and NASA voluntarily agreed to reevaluate the proposals. After reevaluation, the agency made 202 contract awards, and every bidder who submitted an acceptable proposal was given a contract award. With no bidders left to protest, SEWP V moved forward as the contract of choice for many government organizations.
Here are the details of what happened, and you can draw your own conclusions about the surprise ending to the evaluation process.
SEWP procurement description
SEWP is one of the main contracts used in the government market to buy IT products, services, and solutions. The procurement is in its fifth iteration and has become hugely popular across the federal sector.
The SEWP V RFP provided for five competition groups. Two groups, Computer Systems/Servers (Category A, Group A) and Networking/Security/Video and Conferencing tools (Category B, Group D), were competed in full-and-open competitions. Mass Storage Devices (Category B, Group B) was competed as two separate set-asides—one for Service Disabled Veteran-Owned Small Businesses (SDVOSB) and the other for Historically Underutilized Business Zone (HUBZone) Small Businesses. Server Support Devices/Multifunctional Devices (Category B, Group C) was competed as a small business set-aside.
Computer Systems/Services is probably the largest dollar value category, so I’ve chosen to focus on that procurement for this article, but the story and the outcomes are essentially the same for the awards in all five groups.
Evaluation process
Proposals were evaluated using three evaluation factors:
- Management/Technical Approach
- Price
- Past Performance
Management/Technical Approach and Price were weighted approximately equally, and Past Performance was evaluated on an acceptable/unacceptable basis.
Evaluation of the Management/Technical Approach was based on three subfactors:
- Excellence of Proposed Systems (Subfactor A)
- Offeror’s Support and Commitment (Subfactor B)
- Management Plan (Subfactor C)
Each subfactor was evaluated and assigned an adjectival rating using Excellent, Very Good, Good, Fair, and Poor. Within each subfactor, evaluators identified Significant Strengths, Strengths, Significant Weaknesses, Weaknesses, and Deficiencies.
Past Performance acceptability was based on an offeror’s relevant, 3-year past performance history. An offeror with relevant and acceptable past performance would be given an Acceptable score. An offer with no relevant past performance would be viewed as having an indeterminate past performance record and would also receive an Acceptable score. Only if an offer had a significant adverse performance history within the past 3 years would it be scored as Unacceptable.
Price was evaluated by comparing each offeror’s prices with the prices of other offerors to ensure that the offeror’s prices were fair and reasonable.
Proposal evaluation was conducted on a best-value basis by trading off the relative merits of offerors’ Management/Technical Approach score and Price.
Evaluation results
NASA received 36 acceptable proposals for Computer Systems/Servers (Category A, Group A). The Source Evaluation Board (SEB) comprised members with expertise in various disciplines relevant to the procurement. No proposals in this Category A, Group A were rated Unacceptable.
The board reviewed each proposal and reached consensus on the findings, rated and scored each subfactor, and applied established numerical weightings to determine an overall Management/Technical Approach score for each proposal.
The top 10 scoring proposals in order of evaluation scores were Unisys, General Dynamics, Sterling Computer Corp., DLT Solutions, Dynamic Systems, IBM, Dell, HP, CDWG, and Merlin International with these offerors receiving no less than a Good score for each subfactor.
Price evaluation was done by comparing each bidder’s prices against the other bidders. Bidders were grouped into six pricing brackets based upon their proposed prices. One proposal was priced substantially higher than the others, five proposals were high, five were moderately high, 11 were moderate, five were moderately low, and six were low.
All offerors were found to have Acceptable Past Performance.
Selecting the winners
For evaluation purposes, offerors were grouped into three tiers based on their Management/Technical Approach scores.
The highly rated tier included 24 bidders that nearly all received a Good or better adjectival rating across all three subfactors and had no significant weaknesses. Two bidders received a Fair rating for one of the evaluation subfactors, but this low rating was offset by a Very Good rating in another subfactor.
The next lower rated tier had three offerors. Each offeror had one Significant Weakness or several Weaknesses. Scores for the three evaluation subfactors in this tier ranged from Fair to Good. Each of these offerors had at least one Significant Weakness reflecting appreciably increased performance risk to the government.
The remaining offerors were put into the lowest rated tier with Fair or lower evaluation scores for two or more subfactors.
When comparing the price of each offeror in the highest rated tier, one offeror’s price was very high in comparison to the other bidders and its Management/Technical score was relatively low within this tier. This bidder’s very high price was not offset by the performance benefits offered in its proposal, and the offeror was removed from further consideration for award. The remaining proposals in the highly rated group offered competitive prices, and by trading off their prices with their Management/Technical scores, it was determined that these 23 companies represented best value and were selected for contract award.
The three firms in the second highest tier were eliminated from further consideration because their lower Management/Technical scores when compared with their prices did not represent best value.
In the lowest tier, when trading off the Management/Technical scores of the companies, none offered a sufficiently low price to constitute best value, and these companies were eliminated from further consideration.
Based on evaluation results and best-value trade-offs, awards were made to 24 of the 36 bidders.
Protest and reevaluation
Multiple losing bidders filed protests with GAO, alleging that their proposals had been unfairly evaluated. Based on these protests, NASA voluntarily withdrew its initial awards, reevaluated the Management/Technical and Price aspects of the proposals from all unsuccessful offerors, updated their evaluations, ratings, and scores, and revised their price analysis.
As a result, offerors received fewer negative findings, resulting in improved ratings and scores. For example, scores were revised upward for the three companies originally evaluated in the second tier grouping. Significant Weaknesses were eliminated and scores for some subfactors were raised from Fair to Good.
Final awards
Before making the final awards, it was noted that the number of contractors under the SEWP IV procurement was no longer sufficient to provide ample competition and responsiveness to the current volume of orders.
Although a greater number of contractors would increase the administrative burden, the benefits of increased competition outweighed the program management detriment in terms of need for increased contract oversight.
Based on this, it was decided that the number of SEWP V contracts should increase to service the expected increases in volume, size, and complexity of orders.
Additionally, it was noted that the Significant Weaknesses for the lower rated offerors were based on inadequately detailed information in their proposals. In each instance, it was concluded that the offeror’s failure to provide sufficient detail about its approach, while a concern, was a lesser performance risk than an unsuitable or otherwise problematic Management/Technical approach. While the Significant Weaknesses were properly assessed, it was determined that these offerors were viable as prospective SEWP V contract holders. Despite providing proposals without adequate detail in certain areas, the inclusion of these offerors in the SEWP V contract would increase the overall benefit to the government by expanding the field of competition in the resulting task order process.
All offerors’ prices were determined to be fair and reasonable, and the differences among prices were found to be primarily due to different product offerings proposed to meet the solicitation requirements.
In the final analysis, the importance of Significant Weaknesses in each offeror’s proposal was minimized and subcategory evaluation factor ratings revised upward. Each offeror received at least a Fair or higher rating across all Management/Technical subfactors.
All prices were determined to be fair and reasonable, and every offer had an Acceptable Past Performance rating.
As a result, awards were made to all 36 offerors.
Final thoughts
It is tempting to editorialize about what occurred during the reevaluation of each previously unsuccessful offeror’s proposal.
Did scores creep up because the evaluators had a change of heart and were less critical of past inadequacies, or did the concern over future protests and the probability of continued delays in awarding the SEWP V contracts cause the best-value trade-offs to skew in favor of more awards?
I’ll leave that to you to decide and as always, I welcome your comments and insights.
Download a copy Bob’s latest article (PDF).
Email your comments to me at RLohfeld@LohfeldConsulting.com.
by Bob Lohfeld
This article was originally published May 5, 2015 in WashingtonTechnology.com.
Paperback or Kindle
10 steps to creating high-scoring proposals
by Bob Lohfeld
contributors Edited by Beth Wingate
Subscribe to our free ebrief
Teaming friends, frenemies, and enemies—12 tips to mitigate harmful effects
Did you know that contracting officers spend up to 20% of their time mitigating disputes between teaming partners? In an informal poll we conducted on LinkedIn last month, 40% of respondents classified their teaming partners as “frenemies” on their last bid.
Explore Further
- Advice (490)
- AI (15)
- APMP (17)
- Business Development (226)
- Capture Management (204)
- Complex Technology Grants Services (9)
- Favorite Books (5)
- Go-to-Market (27)
- Graphics (5)
- Lohfeld Books (2)
- Past Performance (59)
- Post-submission Phase (14)
- Pre-RFP Preparation (215)
- Proposal Management (286)
- Proposal Production (64)
- Proposal Reviews (29)
- Proposal Writing (88)
- Pursuit Phase (90)
- Research Report (2)
- Resources (59)
- Tools & Tips (312)
- Training (10)
- Uncategorized (218)
Sign Up for INSIGHTS and Download your FREE book
We'd love to help you with your proposals. Enjoy our complimentary Lohfeld Consulting Group Capture & Proposal Insights & Tips book with your FREE subscription to our Insights Newsletter.
GET YOUR FREE BOOK