FRONTEC CORPORATION

Determinations


FRONTEC CORPORATION
File No.: PR-97-035

TABLE OF CONTENTS


Ottawa, Wednesday, May 6, 1998

File No.: PR-97-035

IN THE MATTER OF a complaint filed by Frontec Corporation under subsection 30.11(1) of the Canadian International Trade Tribunal Act, R.S.C. 1985, c. 47 (4th Supp.), as amended;

AND IN THE MATTER OF a decision to conduct an inquiry into the complaint under subsection 30.13(1) of the Canadian International Trade Tribunal Act.

DETERMINATION OF THE TRIBUNAL

Pursuant to section 30.14 of the Canadian International Trade Tribunal Act, the Canadian International Trade Tribunal determines that the complaint is not valid.

Arthur B. Trudeau
_________________________
Arthur B. Trudeau
Member


Michel P. Granger
_________________________
Michel P. Granger
Secretary






Date of Determination: May 6, 1998

Tribunal Member: Arthur B. Trudeau

Investigation Manager: Randolph W. Heggart

Counsel for the Tribunal: Heather A. Grant

Complainant: Frontec Corporation

Intervener: Serco Facilities Management Inc.

Counsel for the Intervener: Ronald D. Lunau

Government Institution: Department of Public Works and Government Services

FINDINGS OF THE TRIBUNAL

INTRODUCTION

On December 22, 1997, Frontec Corporation (Frontec) filed a complaint with the Canadian International Trade Tribunal (the Tribunal) under subsection 30.11(1) of the Canadian International Trade Tribunal Act [1] (the CITT Act) concerning the procurement (Solicitation No. W0117-6-M135/E) by the Department of Public Works and Government Services (the Department), on behalf of the Department of National Defence (DND), of operation and maintenance services for the 5-Wing Goose Bay military airfield, Newfoundland.

Frontec alleged that, contrary to the provisions of the Agreement on Internal Trade [2] (AIT) its proposal was unfairly and improperly excluded from the subject solicitation for different possible reasons: discrimination, unfair, improper or inconsistent evaluation by the Evaluation Team, which consisted of the Technical Evaluation Team and the Financial Evaluation Team. [3] Frontec contended that the evaluation of its proposal was flawed as a result of unclear evaluation factors, the inconsistent application of evaluation criteria or the negligent or deliberate misunderstanding of its proposal by the Evaluation Team. Specifically, it alleged that its proposal should have been evaluated “best overall,” based on its past experience, that it would have been the least expensive, that the scores for its technical proposal, after the reassessment of the point-rated criteria, were mathematically impossible, that the evaluation methodology described in the Request for Proposal (RFP) was not followed or was flawed in design and, lastly, that its financial proposal was opened prematurely, thereby tainting the evaluation process.

Considering that a contract had already been awarded to Serco Facilities Management Inc. (Serco), Frontec requested, as a remedy, the payment of its costs for preparing its proposal, which were estimated at $412,000, for preparing and proceeding with this complaint as well as a sum equivalent to its lost profits for the contract estimated at $9,385,541, exclusive of performance incentive fees.

On December 29, 1997, the Tribunal determined that the conditions for inquiry set out in section 7 of the Canadian International Trade Tribunal Procurement Inquiry Regulations [4] (the Regulations) had been met in respect of the complaint and, pursuant to section 30.13 of the CITT Act, decided to conduct an inquiry into the complaint. On January 27, 1998, the Tribunal granted Serco leave to intervene in this matter. On February 24, 1998, the Department filed a Government Institution Report (GIR) with the Tribunal in accordance with rule 103 of the Canadian International Trade Tribunal Rules. [5] On March 19, 1998, Frontec filed its comments on the GIR with the Tribunal. On March 31, 1998, the Tribunal requested the Department, in writing, to respond to Frontec’s comments on the GIR and to answer specific questions formulated by the Tribunal. On April 9, 1998, the Department responded to the Tribunal’s request, and on April 23, 1998, Frontec submitted comments in reply.

Given that there was sufficient information on the record to determine the validity of the complaint, the Tribunal decided that a hearing was not required and disposed of the complaint on the basis of the information on the record.

PROCUREMENT PROCESS

On April 26, 1997, a competitive RFP was advertised on the Government Electronic Tendering Service (MERX) and in Government Business Opportunities for the management and delivery of non-core services in support of the Allied low-level flying training program, civil aviation and other third party users at 5-Wing Goose Bay military airfield, for a period of five years, with an option to extend the contract for two additional one-year periods.

The RFP identifies that an in-house proposal [6] will be presented by government personnel and indicates at article 2.0 of section II that an evaluation methodology called Tabular Format will be used to evaluate all proposals.

The RFP reads, in part, as follows:

SECTION II: PROPOSAL PREPARATION INSTRUCTIONS

2.0 TABULAR FORMAT

This procurement will be conducted using a tabular format methodology. Instructions regarding the use of tabular format spreadsheets entitled Specific Resource Allocation List (SRAL) and General Resource Allocation List (GRAL), specifically designed for this project, will be provided at the bidders’ conference. Additional details are available in Annex C.

ANNEX C - PRESENTATION OF PROPOSALS

D. Section 3 (and subsequent Sections) of the Statement of Work [SOW]

Based on paragraphs C through G of the SOW for each Section, the bidder should:

b) Complete the Specific Resource Allocation List (SRAL) for each requirement C to G. For each requirement (i.e. SOW line item number), the proposed number of annual direct labour hours[ [7] ] to be used should be stated. These values will be compared to the bidder’s written description of each requirement to determine the technical feasibility of the proposal. The proposed resources for each line item of the SOW do not have to equal the financial amount stated in the price schedule (since the schedule can include overhead, profit and all other administrative costs). [Emphasis added]

A mandatory bidders’ conference was held from May 12 to 16, 1997, at the 5-Wing Goose Bay military airfield. According to the Department, 14 interested parties, as well as an in-house team, attended the conference. Instructions on the Tabular Format evaluation methodology was provided during the conference, and firms received software packages containing the SRAL/GRAL spreadsheets to be used in submitting their proposals. In addition, detailed minutes of the bidders’ conference were produced and distributed to all participants. Because the procurement process involved an in-house proposal, two independent observers, a senior policy analyst and an auditor, both from the City of Ottawa, were asked to assess the proposal evaluation process. The observers agreed to provide their services at no cost to the Crown or the bidders. The role of the observers was to verify that the process used in the evaluation of proposals was consistent with the process outlined in the RFP and fairly considered proposals from the private and public sectors.

Six proposals were submitted on July 31, 1997, including those from Frontec, Serco and the in-house team.

The submission evaluation process was comprised of a technical evaluation phase and a financial evaluation phase. Bidders were required to qualify technically before presenting a complete financial proposal. Therefore, the first submission was to consist of a technical proposal and related costing information, but not a price.

Clause 2.1 of Annex C to the RFP provided that the technical evaluation of proposals would be conducted in five stages, as follows:

2.1.1 evaluation against mandatory criteria

2.1.2 evaluation against point-rated criteria

2.1.3 clarification of questionable and unacceptable elements

2.1.4 reassessment against point-rated criteria

2.1.5 final clarification

According to the RFP, the first stage was to involve a “pass-fail” assessment of proposals against the mandatory requirements of the RFP. No individual scoring was to be involved.

The second stage, evaluation against the point-rated criteria, was to involve two aspects. First, as per clause 2.1.2 of Annex C to the RFP, each proposal had to obtain at least 70 percent of the total points available for section 2 of the SOW, “Management and Administration.” Second, each line item of sections 2 to 19 of the SOW was to be assigned points, [8] totalling 15,725. The weighting of the individual line items was established by the Evaluation Team prior to the review of the proposals, in accordance with the principles set out in Article 4.1.2.B of Annex C to the RFP. For each section of the SOW, four general evaluation categories could be applied: Experience; Personnel & Resources; Management; Method of Operations. Because each line item belongs to one of the four categories, a score under each category could be tallied on the basis of the aggregate scores for all line items belonging to each category. [9]

Each bidder’s response to all the line items of the 18 sections of the SOW was then to be rated as a) exceptional, b) acceptable, c) questionable or d) unacceptable. Exceptional solutions (fully satisfactory solutions with potential for financial savings to the Crown) and acceptable solutions (fully satisfactory solutions) would receive the maximum number of points assigned to the line item. Questionable solutions would receive only half the maximum points for the line item and no points would be given for unacceptable solutions.

Following the scoring of a bidder’s response to all line items in the various sections of the SOW, the points earned in each section would be totalled and compared to the maximum number of points allowed for the said section to determine the bidder’s percentage score. Scores for each general evaluation category would also be computed and converted into percentage scores.

The third stage (also referred to as Round 1 of the evaluation), was to involve a comparison of the quality of proposals to one another in order to identify those proposals offering the best technical solution and those obtaining a low technical score. [10] Proposals that obtained a low technical score as frequently as or more frequently than any other proposal, and that also obtained a low technical score in at least three of the four general evaluation categories, were to be eliminated from further consideration and rated “unsuccessful.” According to the Department, this was the case for one bidder, not Frontec.

In accordance with clause 2.1.3 of Annex C to the RFP, the Technical Evaluation Team was to prepare a list of questions and observations for each element of a bidder’s proposal rated as “questionable” or “unacceptable,” for subsequent submission to the remaining bidders. Questions to bidders were also to be prepared by the Financial Evaluation Team. As provided by the same clause, every remaining bidder was then to be offered, upon request, the opportunity to meet with the Evaluation Team and officials from the Department in an individual three-hour face-to-face meeting to clarify these questions prior to preparing written responses. The bidders would subsequently submit written answers. In addition, the Evaluation Team reserved itself the right to ask any general questions regarding the overall proposal even though these questions may or may not impact on the technical scores.

The fourth stage, (also referred to as Round 2 of the evaluation) involved a reassessment of the proposals against the point-rated criteria based on the clarifications obtained from the bidders in their written responses. Clause 2.1.4 of Annex C to the RFP specified that the evaluation rule for the reassessments was to be based on a more stringent definition of a low technical score as being 5 percentage points lower than the highest technical score, as opposed to the 10 percentage point rule in stage three of the evaluation process.

Frontec’s proposal was rated as unacceptable overall for obtaining a low technical score in ten sections [11] and in three general evaluation categories after the reassessment against point-rated criteria. The three general evaluation categories were (1) Experience, (2) Management and (3) Method of Operations. According to the Department, after the completion of Round 2 of the evaluation, the four remaining proposals were assessed as acceptable.

According to the Department, the elimination of Frontec’s proposal during Round 2 of the evaluation was based on the responses provided by Frontec to the questions raised by the Evaluation Team. These responses revealed that Frontec expressed certain work force adjustment (WFA) cost savings in its original proposal by inserting negative figures into its spreadsheet. Given that the spreadsheet package was not designed to record negative figures, this resulted in a significant overstatement of the number of resources apparently proposed by Frontec to perform the work. According to the Department, this overstatement was not detected by the Technical Evaluation Team as Frontec’s overall employee number, quoted in error at 328, was in line with that of other bidders. Given that Frontec relied in its proposal on significant multi-tasking of staff, a “management pool” of some 70 employees not specifically allocated to any enumerated activity did not seem inappropriate to the Department and DND. Therefore, the initial technical evaluation of Frontec’s proposal, at Round 1 of the evaluation, had been done on the basis of an inflated staff number.

At the face-to-face meeting between Frontec and the Evaluation Team, it became apparent that the negative figures in Frontec’s proposal consisted of the inclusion of job offers to affected employees. However, these job offers were in respect of work that was outside the scope of the proposal. Department officials informed Frontec that employment offers for work unrelated to the contract would not qualify under the Treasury Board Secretariat’s WFA policy as these would not generate cost savings to DND.

For Round 2 of the evaluation, Frontec’s proposal, with the clarifications requested, still included its WFA solution, but removed the negative numbers from the spreadsheets. Therefore, on the basis of the clarifications received from Frontec, the reassessment against point-rated criteria was based on Frontec’s revised actual resource allocation of 278, compared with 328 contained in its original proposal. This reassessment resulted in lower scores for the “Management” general evaluation category and in inadequate ratings for the Aviation Weather Services, Food Services, Airfields, and Roads and Grounds Maintenance Services sections of the SOW.

On September 24, 1997, the Department advised Frontec that its proposal was considered “unsuccessful” for having obtained a low technical score in ten sections and three general evaluation categories of the SOW.

Under cover of a letter dated September 26, 1997, Frontec submitted, in an unsolicited fashion, a one-page “Total Cost to Government” proposal to a DND senior official.

On September 30, 1997, Frontec requested an extension of the closing date for receipt of proposals until such time as an independent review of the Tabular Format evaluation methodology could be completed. On October 1, 1997, the Department advised Frontec that an extension of the closing date was not possible and that action had been initiated to have an independent third party review the evaluation process. The same day, Frontec submitted a letter containing its price proposal to the Department’s Bid Receiving Unit. When the contracting officer picked up the proposals following the 2 p.m. closing time, the letter from Frontec, dated September 26, 1997, referred to above, was among the proposals received. It was returned to Frontec on October 2, 1997.

On October 2, 1997, having received final technical proposals and complete financial proposals with all costs from the four remaining bidders, the Technical Evaluation Team completed the assessment of the proposals. According to the Department, all four bidders were assessed as technically successful.

An independent review of the evaluation process, along terms agreed to by the Department and Frontec, was conducted by the auditing firm of Ernst & Young. This firm was chosen because of its knowledge of and experience in Tabular Format evaluation methodology within the DND environment.

Ernst & Young submitted its final report to the Department on November 12, 1997. The report concludes that the procurement process was conducted fairly and in accordance with the terms of the RFP.

On December 15, 1997, the unsuccessful bidders were advised that a contract in the amount of $135,905,361 (15 percent GST included) had been awarded to Serco.

VALIDITY OF THE COMPLAINT

Frontec’s Position

In its comments, Frontec submits that the GIR asserts contradictory positions in respect of the evaluation methodology, both of which cannot be true. In addition, one of these positions violated the Tabular Format evaluation methodology and the other was impossible to implement. Frontec submits that the Department attempts to justify the process which it followed by citing “independent observers” and a third-party report. However, Frontec submits, one of the observers was a former Lieutenant-Colonel knowledgeable in the Alternative Service Delivery process, who returned to the Canadian Forces immediately after the procurement, and the consulting report accepted at face value the contradictory assertions of the Department and DND about the evaluation methodology.

Frontec also submits that the Department attempted to justify the results of the evaluation by affirming that (1) Frontec’s solution fell below staffing requirements which, Frontec submits, indicates a bias in the existence of a pre-conceived staffing solution in the minds of the evaluators; (2) its innovative WFA solution was invalid, yet Frontec has written evidence that similar proposals met Government policy and; (3) Frontec’s costs were not the lowest, yet the Department awarded the contract to Serco, apparently without considering or valuing the technical risk of its WFA solution.

Frontec further submits that it is apparent from the GIR that its proposal was eliminated because of a wholesale rescoring of its proposal following the submission by Frontec of responses during Round 1 of the evaluation. This wholesale rescoring, Frontec understands, was justified in the mind of the Department because, in Round 1 of the evaluation, its proposal had been given the “benefit of the doubt” regarding the number of personnel that it was proposing for the operation and maintenance of 5-Wing Goose Bay military airfield.

Frontec submits that introducing such a subjective consideration into the Tabular Format evaluation methodology is in direct contradiction with the said methodology, which is represented as an objective evaluation method immune from subjective interference. Frontec submits that this explanation simply is not credible. Indeed, how could the Department equate the 35 additional employees proposed by Frontec to a pool of 70 management employees not specifically assigned to enumerated activity, but which could be relied upon, if required, to perform enumerated tasks when each of those 35 additional employees were identified by job classification, none of them being management and most of them being unqualified to perform crash fire rescue and traffic control type activities. No training had been identified in Frontec’s proposal to change these qualifications. In addition, Frontec submits that there was no way for any of the technical evaluators to know if any of these so-called “management pool” employees had been previously allocated or relied upon by other technical evaluators in the course of their evaluation.

On the question of the acceptability of Frontec’s WFA solution, Frontec submits that the Department’s position in the GIR is only supported by an internal DND memorandum which references hearsay evidence of an unnamed consultant working for the Treasury Board Secretariat. This memorandum, dated October 8, 1997, therefore days after Frontec had been excluded from the procurement on September 24, 1997, was never shown to Frontec and, according to Frontec, constitutes at best an ex post facto papering of the file. Moreover, Frontec submits that it had, at the time of submitting its responses to questions on September 15, 1997, received the advice of senior government officials that its proposed innovative hiring solution was a valid strategy for reducing the Government’s liability under the WFA policy.

Frontec submits that the GIR can be characterized as follows: The Department and DND evaluators got confused during Round 1 of the evaluation because of Frontec’s innovative WFA solution. This confusion was resolved initially by giving the “benefit of the doubt” to Frontec’s proposal during its evaluation which, later on and in turn, led to a wholesale rescoring of its proposal. Essential to the Department’s position, Frontec submits, is the further contention that, with the removal of the confusion about the role of the additional employees, Frontec had inadequate staff to do the job.

Concluding its general observations, Frontec submits that it is not asking the Tribunal to substitute its judgement to that of the evaluators. It asks the Tribunal to draw the obvious conclusion that its proposal, despite showing directly related experience, a lower risk staffing plan and an innovative solution for WFA, was eliminated as “technically unqualified.” Finally, Frontec asks the Tribunal to see through the Department’s attempt to portray Tabular Format evaluation methodology as both an objective evaluation methodology and one which, in this case, through the application of the Department’s own subjective discretion, produced disparate results in two different scorings of the same proposal.

Concerning the rescoring of its proposal, Frontec submits that the Department asserts contradictory positions and that both positions are not credible. Indeed, if the line-by-line evaluation was affected by staffing levels, as indicated by the Department in the GIR at paragraph 27c) and as conveyed to Frontec by the Department during its face-to-face meeting, this violates the established and documented Tabular Format evaluation methodology, since staffing levels are irrelevant at the line item evaluation level. Alternatively, if one assumes that line-by-line item evaluation was not affected by staff levels, as the Department now asserts, then all the concerns that the Department and DND had about Frontec’s ability to perform satisfactorily must have been contained in the line-by-line questions received in Round 1 of the evaluation. Frontec’s answers to these questions were largely acceptable to the Department and, on this basis, Frontec submits, a rescoring based only on those questions would have resulted in Frontec progressing to the final phase of the evaluation.

Moreover, Frontec submits that the “management pool” is a construct of the evaluators. Section 1, “Executive Overview,” of its proposal states that the Type 2 employment to 35 employees, in addition to those proposed for the efficient steady-state operation of the 5-Wing Goose Bay military airfield, was to be achieved through its other growing operations in the local area. Accordingly, there would have been no need to develop this construct and to attribute the resources of the “management pool” to line items, on some basis. Further, there was no basis for such an assignment, no specific process established for individual evaluators to factor these “pool” resources into the evaluation of the line items and no methodology for controlling the process.

Frontec also submits that under the Tabular Format evaluation methodology, there is no room for staffing levels. What counts is “labour hours” (not to be equated with “human resources” or “staff levels”) which are converted on a yearly basis to full-time equivalents (FTEs) by dividing total labour hours by productive hours per employee. The total number of FTEs per section and overall represents the minimum number of employees required to accomplish the work as proposed. Frontec asserts that its FTE total in Round 1 of the evaluation was 229.4. According to Frontec, the primary purpose of the FTE total is to serve as a measuring stick against which to assess a bidder’s actual staff proposal. The actual staff proposed cannot be less than the calculated FTEs required to do the work. In this context, Frontec submits that its original proposal showed 327.8 employees versus 229.4 FTEs and that even without the 70 “management pool” its offer of 258 employees was still 29 employees above the FTE requirement.

On the question of the report conducted by Ernst & Young, Frontec submits that the report did not investigate evidence indicating that there was a preconceived staffing level, that it accepted the Department’s and DND’s statements at face value and that it did not look into the issues that it identified about perception of bias in the evaluation process.

Turning to procedural issues in the evaluation of its offer, Frontec submits, in part, that the re-evaluation of its proposal was made substantially by the four members of the Financial Evaluation Team without the real assistance of the technical experts. This is confirmed by the Ernst & Young report. Further, Frontec submits, that, for example, section 6, Transient Servicing and AMU, of its proposal, was proposed to be undertaken as a subcontract and therefore could not have been affected by any clarification on staffing levels elsewhere in the proposal. Yet, section 6 was re-evaluated, new questions were asked and its score dropped from 100 percent to 88 percent resulting in a low technical score for this section. Frontec also adds that, since it made available in its clarification responses to Round 1 of the evaluation, 35 extra employees to the project steady-state requirements to provide support through transition and to provide “performance assurance” during the life of the contract, this effectively raised its staffing level from 277 (priced in its proposal) to 312. In this context, Frontec submits that its effective level of staff of 312 cannot be “so much below requirements/expectations” as to raise significant doubt about increase risk or disruption. Moreover, if the Department expected a particular staffing level, this is a bias in the process. Indeed, if a minimum staff requirement existed, this information should have been made available to all bidders.

Concerning the independent observers, Frontec submits that they were not independent, since one of the two played a critical role in the development and application of the Alternative Service Delivery process in DND. Frontec also assumes that his associate would have been exposed to his views. As well, it would appear that these observers were not present during Round 2 of the evaluation.

Finally, Frontec submits that the Department’s assertion that section 2 was evaluated only to the level of the site manager is not credible. Indeed, Annex C requirements for information in section 2 go well beyond the site manager level and, if one received a very high score in the evaluation of section 2, as Frontec did, then it is not possible to fail in 10 functional sections where the management structure and staffing were exactly the same.

In its final comments, Frontec submits that the Department’s response to the Tribunal’s question concerning the manner in which the Department organized itself to give the “benefit of the doubt” to Frontec’s proposal during its evaluation, supports its contention that its proposal was evaluated from a staffing level perspective rather than the “ labour hour” perspective used in the Tabular Format evaluation methodology. In addition, according to Frontec, the Department’s response confirms that there was no basis to assign the “management pool” staff to individual line items, that no specific process or method existed to assist the evaluators in factoring these “pool” resources into the evaluation of the line items and that no methodology existed to control the process. In addition, Frontec asserts that the Department’s comments contain contradictory evidence as to whether the “benefit of the doubt” was given to all bidders or, in fact, was given at all, including to Frontec. Indeed, though the Department’s submissions suggest that only Frontec was given the “benefit of the doubt” because of the alleged confusion over the staff levels in its proposal, the report of one of the two independent observers states that there was no evidence of any preferential treatment given to any one proposal.

Concerning the justification sheets produced by the evaluators at the time of the line item evaluation and particularly those relating to the provision of vehicles, Frontec submits that these sheets are incomplete, inconsistent and inaccurate in their contents. It further submits that to the extent that those sheets identified alleged deficiencies in its proposal, the clarification questions that the Department derived therefrom were so general as to hide the Department’s specific concerns, thereby preventing Frontec from truly addressing the issues. Furthermore, Frontec submits that, contrary to the Department’s assertion, the cost of the vehicles was included in its initial response of July 31, 1997. Taken together and considering that the Department failed to produce during the inquiry all the relevant justification sheets, according to Frontec, the above anomalies suggest that the asserted basis for the elimination of Frontec’s proposal is not supportable from the evaluators’ own justification sheets.

Frontec further submits that, without disputing Serco’s experience, which it can only assess from a public record perspective, it is, nevertheless, not credible that Serco’s experience would have been rated higher than Frontec’s in respect of sections 3 through 18 of the SOW.

Concerning the question of the establishment of the lowest overall cost to the Crown, Frontec submits that the proposals should have been evaluated from a risk perspective, not only at the financial evaluation stage, but also at the technical evaluation stage. In this connection, Frontec recognizes that there was no requirement in the RFP that bidders offer in their WFA solutions Type 2 jobs. However, it submits that, because the offering of less attractive Type 3 jobs constituted in itself a greater risk to the Crown, the Department should have taken this risk into consideration in the technical evaluation of Serco’s proposal. In this context, Frontec submits that the transitional measures put into place by the Government since the contract was awarded have had the effect of subsidizing the compensation package offered by Serco, making it more feasible. Concluding on this point, Frontec contends that its latest WFA proposal to assign an additional 35 affected Government employees to conduct contract performance assurance work, clearly related to and co-located with the activities at 5-Wing Goose Bay military airfield, should not have been summarily dismissed by the evaluators.

Frontec finally submits that the GIR and subsequent observations made by the Department support its view that the Department was expecting a resource level from bidders. This, Frontec submits, is inconsistent with Tabular Format evaluation methodology and, in any event, should have been clearly stated in the RFP.

Department’s Position

The Department submits that the evaluation procedure and applicable criteria for this procurement were clearly presented in the RFP and that the evaluation process was further explained in detail at the May 1997 bidders’ conference. In addition, bidders had considerable opportunity to obtain further clarification during the bidding process. However, only three questions were raised which related to the point-rated criteria which are the subject of the complaint. Noting that the evaluation process was successfully scrutinized by two independent observers and was positively reviewed by Ernst & Young, the Department submits that the role of the Tribunal, in this instance, is to determine whether Frontec’s proposal was evaluated in accordance with the criteria set out in the RFP and that the Tribunal must defer to the judgement of the Evaluation Team on specific scores (See Mirtech International Security Inc. [12] ).

The Department disputes Frontec’s assertions with respect to its experience and its bid price. In respect of experience, the Department submits that this was a mandatory “pass-fail” criterion requiring and authorizing no qualitative assessments. Further, there was no requirement in the RFP that the bidder’s relevant experience had to be related to a Canadian military airfield nor that such experience had to be gained on identical projects. Projects of a similar nature were acceptable. Concerning price, the Department submits that the price proposal submitted by Frontec was unsolicited, was submitted after Frontec was declared unsuccessful by the Department and, therefore, was not considered.

Concerning Frontec’s allegation that the results of the reassessment of the rated criteria during Round 2 of the evaluation are mathematically impossible, the Department submits that the changes in scoring resulted from the responses provided by Frontec to the clarification questions. These responses reduced substantively Frontec’s resource allocation for the project, thereby impacting its scoring of several line items and related general evaluation categories. In addition, the Department submits that neither the independent observers nor the review conducted by Ernst & Young found anything objectionable about the reassessment of Frontec’s offer, as conducted at stage four of the evaluation process.

Frontec alleged that its low scores in the general evaluation categories of Experience, Management and Method of Operations demonstrate that the evaluation methodology was flawed in its design or was deliberately, negligently or mistakenly misapplied. In this respect, the Department submits that the past experience cited by Frontec relates to the “pass-fail” assessment of mandatory requirements and not to the evaluation of line items in sections 2 through 19 of the SOW. The “pass-fail” rule to assess experience was known to Frontec on or about the time the RFP was issued and the bidders’ conference was held. It is, therefore, too late to raise this issue now. Further, the Department notes that Frontec met the mandatory requirement for past experience (96 percent of all points available) and was, therefore, not prejudiced by its scoring on this count.

Concerning Frontec’s score for the Management general evaluation category including its solution for Aboriginal involvement, the Department submits again that Frontec is seeking to have the Tribunal substitute its judgement for that of the Technical Evaluation Team to obtain a different score. In addition, the Department submits that Frontec received the full 60 point mark for its proposal concerning the involvement of Aboriginals and, contrary to its assertions, it was not evaluated as low technical score for section 2 of the SOW.

Concerning the Method of Operations general evaluation category, the Department submits that the onus is on bidders to present clear proposals. In this instance, and after having benefited from a face-to-face debriefing and pointed clarification questions, the Department submits that Frontec’s submission remained obscure and vague in several areas concerning its method of operations, and was scored accordingly. Responses in respect of the resources and procedures allocated to the aviation weather service, the allocation of equipment and resources for snow and ice removal, and the methodology to perform such tasks are examples cited by the Department. Moreover, the Department submits that Frontec is incorrect in stating that only team leaders participated in the reassessment of the point-rated criteria. In fact, the questions put to all bidders after stage three of the evaluation process were prepared based on the observations of the functional experts recruited from across Canada and these questions and observations were discussed between the team leaders and the technical experts before they were put to bidders. The Department submits that, generally, the review of the responses to the questions did not require the participation of the functional experts unless the responses provided did not address the concerns identified. In these cases, the functional experts were consulted prior to finalizing the evaluation results.

Concerning Frontec’s allegation that the evaluation methodology was flawed in design, demonstrated by the fact that it scored 96 percent on section 2, Management and Administration, and yet obtained a low technical score in the Management general evaluation category demonstrates that the Department submits that Frontec made an erroneous connection between those two scores. The first deals with a specific section of the SOW while the second concerns a general evaluation category which aggregates marks obtained for line items from various sections of the SOW.

The Department further submits that Frontec’s proposal was evaluated in detail and was found to be lacking in resources to adequately perform certain functions. This lack of resources, the Department submits, raised doubts about Frontec’s ability to ensure the realization of client activities, particularly during peak periods or unexpected operations and emergencies. Moreover, the Department denies Frontec’s allegation that its proposal was not evaluated line item by line item. On the contrary, the Department submits that, when Frontec re-submitted its proposal to show 245 FTEs instead of 229 and actual labour strength of 277 staff members instead of 328, the acceptability of its proposal dropped in a number of line items, thereby, affecting its section rating as well as the general evaluation criteria rating.

Concerning the opening of the price proposal submitted by Frontec, the Department submits that the price proposal submitted on October 1, 1997, at the Bid Receiving Unit was returned to Frontec on October 2, 1997, without copies of it being retained by the Department. In addition, the Department submits that, if Frontec had any concern regarding the disclosure of its proposed price, it should not have submitted it to DND five days prior to the bid closing date through a non-secure and uncontrolled channel.

In its final comments, the Department submits that, contrary to Frontec’s allegation, the Tabular Format evaluation methodology is designed to consider all resources (labour, material, equipment, money, etc.) and assesses their use at many levels. Evaluators were trained to use the SRAL/GRAL spreadsheets in order to consider all this information in conducting their evaluation. The Department also submits that no preconceived staffing level was set and that each line item was assessed on its own merit to determine whether the actual labour hours proposed by the bidders were sufficient to do the work, according to the means and methodology proposed by bidders. The Department submits that, in cases where bidders indicated that they would rely on multi-tasking or the cross-utilization of staff to perform certain functions, the technical evaluators considered the bidders’ total approach. For this reason, the Department submits that, as appropriate, evaluators considered all resources, including staffing resources available in other sections when evaluating the acceptability of line item responses. The Department submits that the use of the term “benefit of the doubt” in the GIR was intended to convey that evaluators, though expected to use their best judgement to assess individual line items, were allowed to score line items as “acceptable” or “questionable” rather than “questionable” or “unacceptable” where a bidder failed to clearly “road map” the resources in support of a specific item.

Concerning the risk to the Crown of the various WFA solutions, the Department submits that this risk was evaluated in the course of the financial evaluation and through the requirement for a bidder to have 70 percent of its proposed workforce made up of affected government employees. In this context, the Department asserted that the Crown’s liability in respect of affected employees under the Treasury Board Secretariat’s WFA policy was lower for Type 1 and Type 2 job offers than for Type 3 job offers. In any event, the Department submits that the full amount of the Crown’s financial liability arising from the various WFA solutions offered by bidders had to be factored into their proposals and these amounts were fully considered at the time of financial evaluation.

Concerning Frontec’s allegation that certain government transitional measures constituted financial support for Serco’s proposal, the Department submits that the isolation post allowance and the housing transition measures do not support or otherwise financially benefit Serco. These measures were not anticipated at the time the RFP was issued nor contemplated during bid evaluation and contract award.

In respect of the WFA issue, the Department submits that there are two separate aspects to this issue, i.e. a resource aspect and a Crown liability aspect. Regarding the resource aspect, the Department submits that, in its proposal, Frontec stated under “local hiring” the following: “Going beyond, the company has committed to offering Type 2 employment to 35 employees in addition to those proposed for the efficient steady-state operation of the Base, and exclude the cost of these employees from its bid.” The Department submits that this statement clearly implies that these 35 employees were additional to the staffing level indicated in Frontec’s spreadsheets. The Department further submits that the second aspect deals strictly with the Crown’s liability and would only have affected Frontec’s financial proposal had its technical proposal been successful. In any event, the Department submits that though innovative, the plain and simple fact remains that Frontec’s WFA solution did not comply with the Treasury Board Secretariat’s WFA policy and, therefore, could not reduce the Crown’s liability under said policy. This ruling, the Department submits, was obtained prior to the selection of a contractor, was consistent with the ruling verbally given to Frontec at the face-to-face August 1997 meeting and had no bearing on the assessment of Frontec’s technical proposal.

The Department also submits that Frontec’s assertion that, within the Tabular Format evaluation methodology, the term “resources” means “ labour hours” or “equipment,” is quite simply wrong. A proper evaluation, the Department submits, cannot consider one aspect of the resources without seeing how they all fit together. For this reason, labour hours that are acceptable at the line item level can roll up to an unacceptable total FTE level. In addition, the Department submits that, where a bidder failed to clearly “road map” its resources in its proposal, evaluators were instructed to request clarification through questions. Where responses to such questions were received, the Tabular Format evaluation methodology permitted a re-evaluation of any and all line items impacted by a clarification and this at any stage of the evaluation process. In this context, the Department admits that line items scored as “acceptable” in Round 1 of the evaluation were scored as “questionable” or “unacceptable” in Round 2 of the evaluation.

The Department further submits that the evaluators were clear on the classification of the 35 employees included in Frontec’s initial WFA package. The only issue relative to these individuals was whether the Department could consider the job offers to these individuals as “reasonable job offers” under the Treasury Board WFA policy and, by way of consequence, reduce the Crown liability.

The question of the discrepancy between actual staff level versus the FTE level in Frontec’s proposal raises a different issue. The Department submits that, for the sake of convenience, evaluators in Round 1 of the evaluation, i.e. prior to receiving any clarifications from Frontec, labelled these resources “management pool.” Whether Frontec intended to have a pool of skilled employees that management could draw upon, when needed, to satisfy different work requirements or whether it intended to have a pool of skilled management members, which could be assigned to a multiplicity of functions, was for Frontec to explain and clarify. The Department submits that, in this instance, the Tabular Format evaluation methodology worked exactly as expected, by targeting an issue that raised questions in the minds of evaluators. Rather than assigning “unacceptable” scores in Round 1 of the evaluation, evaluators instead assigned “questionable” scores, raised questions and, on the basis of Frontec’s responses, re-evaluated Frontec’s offer.

Concerning the extent of the rescoring evaluation in Round 2 of the evaluation, the Department submits that there was no “wholesale” rescoring of Frontec’s proposal. Indeed, only line items impacted by questions and answers were rescored, as well as those line items which Frontec modified on its own initiative. Finally, concerning the re-evaluation of portions of section 6 of Frontec’s proposal, the Department submits that the re-evaluation was required because Frontec changed them between Round 1 and Round 2 of the evaluation.

For the above reasons and taking into consideration the fact that, even if Frontec’s proposal had been technically successful, the contract would still have been awarded to Serco, the Department submits that the complaint should be dismissed and request its cost of defending this complaint.

TRIBUNAL’S DECISION

Section 30.14 of the CITT Act requires that, in conducting an inquiry, the Tribunal limit its considerations to the subject matter of the complaint. Furthermore, at the conclusion of the inquiry, the Tribunal must determine whether the complaint is valid on the basis of whether the procedures and other requirements prescribed in respect of the designated contract have been observed. Section 11 of the Regulations further provides, in part, that the Tribunal is required to determine whether the procurement was conducted in accordance with the requirements set out in the AIT.

The Tribunal notes, at the outset, that certain of Frontec’s allegations pertain to the evaluation methodology set out in the RFP. Pursuant to Section 6 of the Regulations, a complaint must be filed with the Tribunal no later than ten working days after the day on which the basis of the complaint became known or reasonably should have become known to Frontec. Without considering the merits of Frontec’s allegations that the evaluation methodology set out in the RFP was flawed in design and that the evaluation criteria were unclear, the Tribunal is of the view that Frontec had ample time and opportunities to raise these matters with the Department or the Tribunal before the bid closing date. The Tribunal notes that not only were the evaluation methodology and criteria described at length in the RFP, but these were reviewed in depth during the May 12 to 16, 1997, bidders’ conference which Frontec attended. The Tribunal is satisfied that, by bid closing time, Frontec had or should have had a reasonable understanding of the evaluation methodology and criteria and, on this basis, should have filed any complaint that it might have had in these respects. This was not done. In addition, Frontec’s allegation that one of the observers was biased is also not timely. Frontec was given an opportunity early on in the procurement process to voice any objection that it might have in respect of the observers proposed by the Department. Frontec raised no such objection at the time and, therefore, cannot do so now.

With respect to whether Frontec’s proposal was evaluated in accordance with the evaluation methodology set out in the RFP, the Tribunal notes that, though bidders were required to indicate for each requirement of the SOW (line item) the number of direct labour hours required annually to perform these requirements, nothing in the RFP prevented evaluators from considering all resources in their assessment of the line items. On the contrary, evaluators were supposed to consider all the resources offered by the bidders in support of any particular function. This was conveyed to bidders during the bidders’ conference at which time it was clearly stated that the technical proposal would be evaluated taking into consideration the resource allocation in the SRAL and GRAL spreadsheets. [13] The same clarification is included in Exhibit 3 of the GIR, entitled Tabular Format Procurement System, wherein it is stated at page 4, under Technical Evaluation that: “For each criteria [line item] assigned to an evaluator, the evaluator will review the bidders’ response, including the hours, materials, equipment and staffing input in the linked spreadsheets, if applicable, and scores the response as either exceptional, acceptable, questionable or unacceptable” [Emphasis added]. The Tribunal also notes that nothing in the Tabular Format evaluation methodology set out in the RFP prevented evaluators from reassessing any line item(s) affected by a supplier’s response to a Department’s clarification question in respect of a specific line item.

Frontec’s technical proposal, including its spreadsheets stripped of all financial information, was considered by the evaluators. At the time, it was noted that significant discrepancy existed between the actual staff level proposed by Frontec, 327.8 and its proposed FTE level of 229.4. The bulk of the discrepancy was reported in Frontec’s spreadsheets under section 2, Management and Administration. In attempting to find an explanation for this difference, it was concluded, that since Frontec relied extensively in its proposal on multi-tasking (same staff assigned to many line items), it was reasonable to think that Frontec had built into its proposal a management reserve. At the time, this interpretation appeared reasonable to the Department and DND, since the overall staff number proposed by Frontec was of the same order of magnitude as that proposed by other bidders. In practice, this translated into higher scores being assigned to a number of line items in Frontec’s proposal during Round 1 of the evaluation.

Frontec objects to this approach by the Department and DND, arguing that they should have known from the executive summary in its proposal that this so called “management pool” was, in fact, comprised of some 35 unskilled resources, which were part of Frontec’s WFA solution and were to be assigned to Frontec’s operations in the local area, not to the 5-Wing Goose Bay military airfield. In addition, Frontec submits that there was no method developed and used by the Department to achieve the apportionment of the “management pool” resources along line items in an orderly, fair and controlled manner, thereby introducing a significant element of subjectivity in a methodology purported to be highly objective.

The Tribunal observes that the Department and DND acted reasonably when they concluded that the actual staffing level originally proposed by Frontec was 327.8 and that Frontec’s WFA proposal was additional to that staffing level. Indeed, Frontec clearly stated in its proposal that its offer to use 35 employees in “its other growing operations in the local area” was “in addition to those proposed for the efficient steady-state operation of the Base.”

On the question of the Department and DND’s apportionment of Frontec’s “management pool” to various line items for evaluation purposes, the “benefit of the doubt” issue, the Tribunal is of the opinion that this introduced a measure of subjectivity into the process. The Tribunal, however, is satisfied that Frontec was not prejudiced by this approach. In fact, it likely benefited from the approach taken. In its evaluation report, at page 28, Ernst & Young, commenting on Frontec’s line item scoring during Round 1 of the evaluation, states that “the evaluators rated each line item by considering the resources available in the management pool. Had they not done so, the proposal would have likely been evaluated as unsuccessful.”

The Tribunal observes that the Department and DND were faced with a conundrum, which was to evaluate Frontec’s line items strictly, thereby scoring Frontec’s proposal as unsuccessful, or to provide Frontec with an opportunity to clarify its proposal. In the opinion of the Tribunal, the Department and DND did not act unreasonably in following the clarification route. However, the method to do so should have been more transparent and controlled, and a detailed account of all decisions made in this respect should have been kept.

With respect to Frontec’s allegation that the Department and DND improperly conducted a “wholesale” rescoring of its proposal by re-assessing line items previously declared acceptable during Round 1 of the evaluation, the Tribunal determines that this is not supported by the record of the evaluation process by the Department and DND. After the completion of Round 1 of the evaluation, Frontec met face to face with the Department and DND, as was provided for in the RFP, to go over the clarification questions before Frontec’s written response to these questions. The Tribunal is satisfied that Frontec was informed at the face-to-face meeting that, if it revised its resource allocation, its proposal would have to be reviewed to reflect the new situation and scored accordingly.

In its clarification responses, Frontec revised its proposal to show a complement of 245 FTEs and an actual labour strength of 277. The Tribunal is satisfied that the Department and DND were entitled to reassess any and all line items impacted directly or indirectly by such clarifications or by any other changes introduced by Frontec in its proposal. In this context, the Tribunal is satisfied that changes made by the Department and DND in the scores of certain line items in section 6 of the SOW of Frontec’s proposal are supported by changes that Frontec initiated on its own in respect of the said line items. It should be noted that the Tribunal is not agreeing or disagreeing, for that matter, with the specific ratings assigned by the Department and DND. This is a matter of judgement by technical experts. Nevertheless, the Tribunal is satisfied that the RFP permitted such a re-evaluation.

Regarding Frontec’s allegation that it offered in its proposal an innovative WFA solution, which the Department and DND should have accepted, the Tribunal notes first that the risk involved in the various WFA solutions submitted by bidders was not to be assessed as part of the technical evaluation, but rather during the financial evaluation. Accordingly, this issue is irrelevant in determining whether Frontec’s proposal was properly determined to be technically unsuccessful.

Concerning Frontec’s allegation that the Department and DND discriminated against it during the evaluation process, particularly in the alleged use and application of a preset resource level not contained in the RFP, the Tribunal is of the view that no evidence exists to support this contention. The nature of the Tabular Format evaluation methodology, the structure of the evaluation process, the very number of people involved in the evaluation process, the method of scoring proposals individually and of assessing their relative merit in the aggregate as well as the systematic disconnect that existed between technical evaluation and the financial evaluation are all factors which, in the opinion of the Tribunal, make it unlikely that the Department and DND could have systematically discriminated against any bidder.

In the opinion of the Tribunal, it would be difficult to invoke any such bias without clear evidence of its existence. The independent observers did not report evidence of such bias nor did Ernst & Young in its report. For its part, the Tribunal has found no evidence that the Department and/or DND were unfavourably disposed towards Frontec.

Concerning Frontec’s allegations that Serco could not possibly rate higher than itself on experience, that its experience in military airfield management and administration is unique in Canada, that the evaluators’ own justification sheets failed to support the rejection of Frontec’s proposal and that, since section 2 of its proposal was rated above 95 percent in both Rounds 1 and 2 of the evaluation process, it is not credible that Frontec obtained a low technical score on the Management general evaluation category, the Tribunal is of the opinion that these allegations are unsubstantiated. It is possible that section 2 of the SOW, Management and Administration, and the Management general evaluation category can be scored significantly differently for the same bidder. As well, the Department indicated that the experience required by bidders to qualify for the project need not have been acquired in Canada or in the administration of military airfields. Experience acquired in similar projects was acceptable. In the opinion of the Tribunal, the evaluators’ justification sheets were not intended to be exhaustive in all respects. Their purpose was to document, at the line-item level, areas of concern which would be raised in the clarification questions put to bidders. In the opinion of the Tribunal, the justification sheets which it reviewed fulfilled this purpose.

Finally, given that Frontec’s financial proposal was not evaluated by the Department and that financial considerations played no role whatsoever in determining that Frontec’s proposal was technically unsuccessful, the Tribunal is of the view that it need not address the costs and financial issues raised by Frontec in its complaint and related submissions.

For the reasons stated above, the Tribunal finds that the Department and DND followed the evaluation methodology and criteria that were clearly set out in the solicitation documents in declaring Frontec’s proposal unsuccessful in Round 2 of the evaluation process.

DETERMINATION OF THE TRIBUNAL

In light of the foregoing, the Tribunal determines, in consideration of the subject matter of the complaint, that the procurement was conducted in accordance with the provisions of the AIT and, therefore, that the complaint is not valid.

On the issue of costs of defending this complaint, the Tribunal is not prepared to award payments to the Department and DND. Although the complaint was held to be invalid, in the Tribunal’s view, there was a reasonable basis for Frontec to bring the complaint in the first instance.


1. R.S.C. 1985, c. 47 (4th Supp.).

2. As signed at Ottawa, Ontario, on July 18, 1994.

3. The Technical Evaluation Team totalled 38 individuals, comprised of the DND team leader and 3 members of the DND Business Review Team who were involved in all stages of the evaluation and 34 technical evaluators with specialized functional expertise. The Financial Evaluation Team consisted of the Department’s contracting officer and two senior financial evaluators from the Department, as well as of a special DND advisor on specific alternate service delivery costing guideline issues.

4. SOR/93-602, December 15, 1993, Canada Gazette Part II, Vol. 127, No. 26 at 4547, as amended.

5. SOR/91-499, August 14, 1991, Canada Gazette Part II, Vol. 125, No. 18 at 2912, as amended.

6. The operation and maintenance of the 5-Wing Goose Bay military airfield was one of the first DND Alternative Service Delivery projects to include the possibility of an in-house proposal. For this reason, a procurement methodology entitled Tabular Format was used in this procurement. Tabular Format was developed by ASC Group Inc., California, United States and, according to the Department, has been used successfully for large multi-service activity contracts for the Defence departments of the United Kingdom, Australia and the United States, where in-house proposals were also considered in direct competition with the private sector.

7. The expression “Direct labour Hours” is defined in section 2 of the SOW, under section 2.A.2.g, as follows: “[h]ours of labour used in actual hands-on work to provide required services excluding supply support, management and administrative support, supervision and other indirect costs.”

8. For example, section 12 of the SOW set out the many line item requirements for operating the Food Services component of the contract. Some line items were weighted as more important than others and were assigned more points. The total number of points available for section 12, Food Services, was 625.

9. For example, some line items under section 4, Aviation Weather Services, are functional requirements relating to the “Method of Operations” general evaluation category. Similarly, there are line items in section 12, Food Services, that represent requirements relating also to the same category.

10. A score more than 10 percentage points lower than the highest technical score achieved by any bidder within a section or a general evaluation category.

11. Sections: 4. Aviation Weather Services; 5. Air Traffic Control; 6. Transient Aircraft Servicing and AMU; 7. Telecommunication Support; 8. Navaids, Radar, Airfield Communications Maintenance; 9. Crash Fire Rescue - Domestic Firefighting; 10. Transport/Maintenance; 12. Food Services; 14. Cleaning/Janitorial; 17. Airfield, Roads and Grounds Maintenance.

12. Canadian International Trade Tribunal, File No. PR-96-036, June 3, 1997.

13. Exhibit 4 to the GIR, Bidders' Conference Minutes A.4 at 16.


[ Table of Contents]

Initial publication: May 6, 1998