1. Home
  2. Appendix A.  Comments to Draft Report

Appendix A.  Comments to Draft Report

Office of the Chair

April 17, 2006

TO: Aletha L. Brown
Inspector General

FROM: Cari M. Dominguez

SUBJECT: National Contact Center Evaluation, Draft Report, April 7, 2006

In response to your memorandum of Friday, April 7, 2006, circulating for comment the draft report entitled "The EEOC's National Contact Center: An Evaluation of Its Impacts,"  I have asked for a complete, detailed response to be provided to you on all of the draft report's statements and recommendations.  You will receive comments next week from the Office of Field Programs, which will provide all appropriate data and updates on issues that have been resolved since the earlier time period covered by the draft report.

Initially, though, I have a number of observations to pass along.  I would hope that the final evaluation focuses more carefully on what triggered the establishment of the Contact Center:  the urgent need to improve the EEOC's service to its customers.  That should be the touchstone for evaluating the impact of the Contact Center.  I am pleased that the report indicates that field employees believe that the major purpose of this initiative, customer service, is being advanced by the Contact Center, and that the Contact Center has been picking up calls that had previously been dropped or lost due to inadequate telephone technology.  As noted on page 40 of the draft report, the customer satisfaction index for the Contact Center is above the national average and above the average for the federal government.  The report also notes the substantial increase in e-mail communications with the public and the significant expansion of access for Spanish-speaking and other callers as a result of the Contact Center.  These services not only fulfill government-wide mandates, but, most importantly, enhance our customer service orientation in ways that were not possible before the Contact Center. 

To give just one example, on April 7th I was in Miami appearing on Univision television in a nationally broadcast interview.  That day, the Contact Center successfully handled more than 300 calls from Spanish-speaking individuals seeking EEOC assistance and information -- a significant upsurge from the normal call volume of an average of 60 Spanish language calls on Fridays.  The centralized resources, technological efficiencies, and expanded hours of the Contact Center are what make possible this heightened level of responsiveness to public concerns. 

Unfortunately, the draft report, as currently written, fails to present a balanced picture of how the Contact Center is providing the public with critically needed improvements in customer service that could not feasibly be attained through other means.  The draft report's emphasis on expectations beyond improved customer service, and inclusion of subject areas not intended to be addressed by the Contact Center, not only incorrectly define the purpose of the pilot but demonstrate a misunderstanding of the Contact Center's principal underlying concepts. Even in non-customer service areas where there have been positive developments - such as 30% of offices reporting the ability to redirect resources as a result of the Contact Center - that result is presented as a negative. 

The draft report must be viewed in the context of the Contact Center's development and would be more useful if it would note that the observations made by the evaluators took place early in the Center's operation.  There should be some recognition that when the review began in September 2005, the Contact Center was in its sixth month of national operation.  It was and is a pilot program. The Contact Center has logged hundreds of thousands of contacts, and, it must be remembered, was established on a pilot basis:

  • to develop accurate baseline data on the volume and nature of calls,
  • to develop and refine scripts based upon actual requests for information and their frequency,
  • to develop and refine standard operating procedures and business rules, and
  • to develop the most effective working relationship between the Center and field offices.

Much has been learned thus far in the pilot.  New scripts have been developed and standard operating procedures and business rules have been revised.  Many changes have been made in response to suggestions from the field offices.  The Center in April 2006 is different from that reviewed in September 2005 and will continue to evolve and change throughout the pilot.  While the draft report makes a number of recommendations for improving the implementation of the Contact Center, because the draft report is a snapshot in time, it does not acknowledge that many of the steps recommended have already been taken.  Process improvements have been put into place and other recommendations, such as marketing to increase usage of the Contact Center, were already planned and are scheduled to occur in the near future. 

Not only does the draft report treat the Contact Center as a finished product, but it relies heavily on opinion and anecdotal observation rather than hard data and evaluation of that data.  I am sure that you will find the forthcoming full response to the draft report to be illuminating and useful to the final report.

cc: Vice Chair Naomi C. Earp
Commissioner Leslie E. Silverman
Commissioner Stuart J. Ishimaru
Commissioner Christine M. Griffin
Leonora L. Guarraia, Chief Operating Officer
Headquarters Office Directors
District Directors
Ed Elkins, Project Manager, EEOC National Contact Center
Cynthia Pierre, Director, Field Management Program
Gabrielle Martin, President, National Council of EEOC Locals No. 216

Office of Field Programs (OFP)

MEMORANDUM April 21, 2006

TO: Aletha L. Brown
Inspector General

FROM: Nicholas M. Inzeo, Director
Office of Field Programs

SUBJECT: National Contact Center Evaluation, Draft Report, April 7, 2006

In response to your memorandum of Friday, April 7, 2006, circulating for comment the draft report entitled "The EEOC's National Contact Center: An Evaluation of Its Impacts," the Chair asked that I prepare a complete, detailed response to be provided to you on all of the draft report's statements and recommendations. 

Our first observation is that the report minimizes the significant contribution that the National Contact Center (NCC) is making to improved customer service. When the NCC opened to accept calls nationwide on March 21, 2005, the Commission became dramatically more accessible to the public. Through the NCC, constituents can communicate with the agency in more than 150 languages by telephone, TTY, fax, written correspondence, e-mail and web inquiries and obtain quick, accurate information. The Center is open 7:00 a.m. to 7:00 p.m. Central Time, Monday through Friday, and callers can speak with knowledgeable, EEOC trained Customer Service Representatives (CSRs) usually within 30 seconds. Frequently Asked Questions (F AQs) posted on the Center's web page and an Interactive Voice Response (IVR) telephone system provide information to the public 24 hours a day.

The NCC's positive impact on the agency's customer service is also demonstrated through the results of an independent customer satisfaction survey conducted in February. The EEOC's NCC received an overall American Customer Satisfaction Index score of 77, which is 6 points higher than the average Federal government score. Additionally, callers to the NCC gave its Customer Service Representatives a CSI rating of 84, an unusually high rating, particularly for an operation still in a developmental stage. Yet, the report downplays this success, claiming that there is no pre-NCC baseline, even though the ACSI provides comparative baselines as it maintains data among similar federal agencies' performance levels.

This impact is further demonstrated when comparing pre-NCC call volume with current NCC activity. Based upon a one month survey conducted in March 2003, it was estimated that 61 % of calls to field offices were for general information or potential charge inquiries that did not require the expertise of EEOC investigators or attorneys even though many such staff had to handle the calls. Comparatively, between March 21, 2005 and March 31, 2006, the NCC handled 402,383 customer contacts. Of these 402,383 contacts, 302,622 transactions were handled by the NCC staff and only 89,813, or 29.7% of contacts handled by NCC staff were sent to field offices for resolution. In addition, there were 118,332 hits on the NCC hosted FAQs.

Our second key observation is that the evaluation is fundamentally flawed as it is based on misunderstandings and incorrect assumptions of the intended roles of both the NCC and the EEOC Assessment System (EAS). The evaluators have measured both the NCC and the EAS against standards they were not designed to achieve--namely, neither was intended to supplant the role of the investigator in the intake of charges. Charges are drafted by Investigators or Investigative Support Assistants after in-depth interviews. That the NCC staff is screening out more than 70% of all contacts without involving the field, leaving approximately 30% for field staff to handle, demonstrates that NCC is achieving its objective.

The NCC evaluation draft report was prepared by a contractor (JPS), who subcontracted with Convergys. We note that Convergys took the lead in evaluating the operation of the NCC. In 2004, Convergys bid on the contract for operating the EEOC National Contact Center and is an industry competitor of Pearson. We believe having Convergys evaluate the operation of the NCC poses a serious conflict of interest. We note that Commission Stuart Ishimaru said at a September 17, 2004 Commission meeting:

Finally, Commissioner Miller and I had many talks on this before he left the agency, and he was very proud that he was able to put in the proposal that was voted on by my colleagues last year, a provision that we will have outside review of the call center pilot. And I would hope that we pick the reviewer very carefully, and I would hope that we get a true outside contractor, and not someone who has any ties to call centers, or to any of the contractors who deal with call centers.

The conflict of interest compounds our concerns about the substance of the report outlined in the attachment as it likely undermines the independence and validity of the report overall. Furthermore, the fact that this report was being prepared by the subcontractor simultaneously to their preparing for trial of an ADA case in which EEOC had found cause and had filed suit should raise serious concerns not just about the motivation for the report's findings and conclusions but also about the prudence of selecting a vendor, even as a subcontractor, for a project of this magnitude without either ascertaining whether there was any outstanding litigation involving the vendor or disclosing this conflict in the report.

Beyond the serious concerns cited above, the draft report contains many misunderstandings and factual errors that seriously undermine the credibility of its conclusions and findings. Our attached response discusses these in detail, but I will cite several examples here.

Rather than duplicate the 2003 call volume survey, the contractor chose to survey employees about their impressions and opinions. This survey produced little data. Significant in the survey of employee opinions is the clear assumption by EEOC staff that the NCC would conduct intake. The Chair and the Commission did not authorize the NCC to conduct intake. Yet the contractor fails to recognize the disconnect between employee opinions and the intended role of the NCC and makes many conclusions using this erroneous position. 

Throughout the report, there are references to a 50 percent resolution rate by NCC staff, but the monthly data from March 2005 through March 2006 shows a significantly higher resolution rate: 70.3 percent for calls handled by NCC staff. 

The report contains the note on page 41 that the number of CSRs is constant throughout the day. However, the NCC adjusts the number of CSRs throughout the day according to call volume and it is tracked for each 30 minute period and is available on reports maintained by the NCC.

On page 49, the report concludes that the NCC should adjust their internal processes to achieve better efficiency, consistency, and quality of service, and claims that the NCC expects the CSRs to be "proficient users of 800 scripts in communicating with the public." Yet this figure is highly inflated and reflects an inaccurate number of scripts. The NCC uses 119 Internal Scripts (in both English and Spanish) 163 FAQS, and 121 reference database IDs in the Right Now tool. The reference database has contact and coverage information for EEOC field offices and Fair Employment Practices Agencies (FEP As). In addition, they have internal web access to information about other federal agencies, their programs, and EEO counselor contact information. This is a very manageable volume of reference materials, given the electronic mechanisms in place to facilitate quick access and retrieval.

Because there is such a fundamental misunderstanding on the part of the contractor of both the purpose for establishing a National Contact Center and the actual operation of the center, it is clear that the present draft report should be withdrawn. We believe that any subsequent draft must address the many concerns contained in our comments. We are available to meet with you to discuss our concerns and provide any additional information that might be needed.



OFP Comments on NCC Evaluation dated 4/7/06


In its February 2003 report, Equal Employment Opportunity Commission: Organizing for the Future, the National Academy of Public Administration recommended the establishment of a nationwide, toll-free National Contact Center (NCC) be given the Commission's "highest priority" to enhance customer service by providing consistent and accurate information.  With the NCC handling general inquiries which do not require an investigator to answer, investigators would have more time for more substantive tasks. 

Based on the data contained in the draft report and in the quotes from the report noted below, it appears that the NCC has been successful in meeting both of the key objectives.

"CSRs provide customers consistent and accurate information regarding charges and other general inquiries."[36]

"NCC as a Customer Service Initiative--According to one Office Director, the NCC has institutionalized the delivery of customer service and now the customer service orientation happens every day; it is not just a one-time initiative.  The short turnaround requirements by the NCC have heightened timeliness in responding to customers by phone or mail as well as customer service expectations, delivery and follow through."[37]

"Customer Contact--EEOC employees report one benefit of the NCC is that customers have access to a live person on their initial call for more hours of the day, rather than leaving a voice mail and then not receiving a return call until the next day. In addition, the NCC provides timely access to customers since they can contact the EEOC through several channels (e.g. email and fax). Further, each PCP hears immediately from an office by phone or mail. Finally, the NCC provides people who can listen to callers who want to vent their frustrations."[38]

"The Office Directors who reported reduced call volume felt their employees were better able to answer and return phone calls on a timely basis. This has relieved pressure on people answering phones, and now fewer people must answer "cold calls" (initial calls from the public)."[39]

"After removing outliers (people who reported answering more than 1,000 calls in one week), results show relatively larger decreases in number of calls answered in Pay Grades 4 through 7.  There is also a decrease of calls in Pay Grade 12."[40]

Even beyond these findings, there is other information to support the NCC's positive contributions to improved customer service.  In an independent customer satisfaction survey conducted in February 2006, the EEOC NCC received an overall American Customer Satisfaction Index (ACSI) rating of 77, which is 6 points higher than the average for Federal government as a whole.  NCC staff answering the calls received a CSI rating of 84, a score that the draft report acknowledges is rare.

The operations and structure of the NCC make the Commission dramatically more accessible to the public.  Through the NCC, constituents are communicating with the agency by telephone in more than 150 languages, TTY, fax, written correspondence, e-mail and web inquiries.  The Center provides quick, accurate information from 7:00 a.m. to 7:00 p.m. Central Time, Monday through Friday.  Callers can speak with knowledgeable, EEOC-trained NCC staff usually within 30 seconds.  Frequently Asked Questions (FAQs) posted on the Center's web page and an Interactive Voice Response (IVR) telephone system provide information to the public 24 hours a day.

The NCC also assists the agency in meeting other requirements, including the provisions of Executive Order 13166, which requires federal agencies to provide persons with limited English proficiency (LEP) with meaningful access to their services.  Each agency is required to have a Language Assistance Plan for ensuring this access.  The EEOC's NCC increases the agency's accessibility to the LEP community by offering services in 150 languages and the NCC serves as a cornerstone of the agency's Language Assistance Plan.  In another example, under the President's Management Agenda and H.R. 2458, the "E-Government Act of 2002," agencies are required to expand the use of the Internet and computer resources in order to deliver Government services.  This means that EEOC must be accessible to the public by e-mail and the NCC fulfils that requirement by handling more than 1,500 emails a month and hosting a FAQs site which received 15,440 hits in March 2006.

This context of the NCC's effectiveness is a particularly critical springboard for our specific comments on the report itself which follow for it appears that these positive attributes and key functions of the NCC have been lost or omitted from consideration in the actual report narrative and the findings.  Our detailed comments track the outline of the report and cite the referenced text (in italics) from the report, followed by our response and assessment.


Page i:  There were no baseline data to use in evaluating the impact of the NCC on EEOC operations and staff.  Some baseline data were destroyed and other data either had major data entry errors or were confounded by factors such as CSR turnover rate (attrition) and changes in EEOC office intake procedures.  As a result, we were forced to rely on interviews, focus groups and survey data in conducting this assessment.

The NCC evaluation team (JPS) was provided a copy of the 2003 NCC Assessment Report which contained a detailed description of the methodology used in conducting a month-long telephone survey, copies of the original survey instruments, and the detailed instructions for preparation of the telephone logs used in conducting the survey.  The NCC evaluation team also was given detailed tables containing the survey results by office cluster and nationwide.  The baseline data results from the 2003 telephone survey exist and are contained in the detailed 2003 NCC Assessment Report.  The only missing documents were the completed survey forms and the data tape on which the raw data was saved.  The same statisticians who conducted the analyses are still employed at EEOC and were available for interview regarding any statistical methods or formulas used.  In sum, all the ingredients were present to allow the evaluation team to replicate the telephone volume study, if desired, for comparison with the results of the 2003 study.  The decision of JPS to rely only on interviews, focus groups and surveys which relied on respondent opinions, perceptions and recollections is not entirely explicable.

Additionally, JPS acknowledges they were advised that a comparison of metrics on merit factor attainment would be an objective measure of whether the time available to investigators and the quality of investigations improved since the opening of the NCC.  Improved quality of cases is the expected result of freeing up investigator time from answering general calls in order to devote more time to investigations.  As discussed more fully below, the data in the draft indicates that investigators on average receive 11 fewer calls each week.  Merit factor, a measure of investigator resolutions that benefit charging parties, is an objective measure that EEOC has reported to OMB and Congress for many years and has withstood scrutiny of many other outside evaluators.  Merit factor, in fact, increased from 22.2% at the end of the first quarter of FY 2005 to 23.9% at the end of the first quarter of FY 2006.  While an actual cause and effect cannot be determined, the correlation with the operation of the NCC exists and should not be casually dismissed. 

The result of these decisions, however, means that the report is heavily slanted toward opinions and perceptions rather than objective fact and statistical validity.  Perceptions, of course, are very important, but they should be used to explain data, rather than be used as substitutes for objective data gathering.

Page ii:  The NCC is not close to meeting estimates of call volume.  The NCC is handling just one-fifth the projected volume of calls.  The NCC was expected to save 43,224 field staff hours which is equivalent to 21 Full Time Equivalent Employees (FTEs).  We calculated that the NCC presently saves the EEOC approximately 12,504 hours or 6 FTEs.

The above statement contains a number of misunderstandings which require correction.  First, the projected call volume was significantly affected by the decision to allow public calls to continue to come into field offices directly while only calls to the EEOC toll-free numbers went to the NCC.  The projected call volume assumed that all calls would be directed to the Contact Center.  Because the NCC is a pilot project, it was recognized that it needed time to train and develop staff, standardize operations, and work out a relationship with both the public and the EEOC offices it serves.  It was prudent not to immediately cut off the local public lines and direct all calls to the NCC while it was in its developing stages. 

Second, based on an analysis of the calls logged in the March 2003 survey, the 2003 Assessment Report estimated that 61 percent of 1.2 million unsolicited calls were general in nature and could be handled by a contact center.  March 2006 was the busiest month since the Contact Center opened, with 34,792 callers served by the NCC.  In addition, there were 2,228 e-mail customers, 62 persons sending written correspondence, and 64 persons sending facsimiles for a total of 37,146 customers served, or an annualized number of 444,000.  While this number of contacts is below 732,000 (61% of 1.2 million), it represents more than half the projected volume and, certainly more than the "one-fifth" of expected volume mentioned in the NCC evaluation report.  Also, the monthly contact volume is trending upward as more and more of our offices provide callers the option to call the NCC on their local public telephone greeting. 

In reaching the conclusion that call volume has been low, the report (table 10, page 19) indicates that the average monthly call volume has been around 21,000 calls.  We believe that, in using this number, the report mischaracterizes calls handled by the NCC with calls handled exclusively by an NCC employee and equates calls handled by NCC employees with total workload.  The NCC has an average monthly call volume over 30,000.  Moreover, the NCC is a contact center.  It accepts telephone calls, electronic mail, facsimiles and postal mail.  Telephone calls are answered both through an interactive voice response (IVR) network and by NCC employees.  Any person who calls or otherwise contacts the EEOC through the NCC is one less person calling or contacting the EEOC directly.  Thus, the average monthly call volume reported should include all calls, both those answered through the IVR network and those answered by NCC employees, and all contacts with the NCC. 

The focus for the NCC during the first year of operation was on recruitment, training and retention of qualified staff; development and refinement of scripts and FAQs; development of communication and marketing procedures; and development of reports to assist EEOC in decision-making in a variety of arenas.  The evaluation report should make it clear that the purpose of the pilot was to "allow for the collection of refined baseline data on performance metrics and costs during the first 12 months and vendor performance during the second twelve months."[41]  The refined baseline data would provide actual information on the number and types of contacts received at the Contact Center and form a basis for future estimates of call volume. 

Making the necessary changes to the report to accurately reflect the volume of work will also help the report remain internally consistent.  In the next section of the report on the impact on field staff, the report states that 15 offices had already been able to redirect resources as a result of the NCC.  Savings of employees' time in 15 offices, at such an early stage in the operation of the NCC and with all the limitations imposed on the NCC, indicates significant savings in field staff time.  Directors in 15 offices reporting such staff savings also calls into question the estimate reported on page 10 that only 6 FTE had been saved by the NCC.  The difference between 15 offices already reporting that staff has been assigned other functions and your calculation that 6 FTE have been saved illustrates the economies of scale that the NCC is able to achieve.  What in the NCC environment is calculated as a 6 FTE savings is in a field office environment 15 or more people who no longer are required to answer phones.  The savings to field offices will always be greater than the calculation of FTE in the central location. 

The evaluation team further suggests that the NCC was projected to save 43,224 staff hours.  We acknowledge the decision to continue to retain the local public telephone lines in all EEOC offices had a direct impact on the ability to achieve savings in staff hours.  The evaluation team's calculation of staff hours saved is flawed as it is only based on percent of calls resolved using a contact center model for its estimate.  To obtain a more accurate calculation of staff hours saved, the evaluators should incorporate the volume of calls resolved by the IVR, the number of e-mails resolved and the number of "hits" on the FAQs into their formula and determine how many FTEs would be devoted to those functions in the 51 field offices.  It is reasonable to expect that a substantial number of people get their queries answered using these alternative channels to voice contact.  Projected savings is also flawed since EEOC field offices could not operate with the economies of scale of a contact center. 

Page ii:  Impact of the NCC on Headquarters Operations.

There appears to have been a decrease in controlled correspondence and an increase in other communication such as web hits. However, there are no data to indicate this is due to the NCC.

It was never anticipated that the NCC would have any significant, direct impact on Headquarters' operations.  Processing inquiries and other contacts from potential charging parties is primarily a function of the field offices, not Headquarters.  The report does accurately acknowledge the decrease in complaints from the public regarding their inability to reach an EEOC office by telephone. 

Although there is no direct data to indicate that the NCC played a part in the increase of web hits, it should be noted that the NCC IVR provides the EEOC web site address and the NCC staff who work correspondence and take phone calls also have educated the public on the vast, helpful information that one can receive by visiting this site.  It is, therefore, logical to conclude that the NCC played a part in these increases.

Following meetings with the Chair immediately after the new year in 2006, Commission staff developed an NCC Marketing Plan for the purposes of providing more information to the public and Commission staff about the NCC and increasing the call volume of the NCC. 

A working group developed a plan to market the services provided by the NCC.  The objectives of the marketing campaign are: 

  • To create awareness among target audiences of EEOC's readiness to help whenever employment discrimination is suspected
  • To position 1-800-669-4000 as the preferred method for accessing EEOC.
  • To increase volume of inbound calls to the NCC.
  • To increase confidence in the quality of service delivered by the NCC among EEOC staff members.

The plan includes the following actions:

  • Developing and distributing of outreach materials, such as a brochure and single-page handout/flyer.
  • Designing and distribution of promotional give-aways, such as magnets (for service providers) and palm cards (like a business card with general information).
  • Developing PSAs and distributing to radio stations nationwide to be read by on-air talent.
  • Placing ads on Google.
  • Mailing informational letters to various advocacy group national headquarters for distribution to local chapters/affiliates.
  • Listing the 1-800 numbers in the Blue Pages both under "discrimination" and "EEOC."
  • Changing the EEOC website to allow easier access to FAQs.
  • Developing press releases and stories for field offices to send to local print media contacts, particularly to ethnic and minority publications. 
  • Redirecting  first-time callers to field office local numbers to the NCC.
  • Developing  internal EEOC/NCC newsletter on the staff and services provided at the NCC.

Page ii:  Impact of the NCC on Field Operations

It appears there has been only a minimum reduction in calls to most offices.  There are several reasons for this.  First, the NCC is not taking as many calls as anticipated.  Second, the NCC is referring many calls it does receive to the EEOC offices rather than completing the appropriate form and submitting the information electronically.  Third, most of the offices are still taking unsolicited local calls and referring customers to the NCC only when they receive voice mail and do not want to leave a message.

The issue about NCC call volume was previously addressed above.  However, there are several explanations for why field offices receive calls either directly from the public or on referral from the NCC that apparently were not considered in this conclusion  Callers are referred to EEOC offices because sometimes they specifically request the office phone numbers to contact the offices directly.  Further NCC staff are instructed to refer/forward to EEOC complex questions for which the NCC staff does not have approved scripts to consult.  There are also callers who are insistent upon speaking with a field office even though it appears EEOC does not have jurisdictional coverage or the time limits may have run out.  Under NCC business rules, these callers are referred/forwarded to the field for a determination as to coverage and jurisdiction.

Page 2:  Page ii:  Impact of the NCC on Field Staff

We found minimal impact on field staff. Fifteen of 51 offices (30 percent) have been able to redirect labor resources as a result of the NCC.

This section of the report indicates that the NCC has created little additional time for field staff and is creating inefficiencies.  At the same time the report indicates that 15 offices have already redirected resources as a result of the NCC.  Since the interviews with field staff were conducted in February, less than one year after the NCC became operational, realizing staff savings so soon is quite positive and indicates that the NCC is creating and will continue to create additional time for field staff. 

Field staff commented that the NCC was not reducing the "normal intake process."  (page 24).  This entire section of the Report, which uses the opinions of field staff to form the findings, is based on the expectation that the NCC would relieve field staff of intake duties.  Intake duties are the least favored duties among field enforcement staff.  Their expectation, perhaps their hope, that they would not be required to perform their least favorite activity, is unrealistic and finds no basis in the 2003 report or in the Commission vote.  These unmet, yet unrealistic, expectations serve to emphasize the point, as the Report concludes, that insufficient change management communications have been initiated.  Clearly, a change management plan is needed. 

The unrealized, and unrealistic, expectations of field staff should not be used as the basis for the conclusions in this section of the report.  The NCC was not going to relieve field staff of intake duties.  The report by field staff that duplicate work was being performed - EAS forms completed by the NCC and Form 283 intake questionnaires completed by field staff - illustrates the point. The EAS was not intended to replace Form 283.  NCC staff is not to perform intake work and EEOC field staff sending out intake forms does not duplicate the work of the NCC. 

The discrepancy between what the NCC is intended to handle and the unmet expectations of field staff is most clearly illustrated in the following excerpt from the Report: 

Validating the above EEOC observations, while monitoring calls at the NCC we observed that the CSRs do little probing to understand the validity and nature of the claim.  They leave it completely up to the caller to determine whether his/her charge can be considered discrimination.  (Report, page 30). 

NCC employees are not supposed to probe to determine the validity of a charge.  Such probing is appropriate in intake, not at the NCC.  The conclusions in this section of the Report, based on such a fundamental misunderstanding of the role of the NCC (by both EEOC field staff and the report writer), should be revised or discarded.  Furthermore, when 40% of field employees say that the NCC is creating duplicate work by not screening out more contacts, reporting such conclusions only exacerbates the misperceptions of field staff and the report writer.  While field staff believed that all the information on EAS forms was correct up to 67% of the time (Table 13,Page 27), we do not know how high that percentage should be if all the parties understood the expectations for the NCC. 

Page iii: NCC Impact on EEOC Customers 

The EEOC collected customer service satisfaction ratings on the NCC.  The NCC was rated above average compared to other Federal agencies and to service industries in the private sector.

The statement here in the Executive Summary appears to downplay the real success the NCC has achieved in customer satisfaction.  The Report (page 37) states that "[c]ustomers indicate a positive experience [with] the NCC."  Indeed, the report contains significant indications that customer satisfaction is extremely high.  A customer satisfaction survey of all NCC contacts yielded a Customer Satisfaction Index rating of 77.  The Index of 77 was for all types of contacts -- phone, e-mail and fax.  The report indicates that Federal Government scores on this index are in the high 60's and low 70's.  Thus, an Index of 77 for all types of contacts, in the first year of operation, is extremely strong.  The Report indicates (page 40) "[t]here are a few low 80's but they are rare."  However, the Report does not state, and we believe it should, that the Survey yielded an Index of 80 for all calls received and in rating the NCC staff, the Survey yielded an Index of 84.  With a score in the low 80's so rare, an Index of 84 for the service provided by the NCC employees is phenomenal.  We question why those results were omitted from the report.  Those results belie the finding in the Report (page 38) that a customer's experience depends on the NCC employee reached.  An Index of 84, which is rare, indicates otherwise. 

The Report comments on the "soft skills" of NCC employees (page 40) indicating that they are not consistent.  The Report does not, though, explain the qualifications and training of the NCC employees.  An employee of the NCC must have at least one year of customer service experience and at least six months of contact center experience.  Many of the employees are hired from other call center positions within the contractor's workforce.  As a result, the employees already have experience and training, especially in the soft skills.  Moreover, as part of the contract, all NCC employees are given an introductory two-week training session on substantive issues and soft skills.  Refresher training has been provided monthly, on average, since the introductory training as well as whenever scripts are added or revised. 

Additionally, calls received at the NCC may be monitored by an NCC manager or by an EEOC manager.  In addition, some calls are recorded and reviewed by NCC or EEOC managers.  Each week, NCC and EEOC managers review the monitored and recorded calls at a "calibration" meeting.  That meeting reviews both the substance of the response and the soft skills demonstrated by the NCC employees, as well as the individual monitoring by NCC and EEOC managers.  The results of these calibration sessions are shared with the CSRs and serve as an indicator of when additional training is needed. 

The experience requirements, the training and the calibration sessions all contribute to the Customer Satisfaction Index of 84 received by NCC employees.  That score indicates an extremely high, and rare, quality of service by NCC employees.  It is difficult to justify, then, the statement in the Report (page 40) that the soft skills of NCC employees are inconsistent. 


NOTE:  As discussed throughout this response, we do not believe the recommendations in the report are supportable.  We also believe that the recommendations need to be significantly revised based upon the information and clarifications provided throughout this response.  We re here providing comment on those recommendations that have not been the subject of comment in other portions of these comments.  Other recommendations made in the report but not reflected in this section have been addressed in comments on other portions of the report.

Change or Discontinue

2. Significant changes should be made to the operating model.  The citizen contact process should be standardized across the EEOC and NCC. A "Steering Committee" should be set up with representation from the field, headquarters, and the NCC to develop a single intake process flow that begins with the initial contact at the NCC and moves on to the EEOC after meeting certain criteria. Prior to implementing this process across all EEOC offices, it should be piloted with a few different types of offices to identify and close potential gaps.

The recommendations to establish a "steering committee" of EEOC and NCC staff to standardize initial contact and intake processes and a "core group" to establish metrics to measure desired outcomes are good ones. In fact, OFP established an Intake Services Work Group in October 2005, consisting of representatives from the field and headquarters and headed by a field deputy district director.  The charter for this workgroup was to collect data on the various intake procedures and customer service practices across the country and to make a set of recommendations to improve and standardize the quality of customer service and intake work products.  The Intake Work Group is expected to issue its report in the near future; however, adding representatives from the NCC to this group would be a natural expansion and allow for the development of a "seamless" process that has the understanding and buy-in of all stakeholders.  Adding the development of metrics to the portfolio of this group also would be a natural connection and would avoid duplication of effort that creating a separate group might cause.

Change or Discontinue

3. The NCC and EEOC should use the same or integrated technologies to capture and maintain customer information.  This would promote communication between the two organizations and enable a better customer experience.  Customers should not perceive a difference between the NCC and the EEOC when inquiring about issues.  In addition, we recommend establishing a process for EEOC and NCC staff to communicate directly with one another.  The purpose would be to share knowledge and information and to provide a vehicle to regularly ask questions and provide feedback.

We agree.  Complete integration of all of EEOC's data systems has been a long-time agency goal.  Integration of IMS and EAS is planned for later this calendar year.

NCC Recommendations

5. CSRs should receive training on soft skills and call management process and skills. We also recommend that CSRs receive training to improve the quality and quantity of the information provided. When information is sent from the NCC to the EEOC, it should provide value. To maintain high quality skills, we recommend the NCC provide regular feedback and mentoring to its CSRs

CSRs do receive formal classroom training on soft skills and engage in role-playing with other CSRs and supervisors to practice their skills.  Monitoring is done daily, with each CSR monitored at least four times a month.  Feedback is given daily to the CSRs.  In addition, monthly refresher training is conducted for groups of CSRs, and when needed, NCC provides individualized training.  The recommendation ignores the inherent value to EEOC and to those we serve when NCC staff is available to receive calls and relay to EEOC field offices information that may serve as the basis for a timely charge. 


Page 2:  Lacking historical data, we included some items in the electronic survey sent to all employees located in the field asking them to estimate the average number of calls they received pre and post implementation of the NCC. However, these data are only as reliable as the survey respondents' memory.

This is a very important point.  Much of the data used in this report, particularly that which is critical of the NCC, are not hard data but are based on individual perceptions and opinion.  The reviewer did not ask that documents supporting the perception or opinion be submitted, but relied on the perceptions as if they were factual data.  Furthermore, as cited earlier, it was the contractor's decision not to replicate the March 2003 study for comparison purposes.  While the information on staff perceptions is important, they are not the most accurate gauge of NCC performance; rather, they are indicators of the pressing need for change management and internal communications improvement.


Page 9:

2.  Pearson contractual metrics implemented except for Customer Service. 

 . . . Pearson is meeting nearly all their data gathering requirements. The one exception is customer satisfaction. The EEOC contracted with a separate party to conduct a customer satisfaction survey.

The data gathering "exception" is required by the contract, which calls for an independent customer satisfaction survey.  Therefore, this is not a deficiency by the contractor.  While EEOC did contract for this separate survey as part of the pilot project, EEOC has initiated the lengthy process for obtaining approval from the Office of Management and Budget for an automated, continuing customer satisfaction survey which will be overseen by Pearson's Quality Manager.

Page 9:  As the table shows, we believe the data related to information accuracy are subjective because they depend upon the CSR who takes the information.

As the report acknowledges, the information accuracy is captured through monitoring sessions.  The data to assess and score the accuracy of the information provided and the accuracy of information captured are not subjective.  The contractual metric for "accuracy of information provided" and "accuracy of information captured" are determined by monitoring each CSR at least four times monthly.  NCC has very strict measures in place when monitoring and the NCC does not pass a CSR who has not captured  and verified contact information.  In fact, NCC will fail a CSR if they get two transactions wrong (this includes verifying). The NCC and EEOC (three employees with field office investigative experience) hold weekly calibration sessions to ensure consistency and accuracy in monitoring.  Further in daily monitoring sessions conducted by EEOC, the accuracy of the information is objectively assessed, with a brief explanation of any errors or omissions in the call.  These reports are sent daily to the NCC by the Project Manager.  Feedback from the monitoring and calibration sessions is used to conduct refresher training, counsel CSRs, create performance improvement plans when necessary, and in general assist CSRs in providing accurate information to the public as approved by EEOC.  The example provided only shows that some CSRs needed to be corrected and others are doing their job as expected of them.  If the CSR does not take or provide the correct information, then it is not accurate and there is nothing subjective in the process.

The evaluators should provide the scores achieved on these two metrics.  Also, they should have included data on actual NCC performance on the metrics, compared to the targets set in the contract.  Finally, all the reports referenced in Table 2 on page 9 are provided weekly, not monthly as the chart reflects and this has been the case since March 28, 2005.

Page 11: Table 4Six Month Processing Estimates in Pearson Contract Compared to Actual Statistics, April through September 2005

Estimates in Pearson Contract for English IVR - 719,554 -

Actual:  For some months these data are not reported.

IVR call counts are available for the entire contract period.  Detail data was not available for only two periods due to a software malfunction during the following periods in 2005: 

  • July 2005: 3 days (19th, 20th and 21st)
  • August 2005: 3 weeks (weeks of 8/7, 8/14 and 8/21)

It is also important to note that counts for these periods were not lost, only the detail was not available.  It is possible to approximate the data for the period when the software malfunctioned through an extrapolation of the data that was collected.

Page 12: 

1. Written and telephone communications to headquarters decreased in Fiscal Year 2005, but there are no data indicating it is attributed to the NCC.

While there has been a trend of decreasing correspondence to the Chair, the information for the past 15 months is mixed.  Comparing the first six months of Fiscal Year 2005 (pre-NCC) to the period April through September 2005 (post-NCC), there was an eight percent increase.  However, comparing the October through December quarter pre (FY 2005) and post (FY 2006) NCC, there was a 30 percent reduction in correspondence.  There is no clear reason for the changes.  It is possible that complaints have reduced because the NCC is providing better customer service, but it also may be that the EEOC website is enabling people to more readily contact the field office rather than going through headquarters.  

Data related to telephone calls to the public EEOC telephone number (202/663-4900) over the past five calendar years have been kept following different assumptions and are not comparable.

Evaluators should acknowledge that the decrease could be, indeed, attributable to the NCC.  The evaluators suggest, with no supporting evidence or data, that the reduction in calls and correspondence to HQ could be a result of people visiting the EEOC website and then calling or writing the field office directly.  The field office contact information was available on the website long before the NCC came into existence.  The EEOC public website went up in early 1997 and has had detailed information on contacting the field offices since that time.  Therefore, it seems highly unlikely that the website is the only in reduced complaints, particularly given its existence in the pre-NCC era that is under comparison.  A more probable conclusion is that the NCC has assisted people in contacting field offices when there were problems with the local phones.  There have been many instances of persons calling field offices and after not being able to get through, then calling the NCC which was able to alert the offices that callers were having difficulty reaching them.  With respect to the telephone call statistics, the evaluators failed to include information on phone inquiries from the EEOC's Integrated Mission System (IMS) that would have shown a drop in phone inquiries to the field offices:

Phone Inquiries in IMS
2004 - 50,892
2005 - 40,464

Furthermore, the evaluators failed to report that OCLA put a message on its greeting for its primary public phone line (202/663-4900) that referred callers to the toll-free number at the NCC.  Callers to the OCLA number hear the information in the greeting if the phone is not answered live by OCLA staff.

Page 13:

3.  NCC has had minimal direct impact on publications requested.
For the calendar year 2005, the Publication Distribution Center received 20,556 requests for EEOC publications, of which 342 (1.6 percent) were email requests from the NCC. The highest proportion of requests (69.2 percent) comes directly from the website order form, bypassing the NCC.

This observation should be eliminated.  The NCC contract did not include publications fulfillment and we never advertised the NCC as a source for publications.  We have a contractual option to add the publications fulfillment function, but the option has not been exercised.  The e-mail requests received by the NCC were incidental.  The NCC refers callers with internet access directly to the EEOC's clearinghouse website.  For those without internet access, the NCC takes and forwards the requests to the Distribution Center.

Page 14:

4.  There is no evidence that annual or monthly charge statistics have changed because of the NCC.

The 2003 NCC Assessment Report stated,

Some staff have expressed fears that establishing a national contact center would lead to loss of jobs and/or a skyrocketing of receipts.  . . . it is possible EEOC receipts may rise with the implementation of a national contact center.  Should receipts go up, more staff would be needed rather than fewer.  Starting up as a pilot phase, however, would enable quality assurance procedures to be put in place to monitor the impact on workload and allow the agency to better forecast staffing needs. 

The negative characterization of this conclusion, without any basis in the Report or the Commission vote, is a further example of assumptions and opinions used by evaluators without any basis in fact. 

Page 15: 

5.  While outreach efforts have increased, there is only ancecdotal indication of any impact attributable to the NCC.

While it was expected that the NCC could answer inquiries about EEOC outreach and Revolving Fund events; there was no specific goal or intention for the NCC to directly impact outreach efforts across the EEOC.

Page 16:

6.  There may be insufficient awareness of the NCC, but efforts are underway to broaden publicity of the toll free number.

Specific to the NCC, a few employees commented that many of the EEOC's clients do not know about the toll-free number.

Perhaps the "few employees" are unaware that the toll-free number used by the NCC is the same toll-free number EEOC has had for years.  This toll-free number is prominently displayed in all EEOC publications and on all EEOC posters required for display in American workplaces.

Page 17:

1.  Field employees have mixed perceptions of the impact of the NCC on call volume to their offices.

The electronic survey asked employees to compare the number of calls they received in an average week pre and post implementation of the NCC.  After removing outliers (people who reported answering more than 1,000 calls in one week), results show relatively larger decreases in number of calls answered in Pay Grades 4 through 7.  There is also a decrease of calls in Pay Grade 12.  We further investigated the perceived increase in calls reported in Pay Grades 14 and 15, and discovered that of the 85 respondents, only 11 people reported increases, 7 of whom indicated from 1 to 3 increased calls in a week.  The remaining four respondents appear to be outliers.  Comments from many of the field employees indicated they perceived no reduction in calls because the NCC is telling people to call offices directly. 

Table 9 on page 18 shows a mean decrease of between 20 and 28 calls per week for each employee in Grade GS-4 through GS-7.  That mean excludes outliers and is characterized in the report as "relatively large decreases in number of calls."  Such a decrease is also reported at the GS-12 level, which is the level where the great majority of EEOC field investigators are employed.  These decreases are significant. 

The Executive Summary, though, states that "there has been only a minimum reduction in calls to most offices."  We believe that the Summary should be corrected to reflect the discussion and the factual information contained in the Report.  The data indicates a significant reduction in calls, but instead, the perceptions of field employees, which vary from the data, are being reported as a finding and conclusion.  On February 4, 2006 there were 697 investigators on board of whom 644 (92.4%) were in Grade GS-12.  In the March 2003 survey Grade 12 employees answered 32% of all calls, the largest volume at any grade level, so any reduction over this level should be considered both significant and a success of the NCC. 

Since the factual information for employees in grades up to GS-13 excludes time spent by "outliers," those who reported too many calls received, the table should be corrected to similarly exclude the outliers at the GS-14 and GS-15 grade levels.

Additionally, the evaluators missed the opportunity to review IMS data on phone inquiries, which, while not totally inclusive, gives an idea of the trend in call volume.  (See response to the page 12 item above.)  Also, the evaluators should have replicated the March 2003 telephone volume study to get an accurate view of call volume reduction or increase.  Perceptions of impact will always vary tremendously across offices because workload and staffing varies by office depending on population density, demographics and staff retention rates. Perceptions will also vary depending upon whether the office is redirecting first time callers to the NCC through its voice mail message.

Page 19:  Call Resolution-

Our call monitoring data analysis shows that the NCC is successful in resolving 50 percent of the calls (these calls are not forwarded or sent to the EEOC offices).  These calls relate mostly to general inquiries or information about other Federal agencies.  The other 50 percent of the calls are either referred (the caller is asked to contact EEOC office) or is forwarded via an EASQ to the EEOC office.

The data from March 21, 2005 through March 31, 2006 show that 89,813 calls were forwarded to EEOC field offices.  Those referrals are 29.7% of all calls handled by NCC staff.  Of the referrals made to EEOC field offices, the NCC completed an EASQ and sent it to the field office (26,250), took the customer's information and forwarded it by e-mail to the field office so that the office could contact the customer (17,898), referred the caller to the appropriate field office (44,667), or made a hot-line transfer of the call (998) depending upon which option the caller chose.

March 21, 2005 through March 31, 2006

IVR handled  calls 99,761[42]
Calls Handled by CSRs (English & Spanish) 276,662
TTY 3,031
Correspondence - Written 723
Correspondence - Email 17,239
Correspondence - Web 4,212
Correspondence - Fax 755
Total CSR Handled Transactions 302,622
Hits on Frequently Asked Questions 118,332

Page 20:

Office employees indicated that they still receive many calls to get case status.

When a charge is filed, the Charging Party (CP) is usually provided with the investigator's phone number.  It is natural for a CP to call his/her investigator to request case status and not to call the NCC.  Further, some CPs may want more detailed information than that which is available to NCC staff through the IMS.

Page 21:

There is occasional informal communication between some office management employees and the NCC Project Manager at headquarters.  The NCC Project Manager briefs district directors and field personnel who come to headquarters.  There is no formal process for EEOC employees to ask questions directly of CSRs or give feedback on how the NCC can better serve the offices.

There is more than occasional communication between field office employees and the NCC Project Manager.  The field employees who are involved in processing EAS questionnaires and e-mails received from the NCC communicate regularly by phone and e-mail with Headquarters staff providing comments and suggestions.  Further, staff have been encouraged to report any problems to the Project Manager or one of the Program Analysts who works on the NCC project.  Misdirected EASQs, misspellings, or any other problems are reported to Headquarters and then discussed with NCC managers who counsel CSRs who have made the errors.

Page 21:

...Investigators would like to communicate directly with the CSRs rather than through the formal system (the GroupWise Email) because some issues are low priority and the established system gives everything high priority.

The default setting of "High Priority" was removed December 16, 2005. 

Page 21:  NCC Staff Understanding of EEOC Objectives and Operations

Interviews with NCC Team Leaders and focus groups with CSRs revealed that there is no common understanding of the program's objectives.  One Team Leader stated that the mission of the NCC is to "provide a general overview about the EEOC, information on discrimination, and forward escalations," while another Team Leader stated that it is to "handle 70 percent of the calls that the EEOC was receiving and document the types of calls received for trending purposes." While these two mission statements seem similar, their focus is completely different: the first one focuses on providing general information and forwarding escalations whereas the second one focuses on providing information and documenting the types of calls

The two team leads have different roles.  One serves as the Content Manager, the other is responsible for reporting.  As a consequence, their focuses are different and they responded accordingly.  Both responses are correct though neither includes all of the program objectives.  This does not mean that there is no common understanding of the program's objectives.

Page 22:

6.  EEOC and NCC technologies are incompatible, which adds to the workload of EEOC officesThere is no integration of technologies between the NCC and EEOC.  The NCC uses RightNow software (Customer Relationship Management (CRM) technology) to document all information related to calls. If the caller wishes to file an EASQ, the CSR must transfer the information to the EEOC Assessment System (EAS).  The NCC has "read only" access to the EEOC's IMS.  However, since the IMS is not integrated with either the RightNow software or the EAS, EEOC staff must separately enter information from the EASQ into the IMS.

We agree.  Complete integration of all EEOC's data systems has been a long-time agency goal.  Integration of IMS and EAS is planned for later this calendar year.

Pages 22-23: 

6.  EEOC and NCC technologies are incompatible, which adds to the workload of EEOC offices

Not only are the technologies incompatible, but also the NCC staff seems to have a lack of training on available software.  For example, during the site visit to the NCC, the CSRs indicated they require a charge number to look up claim status on the IMS. If the caller does not have a charge number, they refer him/her to the EEOC offices.  However, communication with the staff at headquarters indicated that the NCC has exactly the same access to the IMS in terms of looking up charges as the EEOC offices, i.e., they can look up charges several different ways including by Charging Party (CP) first and last names, Social Security Number, receiving office, and Investigator name.

This representation is incorrect and does not involve an issue of  lack of training.  These procedures insure information security.  To avoid confidentiality problems, NCC staff have been instructed to only provide status information when the individual provides a charge number and other identifying information.

Page 26:  Creating More Work

While 33 percent of the survey respondents said that PCPs they have talked with did not think they had already filed a charge with the NCC, 42 percent indicated that relatively many PCPs they talked with had the impression they filed a charge with the NCC.

There are three places in the scripts where callers are told they cannot file a charge over the phone.  The first is after it appears that the caller's concerns are covered by EEOC, and they are advised, "You cannot file a charge over the phone but we can begin the process."  Then the various options are explained.  If the caller elects to go through the EAS system, the script reads "Our conversation today does not mean that you have filed a charge as one cannot file charges over the phone."  At the end of the EAS process or if the caller chooses to write a letter to the field office or to have the office to contact him/her, the caller is told "Remember you have not filed a charge until you have signed a sworn document with an EEOC field office."  Furthermore, this same phenomenon occurs when callers contact field offices directly; many believe they have filed a charge when they have only initiated an inquiry.

Page 26: 

In addition, a few survey respondents reported that PCPs have said the NCC told them that they have a "good case" and upon further discussions, the EEOC office determined that the case was unlikely or that there was no case at all.

While NCC staff are encouraged to be empathetic and sympathetic, they are trained and instructed not to comment on the merits of a "case". Experience has shown that some customers hear what they want to hear in their dealings with both EEOC field offices and the NCC.  EEOC routinely monitors 15-20 live calls a week.  Of all the calls monitored, there has only been one where the CSR said the caller seemed to have a "good case".  The NCC site manager was immediately notified and the CSR was counseled.

Page 27: NCC Screening

The field survey respondents indicated that the NCC is screening out many non-jurisdictional cases.  The problem, however, is screening related to the EEOC bases of discrimination.  Forty percent of the respondents involved in intake indicated that relatively many of the EASQs they receive should have been screened out.  This number may be high because survey respondents are indicating that the basis on the EASQ was incorrect (one office indicated that it looks like the CSRs are randomly picking a basis).  In support of these survey responses, we observed during our NCC site visit that the CSRs do not probe the caller to find the right basis for discrimination. Instead, the CSRs read the list and let the caller pick the basis.

Page 29: Incomplete EASQs

Complaint information is also incomplete, which has led office intake staff and supervisors to suggest that CSRs need to probe to get information that is more descriptive. The reason for discrimination is not clear on the forms and there is less information than was expected. For example, if the form indicates the basis is pregnancy, it does not indicate why it was discriminatory. Many survey respondents indicated that the CSRs need to do a better job screening calls (e.g. more in-depth questioning and exploring complaints). Survey respondents indicated they would particularly like to receive more information relevant to the basis for discrimination.

Page 34:  Information Forwarded to EEOC

 For the information about callers that is forwarded to the EEOC through the EASQs, as described above, interviews with the EEOC staff revealed that the information sent is not very useful.  Most of this information also has to be collected again by Investigators.

In all three of these findings in the report, there appears to be a clear misunderstanding of the role of the NCC.  Neither the NCC nor the EAS was designed to supplant the role of the investigator in the intake of charges.  Charges are drafted by Investigators or Investigative Support Assistants after in-depth interviewsIn the discussions involving field and headquarters employees, both managers and non-managers, prior to the Commission decision to approve a two year NCC pilot, there was much concern about whether the NCC employees would be taking over the role of EEOC investigators in asking probing questions and screening out persons who might have viable claims of discrimination. The role of the NCC was clearly articulated in the presentation at the Commission meeting, held on September 17, 2004, when the Commission approved the contract to develop the NCC:

And lastly, although this may not be the last of the issues, but the last I'll discuss today, is inherently governmental functions; whether or not what the contact center staff will be performing are inherently governmental functions. In fact, the contact center employees will not counsel or screen out potential charge filers regarding whether or not they have a charge of discrimination. They will be screening in inquirers in respect to providing them access to the Commission. Neither will they be expected to answer questions related to complex issues of employment discrimination. Customer service representatives will be trained to recognize more complex factual inquiries, and refer those, seeking an interpretation of the law or a set of facts to our field offices. The customer service representatives will describe the laws the EEOC enforces, serve as a clearinghouse for information on employee rights and employer responsibilities, provide information on how the investigative process and mediation work, give the location and telephone numbers of our field offices and the hours of operation, provide case status information, and file disclosure information, provide referral information for other agencies, and provide responses to requests for training and education. Most calls to the EEOC cover these subjects. However, if a caller wants to discuss a potential charge of discrimination, the customer service representative will collect the information, record the pertinent information using a web-based inquiry assessment tool, and forward the information to the appropriate EEOC office for follow-up. We also note that our determination that the contact center representatives will not be performing inherently governmental work is shared by many other agencies with public information centers who have contact centers staffed or operated by contractors.

Additionally, there does not appear to be a clear understanding displayed in this report about the underlying fundamentals of the EEOC Assessment System (EAS).  The EAS is an EEOC developed web-based e-government application which walks the user through a series of questions to determine whether EEOC is the most appropriate agency to provide assistance and, if appropriate, allows the user to electronically submit information related to his/her potential charge of employment discrimination to the proper EEOC field office for follow-up.  It is an independent stand-alone system which will be made available to the public in calendar year 2006.  When the NCC opened, it was decided that the NCC would pilot the EAS.  Based upon experience to-date, changes have been made to capture information requested by the field.

The purpose of the EAS is to provide assistance in determining whether the potential charge meets high-level jurisdictional requirements prior to submission to a field office.  The primary goal is to help filter-out those inquiries which are clearly outside of the jurisdiction of the EEOC, thereby reducing staff resources needed to respond to these inquiries.  The EAS was purposely designed to require that an Intake officer follow-up with the potential charging party to review the allegations and circumstances and make the final decisions and recommendations related to EEOC jurisdiction.  The system errs on the side of caution, allowing the field staff to make the final determination - for example if the potential charging party is uncertain regarding the number of employees in the respondent organization, it assumes that the allegation "passes" this test so that the intake officer can further discuss and clarify this  with the user directly.  A primary concern was that the EAS not discourage potential filing of a charge based on unclear user response information or complex jurisdictional issues. 

Likewise, the e-Questionnaire component of the EAS was purposely designed to provide basic information about the inquiry to a field office, so that the office would have the primary information required to prepare for a follow-up interview.  Considerations were made that, as an e-government application which will ultimately be available for public use over the Internet, the questionnaire needed to be simple and short.  Again, it was determined that detailed information was best gathered during the formal interview process.  The questionnaire forwards information from the assessment and provides basic contact information along with text boxes to describe what happened (asking who, what, when, where) and why the individual feels that the action(s) were discriminatory.  The EAS design workgroup, which was comprised of both headquarters and field staff, felt that the information submitted in the questionnaire was sufficient to allow the intake staff to properly prepare for the required follow-up interview.  Another consideration was a very strong political concern that the EAS not be considered as a formal mechanism for filing a charge of employment discrimination, as this may be seen to reduce the roll of the intake investigator.  The system should be viewed as a mechanism for submitting an inquiry to the EEOC for follow-up, not as a mechanism for making final jurisdictional determinations and filing a charge.  This is the role of the investigator.

Page 32: 

4.  Inconsistent with policy, the "hot line" transfers of callers from the NCC to the appropriate office are often not within the period of 30 days prior to the expiration of the SOL.

CSRs know they should transfer callers to the EEOC office when they are within 30 days prior to the SOL expiring.  It is possible that calls are forwarded in error due to lack of standardized written procedures at the NCC.  Managers in one EEOC office said that transferring these calls outside the designated window can create unrealistic expectations on the part of the caller and creates extra work for the office because employees must stop everything to address an issue that is often not truly urgent.

There are standardized written procedures, but the EAS has a built-in mechanism for triggering hot-line calls.  If the caller is calling within a 60-day window, 30 days on either side of the projected deadline for filing a charge, the EAS system generates a message to contact an EEOC field office immediately and the NCC initiates a hot-line transfer.  The field office can then make the determination, not the NCC, whether filing a charge would be timely.  In order to preserve the rights of individuals calling the NCC, it is better for the NCC to err on the side of referring too many calls rather than too few calls.

Page 34:  Information Forwarded to the EEOC

Our call monitoring analysis shows that of the 50 percent of calls that are sent to EEOC, 62 percent (31 percent of all calls) are related to 'filing a claim' issues, which could have been forwarded by the EASQ.

Page 39:  Some CSRs lead customers to fill out the EASQs while others lead customers to contact the EEOC office.  Only 25 percent of the customers who called to file a charge actually completed the EASQ.  The remaining 75 percent were asked to contact the EEOC office directly.

The NCC data show 89,813  or 29.7 % of the 302,622 transactions handled by NCC staff from March 21, 2005 and March 31, 2006 were sent to the EEOC field offices.  When the NCC opened, a caller whose concerns appeared to be within EEOC's jurisdiction and who wished to file a charge was given 4 options:  (1) to send a signed letter to the appropriate field office with all the necessary information for a minimally sufficient charge; (2) go through the E-Assessment System; (3) have the NCC staff take the information and forward it to the appropriate field office; or (4) the CSR would provide information on contacting the field office and its intake procedures and hours of operation and the caller could call the office or visit as appropriate.  As more offices began to direct first-time callers to the NCC, the CSRs encouraged more callers to go through the EAS. The table below shows the increasing use of EASQs.

April 2005 1,979
May 2005 1,676
June 2005 1,923
July 2005 1,746
August 2005 1,950
September 2005 1,578
October 2005 1,610
November 2005 1,788
December 2005 1,843
January 2006 2,668
February 2006 2,472
March 2006 3,829

Page 35: Change Management

Finally, some Investigators do not understand why NCC cases get higher priority than other customers and believe that giving the NCC cases special treatment communicates the wrong message since "all calls are important not only those to the NCC."

We agree that all calls are important and think that the timeframes established for responding to NCC inquiries also should be the goal for inquiries received directly by the field offices.

Page 36: Training

Survey respondents indicated they believe that the CSRs need more training.  They recommended topics including handling irate customers, EEOC jurisdiction (including application and interpretation of the laws the EEOC enforces), repositioning, antidiscrimination laws, identifying issues and basis for discrimination, Title VII and other statutes, and Federal sector cases.

NCC staff receive two weeks of classroom training and on-the-job training on all of the above.  In addition they receive periodic refresher training.  When scripts are revised or new scripts developed these are discussed with and explained to the staff.  The existing training program resulted in an ACSI index of 84 for the NCC staff.

Page 38  Complaints about Offices

Many employees in the field believe that not having access to a toll-free number has created a barrier between EEOC and the CPs because of the long-distance expenses.

All callers are able to contact field offices by making collect calls.  All field offices are aware that they can accept collect calls to ascertain the caller's information and have a staff member return the call to the CP so that there is no long-distance fee incurred by the caller.  In addition, OFP is exploring the feasibility of establishing district-specific toll-free numbers for customer contact to defray the cost of long distance calls. 

Page 38  Access

The NCC has two Spanish speaking CSRs; therefore, Spanish callers are usually guaranteed the ability to communicate in Spanish.

As of Friday, April 15, 2006, the NCC has eight Spanish-speaking CSRs.

Page 39: Customer Experience  (paragraph 3)

Customers who do not complete the EASQ have to contact the EEOC offices directly. Their experience would be no different than it would have been pre-NCC, e.g., calls not answered by a live person, mailboxes may be full, etc.  There are no standard procedures to handle calls related to taking information related to a charge.

Over 70% of the calls to the NCC are handled entirely by the NCC, not referred to an EEOC field office.  Those customers have a different experience than they would pre-NCC, receiving quick and accurate information or referrals.  Callers are given three options, as discussed earlier, for contacting EEOC field offices.  Further, as the NCC handles more calls, fewer calls are made to EEOC field offices, providing a greater likelihood that the call will be answered.

Page 40: 

6.  Information about general overviews and inquiries is fairly consistent and accurate; however, other specific information is not consistent or accurate.

CSRs have 800 scripts that they reference.

 Page 49

6.  The NCC should adjust their internal processes to achieve better efficiency, consistency, and quality of service.  Part of this involves better use of available technologies.

We reject the notion that the NCC should expect CSRs to be proficient users of 800 scripts in communicating with the public.

The report uses a highly inflated and inaccurate number of scripts.  The NCC uses 119 Internal Scripts (in both English and Spanish) 163 FAQS, and 121 reference database IDs in the Right Now tool.  The reference database has contact and coverage information for EEOC field offices and Fair Employment Practices Agencies (FEPAs).  In addition, they have internal web access to information about other federal agencies, their programs, and EEO counselor contact information.  This is a very manageable volume of reference materials, given the electronic mechanisms in place to facilitate quick access and retrieval.

Page 40: 

7.  The Customer Satisfaction Index is above average when compared to other Federal agencies or the service industry. However, due to the lack of baseline data we cannot assess whether this is an improvement from pre-NCC.

The customer satisfaction survey was conducted to establish baseline data for the NCC.  The report characterizes an ACSI in the low 80's as rare.  The ACSI index for NCC staff was 84.

Page 41: 

Scheduling--Blue Pumpkin/Accent

The NCC used 'Blue Pumpkin' until recently for scheduling purposes and is in the process of converting to 'Accent.'

The NCC switched from 'Blue Pumpkin' to 'Aspect', not 'Accent. 

Page 41:

Scheduling--Blue Pumpkin/Accent

Also, according to CSR focus groups, the call volume is heavy in the mornings and light between 4 and 6 pm; however, the number of CSRs is constant throughout the day.

This finding is incorrect.  As does any credible contact center, the NCC adjusts the number of CSRs based on the interval volume.  Below is a CMS Interval report that the NCC commonly uses.  The particular column to note is the one labeled "Avg Pos Staff.:  The figures in this column indicate that NCC staffing is heavier during the peak times and less during the early and late hours, to correspond to call volume.

image009 - CMS Interval Report

Page 41:


The NCC uses Deltek for timekeeping purposes.  However, the process in place requires a significant amount of manual entry and is recorded in three different places: an Excel spreadsheet, the timekeeping system, and paper copies.  There was no valid reason provided for these manual activities.  Aux codes (codes generally used on switches to track CSR time) are not used, resulting in tracking CSR times manually. 

NCC timekeeping procedures require CSRs to enter their time into the Deltek Timekeeping system as well as turn in a paper copy to their supervisor on daily basis.  The paper copy is reviewed by management each morning to ensure that the time was recorded accurately (which would include tasks charged to, staff time, leave reasons, etc.).  The internal spreadsheet is used to track absence reasons (i.e. sick, recurring, approved, unapproved, vacation, etc.).  Aux codes are used for correspondence and TTY tasks which could be considered "off phone" tasks. 

Page 42:  Call and Contact Management/General Reporting - Avaya and RightNow

Switch Reports - Daily, weekly, and monthly reports on call attributes like speed of answer, abandoned calls, etc.  These metrics are reported to the EEOC on a monthly basis.  In a typical call center, interval reports (call metrics for every half hour) are produced and used to manage the floor. These reports were not available for this program.

Page 42:  Call and Contact Management/General Reporting - Avaya and RightNow

CRM Reports from RightNow - These reports show contact by State, channel, action taken, topic, etc.  These reports are sent to the EEOC on a monthly basis.

Switch and CRM reports are provided to EEOC on a weekly and monthly basis.  Interval reports are used daily and are an absolute necessity in managing the floor.

Page 42:  Agent Performance Management -- No Particular Tool

Agent performance management is not standardized.  The Site Director said that the team leaders and the staff members have meetings once a week to discuss monitoring scores, changes, and general questions.  However, the CSRs denied this.  The CSRs said they do not have any one-on-one coaching sessions.  Their coaching is more like refresher training.

This finding is incorrect.  The Site Manager stated that the Lead Staff meets a couple times a week to discuss any open issues including quality, new content, employee issues, etc.  Coaching takes place daily and is conducted by the Lead Staff and Tier 2 group.  The NCC believes the auditors misunderstood and think that all employees are brought into a meeting once per week to discuss quality.  Quality and monitoring are discussed on a one-on-one basis.  Based on monitoring results, CSRs are provided with one on one coaching.  Refresher training classes take place as well as one on one coaching.

Page 42:Agent Performance Management -- No Particular Tool

A reporting relationship between a team leader and a CSR does not exist. CSRs can approach any team leader with any questions.  The NCC does not use any agent scorecards or other performance metrics to review the performance of the CSRs.  The only type of measurement used is through call monitoring, which the CSRs say they receive once every two months.

While CSRs can approach any member of the Lead Staff, the Site Manager provided to the audit team a map document that identifies to whom each CSR reports.  Each monitoring session results in a completed monitoring scorecard.  CSRs are also provided with their talk times and ACW measurements several times per month.  If there are issues with a particular CSR, then more attention is given as well as coaching and review of statistics.


Page 50:  EEOC Recommendations

8.  Develop methods to increase awareness of the NCC.

In addition, the planned advertising campaign is important to increase awareness of the toll free number.  Another possibility is to modify the EEOC's home page to clearly display the importance and relevance of the NCC to customers.  Presently it is listed under the banner and under the link for "Contact Us."  To encourage initial customers to contact the NCC, they need to be shown the benefits of contacting the NCC before calling the EEOC offices.

This action was included in the marketing plan which was provided to the evaluators.

Page 51  EEOC Recommendations
9.  Implement Change Management Practices

The existing NCC Newsletter might become an EEOC/NCC newsletter with information from both organizations.

This has been the intent from the beginning.  The Newsletter is a joint EEOC/NCC effort to show the role both field office and NCC employees play in serving the public.

Page 54:  Methodology Details (paragraph 7)

Electronic Survey
The day after the survey was launched, the EEOC sent a National Contact Center Newsletter to all employees.  One week after the survey was launched (and coincident with the first reminder emails) the EEOC sent a report describing NCC activities for the month of January.  We raise the possibility that sending this information could have been perceived as an effort to influence survey responses and therefore negatively influenced our response rate and/or the responses.

Had the survey been launched as originally scheduled there would not be the possibility of such a perception.

Page 56:  Appendix C. Trend Information Desired by EEOC Offices

The following of types of trend information listed in Appendix C can be generated by EEOC field offices:

  • Listing by month of EASQs the EEOC offices receive
  • Number of EASQs that actually become charges (provided that they have been properly coded in IMS)

The following types of trend information is now provided :

  • Number of calls screened out by the NCC phone tree and number screened out by the CSRs
  • Number of calls not referred to EEOC Offices (screened out)

The following trend information is now provided for EEOC as a whole but can be provided on an office basis:

  • Overall number of calls forwarded to each EEOC office

Demographic trend reports are being developed.  Much of the information Directors indicate they would like to receive is operational information about the NCC that EEOC Project Manager is disseminating in monthly reports. 

OFP - Metrics

Sent: Monday, May 01, 2006 7:44 PM
Subject: Re: Draft Rpt NCC Evaluation 4/7/06: Comments Requested

Attachments: EEOC NCC Metrics.doc

Aletha, Milt and Larkin, 

In my memorandum to you of April 21, 2006, I commented on the preliminary draft of the NCC evaluation.  On page 11 of my comments, I state that the evaluation of the NCC should have been made against the performance standards set in the contract that the Commission approved.  I am attaching what I was given that lists the performance standards to be met by the NCC contractor.  Again, we would be happy to meet with you if you have any questions. 

Nick Inzeo 

>>> INSPECTOR GENERAL 4/7/2006 4:46:27 PM >>>
Attached please find NCC Transmittal and Draft Report. Comments due 4/21/06.

EEOC NCC - Performance Measures and Metrics

Performance Metric Expected Target Frequency of Reporting
Call Quality Monitoring -
Soft Skills
90%-95% CSR score on calls monitored Monthly
Customer Satisfaction -
Waiting on OMB approval
70%-75% satisfied or extremely satisfied Monthly
Accuracy of Capturing Information 95%-97% Monthly
Accuracy of Information Provided 95%-97% Monthly
Service Level -
Average Speed of Answer
70% to 80% calls answered in 30 seconds or less Monthly
Average Speed to Respond to Written Correspondence 70% to 80% in two business days Monthly
Average Speed to Respond to Email 70% to 80% in two business day Monthly
Average Speed to Respond to Facsimile Transmissions 70% to 80% in two business days Monthly
Blocked Calls 1% to 3% Monthly

Office of Research, Information, and Planning (ORIP)

Office of Research, Information and Planning Comments on

Evaluation Report:  The EEOC's National Contact Center: an Evaluation of its Impacts
Draft Prepared by Job Performance Systems, Inc., April 7, 2006 for
EEOC Office of Inspector General

ORIP has reviewed the draft report on the National Contact Center (NCC) prepared by Job Performance Systems, Inc., and has provided many specific comments below.  Although we believe that the program evaluation generally focuses on the fundamental criteria stated in the original Task Force report of the feasibility of a contact center, there are many analytical areas and conclusions stated that fails to measure the impact of the NCC.  We believe that the evaluation does not utilize direct measures of the NCC's impact and, unfortunately, does not apply an appropriate research design and statistical techniques to isolate these effects.  We believe there were opportunities to use better, and more appropriate analytical approaches.  By not doing so, the study includes a number of unsubstantiated findings and recommendations that do undermine its usefulness to answer the fundamental issues on which it intended to focus. 

As you review our specific comments, there are a few over arching concerns and themes that influence the report's strength and value to the EEOC:

  1. The evaluation fails to use charge and inquiry data available in the IMS that is directly relevant to the evaluation.  The evaluation fails to construct relevant time period analyses and relies instead on fiscal year data rather than time periods that are more relevant to the actual NCC implementation dates.  It fails to establish other types of controls to isolate the impact of the NCC, such as variations in its use by geographic areas and differences in field office implementation. 
  2. The evaluation clearly uses multiple measures of program impact, which is a recommended technique in program evaluation[43].  However, the evaluation covers so many objectives that it reduces an in-depth analysis.  For example, the need to examine the impact on "controlled correspondence" does not appear to be as critical as other objectives.  Also, it does not appear that there was an attempt to prioritize, and gain some agreement on, the most important objectives.
  3. We have some concern that the type of program evaluation conducted does not adequately provide the type of evaluation that would be most appropriate for the Office of Inspector General, even though there is some limited analysis in a number of areas.  For example, the study lacks any cost assessment type of analysis, such as those that could provide cost/benefit ratios or similar measures[44] or compare the cost/benefits of alternative approaches.
  4. Most of the report's conclusions rely on input from Commission staff regarding perceptions rather than direct measurement.  This makes the data collected very imprecise and more impressionistic than empirical.
  5. In that regard, we also believe that the report does not contain enough of the data that it did review, which would provide some justification for the statements made in the report (although we are not necessarily endorsing those statements for the other reasons we have mentioned).  As just an example, it would be important for the report to include the actual survey administered and a grouping of the responses to each question, so that it can be referenced by the reader.  Also, there are many places were a conclusion is stated, but it is not always evident what data/analysis supports the conclusion.
  6. Finally, the study identifies many instances where performance information or measures are lacking or are unreliable.  The inaccuracy and unreliability of data, if the study is accurate, raises troubling implications in many areas of the agency's work and planning.  We believe that this is important to address regardless of the final results discussed in the study.

Detailed comments are provided below.

Page i:  Throughout the report, it is not clear what the period covered with the data and data analyses.  As an example, in the Executive Summary, the Approach section does not provide a context for the statements made about baseline data or what time frame the evaluation analyzed.  This deficiency carries through the Methodology section of the full report, for example, as well as other areas of the report.

Page ii:  A statement is made in the first paragraph ("Unsolicited calls that were originally …") seems to conflict with, or is confusing when compared to, the statement in the 4th paragraph ("Third, most of the offices are still taking unsolicited calls …"). 

In the 5th paragraph, there seems to either be a violation of the contract or the evaluator was not provided with information, if it was requested ("In addition, the NCC does not provide regular trend reports ...").  Also, in the 4th paragraph, the study seems to raise another contract aspect, if accurate ("Second, the NCC is referring many calls … to EEOC offices rather than completing the appropriate form …").

Page iii:  3rd paragraph mentions CSR "soft" skills, which is also discussed in the full report.  This does not seem to be well described.  What is the difference between soft and hard skills that are expected?  Also, based on the parameters of the NCC contract, is the study assuming the function of the NCC should involve these soft skills?  Some of the later recommendations made may go beyond what the EEOC initially envisioned for the NCC staff; for example, "taking" a charge.  The study seems to make recommendations for the agency to consider that approach for the NCC, but it seems that the study should also assess the NCC and its impact on EEOC offices for the purpose for which EEOC intended to use it.

Page iv:  In the 2nd paragraph under item #3, the study suggests here and in the main report ways for the EEOC and NCC staff to better communicate and share knowledge and information.  There are also other aspects of job duties and responsibilities suggested throughout the report; for example, the area of CSR duties and responsibilities discussed above for page iii.  Throughout the report, the study does not address a concern about turnover that often can occur with call center employees (this has been raised in GAO studies on call centers and was referenced in the Task Force report in its review of call centers).  The study does consider cost and quality aspects, if there is extensive turnover over a year, for example, in training and constant monitoring to ensure quality if CSRs are expected to deviate from scripts.  Although this may be a good suggestion, the study needs to address this variable if it recommends more responsibilities for NCC staff than were originally contemplated.

The study often mentions "Tier 1" and "Tier 2" (2nd paragraph under item #4) but does not adequately describe what the tiers mean and want activities they constitute.

Page v:  The evaluator's claim, ". . . there are no reliable metrics to evaluate the impact of NCC on EEOC operations."  If there are no such metrics their very strong recommendations are without basis, and the evaluation instead should have focused on specifying the necessary data needed.  In evaluation terminology the report should have been what is referred to as an "evaluability assessment".[45]  However, there are data sources and research design alternatives that undermine the claim that there is insufficient data.

The Summary in item #7 and in the full study it mentions the need for developing "reliable metrics" but does not provide sufficient guidance or examples (the minimal examples provided on page 49 do not strike us as sufficient since the lack of metrics seems to be a major issue in this study).  If nothing else, the study should provide references to articles or other documents that could be invaluable in developing the type of metrics they recommend.

Page 2: The study begins with a methodology section that basically discusses the lack of data.  There are two major concerns here.  First, discussing data before what needs to be measured is not a useful process.  It would seem more appropriate to clearly state the objectives to be measured and then later point out which measures lack sufficient data.[46]  Second, the discussion fails to seriously consider the use of IMS (charge tracking) data as a possible source of information.

Page 3:  The study fails to recognize natural control groups that would have provided an ability to isolate NCC impact.  The study states,

Another strategy we explored was to identify whether some offices could function as control groups. However, this proved impossible due to the way the NCC has been rolled out over time. The EEOC set up seven offices to pilot the NCC for one month and then commenced implementation across all offices on March 21, 2005. Beginning November 29, 2005, the EEOC gradually authorized offices to change their telephone voice message to redirect all first time callers to the NCC. This initiative started with eight local offices and has since been gradually expanded to include all offices. The extent offices redirect callers is left to the discretion of each office. As a result of this implementation methodology, there were no comparative treatment/control groups to evaluate impact.

The incremental implementation[47] offers an excellent opportunity to use research designs  like "interrupted time-series" to see how charge behavior varies after use of the NCC is introduced.

Page 5:  Already mentioned, but the NCC had not collected customer satisfaction data that was required in the contract.  The study mentions an experimental measure but provides no specifics.  Even though the evaluators abandoned it, it would be useful to have information about what they tried and why they abandoned the effort.  They recommend elsewhere that the EEOC develop metrics, and it would be very useful to have this information as a learning experience.

Page 9:  The study describes the inconsistency in capturing customer information. If accurate, this may be another contractual area needing review.

Page 10:  The row in the table dealing with call duration is not useful and should be translated into common measures like mean or median duration of calls.  In addition, if the categories used for 2003 estimates are part of the contract performance measurements the NCC were required to meet, the study should also present the results in these categories to evaluate how well the estimate tracks the actual findings.  Both the common measures and the actual comparison to contractual performance measures would be useful.

Page 12:  The study used fiscal year data here, even though implementation of the NCC was not based on the fiscal year. In examining controlled correspondence, the study concludes, 

There is no clear reason for the changes [in controlled correspondence]. It is possible that complaints have reduced because the NCC is providing better customer service, but it also may be that the EEOC website is enabling people to more readily contact the field office rather than going through headquarters.

The purpose of program evaluations is to establish causality between a program and an outcome.[48] It is the evaluator's responsibility to determine whether the observed change was due to the NCC or not.  One would expect an evaluation to measure alternative explanations of change rather than just speculating about them.

Similarly, the report notes that there are reductions in the use of the EEOC public telephone number and concludes that it cannot determine if the change is due to the NCC.  The report cites lack of data, but does not clearly indicate the data that is needed nor are data and research design alternatives explored.  Further, the evaluator appears to be measuring data for three months and is not tying NCC incremental implementation to the measure.

Page 13:  The discussion of website use suffers from the same problems.  Very simple measures of use are provided.  There are no efforts to control for factors other than the NCC that might explain increased use and there is no recognition of incremental implementation.  The report states, "The total number of visitors to the EEOC website has increased over the past five years, reflecting what we were told is a typical growth pattern for Federal websites."  The language suggests a steady increase. But instead Table 7 demonstrates that after a 38 percent increase in use from 2002 to 2003, there is just a 6 percent increase from 2003 to 2004, followed by a 23 percent increase from 2004 to 2005.  Clearly something very different occurred during this time period.  Social science practice requires that statements such as "we were told" be clearly substantiated and cited.

Again, with respect to publication requests, fiscal year data is used rather that direct tracking to implementation or accounting for variation in NCC use by geography.  (An example of the latter might be that if more individuals from California are using the NCC are more individuals from California requesting publications?)

Page 14:  After examining/providing a graph of charge data, the evaluation concludes that, "While there are monthly differences in total receipts, the overall pattern of receipts across each of the 12 months has remained fairly consistent."  There are no analyses provided that supports this conclusion.  There are no models offered to explain charge behavior and there is no attempt to isolate differential use and/or implementation of NCC to specific rates of charge receipts.  Also, once again there is reliance on fiscal year data.  To examine causality between charge statistics and the NCC requires an elaborate model of charge behavior and sophisticated statistical analyses, neither of which is presented here.  This is a very important issue for the Commission's use of an NCC and the analysis provided does not adequately address the issue and its complexity.

Page 15:  Figure 2 attempts to provide data in a graphic format which is not very reader friendly.  A data table would be helpful.

There is also a conclusion presented here that, "While outreach efforts have increased, there is only anecdotal indication of any impact attributable to the NCC."  To reach this conclusion, the evaluator apparently uses conversations with or a survey of our Program Analysts.  In order to determine a relationship between NCC and outreach, the evaluator clearly should have used participant input or similar data to reach this conclusion.

Page 16:  Here the evaluator observes that there "may be insufficient awareness of the NCC."  Again the evaluator measures awareness using our staff's beliefs not empirical evidence.  A more reliable source of data would have been charging parties in open investigations  They should have been asked if they were aware of the NCC, if it is appropriate under the Paperwork Reduction Act to ask that type of question.  In addition, given the other findings in the study about the performance of the CSRs, the recommendations about better interaction and information sharing with EEOC, and the need for consistency and accuracy of data, it seems that the study should caution against prematurely engaging in activities involving publicizing the NCC before resolving the other issues, if appropriate.

Page 17:  Once again, the evaluation relies on perceptions about an outcome (in this instance call volume) rather than real measures and even with that, the report does not provide the statistics collected from the survey.  There is no attempt to distinguish between these perceptions in areas where there is high per capita use of NCC versus low per capita use of NCC or similar efforts to isolate the impact of NCC.[49]

Page 18:  "Probing" needs to be carefully defined.

Page 19:  The report makes the following observation. 

From April through most of November 2005, customers could call either the toll free number (answered by the NCC) or the local number (answered by the EEOC offices). From November 2005 through March 2006, the EEOC gradually authorized offices to put a message on their voicemail redirecting first time callers to the NCC. When we conducted our survey, 4 of 30 offices (13 percent) were redirecting all callers to the NCC, 14 offices (47 percent) had put a message on their voicemail giving callers the option to call the NCC, and 12 offices (40 percent) were not redirecting callers to the NCC.

It would seem that such natural differences in the manner in which NCC was implemented provides an opportunity to measure differences.  That is, do calls received vary with the way the field office directs calls to NCC?  Unfortunately, the report fails to utilize this. For example, once calls were redirected in a specific District Office did the proportion of calls drop relative to a District Office that had not redirected their calls?

The report indicates that "50 percent of calls were resolved at the NCC".  It would have been useful to measure and report the following factors.

  • The stability of this 50 percent figure over time. 
  • The cost of referring 50 percent of calls to the NCC when they cannot be resolved there (for example, the number of days lost by the potential charging party).
  • Whether the failure to resolve 50 percent of the calls resulted in "tolling" issues for the potential charging party.

Page 20:  To determine if the type of calls received by our staff changed as a result of the NCC, it appears that the evaluator just asked our staff if the nature of calls changed.  Such measures would appear to be very unreliable.  Again direct measures should have been used.  Given the probable lack of pre-NCC data, it might be necessary to instead measure if the increased use of NCC is related to the nature of calls received.  Calls received must be captured more than just from our employees' recollections.

Also the report states that, "In addition, during our interviews and focus groups we did not detect any clear indication of change in the types of calls employees are answering pre/post the NCC."  Focus groups seem useful for developing issues, problem areas and hypotheses for evaluation studies but are not such useful avenues for collecting impact data as done here.

Page 21:  The evaluation makes some observations about how there are disparities in the NCC staff's perceptions about program objectives.  The reader would benefit from an explanation of how these differences in perceptions affect the impact of the NCC.

Page 23:  In this section there is a discussion of the NCC requirement to identify trends.  There is an observation that there is no automated way to identify trends.  This raises a few issues.  First, this appears to be a major deficiency where the NCC may not be meeting simple contract requirements.  Second, the type of data that is being collected should be described.  Third, it would appear that the evaluation should be utilizing this data, yet there are no references to it.

Page 24:  The following statement is made, "Of the 15 offices, 8 [field offices] indicated that the labor savings resulted from fewer incoming telephone calls and 2 offices indicated that Investigators who were answering 'cold calls' now have more time to spend on investigations."  The evaluation would have been much more useful, had it explored the differences in greater detail.  What actually enabled those two offices to have more time for investigations?  Are they different than other offices with respect to process, workload, number of people in their jurisdiction using the NCC or some other factor? 

Another interesting statement here is that, "Many commented that they have increased responsibilities [as a result of NCC], in part because the NCC is only providing bare bones support by taking phone messages and relaying information unless the call is blatantly non-jurisdictional."  It would have been useful to report the actual number and percent rather than just "many". 

Page 25:  Table 11 presents the data in an odd way using grouped data.  One would certainly want to know the actual mean and/or median and the total number of responses.  The report also makes the following observation, "Thirty nine percent of survey respondents involved in intake reported that relatively many (more than 20 percent) of the EASQs they review are for PCPs who already have an inquiry in the IMS and therefore should not have been forwarded as an EASQ."  Here and in other places there is the use of the term "relatively many" (or "relatively few" in other places) to refer to closed ended survey response which seems to actually be "more than 20 percent".  It is more transparent to report the actual response rather than labeling it "relatively many".  The real problem and the reason that the language is so awkward is that the evaluators fail to use a direct measure and are again relying on perceptions.  Direct measurement if just from a sample of EASQ's would have been a more defensible way to measure this problem.

Also, as noted earlier, the study analyzes the duplication of work but mixes up two important concepts.  If the work assigned to the NCC and the CSRs is inaccurately done, given the guidance provided by the EEOC (scripts, training on script language, etc.), then that needs to be rectified so that field office staff does not have to do duplicate work or correct inaccuracies that should have been caught.  However, if the study is suggesting that the duplication is because the NCC/CSR needs to assume other responsibilities beyond what EEOC envisioned for them, such as "take" the charge itself, they need to address this area both ways: the way the EEOC wanted it to work and based on their recommendation on how to make it work more effectively and efficiently.  Also, see pages 27 and 30 for examples of this same point.

Page 26: The same problem is repeated when discussing GroupWise e-mails.  Direct measures rather than staff perceptions should be used.  There is a statement that, "This perception by PCPs can affect Investigators and the PCP because it takes Investigators extra time to backtrack and explain the limited role of the NCC."  Direct measurement of time to closure in our MIS data would be more telling than our staff's perception about the perception of a PCP.

Page 27:  Table 13 is very confusing.

Page 28-29:  Although repetitive it is worth observing that direct measure of inaccurate, incorrect or incomplete EASQ's is preferable to our staff's perceptions of such things.  The evaluator should have sampled EASQ's and tested their accuracy and completeness.  Also on page 29, by "reason for discrimination", it is assumed that the evaluator means the issue in the allegation.  Obviously, "reason for discrimination" is a very poor term to use.

Page 30:  The report states that ". . . while monitoring calls at the NCC we observed that the CSRs do little probing to understand the validity and nature of the claim."  There is no explanation of how monitoring was done.  One would expect an evaluator to develop, implement and document that this assessment was done in a scientific manner.  Further, no statistics are reported here.  Quantitatively, what does "do little probing" (emphasis added) mean?  As currently stated this would appear to be just an unsubstantiated opinion from a non-expert.

Page 31:  Reliance on perceptions rather than real data is a problem here as well.

Page 32:  The following statement is made but it is unclear as to the source of the data, "Of the NCC calls that are sent to EEOC offices, 62 percent are related to filing claims and the remaining 38 percent are related to general inquiries, EEOC overviews, or claim status."

Page 33:  The following conclusion is reported, "The NCC staff believes that if they are granted the right access to the IMS and the technologies are integrated, they would have access to the information they need to resolve a higher percentage of the calls that are not related to filing claims."  It would be helpful to have more detail about what this means and how increased access would work.  It is not clear, for example what "not related to filing claims" means.  If it means people that just need one of our publications for example, how would access to IMS help?

Much like the statement on page 32, the source of the following statement is not transparent.  "The NCC can improve the call resolution and thereby reduce calls to EEOC offices if they can resolve all the non-claim related calls. Only 66% of the calls that are related to 'non-claim' issues are resolved at NCC, the others are referred to EEOC offices."

Page 37:  The following statement would benefit from additional explanation or an example. 

According to one Office Director, the NCC has institutionalized the delivery of customer service and now the customer service orientation happens every day; it is not just a one-time initiative. The short turnaround requirements by the NCC have heightened timeliness in responding to customers by phone or mail as well as customer service expectations, delivery and follow through.

Page 38:  There are a number of conclusions presented without figures to support them. Statistics regarding, use of non English speakers, the number of field office that have reviewed their intake procedures, the number of customers that get a live person at the NCC and the number of CSR's that have different views of their roles would be more beneficial than the current characterizations.  Additionally, discussion of the lack of toll free numbers at the office level does not seem to be a complaint about offices but about the technology that they are provided.

Page 39:  The source of the data for Figure 7 is not clear.  The legend may be a bit misleading.  Should "EEOC Offices" actually be labeled "Asked to Contact EEOC"?

Page 40:  There are a number of observations here that urgently need to be supported by actual numbers.  See for example, 

  • Some CSRs are patient and listen to the customer for a long time without controlling the call, while others are impatient and cut off callers.
  • Some CSRs have selected a few scripts that they use most often and refer only to those for all answers.

In another observation, the evaluation compares NCC customer satisfaction to a federal government standards and then states,  "Since there are no baseline data, there is no way to know whether this is an improvement from pre-NCC."  Pre-NCC may be largely irrelevant; the contractor should have compared NCC satisfaction to satisfaction with EEOC intake by conducting their own survey.

Page 44:  The calculations here regarding EEOC staff hours saved by the NCC would benefit from a more comparative approach by examining NCC staff costs versus EEOC staff savings.  Is the 20 hours saved per office equal to the expenditures?

Page 46:  Recommendations start here.  Given the serious methodological problems, recommendations will have to be re-caste after the analytical portion of the report is corrected.

Page 48:  The study recommends training on soft skills, which we already mentioned need to be more specifically defined.  In addition, it the application of these skills require CSRs to deviate from scripts, the study needs to address recommendations if the EEOC retains its model of CSR involvement in interacting with customers vs. the alternative model recommended of CSRs possibly "taking" charges to reduce duplication of effort by EEOC staff.

Page 54:  A copy of the survey instrument should be included.

Page 57:  This appendix presents some definition of their projections - best case, most-likely and worst case.  There is no documentation or explanation of how these figures were derived.  It would be useful to have a more complete explanation of how they were generated.

Office of Communications and Legislative Affairs (OCLA)

Comments about the NCC Audit Report's findings as they relate to data provided by OCLA

I reviewed the data on pages 12 and 13 of the NCC Audit Report, which relate to data OCLA provided to the auditor, and I disagree with some of the representations of the data and conclusions drawn.

(1) The report indicates that OCLA's "data related telephone calls to the public EEOC telephone number over the past five calendar years have been kept following different assumptions and are not comparable." I provided the auditor copies of summaries of the logs used to collect the data on OCLA's phone calls for four years, not five years, FY 2002 and 2003 and calendar years 2004 and 2005. Beginning in 2004, the format of the phone logs changed, and specific categories of calls were added to the format. While the logs for 2002 and 2003 do not capture the same detail about the calls as do the 2004-05 logs, they do indicate how many calls were received, and therefore the data from those logs could have been used in the auditor's analysis.  However, they were not, which could have arguably affected the outcome of the report's conclusions. The  average number of calls received in FY 2002 was 2696 per month and the average for FY 2003  was 1984 per month. For calendar year 2004, the average number of calls per month was 2414. The average for calendar year 2005 is 1345. Thus, 2002, 3003 and 2004 phone call data are fairly consistent, and the 2005 data is more than 30% less than any one of the three preceding years. I, therefore, disagree, that the data are insufficient to determine that there is a reduction, and that the NCC is the reason for the reduction in the number of calls.

(2) The report indicates that "data from 2004 and 2005 are incomplete, but three months are comparable and each shows a reduction in total calls". And the report goes on to cite three months' worth of data in Table 6. It is true that there are some gaps in the data provided because the data is missing from OCLA's records. However, there are, in fact, six months of data that can be compared, showing a more complete picture to provide a more thorough analysis, which documents that OCLA received many fewer calls in 2005 than in 2004, between 31% and 61% fewer calls. See chart below prepared by OCLA after receiving the audit report:

OCLA Calls - all categories
2004 2005 number of fewer calls and %
January 2190 1164 1026 47%
February 2994 1812 1182 39%
March 2587 1016 1571 61%
May 1943 1342 601 31%
June  2502 1616 886 35%
November 2268 1124 1144 50%

(3) In addition, the report shows only three months' worth of partial data on calls related to field offices, and states that "the data showed a decrease for one month only and that there are not sufficient data to determine whether the call reduction can be attributed to the NCC."  However, OCLA provided the auditor useful data on two additional months ( February and March), and partial data on a third month ( January ), which were not referenced at all in the report. An analysis of this broader data shows a more complete picture and allows a more thorough, accurate analysis, documenting that OCLA received many fewer calls in 2005 than in 2004, which should be attributed to the NCC.  See chart below prepared by OCLA after receiving the audit report:

OCLA Calls received and referred to field offices:
2004 2005 number of fewer calls and %
January 140 90 50 36% fewer calls received
February 811  308 503 62%
March 665 189 476 72%
May 500 261 239 48%
June missing 308

(4) Finally, the report analyzed OCLA's controlled correspondence records between 2003 and 2005, and the records indicate that there was a 30 percent reduction in correspondence received in 2005 when compared to either 2003 or 2004 data. The report chooses to state that "the correspondence decreased over the past three years", which could suggest that the decrease has been a collective decrease. However, this it not the case.  The number of letters received in 2003 is essentially identical to those received in 2004, and the number of letters received in 2005 is 30% less than those received in 2003 or 2004. It also goes on to say that "while it is possible that complaints ( i.e., correspondence ) have reduced because the NCC is providing better customer service, it also may be that EEOC's website is enabling people to more readily contact the field office rather than going through headquarters." I am not sure why they conclude this, because the website provided field contact information in 2003 and 2004, so that should not be a factor in explaining the reduction of congressional correspondence. Furthermore, we're talking about letters addressed to Headquarters, which by their very nature are generally raising issues or concerns about the field's handling of their cases. It is, therefore, unlikely that a reduction in these letters is a result of the website enabling people to more readily contact the field rather than going through headquarters.

The report also chooses to selectively break down the analysis in six month increments before and after the inauguration of the NCC, and concludes that there was an 8 % increase in congressional correspondence for six months post-NCC when compared to the six months pre-NCC, and from this the report concludes that the data provide mixed results for the past 15 months. Frankly, the report's analysis of both the phone calls and the congressional correspondence, the use of cherry picked, selective data for the analysis, and the conclusions drawn from the analysis suggest to me that there was a concerted effort to search for negative results that would reflect poorly on the NCC. 

Office of the Executive Secretariat (Exec. Sec.)

Sent: Friday, April 21, 2006 3:12 PM
Subject: Re: NCC Draft Evaluation Report

Thank you for your comments. This more recent data will be considered.

>>> STEPHEN LLEWELLYN 4/21/2006 3:08:47 PM >>>

The Executive Secretariat recently completed its 2nd Quarter Report on the
Chair's Controlled Correspondence, and it contains more recent information
than was available when we were contacted by Job Performance Systems as part
of its evaluation of the National Contact Center. Thus, you may wish to
consider our more recent data in considering the NCC's impact on

For the 1st half of FY 2006, the Executive Secretariat processed 245
controlled correspondence items addressed to the Chair, compared to 279 items
for the same period during FY 2005. The decrease was greater for field
offices, which had a 16% decrease (126 items through March 31st compared to
150 items for the same time period in FY 2005) than for HQ, which dropped 8%
(119 items compared to 129).
The decrease is greater if the 1st half of FY 2006 is compared to the 2nd
half of FY 2005. The 126 items for the field represents a 30% decline from
the 180 field items for the 2nd half of FY 2005. The 119 items for HQ
offices is a 42% decrease from the 206 HQ items for the 2nd half of FY 2005.
While there may be several reasons for the decrease in incoming
correspondence, it seems likely that better customer service, increased
outreach, establishment of the National Contact Center and the availability
of information on the EEOC website may have resolved situations that in
previous years would have resulted in written
correspondence to the Chair.  Stephen LlewellynActing Executive
OfficerExecutive Secretariat

Office of Legal Counsel


TO: Aletha L. Brown
Inspector General

FROM: Peggy R. Mastroianni
Associate Legal Counsel

SUBJECT: National Contact Center Evaluation Draft Report

We have reviewed the draft report "The EEOC's National Contact Center: An Evaluation of Its Impacts."  We have two comments.  1)  On page 10, the report compares the total calls projected in the 2003 Assessment Report with the 2005 actual projected calls, and notes that the 2005 number is significantly smaller than the 2003 number.  We question the usefulness of the comparison and the significance of any resulting implications that the NCC has performed below expectations, since the NCC's operations were only a gradual phase-in beginning in March 2005 and, we understand, intentionally did not include all calls to Commission offices.  The 2003 Assessment Report projection covered was based on a field survey of all calls coming in to offices.  It does not seem appropriate to compare the two numbers without any explanation of the gradual phase-in of the NCC. 

2)  In a similar vein, we do not believe the report overall gives sufficient context to its findings and recommendations, which are based on information obtained in the fall and early winter of 2005, when the first operational year of the NCC pilot was in mid-development.  The report's findings reflect only the first 6 to 9 months of NCC pilot operations, and it should be revised to clearly state that, and to take into account the Commission's planned phase-in of the NCC operations.  As currently written, the report criticizes the NCC for failing to meet the expectations of a fully operational call center, but, to our knowledge, the Commission did not intend the NCC to be fully operational in the first 6 to 9 months of the pilot.

The attorney assigned to this matter is Kathleen Oram, who may be reached at extension 4681.


New York District Office

April 24, 2006


TO : Aletha L. Brown
Inspector General

FROM : Spencer H. Lewis, Jr., Director
New York District Office

SUBJECT : Comments of Draft NCC Report

Thank you for the opportunity to comment on the draft report on "The EEOC's National Contact Center: An Evaluation of its Impact." 

Given that the NCC was established to upgrade customer service, improve human capital effectiveness and deliver accurate and consistent service to its customers, our comments will focus on those three areas.

Upgrading Customer Service:

That the NCC provides a live person to all EEOC callers is definitely an improvement in our customer service for people initially trying to reach the EEOC.  The NCC was rated above average compared to other Federal call centers, and to service industries in the private sector.  For better or worse, it is becoming less and less a standard of customer service to get a "live person" on the telephone promptly when one calls a large organization or institution, not to mention that we can take calls in multiple languages.  Thus, in our opinion, the objective of "upgrading customer service" has been successful.

Improve Human Capital Effectiveness:

Assessing Impact:  First, it is difficult to assess the overall impact of the NNC on staff for a couple of reasons. First, with no baseline data on number of calls, volume of email and volume of snail mail the district and field offices received prior to the inception of the NCC, there is no objective data on the impact of the NCC on the field's workload.

Having said that, the concept of NCC as a vehicle to improve customer service was premised in part on the fact that the district/field offices were unable to handle all the calls they received - literally, many calls went unanswered, or,  our response was so delayed that the caller made a second call to us. The NCC has taken up the slack on the unanswered calls -- but since those calls were not taking up field resources to begin with, because they were beyond our capacity, NCC's picking them up does not significantly free up labor resources to do other things or reduce assigned workloads.[50]  However, were it not for the NCC, the thousands of calls they resolve would most likely continue to go unreturned by the filed.

In addition, given the statistics on the number of calls being answered by the NCC, and the fraction of that being forwarded to the field, I think it is safe to say that field offices are generally getting fewer calls -- however, the decrease is not felt as much by those offices who have continued to lose staff and are distributing the calls they still receive among fewer people. 

Meeting Staff Expectations on the Value of the NCC:  The report points out that while staff did/do understand the purpose of the NCC, their expectations about the value of NCC have not been met.  I think this is key to the report.  In our opinion, the NCC is doing what it was asked to do - and in that sense, it is not accurate to say the NCC has failed to deliver.  However, because of the ever-decreasing body-count in district/field staffing and the corresponding increases in workload on those who remain, establishing the NCC created an unrealized hope that it might relieve some of the intake pressure. 

We are also generally satisfied that the NCC is doing what it was instructed to do vis-a-vis the questionnaires.  But, for example, in New York City,  despite the help from NCC, we still receive 100- 115 pieces of direct mail inquiries a week, 45-50 emails/EAS questionnaires from NCC, 60 - 80 messages a week on our own Intake phone line, and an untracked number of calls on our general phone line.  One hundred percent of these inquires require intake follow-up  before a charge is filed. If each intake contact and the associated IMS data entry averages 45 minutes (some shorter, some longer), our 18 investigators would conservatively spend one-fourth of their time on intake, and in fact, we estimate it over 30%. Thus, the negative assessment of the NCC impact seems to be more a ventilation of frustration that the NCC cannot do more, which is not the same as saying it has not accomplished its intended mission. 

Deliver Accurate and Consistent Service to Customers:

The high rating by customers of the NCC suggests that overall, customers are getting consistent and correct information from the NCC.  As that was one of the aims of the NCC, the NCC has achieved this objective and that's a plus for the EEOC.


We believe that the improved customer service being provided by the NCC is a good start and the concept should be built upon, not scrapped, at this point.  Some of the functions we would especially like to see studied for possible take-over by the NCC are:

  • the disclosure process under FOIA and Section 83
  • mailing of comprehensive intake questionnaires for all field offices - to eliminate need to mail second questionnaire to PCPs after receiving the EAS from the NCC.

cc: Nicholas Inzeo, Director, Office of Field Programs
Cynthia Pierre, Director, Field Management Programs
John Schmelzer, Acting Director, Field Coordination Programs
Ed Elkins, Project Manager, National Contact Center

Memphis District Office

April 21, 2006


TO: Aletha L. Brown
Inspector General

FROM: Katharine W. Kores
District Director

RE: NCC Evaluation Draft Report Comments

I appreciate the opportunity to comment on the NCC Evaluation Draft Report.

My first concern regarding the NCC Evaluation Draft Report is its failure to include the disclosure of possible conflicts of interest involving the main sub-contractor, Convergys.  The report does not indicate whether the Inspector General, or the contractor, Job Performance Systems, Inc. were aware of the fact that Convergys was an unsuccessful competitor for the EEOC NCC contract. Since Convergys "took the lead in evaluating the operation of the NCC,"[51] the report should include an explanation of the action taken by Convergys to enable it to approach this project from the position of an unbiased evaluator and not a business competitor of NCS Pearson Inc. There is also no indication in the report that either the IG or JPS knew that the EEOC had filed an enforcement action against Convergys alleging that it violated the Americans with Disabilities Act.  Again, the report should include an explanation why this ongoing litigation and adversarial relationship[52] do not create a conflict preventing Convergys from performing an unbiased review of the EEOC NCC.

My second concern is that the recommendations in the report do not seem to logically flow from the findings of the report and my understanding of the purpose of the EEOC NCC.  My understanding of the major purpose of the NCC was to increase the EEOC's ability to provide reliable customer service to unsolicited callers.  Many of the findings in the report indicate that the goal of providing better customer service is being achieved.  For instance, the objective finding that callers are able to speak to a live person from 8:00 a.m. to 8:00 p.m. and that over 100 languages can be accommodated is an obvious expansion on the EEOC's ability to serve the public prior to the NCC pilot.  There is also the finding that staff believes the NCC is picking up calls that had previously been dropped because of technology issues.  The Customer Satisfaction Index rates the NCC above average compared with other Federal agencies and private sector service industries.  Given these and other indications throughout the draft report that enhanced customer service is being provided, the recommendation that the NCC pilot be scrapped is not reasonable or helpful.

The recommendation to scrap the NCC pilot seems to stem mainly from the report's finding that the number of calls handled by the NCC is lower than was projected prior to the commencement of NCC operation.  However, there are other findings in the report which indicate that this number of calls can reasonably be expected to increase.  The low number is likely influenced by the very early stage at which the NCC operation was being reviewed.  The NCC had been in operation less than six months when the review period began and less than one year when the review period ended.  The report finds that there is a need for increased awareness of the toll free 800 number and that during the review period many of the EEOC field offices had not yet put a message on their voice mail redirecting callers to the NCC.  It also finds in June 2006, the Commission will launch a major advertising campaign to increase awareness of the 800 number. 

The findings made in the report which relate to the NCC's ability or failure to take charges, categorize charges, counsel callers or have a complete knowledge of the laws against employment discrimination imply that a different function was intended for the NCC than that of a Contact Center.  The functions listed are properly reserved to EEOC personnel.  It was not and should not be, in my opinion, the Commission's goal to have the NCC perform these important EEOC responsibilities.  The NCC was never intended to replace the EEOC.  The purpose was and should be, to provide a reliable means by which members of the public can contact the EEOC and obtain accurate, basic information regarding the EEOC and the laws against employment discrimination. I find most helpful the recommendations which involve training for CSRs, additions to FAQs and scripts, coordination of technology and improved and expanded record-keeping. It is unfortunate that the draft report takes such a negative tone rather than focusing more constructively on these opportunities for improved service by the NCC.

cc: Chair Cari M. Dominguez
Lea Guarraia, COO
Headquarters Directors
District Directors
Nick Inzeo, Director OFP
Ed Elkins, Senior Advisor to the Chair
Cynthia Pierre, Director FMP

Charlotte District Office

Sent: Wednesday, April 12, 2006 5:54 PM
Subject: Re: Fwd: Total Taste of Technology at Call Center Demo
&Conference Orlando

Importance: High

** High Priority **

Hi Ruben,

Thanks for your insightful comments and the referral to
I will ensure that your comments and referral are circulated among the team. We
appreciate your thoughtfulness and continued commitment to improving the


>>> REUBEN DANIELS 4/12/2006 2:19:15 PM >>>

As I've been studying the draft report and opened this, I immediately thought
of you and the potential this may help us devise improvements to the NCC. As
a proponent and supporter of increased use of call center technology, this
program seems to provide some insight to several key issues raised in the
report that need to a part of the decision of how we get better benefit from
the program going forward.

It always struck me that the failure to integrate the NCC with IMS was
fraught with potential for problems and lost efficiency. That is why the
section on integration with in house systems caught my eye. However, going
back to pre-NCC status strikes me as the worst possible solution.

>>> 4/12/2006 2:01:17 PM >>>

Dear Reuben Daniels, Today's dynamic Call Center Technology is not just for
techies anymore. People from across the call center industry are taking
notice of technology's surge and scrambling to see how the latest technology
can help their call center. Register today to experience a total taste of
technology while discovering new ways to make your call center shine!

Call Center Magazine is proud to offer subscribers an additional 10% off
premium and standard conference packages. You'll save an additional $200 when
you register before this Friday, April 14th, 2006*So don't delay, that's over
$300 in savings!

Call Center Demo & Conference
May 15-17th, 2006
Hyatt Regency Grand Cypress
Orlando, Fl

Register today, early bird savings on premium and standard conference
packages ends April 14th. Register online or simply call 800-441-8826 or 415-
947-6130 (M-F 9:00am-4:00pm) Be sure to use priority code: EM11

Technology to a "T"

Call Center Demo & Conference Orlando provides numerous ways to take to your
technology knowledge from some to tons in just a few days!

Test Drive the Latest Technology on the Show Floor Find out what new and
exciting technology can enhance your call center from knowledgeable
exhibitors in a friendly, non competitive environment.

Take in a Technology Presentation at the Solutions Showcase Brand new to Call
Center Demo & Conference, the Solutions Showcase will provide an opportunity
for you to hear about case study successes and witness educational technology
presentations on the show floor. Find answers (and solutions!) to your
biggest call center dilemmas from some of the industry's brightest.

Talk Technology with Peers at Networking Events From cocktail hours to
association gatherings, there is plenty time to talk technology with your
industry peers and leaders.

Tour Call Centers with ICMI'S Insight Tours Get an inside look at the call
centers of Disney, Connextions, and SunTrust, a company using their own,
homegrown Sales Desktop and Platform Systems to compliment the latest
industry technology.

Turn to the Experts for the latest Technology News and Education Whether you
are looking to get your feet wet with a little technology education, or
diving right into a technology-only conference track, there is plenty of
information available in our Pre-Show Conferences, Conference Tracks, Keynote
Addresses, and much more.


This message is being brought to you because you are a Call Center Magazine
subscriber and you registered, attended or expressed interest in CMP's Call
Center Events. If you do not wish to receive such mailings in the future,
please Respond Here


National Council of EEOC Locals No. 216, American Federation of Government Employees, AFL/CIO (Union)


Gabrielle Martin, President
303 E. 17th Ave., Suite 510
Denver, CO 80203
Telephone 303.866.1322
Facsimile 303.866.1900

April 20, 2006

Larkin Jennings
Office of Inspector General
1801 L Street, NW
Washington, DC 20507

Re: Draft Call Center report

Dear Mr. Jennings:

Thank you for the opportunity to provide comments on the Draft Report on the Draft National Call Center (call center).  There are a number of findings with which I agree.  In addition, there are a few concerns I wish to express about areas where the report could use additional focus.

Initially, I must say that the report is comprehensive, addressing the background behind the move to establishing the call center, the various modes in which it has operated and the reasons it is failing.  Unfortunately, EEOC implemented the call center despite strong predictions of these outcomes.

The Report Findings

I agree with the findings that the call center has little, if any positive impact on the work of the EEOC offices.  Moreover, I agree that the call center is neither cost effective, nor is it providing any relief to offices.  The call center results in way too much duplication of work for offices, the number of calls to an office have not declined, and the amount of work has stayed the same.  Further, the work quality is highly inaccurate, creating in many cases, more work than had the call initially come directly to an EEOC office.  Seven days of training would not provide enough understanding of EEOC's laws, processes and procedures to allow staff to provide the necessary service to the public.  The EEOC should never have undertaken to reduce civil rights enforcement to key words and scripts and I think that the report needs to address the inability to reduce substantive rights to the ability of someone to find the correct key word in six minutes or less.  Continuing the call center will result, at best, in paying for an extremely expensive message service.

EEOC cannot fix any of these outcomes by continuing to throw money at the problems.  So, I do not agree with the recommendations to fix the call center.  Frontloading its resources into answering the phones through the call center and getting no other reasonable value for the money, as can occur if the employees are working in EEOC's offices hinders the remainder of the investigative process.  Further, it is unreasonable to require that a case receive priority handling because someone used the call center.  Offices must address all customer inquiries.  EEOC can fix its customer service problems, however, by returning customer service to the local offices.

A Missing Option:

Hiring Investigative Support Assistants (ISAs) in various offices throughout the country to serve the intake customer service functions should receive more direct and specific focus in the report.  Originally developed to assist Investigators in the intake and investigative processes, ISAs counsel members of the public, screen potential charges, and handle mail inquiries.  ISAs also provide additional clerical support to offices.  Since the ISA position also served to provide a cadre of employees who could be promoted into Investigator vacancies, EEOC was able to capitalize on the training and resources invested in these employees, as well as on the expertise gained by the ISAs.  However, EEOC cannot recoup the expense of training and resources expended on revolving door call center staff who always will have only limited expertise.

Other Areas the Final Report Should Address:

1. The final report should highlight the extreme cost of continuing to do business in the same way.  As structured under the current contract, the mission of call center employees is to get off the phone as quickly as possible - inaccurate information and passing the call along had been predicted.  As the report indicates, most of the training is geared toward that result.

2. The final report should mention that expending any resources to train employees at a call center with a high rate of turnover would be a huge waste of money and would sap resources.  Given the almost unanimous survey results that call for more investigative and support staff to handle calls, mail and investigations in offices, the report should reflect that diverting additional funds to train the telemarketers would be irresponsible and a tremendous waste of precious resources.  Finally, the final report must note that training call center employees never will provide them with a stake in the agency's mission since they remain so far removed from the investigations.

3. The final report should not recommend that the call center take on any additional functions.  Several studies of call centers in the federal government question the value of diverting resources to call centers when, as here, the public cannot directly contact the agency's offices and when poor quality service and inaccurate information is being given out.  A more recent GAO study questions the value of such centers when quality control mechanisms are insufficient.  Again, the draft report study does not mention those studies or make specific recommendations.  However, despite the wisdom of those studies, ignored when the call center was implemented, I repeat my concern that additional money not be thrown at the call center to fix the EEOC's debacle.

4. Any recommendation that EEOC send more work to the call center must mention EEOC's promise to keep its offices accessible to the "unsolicited" public.  From EEOC's first mention of instituting a call center, the public was concerned that EEOC would close its doors to members of the public seeking to speak with an employee about claims of discrimination.  The EEOC indicated several times that it would not prevent the public from directly going into an office or contacting an office.  So any recommendation to direct all unsolicited calls to a call center violates EEOC's commitment to retain open access.  Moreover, as the draft report found, the call center adds a layer of bureaucracy that victims of discrimination must endure.  Why would we continue to push this layer of bureaucracy?

5. Since EEOC makes much of the fact that the initial contacts with the public are part of the investigation, the report must acknowledge that EEOC would be sending work it has deemed "inherently governmental" to the call center.  EEOC would be required to comply with all regulations concerning contracting out, should it continue the call center debacle.  The final report should note that any recommendation to sever unsolicited ties with a government office, the purpose of which is to serve the public, would violate the public trust.

6. The final report must further highlight the small number of calls handled by the call center, at an exorbitant cost.  The cost benefit analysis would weigh against continuing the call center. 

7. The final report must change the negative connotation of Pre-NCC activities.  To the extent that the Draft Report suggests that having experienced EEOC employees handle the calls and work directly with the public is a bad thing, this must be changed.  That premise is faulty for several reasons.  First, unlike with the call center, callers were not bounced from a live body to another office.  Callers either spoke to someone, or left a message.  Next, comments about the merits of the case were not being made, nor could they be misconstrued due to the number of people involved in the call center process.  Next, more attention must be given to the faulty underlying data.

The report should specifically mention that EEOC's method of attempting to obtain baseline data was seriously flawed in several respects.  EEOC cannot determine what it asked to come up with the information, but its directions were flawed, and there was no mechanism to determine how offices interpreted the instructions.  So the negative connotation is not deserved.  The recommendation in the final report should be to increase staff to address the ability to provide a level of service in each EEOC office.

8. The final report should address more specifically, the cost factor for technologies involved in attempting to obtain accurate data through the call center.  As the draft report notes, continuing to pursue the call center will require substantial commitments of money.  Although the draft report notes that the current technologies available at the call center are not being used effectively, it does not address or identify potential downsides to throwing more money at the problem.  The final report should acknowledge that sinking more money into these technologies prevents EEOC from putting resources into technology available for its employees.  EEOC's employees are required to conduct complex investigations, yet the funds spent on the call center limit EEOC's abilities to improve its extremely limited technologies to assist with the work.  All EEOC's employees are impacted by the severe lack of technology to perform their work.  For example, while EEOC has computers that are used when speaking with members of the public, it has not invested in improving its current technologies to allow it to capture much of the information it pays the call center to capture.

9. As for impact on customers, the draft report accurately notes that the impact statements are meaningless given the lack of front end data.  The final report could highlight that despite questions being asked early on about the accuracy of its premises, EEOC never sought to obtain an accurate record in this regard.

10. One final comment about the call center - the Agency's efforts to influence the outcome should be more prominent.  The agency was aware of when the survey would be released and shamelessly attempted to influence the results.  This should not be relegated to a passing remark.

Please feel free to contact me should you need to discuss any of the matters herein.


Gabrielle Martin, President

[36] Page 40, # 6

[37] Page 37, # 2

[38] Ibid

[39] Page 17, # 1

[40] Ibid

[41] Recommendation 3.  2003 NCC Assessment Report, p. 51.

[42] There was period  when the software did not accurately capture this data.  The number used is based upon the data that was collected with an extrapolation of that data for the period when the software malfunctioned (See our comment on response to the item on Page 11. Table 4).  .

[43] See for example, Practical Program Evaluation for State ant Local Government Officials. by Harry Hatry.  Richard E. Winnie and Donald M. Fisk. 1973.

[44] It is assumed that this evaluation was assigned to the Office of the Inspector General due to the financial relationship between the NCC and the Commission.  For example, see the OIG requirement to "recommend policies for, conduct, supervise, and coordinate other activities carried out or financed by the Commission for the purpose of promoting economy and efficiency in the administration of, or preventing and detecting fraud, waste, and abuse in those programs and operations" (  There seems to be little focus on this requirement.

[45] See for example,  Pamela Horst, Joe N. Nay, John W. Scanlon and Joseph S. Wholey, "Program Management and the Federal Evaluator", Program Evaluation:  Patterns and Directions, Eleanor Chelimsky, editor, American Society for Public Administration, 1985.

[46] A useful example of such an exercise can be seen in Measuring the Effectiveness of Basic Municipal Services, Initial Report, The Urban Institute and International City Management Association, 1974.

[47] For the purpose of these comments, references are made to incremental implementation which refers to the fact that the NCC was not implemented at the same time and in the same way in all field offices.

[48] See for example, Laura Langbein's Discovering Whether Programs Work:  A Guide to Statistical Methods for Program Evaluation, Goodyear Publishing, Santa Monica , California, 1980. p. 40.

[49] Clearly some effort needs to be added to control for differences in the size of the populations being served by our District Offices.  Per capita refers to this type of control such as NCC contacts per 1,000 individuals in the workforce for the District Office's jurisdiction.

[50] In our office, we have redirected  some resources from answering phones to receiving and assigning  questionnaires that come from the NCC. 

[51] Draft Report, page 4, footnote 2.

[52] On Friday, April 14, 2006, a federal jury returned a verdict finding Convergys violated the ADA and liable for over $114,000 in back pay and damages.