Header Image
Amanda Shapiro

Amanda Shapiro, M.S.

Senior Consultant
Facebook Twitter Linkedin
Amanda Shapiro is a Consultant at DCI Consulting Group. Amanda joined DCI as an Analyst in September 2010. Amanda provides clients with guidance on EEO and affirmative action statutes and regulations to meet the OFCCP compliance requirements. Amanda’s primary focus is to provide support for clients on affirmative action planning, compensation equity analyses, audits, and employment discrimination. She also is involved in testing and selection projects, including job analyses, test validation, and validity generalization research. Amanda has job analysis experience with a variety of industries including professional-based consulting roles and general laborer positions.

Before joining DCI, Amanda worked as a Research Specialist for the Association of American Medical Colleges (AAMC). At AAMC, Amanda worked on the 5th Comprehensive Review of the Medical College Admission Test (MCAT), providing research and statistical support for the members of the advisory committee. She has experience with high-stakes test development, facilitating research meetings, communicating sensitive information, and managing large scale job analysis assessments.

Amanda earned a M.S. degree in Industrial/Organizational Psychology from Radford University and a B.S. degree in Psychology from the University of Florida.

Amanda Shapiro ’s Recent Posts

The 32nd Annual Conference for the Society of Industrial and Organizational Psychology (SIOP) was held April 26-29, 2017 in Orlando, Florida. This conference brings together members of the I/O community, both practitioners and academics, to discuss areas of research and practice and share information.

Many sessions cover topics of interest to the federal contractor community, including employment law, testing, diversity and inclusion, big data, and regulations for individuals with a disability.

DCI Consulting Group staff members were well represented in a number of high profile SIOP presentations and also attended a variety of other sessions worth sharing. Notable session summaries and highlights can be found below; you may use the list below to navigate to a particular summary.

Analytics has a Seat at the Table: Now What?

A panel of I/O practitioners discussed considerations and challenges when building workforce analytics functions, including critical partnerships, cultural change, understanding stakeholders, and maintaining a business focus. These experts shared tips and tricks for implementing workforce analytics in their organizations, including:

  • Need an understanding of the systems, stakeholders, and IT in order to shape the input and pulls to improve data output
  • If organizations are siloed, it’s important to make connections across groups to ensure you can make the case that workforce analytics are needed
  • The role of the HR Business Partner is changing – analytics will be a part of their role if not already. Companies can provide skill building workshops to ensure knowledge and skills are appropriate. Online courses for statistics that can also be utilized.
  • If certain HR groups are opposed to utilizing analytics, advertise the benefit to HR stakeholders
    • Frontline HR: analytics can streamline work, provide better results, give better insight into workforce, candidates, etc.
    • Strategic HR: analytics can help balance a culture of constant deliverables, defining relationships by leadership and analytics, translating vision into action.
    • HR leadership: analytics can build the business case, define ROI, provide a common language
  • If rolling out a new process or program, use pilots to your advantage to make the business case, design research for short and long term results, and have two-way feedback loops with stakeholders. Leverage interviews to round out research and implementation – know your users and stay up-to-date on what’s important.

Burden of Proof: Can I-Os and Employment Counsel Successfully Collaborate?

A panel of labor attorneys and I/O psychologists, both practitioners and educators, spoke on the complexities of working on employment law issues and challenges in organizations. Listed below are several recommendations, mostly focused on considerations when developing, validating, or selecting a (off-the-shelf) test.

  • Federal agencies are focused on the search for less adverse alternatives for tests. I/O panelists shared that this has been a challenge in getting this search addressed fully in the technical report; this is especially a challenge with vendors who only have one or few tests and they don’t have alternates available.
  • Panelists cautioned about using automated resume screens and machine learning. Without perfect correlations there’s a high likelihood that at least one resume will make it through that is not good – this can limit defensibility of process. Or the tools could be weighing for words like church, Africa, etc. which would create risk. Also, how do you validate a test that is constantly changing? A panelist from PepsiCo noted that they opted to not do this – they don’t want to be the first in the courtroom.
  • Panelists also cautioned about working with vendors who specialize in gaming assessments – these vendors often don’t have I/O or validation knowledge and skill set.
  • Panelists recommended that organizations get legal involved early when selecting a vendor and/or new test – involve them in the RFP process.
  • It was also noted not to forget about international considerations, especially data privacy. It’s recommend that the organization involve counsel from each country.

Making Better Business Decisions? Risks and Rewards in Big Data      

This session was moderated by DCI’s Dr. Emilee Tison and highlighted both the risks and rewards in using big data techniques to inform employment decisions. As data analytic techniques continue to evolve and incorporate increasingly sophisticated methodologies, employers are cautioned in using such techniques with little to no transparency on the how or why results are being calculated. Although big data approaches in employment decision making can offer great benefits in terms of overall cost, time constraints, more positive candidate experiences, and better statistics, it is imperative for employers to also weigh these benefits against any legal and practical considerations. Such considerations may include privacy/confidentiality concerns, and also the increased potential for such techniques to lead to adverse impact against protected groups if variables are not fully validated or researched at the onset.

Big data techniques are only increasing in popularity and will only continue to evolve at a rapid pace moving forward. Although these techniques can seem very appealing to an employer in informing decisions on the front-end, they can prove very difficult to defend in court at a later date. For this reason, companies are advised to be cautious in implementing any new and improved techniques, as they will be asked to explain how the information being used is job-related. The major takeaway here is for employers to ensure they have the right justification for what they are doing before they incorporate any big data approach into business decisions.

Solving the Law Enforcement Staffing Crisis

DCI’s Dr. Michael Aamodt, together with representatives from both the U.S. Secret Service and U.S. Customs and Border Protection, lead an open discussion on the challenges that many law enforcement agencies are experiencing present day in both attracting qualified candidates to their organizations and meeting demanding staffing needs. Often times, demand is high, but agencies struggle to fill positions as a result of small applicant pools and applicants who do not pass the background check stage in the process.

Discussions focused primarily on causes of the applicant shortage (i.e., working conditions, job location, and strict policies/requirements), strategies for assessment and recruitment, and changes that may be warranted with regard to the background check process. It was also suggested that agencies review their current policies and procedures, and update where possible any strict policies that appear to exclude otherwise qualified applicants (i.e., work to reduce a strict tattoo or piercing policy).

I/O’s Role in Advancing HR in the Big Data Charge

Panelists in this session included DCI’s Dr. Eric Dunleavy and others from diverse backgrounds in both applied research and analytics departments who discussed recommendations with regard to how the I/O community can advance the current state of human resources management.

As big data continues to become increasingly prevalent in the world of HR, I/O psychologists find themselves in a position to lead the big data charge and contribute their knowledge and expertise in this realm. Companies are tasked with balancing a great deal of risk associated with the use of big data techniques in the employment decision making process, and establishing meaningful and legally defensible models requires a lot of human touch and input. New tools may make compiling and analyzing big data simple, but the tools themselves don’t tell us why something occurred and what we should do based on those results. This fact represents an opportunity for I/O psychologists to assist HR professionals in turning results into actionable information.

Optimizing Validity/Diversity Tradeoffs in Employee Selection

The session entitled “Optimizing Validity/Diversity Tradeoffs in Employee Selection” included three presentations that discussed alternative ways to handle the tradeoff between selection procedure validity and adverse impact. Typically, high validity is associated with high adverse impact, and this session focused on ways to maximize validity while keeping adverse impact at a tolerable level. Methods included algorithms to identify biodata scoring systems that balance this tradeoff, pareto optimal methods for weighting the selection components in a composite, and methods to estimate the extent to which pareto-optimal weights established in one sample generalize to other samples.

This session highlighted the challenges inherent in developing a selection system that is both highly predictive and free of subgroup differences. Art Gutman, the discussant for this session and a frequent contributor to DCI’s blog, commented on the approaches presented within the context of legal precedent. He noted that while these approaches may seem reasonable from an academic/research perspective, there may be sizeable hurdles to overcome in the courtroom. DCI will be on the lookout for the utilization of these approaches.

O*NET Based Research: Leading Edge or Wasted Opportunity

This symposium entitled “O*NET Based Research: Leading Edge or Wasted Opportunity?” showcased novel uses of O*Net data. A presentation by DCI’s Dr. Kayo Sady examined the importance of various personality characteristics in predicting salary across different industries. For example, he found that extraversion is highly valued in the tech industry. Dr. Sam Holland presented a tool that explored O*Net data from a network perspective to help job seekers find jobs that offer many potential career options. Using this tool, one can identify both the most advantageously situated jobs and the patterns of characteristics associated with those jobs.

Leading the Charge: IGNITING Veteran–Workforce Integration Solutions

A diverse panel consisting of representatives from academia, The SHRM Foundation, the military, consulting, and employer perspectives led a discussion touching on a specific timepiece in a veteran’s transition into the civilian work life. Challenges facing veteran transitions are broad in nature, and the lack of available data to assist with researching veteran outcomes as they transitioned to civilian life was discussed. Resources to help recruit and retain veterans have been published by the SHRM Foundation.  Future research is underway to help understand retention challenges for veterans in organizations.

Annual EEOC/OFCCP Practitioner Update

DCI’s Mike Aamodt and Joanna Colosimo were joined by colleagues from Fortney Scott, LLC and Capital One to update the SIOP community on current EEOC and OFCCP enforcement trends and implications from the presidential election. The panel focused on current pay equity enforcement trends, strategic outreach and recruitment for protected groups, and selection issues from an EEO perspective.  Best practice takeaways from the session highlighted the importance of collaborating with legal counsel, conducting proactive pay equity studies, and proactively monitoring the effectiveness of selection, recruitment and outreach programs on protected groups.

Mentoring for Women in I/O: Career Changes, Interruptions, and Transitions

In a moderated panel session, the presenters discussed issues for women in I/O that arise due to non-linear career trajectories. For example, job changes often are seen as resulting from indecision rather than strategy. Panelists and moderators were:

  • Silvia Bonaccio – University of Ottawa
  • Irini Kokkinou – SCAD
  • Kea Kerich – Marriott International
  • Alison L O’Malley – Deere & Company World Headquarters
  • Tatana M. Olson – United States Navy
  • Kristen M. Shockley – University of Georgia
  • Jane Wu – IBM
  • Lynda Zugec – The Workforce Consultants

The primary focus of the session was on anecdotes of the panelists’ own career trajectories. Additionally, the panelists were asked to respond to the following questions:

  1. What factors led you to a non-linear career path?
  2. What challenges did you face in pursuing a non-linear career path? How did you handle these challenges?
  3. What opportunities resulted from your non-linear career path?
  4. What skills did you develop from implementing your career change(s), interruption(s), or transition(s)?
  5. What advice do you have for graduate students going on the job market or for more experienced professionals considering interrupting/changing careers?

In the final portion of this session, panelists met with groups of audience participants to discuss in more detail their experiences and advice.

Innovative Adverse Impact Analysis

This expert panel covered a variety of topics related to complex adverse impact analyses. The panelists shared innovative approaches to constructing analytics when responding to intricate personnel decision-making situations. Moderators of the panel were Scott B. Morris (Illinois Institute of Technology) and Eric M. Dunleavy (DCI Consulting Group). Panelist topics included the following:

  • Donald R. Deere (Welch Consulting) discussed options for accounting for non-neutral analysis of cases in which multiple applicant records exist for a candidate.
  • Daniel C. Kuang (Biddle Consulting Group) discussed use of a composition analysis through binomial statistics to measure the difference of % selected versus % expected.
  • Fred Oswald (Rice University) discussed alternative measures to using impact ratio. Such measures included: odds ratio, Phi, absolute selection rate difference, Cohen’s h, and shortfall.
  • Richard F. Tonowski (U.S. Equal Employment Opportunity Commission) discussed the utility of measuring practical significance, sharing examples of when the addition of practical significance is critical to cases seen by the EEOC.

Alternative Session:  New Directions:  Enhancing Diversity and Inclusion Research and Practice

In a thought-provoking, alternative session entitled, “New Directions:  Enhancing Diversity and Inclusion Research and Practice,” five scholars and practitioners took stage to discuss the current state of diversity and inclusion research and how to better align research with practice. The following quotation, which was cited twice during the session, really resonated:  “Despite a few new bells and whistles, courtesy of big data, companies are basically doubling down on the same diversity and inclusion approaches they’ve used since the 1960s.” (Frank Dobbin, Harvard University; Alexandra Kalev, Tel Aviv University). Dr. Alice Eagly, Northwestern University, argued that right now there is a large gap between what research shows (academia) and what generalizations policy makers and practitioners are using. She challenged researchers and practitioners to be better, more honest stewards of diversity and inclusion knowledge so that “fake news”-esque generalizations propagated by advocacy groups, in particular, do not hinder forward progress in this field.

Up next was Julie Nugent, Vice President of Research at Catalyst, who led a discussion on what inclusion and exclusion feels like for employees in organizations. The Catalyst study found that employees feel included when they are valued for their specific contributions (uniqueness) and are welcomed among their peers (belongingness). In contrast, feelings of exclusion in employees arise if they are devalued/dismissed for their unique characteristics, especially their gender, race/ethnicity, nationality, age, religion, and sexual orientation.

Last, Dr. Gabriela Burlacu, SAP SuccessFactors, wrapped up the session by explaining how each step of the employee life cycle and the personnel decisions made therein, from applying to being hired, paid, trained, promoted, and terminated, can be better tracked and managed by technology – technology that when used appropriately can mitigate the threat of unconscious bias.  She spoke of “technology nudges,” such as defaulting bonuses and pay increases to absolute values rather than percentage increases based on base salary.

At DCI, mitigating unconscious bias through the creation and administration of structured personnel systems is something we assist our clients with every day, especially as it relates to EEO risk. We will continue to follow developments related to this type of technology, as well as other forms of “big data,” used to make personnel decisions and keep you posted with our recommendations.

Symposium/Forum:  Novel Workplace Diversity Interventions:  Field Experiments with Promising Results

In this well-attended session entitled “Novel Workplace Diversity Interventions:  Field Experiments with Promising Results,” five researchers and practitioners presented on the effectiveness of four field experiments in promoting positive diversity-related outcomes and improving diversity management in organizations. Dr. Alex Lindsey’s research focused on diversity interventions such as perspective taking (to produce empathy), goal setting (to increase internal motivation) and reflection (to produce guilt) and their effects on pro-diversity attitudes and behaviors. He found that the reflection intervention was most effective in increasing internal motivation and pro-diversity behaviors, but it also promoted anger and frustration a week later. Dr. Lindsey admitted future research (perhaps into more hybrid reflection and goal-setting activities) might be necessary to reach resistant diversity trainees in organizations.

In one more example, Jose David and Carolyn Fotouhi from Merck presented on their company-wide women’s sponsorship program. Jose explained that Merck had recently come out with key healthcare products in oncology, HPV, and insomnia, but their sales were lagging, so they decided to revamp their operating model. After some internal research, Merck found that females comprised less than 1/3 of incumbents in critical roles and slightly over 1/3 female incumbents in director-level roles, yet females make 90% of healthcare expenditure decisions and make up more than 50% of the patient base. Thus, Merck developed an advancement-focused women’s sponsorship program that allowed women protégés of all ranks to interface with women in leadership positions through one-on-one virtual sessions and networking circles.  They found that women protégés experienced a higher rate of internal movements (9.5% more females in critical roles and 3.5% more females in Director-level roles) and greater representation of females in succession planning slates. It will be exciting to note if increases in female representation in critical/director-level roles translates into increased key product sales. Perhaps only time will tell.

Caught on Video: Best Practices in One-Way Interviewing

The “Caught on Video: Beat Practices in One-Way Interviewing” session kicked off with a definition of one-way interviewing and how it differs from traditional two-way interviewing.  In a nutshell, one-way interviewing is the practice of utilizing video recording to capture applicant’s responses to interview questions that can then be scored at a later time.

One-way interviewing is still relatively new and not widespread in practice.  Therefore, the panelists recommended some best practices based on their experiences including:

  • Filming actual recruiters (as opposed to actors playing recruiters) asking the interview questions
  • Creating a behavioral indicators checklist for recruiters to quantitatively and systematically rate applicants
  • Developing questions through a rigorous process which may include a job analysis
  • Continually rotating questions to prevent question-sharing among applicants

According to the panelists, initial reactions from applicants have been positive.  For example, they like the flexibility of one-way interviewing.  Recruiters also enjoy the method for its flexibility (some recruiters watch the videos while on the treadmill!) and like that the interviews have a clear scoring rubric.

What is Machine Learning? Foundations and Introduction to Useful Methods

The session entitled “What is Machine Learning? Foundations and Introduction to Useful Methods” was targeted at individuals with basic to intermediate understanding of machine learning.  Supervised vs. unsupervised models of machine learning were discussed.  As a best practice, the panelists recommended cross-validation to estimate the R2 shrinkage.  Optimally, the data would be broken into a training, validation, and test set so that the researcher may develop, train, and then test the final model.

There are several important concerns that may impact machine learning models.  For example, overfitting occurs when the model predicts data too well for the sample and does not generalize.  Another concern with machine learning is the bias/variance tradeoff.  This was likened to reliability/validity in that high reliability is akin to low variance and high validity is akin to low bias.  Finally, the curse of dimensionality refers to the fact that more predictors require a bigger sample.  As tempting as it may be to add many predictors, it’s prudent to keep in mind what your sample size is when building a machine learning model.

Applicant Reactions During Selection: Overview and Prelude to a Review

With the rise in technology-based selection tools, the panelists in the “Applicant Reactions During Selection: Overview and Prelude to a Review” session claim that research on applicant reactions (ARs) has not kept pace with the advances in the types of selection tools used in practice.  Research results suggest that applicants favor technology-based testing over traditional media (e.g., Potosky & Bobko, 2004).  However, when it comes to face-to-face interactions (e.g., interviews), individuals still prefer they be in-person without a technology interface (Straus, Miles, & Levesque, 2001).

Another topic centered on why employers should be concerned with ARs.  The argument is that because ARs are not directly tied to measurable performance on the job for hired employees, it may not matter to employers how applicants perceive the selection process.  However, the panelists argue that ARs do matter.  Further, AR research has been limited in that there are several moderating variables, including selection context variables (e.g., hiring expectations and selection ratio, job desirability), organizational context variables (e.g., organizational size), and individual-level variables (e.g., personality) that have not been examined but that may significantly impact ARs in differing ways.

The Pre-conference Master’s Consortium: Advantages & Strategies for Building Your Business Acumen

A pre-conference session on “Advantages & Strategies for Building Your Business Acumen” was presented by Keli Wilson with DCI Consulting Group. The basis for this talk was to help I-O psychologists entering the workforce to understand the following three concepts: (1) the bottom-line; (2) business development; and (3) the request for proposal process.

Specifically for the bottom-line discussion, information was shared regarding how to communicate and tie organizational outcomes to savings (e.g., minimizing turnover, mitigating risks such as discrimination claims by understanding the mission of EEOC and OFCCP, and leveraging press releases of other companies to demonstrate purpose and savings for embracing I-O practices).

As I-O psychologists advance in their careers and understand the organization they work for or consult with, there may be opportunities to identify potential gaps and communicate opportunities for efficiencies and business growth. A high-level overview of what this process entails was covered during the session (e.g., conduct market research, recognize stakeholders, have a strategic vision, prepare and implement a business plan, market and create buy-in). The guidance provided was to hone presentation skills and learn how to pitch and sell ideas.

Finally, given I-O psychologists commonly go into consulting careers, there is a need to understand the request for proposal process in order to win and partner with clients on projects. An overview of the typical stages of a proposal process were shared with the graduating students (e.g., sales call, proposal, interview, question/answer, sales demo, negotiation of pricing, and signed contract). It was discussed that the scope of the project within the proposal should clearly state the identified problem and provide a proposed solution. Additionally, a tutorial on proposal pricing was provided in this session (e.g., pros/cons of hourly, daily, or project based pricing).

In summary, students graduating with an I-O Master’s degree were provided with the opportunity to be exposed to the business aspects that may not be covered in I-O graduate programs.

Industry Differences in Talent Acquisition

This SIOP conference session was led by a panel of speakers from various organizations: Jenna C. Cox, IBM; Amanda Klabzub, IMB; Mary Amundson, Land O Lakes; Jennifer M. Dembowski, The Home Depot; Nicole Ennen, Google; Hailey A. Herleman, IBM; and Lisa Malley, DDI. The focus of the discussion was on the similarities and differences of talent acquisition across industries. The similarities across industries is in the area of attracting and selecting the talent that supports business initiatives, but differences were noted in the scarcity of talent and need to grow specific talent (e.g., through educational programs), as well as the geography in which the company operates in (e.g., the talent mix in different markets). Also, unemployment rates may be great for job candidates, but not for retailers because of the reduction of qualified pools. It was shared that manufacturing jobs that pay very well can be difficult to fill due to schedules that are unappealing, unpredictable work, and often dirty working conditions. Furthermore, it was communicated that agriculture is very hard to staff, particularly middle management. As for the tech industry, it was stated that you may need to source candidates more than other industries because some of the most qualified and best people for the job already have jobs.

The panelists mentioned that a key differentiator in attracting talent is organizational culture and branding. In addition, the panelists touched on the candidate experience and how they strive to bring down the median days to selection. Some examples of how this was achieved were through cutting out layers of approval, using selection tools, and trimming down the number of interviews as long as a fewer number of interviews was just as predictive. In regards to the candidate experience, the company desire to want people to come back to them, as well as to understand what went well and what can be improved, which is gathered through a candidate experience survey.

Finally, a way to make your company more appealing to candidates seeking employment is to focus on the career site (e.g., social life at company and organizational culture). The goal would be to make it easy to use the career site and to allow candidates to find the jobs that they want to apply for. When it comes to the selection process, it was shared that having a hiring committee can help mitigate individual unconscious bias in the hiring process (e.g., come in and review all the materials of the entire process and feedback/scores from structured interviews).

“That Company is Great!” Best Practices for Improving Candidate Experience

This SIOP conference session was panel style with the following guest speakers: Brittany J. Marcus-Blank, University of Minnesota; Sarah A. Brock, Johnson & Johnson; Pamela Congemi, Medtronic; Jim Matchen, Target Corporation; Marina Pearce, Ford Motor Company; and Amy Powell Yost, Capital One. The topic of discussion was on how to create a positive candidate experience.

Taking action to improve candidate experiences not only helps to secure top talent, but it also benefits brand loyalty. The following are some of the practices shared amongst the panelists to increase candidate experiences:

  • setting candidate expectations (i.e., transparency of the selection process and communication of their status at each step);
  • calling each declined candidate (i.e., receive a personal call from talent acquisition);
  • being aware of the physical space in which the interviewees will spend time (e.g., have an identified space that candidates will not be distracted and at the same time impressed by what the candidate can see of the company);
  • picking the candidate up from the airport;
  • planning a welcome session and/or tour of facility;
  • assigning an onsite coordinator to welcome the candidate being interviewed;
  • providing the candidate with information about the interviewer(s) beforehand;
  • empowering recruiters to use discretion in sending gifts on behalf of the company (e.g., send a small gift to someone who just graduated with an MA degree or who just dropped out of the selection process due to a devastating life event);
  • implementing a candidate reaction survey;
  • monitoring Glassdoor and similar websites to discover feedback; and
  • training employees on good behavior (e.g., actions that will be appreciated by the candidates, such as a recruiter remembering their name).

Physical Abilities Testing: Lessons Learned in Test Development and Validation

This panel, which included DCI’s Emilee Tison, discussed unique challenges associated with physical abilities testing. Panelists identified challenges encountered in the field and shared lessons learned in this area of work. Specifically, the presenters addressed the following topics:

  • Test Development – how developing a physical abilities test is different from other types of selection tests
  • Adverse Impact – typical assumptions of existing sub-group differences and methodologies to reduce adverse impact
  • Validation – strategies typically used for physical abilities tests and how this differs from other types of selection tests
  • Criteria for Validation – typical criteria used for criterion-related validation evidence and challenges faced during these analyses

Panelists cautioned organizations to ensure a full understanding of the physical requirements of the job as well as the types of physical abilities tests available for implementation. Physical abilities testing is not a ‘one-size-fits-all’ process; considering it as such increases risk of a mismatch between the physical requirements of the job and the test being implemented, and increases legal risks.

Master Tutorial: R Shiny Apps in I/O

In this session spearheaded by DCI’s Sam Holland, the use of R’s Shiny package was showcased to demonstrate its usefulness in sharing and visualizing analytic results. It allows R users with no programming background the ability to deploy web-ready applications to showcase results. After walking participants through the basic concepts and principles needed to leverage the package, presenters demonstrated how quickly basic R scripts can be transformed into interactive dashboards.

Everything UGESP Forgot to Tell You About Content Validity

This panel, moderated by Emilee Tison, Ph.D. (DCI Consulting Group), discussed the importance, usefulness, and practicality of content-oriented validation methodologies, which is the extent to which the content of the selection procedure reflects important performance domains of the job. This methodology, however, is often criticized for having limited application and questioned as to whether it increases the likelihood of actual prediction of job performance.

Each panel member spoke about a different topic and the role played by content-oriented validation methodologies:

  • James Sharf, Ph.D. (Sharf and Associates) provided a history of the development of EEOC’s Uniform Guidelines on Employment Selection Procedures (UGESP). Of particular interest was the use of validity generalization and whether or not it was appropriate to use.
  • Mike Aamodt, Ph.D. (DCI Consulting Group) discussed background checks and the various aspects to address when linking risk areas to specific tasks performed.
  • Damian Stelly, Ph.D. (Flowserve Corporation) addressed using content validation as a potential alternative to criterion validation for personality assessments. The applicability will often depend on the specific context of the situation.
  • Deborah Gebhardt, Ph.D. (HumRRO) discussed a number of important considerations when using content validation for physical ability tests. Some of these include: a reflection of essential job tasks and work behaviors, feasibility of the simulation, using only basic skills and not those learned on the job or in training, safety, ability to standardize test components, using a meaningful scoring metric, and the reliability of test components.

By Amanda Shapiro, Senior Consultant; Brittany Dian, Associate Consultant; Samantha Holland, Consultant; Joanna Colosimo, Director of EEO Compliance; Jana Garman, Senior Consultant; Jeff Henderson, Associate Consultant; Julia Walsh, Consultant, Keli Wilson, Senior Manager of EEO Compliance, D&I; Cliff Haimann, Consultant; Bryce Hansell, Associate Consultant; and Emilee Tison, Associate Principal Consultant, at DCI Consulting Group


Between the periods of 2005-2007, OFCCP had several open compliance reviews with Pilgrim’s Pride Corp. establishments. In just the last month, three DOL administrative complaints for audits from this period were dismissed. Dismissals resulted from Pilgrim’s Pride bankruptcy filed in 2008.

In December of 2008, Pilgrim’s Pride declared bankruptcy, and proceeded through the process, effecting a bankruptcy plan in December of 2009. As part of the bankruptcy plan, the Court gave notice of Pilgrim’s Pride bankruptcy available to all relevant parties, requesting that all parties submit claims by June 2009. OFCCP filed a claim alleging discriminatory practices at Mount Pleasant and Lufkin, TX, factories in May of 2009. Pilgrim’s Pride then filed a Claims Objection Procedures Motion to OFCCP’s claims of discrimination to which OFCCP did not respond. As a result, the Court granted Pilgrim’s Pride’s objection, thus disallowing OFCCP’s claim. No other claims were submitted for other Pilgrim’s Pride establishments under review by OFCCP.

On September 15, 2015, one day after the Pilgrim’s Pride bankruptcy plan ended, the DOL filed the first of three administrative complaints with the Office of Administrative Law Judges for the Athens, AL compliance review. The next two complaints were filed for Marshville, NC, and Mount Pleasant, TX, in October 2015 and May 2016 respectively. A press release was also issued for the Mount Pleasant review in May 2016.

As a claim for the Mount Pleasant review was submitted during the bankruptcy plan, but ultimately disallowed, this administrative complaint was not able to move forward. Further, since OFCCP failed to file claims for the Athens and Marshville reviews prior to the bankruptcy notice period of June 2009, the Courts dismissed these complaints. OFCCP maintained they were not properly notified of the Bankruptcy case, thus timely claims could not be submitted in 2009. The Court found that this could not be possible, given that the Southwest and Rocky Mountain Regional Office filed claims in May of 2009 in accordance with the notice. Given that OFCCP is one agency, the court expected that if one region had sufficient notice then all regions should have had sufficient notice.

For more information on the full proceedings, please reference this document. This particular case serves as a helpful reminder to consult legal counsel for guidance during a bankruptcy period if there are open reviews covering a time period prior to bankruptcy.

By Amanda Shapiro, Senior Consultant, and Rachel Monroe, HR Analyst, at DCI Consulting Group


Facebook Twitter Linkedin

As noted in a recent blog, 800 Corporate Scheduling Announcement Letters (CSAL or courtesy letter) were sent to contractor establishments on February 17, 2017. CSALs serve to give contractors advanced notice that one (or more) of their establishments appear in the list generated by the Federal Contractor Selection System (FCSS). DCI has noted a trend with the recent batch of CSAL letters: some CSALs are addressed to locations with open or recently closed OFCCP audits. In some cases, these locations also received CSALs during the last wave sent in 2014. This may not be representative of all letters, but it is unclear why establishments with open or recently closed audits would receive a CSAL.

As a reminder, if a scheduling letter is received for an establishment with an active audit, or an audit that closed within two years of the date of the new scheduling letter, you should reach out to OFCCP to administratively close the audit. A CSAL does not initiate an audit, thus there is nothing to administratively close or dispute necessarily. One thing to keep in mind is that if enough time passes and the scheduling letter is received two years or more from the close of the previous audit, then the audit can commence regardless of when the CSAL was received. If you have received a 2017 CSAL it is recommended that you reach out to your Consultant and/or Counsel to discuss.

By Amanda Shapiro, Senior Consultant, and Rachel Monroe, HR Analyst at DCI Consulting Group



Facebook Twitter Linkedin

It seems that OFCCP and the Department of Labor (DOL) has been active in early 2017. So far in January we have seen a number of press release worthy settlements and lawsuits. This week, in addition to the lawsuit with JPMorgan Chase, DOL released information on their lawsuit with Oracle America, Inc. (Oracle).

Oracle is a technology company, offering integrated cloud application and platform service, and due to its contracts with the Federal government, also a Federal contractor subject to routine compliance evaluations from OFCCP. OFCCP and DOL have alleged that Oracle discriminated against female, Black and Asian employees in pay. A second allegation focuses on hiring, in which the suit claims that Oracle discriminated against non-Asian applicants. The complaint also describes a refusal to turn over requested records as it relates to prior pay and applicant data/records.

Both the allegations and analyses in this complaint are fascinating. We’ve written many times about aggregation issues on this blog, and this complaint is ripe with examples of when aggregation goes bad. A few highlights from DOL’s press release and the  complaint are summarized below:

  • In reading the press release pay seems to be analyzed by title, however once digging in to the actual complaint it is apparent that pay analysis groups (PAGs) were conducted by line of business. The allegations center around 3 lines of business which include 80 different job titles across the three.  Unfortunately, these large PAGs have become the norm under Directive 307. Although job title is controlled for in the analysis, there are constraints with using this method, and it does not necessarily mean that employees are similarly situated.
  • Similar to pay, applicant analyses are also conducted at an aggregate level – seemingly job group and line of business, which include 69 job titles. These groups are listed as: Professional Technical 1, Individual Contributor (PT1) job group and Product Development line of business (or job function). Due to this mixed bag of job group and function, it’s unclear whether these are legitimate groups that Oracle hires from or whether these are different aggregations created by OFCCP.
  • Further, data may have been aggregated across plan years. Based on the complaint, hiring allegations are only based on data from the first 6 months of the plan year (January 1, 2013 through June 30, 2013), however it is more probable that it was an 18-month time frame given the data that OFCCP had access to. If that is the case, given the size of the location (approximately 7,000 employees), and the types of jobs in the groups listed, it wouldn’t be unreasonable to expect there to be thousands of applicants in the analysis pools.
  • Non-Asian is not a protected class and rather a random aggregation of all race and ethnicities that are not Asian. This grouping is inappropriate and has been previously rebutted by the courts – see our previous blog on OFCCP v. VF Jeanswear Limited.
  • Although not specifically tied to data aggregation – it’s interesting to note that OFCCP also documented findings of comparisons to the national labor data and deemed Asians to be overrepresented in the applicant pool. This was used as evidence of Oracle’s preference for Asians and its target recruitment of the group.

By Amanda Shapiro, Senior Consultant, DCI Consulting Group

Facebook Twitter Linkedin

Did you receive a scheduling letter this summer? If so, you’re not alone. DCI has seen a wave of OFCCP scheduling letters received by establishments in June and July.  Many of the audit letters were for establishments that were on the Corporate Scheduling Announcement Letter (CSAL) list with many of the establishments receiving courtesy letters over two years.  Additionally, there have been a handful of establishments that have received a scheduling letter that were not listed on a CSAL.

The majority of audit letters received by DCI clients have been in the Northeast and Midwest regions.  One thing of interest is that in many cases in both regions, the District Office conducting the audit is not the closest or the most logical District Office to be auditing the location based on proximity. That being said, the District Office listed on the letter is still within the appropriate region (e.g., Indianapolis District office auditing a location near Minneapolis and Milwaukee).    Moreover, we have also seen multiple audit letters for locations at the same organization within the same region arrive at the same time. It is not terribly uncommon to see this occur given various workload concerns with offices within a region, but still an interesting trend to note for contractors.

As a reminder, it is always a good practice to remind locations to be on the look-out for an official scheduling letter from OFCCP, as contractors have 30 days to respond with a desk audit submission.

By Joanna Colosimo, Associate Principal Consultant, and Amanda Shapiro, Senior Consultant at DCI Consulting Group 

Facebook Twitter Linkedin

DCI has recently seen an increase of audit “pre-requests,” in which a compliance officer requests that a contractor submit additional items to those found in the Itemized Listing. This request is made by OFCCP prior to receiving the contractor’s desk audit submission and asks for items that often appear in an early post-submission request, such as job listing evidence, copy of VETS reporting, copy of employee handbook, etc. It’s unclear whether these requests are consistently asked of all contractors audited through particular district offices; however DCI has seen some consistency of items in such requests.

There are some concerning aspects to these “pre-requests,” including the following points:

  • Although these items are typically requested post-desk audit submission, there is no Office of Management and Budget (OMB)-approved form to request these items of all contractors. When requested before receipt and review of a contractor’s submission, these items appear to be a more indiscriminate solicitation, similar to an OMB-approved information collection request (ICR).
  • Following the previous point, these requests are typically presented as though they are required as part of the desk audit submission and many contractors may not understand they can (and often should) request extra time to fulfill the request or explanation of why certain items are being requested.
  • In spite of providing some helpful tips as part of these requests, some of the guidance provided by the compliance officer is incorrect or may be confusing. For example, we have seen guidance noting that hiring numbers with 1:1 selection to pool ratios, or small ratios, are not to be included in the submission. This implies that in all instances a 1:1, or small, hiring ratio would be incorrect or a violation of the regulations, yet we know that in some cases this scenario occurs for legitimate reasons; see this blog for additional exploration of this topic.

Because these “pre-requests” appear to be increasing in frequency, contractors are advised that if they receive such a request for an establishment, begin by engaging with the compliance officer. Outright denying access to requested materials is not recommended; however, DCI does recommend acknowledging receipt.  Contractors should work with the compliance officer to determine a different submission date and/or politely request that OFCCP first review the desk audit package once delivered, before requesting items that are not a part of the OMB-approved scheduling letter and itemized listing.

By Jana Garman, Consultant, and Amanda Shapiro, Senior Consultant at DCI Consulting Group

Facebook Twitter Linkedin

In OFCCP v. Bank of America (BOA), the Administrative Review Board (ARB) overturned OFCCP’s allegation that BOA discriminated against Blacks in 2002-2005. As described in another blog in this series, the basis for the ARB reversal of the 2002-2005 case was the lack of evidence through data aggregation. Specifically, the ARB stated that:

“The OFCCP’s evidence of discrimination in 2002-2005 boils down to one standard deviation of 4.0 (or 4.1) for the four-year period, but no standard deviation conclusions year by year.”

Similar to the ruling made in the Lopez v. City of Lawrence case, where aggregation of separate city applicant pools was rejected, the ARB discredited the aggregation of data across several years. Without evidence year by year, the aggregation of four years of data may be arbitrary and inappropriate. As noted by Jacobs, Murphy and Silva (2012)1, there are unintended consequences that may come from analyzing large databases, from which they coined the phrase “Being Big is Worse than Being Bad. This phrase refers to the increased chance of a statistical finding in a large dataset due to statistical power.

The BOA conclusion is noteworthy because it provides further guidance on conducting adverse impact analyses across multiple time periods, and supports the idea of pushing back against the arbitrary, multi-year data aggregation that may be used in analyses performed by the agency. Overall, employers need to be cautious when aggregating data and should consider it only under certain parameters. For additional background and corroboration on data aggregation, we recommend reviewing the Technical Advisory Committee Report on Best Practices in Adverse Impact Analyses.

Unrelated to the data aggregation issue, there have been other new developments in the BOA case. Stay tuned as the 23 year saga continues.

By Amanda Shapiro, Senior Consultant, and Vinaya Sakpal, HR Analyst, DCI Consulting Group 

Facebook Twitter Linkedin

On March 15, 2016, The Wall Street Journal released an article focusing on the results of pay discrimination investigations carried out by the Office of Federal Contracts Compliance Programs (OFCCP) through their standard compliance review process. The author’s inquiry was informed by settlement outcomes, OFCCP activity measures, and interviews with experts in the field, including David Cohen with DCI. The article suggests that the agency’s results have been inadequate given the importance of their charge to address the wage gap.

After 7 years under Director Patricia Shiu, the article reports that OFCCP has had few results based on pay discrimination investigations. Between the start of 2010 and September 2015, the total relief funds collected by OFCCP equaled about $5 million. When compared to a single $5.5 million settlement collected under the Bush administration in 2004, pay discrimination settlements have notably decreased. This downward trend in settlements is also evident in that only 8 compliance reviews resulted in a pay discrimination settlement during fiscal year 2015, the lowest annual sum of pay cases thus far under this administration.

In defense of OFCCP’s results, Director Shiu said that the agency has made considerable changes that have required extensive training of personnel. She noted that the increased scope of audits has led to a more lengthy review of each audited contractor, which partially resulted from the rescission of the 2006 Standards and Guidelines and implementation of a new Directive. She also noted that the agency had a workforce overhaul, adding statisticians and compliance officers. Although not noted in the article, contractors may also recall that OFCCP rescinded the protocol of “Active Case Management”, replacing it with “Active Case Enforcement,” as well as implemented several new Executive Orders and revised regulations; all of which likely contributed to the extensive training mentioned.

OFCCP has received an increased budget for several years with a current $105 million annual budget, up from about $82 million in 2009. In spite of increased resources, OFCCP’s results have been underwhelming, which may call into question the methods utilized to investigate pay discrimination. Director Shiu stated that systemic compensation cases take time to develop and that the agency has many cases in the pipeline. It will be interesting to see what the remainder of the fiscal year brings.

By Jana Garman, Consultant, and Amanda Shapiro, Senior Consultant at DCI Consulting Group 

Facebook Twitter Linkedin

The Scheduling Letter and Itemized Listing, reviewed and approved by the Office of Management and Budget (OMB), set the framework for what data are required to submit in a compliance review.  Specifically, item 18 in the Itemized Listing requires contractors to supply “data on your employment activity (applicants, hires, promotions, and terminations) for the immediately preceding AAP year”.  The submission of update data is only required if the contractor is 6 months or more into the current AAP year upon receipt of the Scheduling Letter.

In some recent audits the OFCCP is automatically requesting data 2 years back from the Scheduling Letter date, which would include not only data from the AAP period under review, but also data from before and after the AAP period. This type of time period expansion may or may not be in accordance with regulatory guidance.

There are situations where OFCCP may reasonably request data outside of the AAP data timeframe. For example, it is acceptable for the OFCCP to seek records for a 2-year period in accordance with the general record-keeping obligations. Also, the Federal Contract Compliance Manual mentions that it may take special circumstances, such as the appearance of potential discrimination, to warrant a review period beyond the Scheduling Letter date. The OFCCP’s position in Frito Lay v. OFCCP was that these requests can be made if under a continuing violation theory, assuming that a violation was identified. Unfortunately, a formal ruling was not made in this case, so there is no final decision on whether the OFCCP’s position is appropriate.  However, it is important to be aware of when this request may not be appropriate.

Here are some tips for responding to a request for data outside the elected AAP year or beyond the Scheduling Letter date:

  • If there are no statistical indicators at the job title/group level (i.e., no apparent violations), then this may warrant a discussion with OFCCP. A simultaneous action would be to ask OFCCP to reevaluate the original data and confirm results.
  • If there are statistical indicators (i.e., potential violations), ask OFCCP to identify the specific job titles/groups with a statistical indicator and limit the data request to those only for the AAP temporal scope.
  • Should the OFCCP persist with a 2 year request, then contractors can identify the regulations at §60-2.1(c). It states that AAPs must be updated annually. If the OFCCP does not identify a special circumstance and the contractor is expected to update personnel activity on an off-cycle basis to coincide with the Scheduling Letter date, then this would exceed the annually updated portion of the regulations and approved burden-hour estimates from OMB. Additionally, it would devalue the requirement to annually develop AAPs and identify problem areas.
      • To illustrate how providing additional information may present an issue, imagine that you had a January 1, 2015 AAP and received a scheduling letter on October 12, 2015. For the desk audit you submit your annual January 1, 2015 AAP and an update July 1, 2015 AAP. During follow-up requests the OFCCP requests data for three applicant job groups from October 12, 2013 through October 11, 2015 (i.e., 2 years back from the date of the scheduling letter). Although you did not see statistical indicators for the job groups requested when you proactively analyzed them, after submitting the 2 years of data to the OFCCP they indicate that there are indicators when analyzing the combined data. Given that this data is outside the temporal scope of your proactive AAPs, this indicator could not have been proactively identified.

In the past, requests for data outside the temporal scope of the audit have been rare; however, as noted above, DCI has seen this request several times in recent audits.

By Keli Wilson, Principal Consultant and Amanda Shapiro, Senior Consultant at DCI Consulting Group 

Facebook Twitter Linkedin

Given the high prevalence of data security breaches and identity theft today, organizational leadership should be making the protection of their consumers’ and their employees’ confidential information a high priority. It is no surprise that federal contractors who are subject to OFCCP compliance reviews are employing more purposeful data protection strategies to ensure that their sensitive employee information remains secure when it is sent outside of the organization.

DCI recommends, at minimum, the following considerations when releasing data offsite:

  • Remove employee names and include an employee or generic ID instead. Do not include social security numbers.
  • Include a confidentiality disclaimer (e.g. “information not subject to FOIA”) when submitting information. Although items you provide are not covered under Attorney Client Work Privilege (ACWP), you should still mark them as confidential.
  • Password protect reports and zip files. Encrypt data (e.g. excel spreadsheets) or sensitive reports (e.g., background checks, criminal history checks, etc.).
  • If sending by mail, use an encrypted media device (e.g. flash drive) and request tracking information. While submitting information to OFCCP via encrypted flash drive may be one of the safer practices we mention, there is still a possibility that these small devices could get lost or stolen along the way.

We recommend that you hold an internal discussion with your EEO and legal experts to determine which option(s) makes the most sense for your organization and to ensure all employees responsible for these communications receive the appropriate training.

By Jeff Henderson, Associate Consultant and Amanda Shapiro, Senior Consultant

Facebook Twitter Linkedin


Really, I Come Here for the Food: Sex as a BFOQ for Restaurant Servers

Michael Aamodt, Principal Consultant at DCI Consulting Group, wrote an article featured in SIOP’s TIP publication, January 2017.

Recent Blog Posts

Fiscal Year 2018 Budget Proposes Merger of OFCCP and EEOC

The Department of Labor’s Fiscal Year 2018 (FY2018) budget proposal was released today, May 23, 2017.  The budget outlines the initiatives and priorities of the new administration, and as predicted by DCI, recommends merging the Office of Federal Contract Compliance Programs (OFCCP) and Equal Employment Opportunity Commission (EEOC) by the end of FY2018.

The proposed budget indicates that the consolidation will provide efficiencies and oversight.  Additionally, the proposed budget allots $88 million for OFCCP, a decrease of $17.3 million from Fiscal Year 2017.  The main cut to the budget appears to be headcount, with a proposed 440 full-time equivalent (FTE) headcount, a reduction from 571 FTEs.  Some other interesting items that have

Read More