WorkestraDocs
ModulesRecruiting

Bias & Fairness

Configure fairness mode, audit AI decisions, and ensure equitable hiring.

Bias & Fairness

Workestra's recruiting tools include features to reduce bias and ensure fair evaluation of all candidates.

fairness settings

Screenshot needed � add an annotated image showing this UI

Fairness Mode

What Is Fairness Mode?

Fairness Mode redacts potentially biasing information from AI screening:

Redacted FieldWhy
NamePrevents gender/ethnic assumptions
Age indicatorsAvoids age discrimination
LocationReduces location bias
Education institutionAvoids prestige bias
PhotosEliminates appearance bias

How It Works

With Fairness Mode enabled:

  1. Candidate applies with full resume
  2. System creates redacted copy
  3. AI scores the redacted version
  4. Scores based on skills and experience only
  5. Full resume available to humans

Fairness Mode affects AI scoring only. Recruiters can still see full candidate information.

Configurable Redacted Fields

Choose which fields to redact:

Standard Configuration

Default redacted fields:

  • Name
  • Age/Date of birth
  • Home address
  • Education institution names
  • Photo
  • Email address
  • Phone number
  • Previous employer names

Custom Configuration

Adjust in Recruiting Settings > Fairness:

  1. Toggle fields on/off
  2. Save configuration
  3. Applies to future screenings

Some fields may be needed for context. For example, previous employer might be relevant for industry experience.

Audit Logging

What Gets Logged

The system records:

EventDetails
Screening startWhen AI begins analysis
Fields redactedWhich were hidden
Score calculationHow score was derived
Human overrideWhen humans change AI scores
Final decisionHire/reject with reasoning

Viewing Audit Logs

Access at Recruiting > Settings > Audit Log:

  1. Filter by date range
  2. Filter by job or candidate
  3. Filter by event type
  4. Export for compliance review

Compliance Reports

Generate reports showing:

  • Screening volume by demographic (if collected)
  • Score distribution
  • Pass/fail rates
  • Human override frequency

Bias Report Badge

Candidate View

On each candidate, a Bias Report badge shows:

  • Whether Fairness Mode was applied
  • Which fields were redacted
  • If screening was audited
  • Any flags or concerns

Report Contents

Click the badge to view:

Fairness Report for Jane Doe
================================
Fairness Mode: Enabled
Redacted Fields: Name, Age, Location
Screening Date: 2024-03-15
AI Score: 87/100

Score Breakdown:
- Skills Match: 35/40
- Experience: 28/30
- Education: 12/15
- Cultural Fit: 12/15

Redacted Information:
- Name: [REDACTED]
- Age: [REDACTED]
- Location: [REDACTED]

Audit ID: AUD-2024-0315-8847

Enabling Fairness Mode

Global Setting

Enable for all jobs:

  1. Go to Recruiting Settings > Fairness
  2. Toggle Enable Fairness Mode by Default
  3. Configure redacted fields
  4. Save

Per-Job Setting

Enable for specific jobs:

  1. Open job details
  2. Go to AI Agent tab
  3. Toggle Enable Fairness Mode
  4. Override global settings if needed
  5. Save

Best Practices for Fair Hiring

Structured Evaluation

Use consistent criteria:

  • Same questions for all candidates
  • Standardized scoring rubrics
  • Multiple interviewers
  • Documented decisions

Diverse Hiring Panels

Include diverse perspectives:

  • Mixed gender panels
  • Cross-functional interviewers
  • Trained bias awareness
  • Regular calibration sessions

Regular Audits

Review for patterns:

  • Monthly pass/fail rates by source
  • Quarterly demographic analysis
  • Annual bias audit
  • External review (annually)

Training

Educate your team:

  • Unconscious bias training
  • Structured interview techniques
  • Legal compliance updates
  • Tool usage (Fairness Mode, etc.)

Equal Employment Opportunity

Ensure compliance with:

  • EEOC (US) — Equal Employment Opportunity Commission
  • GDPR (EU) — Data protection and automated decision-making
  • Local laws — Jurisdiction-specific requirements

Documentation

Maintain records of:

  • Why candidates were rejected
  • How AI was used in screening
  • What bias mitigation was applied
  • Training completion records

Candidate Rights

Inform candidates of:

  • Use of AI in screening
  • Right to human review
  • Data retention policies
  • How to request deletion

Measuring Fairness

Metrics to Track

MetricTargetMeasurement
Screening paritySimilar pass ratesBy source, not demographics
Interview conversionConsistent ratiosApplication → interview
Offer acceptanceNo significant gapsBy candidate segment
Time in stageEqual treatmentDays per pipeline stage

Disparate Impact Analysis

Watch for warning signs:

  • Significant differences in pass rates
  • Certain groups consistently scoring lower
  • Human overrides clustered by demographic
  • Feedback mentioning non-job factors

Corrective Action

If bias is detected:

  1. Pause the affected process
  2. Review recent decisions
  3. Retrain AI if needed
  4. Adjust redacted fields
  5. Document changes made

Transparency

Candidate Communication

Be open about AI use:

"We use AI tools to help evaluate applications fairly. 
All candidates are assessed using the same criteria. 
You have the right to request human review of any automated decision."

Internal Communication

Train hiring managers:

  • How Fairness Mode works
  • When to override AI scores
  • How to document decisions
  • Escalation procedures

Next Steps