Bias & Fairness
Configure fairness mode, audit AI decisions, and ensure equitable hiring.
Bias & Fairness
Workestra's recruiting tools include features to reduce bias and ensure fair evaluation of all candidates.
Screenshot needed � add an annotated image showing this UI
Fairness Mode
What Is Fairness Mode?
Fairness Mode redacts potentially biasing information from AI screening:
| Redacted Field | Why |
|---|---|
| Name | Prevents gender/ethnic assumptions |
| Age indicators | Avoids age discrimination |
| Location | Reduces location bias |
| Education institution | Avoids prestige bias |
| Photos | Eliminates appearance bias |
How It Works
With Fairness Mode enabled:
- Candidate applies with full resume
- System creates redacted copy
- AI scores the redacted version
- Scores based on skills and experience only
- Full resume available to humans
Fairness Mode affects AI scoring only. Recruiters can still see full candidate information.
Configurable Redacted Fields
Choose which fields to redact:
Standard Configuration
Default redacted fields:
- Name
- Age/Date of birth
- Home address
- Education institution names
- Photo
- Email address
- Phone number
- Previous employer names
Custom Configuration
Adjust in Recruiting Settings > Fairness:
- Toggle fields on/off
- Save configuration
- Applies to future screenings
Some fields may be needed for context. For example, previous employer might be relevant for industry experience.
Audit Logging
What Gets Logged
The system records:
| Event | Details |
|---|---|
| Screening start | When AI begins analysis |
| Fields redacted | Which were hidden |
| Score calculation | How score was derived |
| Human override | When humans change AI scores |
| Final decision | Hire/reject with reasoning |
Viewing Audit Logs
Access at Recruiting > Settings > Audit Log:
- Filter by date range
- Filter by job or candidate
- Filter by event type
- Export for compliance review
Compliance Reports
Generate reports showing:
- Screening volume by demographic (if collected)
- Score distribution
- Pass/fail rates
- Human override frequency
Bias Report Badge
Candidate View
On each candidate, a Bias Report badge shows:
- Whether Fairness Mode was applied
- Which fields were redacted
- If screening was audited
- Any flags or concerns
Report Contents
Click the badge to view:
Fairness Report for Jane Doe
================================
Fairness Mode: Enabled
Redacted Fields: Name, Age, Location
Screening Date: 2024-03-15
AI Score: 87/100
Score Breakdown:
- Skills Match: 35/40
- Experience: 28/30
- Education: 12/15
- Cultural Fit: 12/15
Redacted Information:
- Name: [REDACTED]
- Age: [REDACTED]
- Location: [REDACTED]
Audit ID: AUD-2024-0315-8847Enabling Fairness Mode
Global Setting
Enable for all jobs:
- Go to Recruiting Settings > Fairness
- Toggle Enable Fairness Mode by Default
- Configure redacted fields
- Save
Per-Job Setting
Enable for specific jobs:
- Open job details
- Go to AI Agent tab
- Toggle Enable Fairness Mode
- Override global settings if needed
- Save
Best Practices for Fair Hiring
Structured Evaluation
Use consistent criteria:
- Same questions for all candidates
- Standardized scoring rubrics
- Multiple interviewers
- Documented decisions
Diverse Hiring Panels
Include diverse perspectives:
- Mixed gender panels
- Cross-functional interviewers
- Trained bias awareness
- Regular calibration sessions
Regular Audits
Review for patterns:
- Monthly pass/fail rates by source
- Quarterly demographic analysis
- Annual bias audit
- External review (annually)
Training
Educate your team:
- Unconscious bias training
- Structured interview techniques
- Legal compliance updates
- Tool usage (Fairness Mode, etc.)
Legal Compliance
Equal Employment Opportunity
Ensure compliance with:
- EEOC (US) — Equal Employment Opportunity Commission
- GDPR (EU) — Data protection and automated decision-making
- Local laws — Jurisdiction-specific requirements
Documentation
Maintain records of:
- Why candidates were rejected
- How AI was used in screening
- What bias mitigation was applied
- Training completion records
Candidate Rights
Inform candidates of:
- Use of AI in screening
- Right to human review
- Data retention policies
- How to request deletion
Measuring Fairness
Metrics to Track
| Metric | Target | Measurement |
|---|---|---|
| Screening parity | Similar pass rates | By source, not demographics |
| Interview conversion | Consistent ratios | Application → interview |
| Offer acceptance | No significant gaps | By candidate segment |
| Time in stage | Equal treatment | Days per pipeline stage |
Disparate Impact Analysis
Watch for warning signs:
- Significant differences in pass rates
- Certain groups consistently scoring lower
- Human overrides clustered by demographic
- Feedback mentioning non-job factors
Corrective Action
If bias is detected:
- Pause the affected process
- Review recent decisions
- Retrain AI if needed
- Adjust redacted fields
- Document changes made
Transparency
Candidate Communication
Be open about AI use:
"We use AI tools to help evaluate applications fairly.
All candidates are assessed using the same criteria.
You have the right to request human review of any automated decision."Internal Communication
Train hiring managers:
- How Fairness Mode works
- When to override AI scores
- How to document decisions
- Escalation procedures
Next Steps
- AI Screening — Fair evaluation with AI
- Settings — Configure fairness options
- Recruiting Settings — Complete ATS configuration