Competency-rubric questions
Question banks per competency, validated against role-family success profiles. Same questions for every candidate in a role; calibrated panel rubrics across geographies and languages.
Unstructured early-careers interviews land at 47% inter-rater reliability — about the level you’d expect from a coin flip, weighted slightly. Add AI-prepped candidates who all sound like senior managers, and the structured interview that worked in 2019 is now a hiring rounding error. Video Interviews brings the consistency back, with explainable AI scoring per question.
Two real things have happened to the structured early-careers interview at once. AI prep has flattened the case-question signal — everybody arrives sounding articulate and well-rehearsed. And the interviewer side has drifted: rubrics get applied unevenly, junior interviewers anchor on first impressions, panel feedback meetings turn into rapport-aggregation exercises rather than evidence-comparison ones.
The result is an interview process that feels rigorous and is, on the data, mostly noise. The fix is not more interviewers; it’s structured questioning, explainable AI scoring per question, and per-interviewer drift monitoring.
When inter-rater reliability is in the 40s, the interview process is functionally drawing names from a slightly-weighted hat. The strongest predictor of a hire decision becomes whether the candidate happened to interview with someone who liked them — not anything that correlates with first-year performance. The cost lands at month nine, when the line manager is having a candid conversation with HR about the candidate the interview process pushed through.
The interview is the most-defended and least-validated part of most early-careers funnels.
The Video Interview Platform replaces the unstructured-by-default interview with a structured, scored, calibrated one. Same questions for every candidate per role. Explainable AI scoring per question, per competency rubric. Inter-rater drift flagged in real time. Adverse-impact monitoring per interviewer. The senior practitioners on the panel still make the hire decision — the platform gives them better evidence to make it on.
PwC’s graduate audit and tax pipelines moved to the Video Interview Platform across the 2024 and 2025 intake cycles. Predictive validity against 12-month line-manager performance climbed from r = 0.18 (legacy SHL battery + unstructured interview) to r = 0.41 (structured rubric + AI-scored video interviews + Assessment Platform). 4-fifths rule satisfied across all monitored characteristics. Independent BPS-panel methodology review on file.
Video Interviews is the second filter, after the Assessment Platform shortlist. The two work as a pair: behavioural evidence from IWX feeds the Assessment Platform, the Assessment Platform produces a ranked shortlist, the Video Interview Platform applies structured human-decisioned interviews to that shortlist with explainable AI scoring throughout.
Most clients deploy Video Interviews as part of an Assessment Platform engagement; some deploy it stand-alone as a structured-interview replacement for legacy panel processes.
Most clients start with a 45-minute calibration session: we run a small panel through one of our existing competency rubrics on anonymised candidate footage, score independently, and compare. Most senior practitioners are uncomfortably surprised by the inter-rater spread. That’s the start of the conversation.