-
Professor Jan-Emmanuel De NeveUniversity of Oxford
-
Dr William FlemingUniversity of Oxford
-
Dr George WardUniversity of Oxford
Project overview
This project will evaluate the reliability of Indeed’s crowdsourced workplace wellbeing data.
Job websites offer a unique opportunity for social science research, as they facilitate data collection on people’s experiences within the labour market.
Indeed is the world’s largest job search platform. It invites employees to anonymously answer a survey of questions relating to their workplace wellbeing along four dimensions: job satisfaction, job purpose, happiness at work, and work stress.
These dimensions are supplemented by questions on the drivers of wellbeing; fair pay, social support, appreciation, trust, belonging, management, inclusivity, flexibility, achievement, energy and training.
Despite the popularity of online crowdsourced methods, reliability and representativeness are rarely established, even though research shows high variation between different platforms.
Research questions
Through analysis of Indeed data, the research team aims to establish the validity and reliability of major crowdsourced datasets by answering the following research questions:
- What makes for good quality crowdsourced data?
- Are Indeed survey responses internally reliable and accurate?
- Are measures and results externally valid when comparing with representative surveys?
- Do inaccurate or dishonest responses bias overall validity?
- Are there differences in errors and biases across organisations, industries, and sectors?
- At what sample size do organisations achieve stable distributions and aggregate results?
- Are key relationships between observed variables coherent and consistent?
A literature review will identify guiding principles for what makes good and bad quality crowdsourced data and best practice for evaluating crowdsourced data. In addition, reliability will be evaluated through multiple methods, including establishing a minimum response time, evaluating the consistency of inaccurate responses, and identifying clustered variations in resultant biases. Meta-data will be compared with the UK Labour Force Survey, UK Household Longitudinal Study, and the UK subsample of the 2021 wave of the European Working Conditions Survey.
Findings
The findings will be shared in a publicly available report and an academic article.
The project team will present their findings to Indeed and give advice on how the findings can be adopted in future industry practice and research to mitigate any systematic biases.