Pethig F, Hoehle H, Hui KL, Lanz A (2024). “Behavior toward newcomers and contributions to online communities.” MIS Quarterly, forthcoming.
In this paper, we study whether and how behavior toward newcomers impacts their socialization outcomes in terms of retention and quality of contributions in online communities. By exploiting a natural experiment on a large deal-sharing platform, we find that an intervention that proactively reminds other community members to be more considerate of newcomers causes newcomer deals to receive 54% more comments with a more positive sentiment. The newcomers are 10% more likely to post another deal, suggesting an increase in retention. However, we do not observe any effect of the intervention on the quality of subsequent contributions. Our evidence suggests that the intervention merely caused a temporary shock to newcomers’ first contributions but did not improve their learning or motivate greater efforts. We draw implications on the design of socialization processes to help communities improve the retention and performance of newcomers.
Pethig F, Kroenung J (2023). “Biased humans, (un) biased algorithms?.” Journal of Business Ethics, 183, 637-652.
Previous research has shown that algorithmic decisions can reflect gender bias. The increasingly widespread utilization of algorithms in critical decision-making domains (e.g., healthcare or hiring) can thus lead to broad and structural disadvantages for women. However, women often experience bias and discrimination through human decisions and may turn to algorithms in the hope of receiving neutral and objective evaluations. Across three studies (n=1,107), we examine whether women’s receptivity to algorithms is affected by situations in which they believe that their gender identity might disadvantage them in an evaluation process. In Study 1, we establish, in an incentive-compatible online setting, that unemployed women are more likely to choose to have their employment chances evaluated by an algorithm if the alternative is an evaluation by a man rather than a woman. Study 2 generalizes this effect by placing it in a hypothetical hiring context, and Study 3 proposes that relative algorithmic objectivity, i.e., the perceived objectivity of an algorithmic evaluator over and against a human evaluator, is a driver of women’s preferences for evaluations by algorithms as opposed to men. Our work sheds light on how women make sense of algorithms in stereotype-relevant domains and exemplifies the need to provide education for those at risk of being adversely affected by algorithmic decisions. Our results have implications for the ethical management of algorithms in evaluation settings. We advocate for improving algorithmic literacy so that evaluators and evaluatees (e.g., hiring managers and job applicants) can acquire the abilities required to reflect critically on algorithmic decisions.
Pethig F, Kroenung J, Noeltner M (2021). “A stigma power perspective on digital government service avoidance.” Government Information Quarterly, 38(2), 101545.
The digital-by-default policy for government services implemented in many European countries can pose challenges to marginalized citizens, such as people with disabilities. Prior research on electronic inclusion and the digital divide has mainly considered technology-related concerns, such as Internet anxiety, preventing people with disabilities from using digital government services. Yet, these concerns may insufficiently account for the fact that people with disabilities may suspect that governments provide new services only to reduce costs and forgo the need for more meaningful social change. Therefore, we draw from stigma power theory to understand how perceptions of stereotyping and discrimination contribute to the avoidance of digital government services among people with disabilities. Our results indicate that overcoming underutilization of digital government services among people with disabilities requires a holistic approach by addressing technology-related as well as stigma-related concerns.
Pethig F, Kroenung J (2019). “Specialized information systems for the digitally disadvantaged.” Journal of the Association for Information Systems, 20(10), 247-265.
A number of specialized information systems for the digitally disadvantaged (SISD) have been developed to offset the limitations of people less able to participate in the information society. However, contributions from social identity theory and social markedness theory indicate that SISD can activate a stigmatized identity and thus be perceived unfavorably by their target audience. We identify two mechanisms by which functional limitations affect a digitally disadvantaged person’s adoption decision: (1) adoption decision as shaped through technology perceptions (i.e., perceived usefulness, perceived ease of use, and perceived access barriers), and (2) adoption decision as shaped through marked status awareness (i.e., stigma consciousness). We test our contextualized research model on digitally disadvantaged users with physical and/or sensory disabilities. Results of our mediation analysis show that the individuals who have the most to gain from SISD use (i.e., those with greater perceived functional limitations) are doubly disadvantaged: as a group, they find it more challenging to use SISD and are also more sensitive to the fear of being marked as disadvantaged or vulnerable.