AUTHOR=Legler Franziska , Bullinger Angelika C. TITLE=How to achieve human-centered automation: the importance of trust for safety-critical behavior and intention to use in human-robot collaboration JOURNAL=Frontiers in Organizational Psychology VOLUME=Volume 3 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/organizational-psychology/articles/10.3389/forgp.2025.1669782 DOI=10.3389/forgp.2025.1669782 ISSN=2813-771X ABSTRACT=IntroductionRecent technological advances in human-robot collaboration (HRC) allow for increased efficiency and flexibility of production in Industry 5.0 while providing a safe workspace. Despite objective safety, research has shown subjective trust in robots to shape the interaction of humans and robots. While antecedents of trust have been broadly examined, empirical studies in HRC investigating the relationship between trust and industry-relevant outcomes are scarce and the importance of trust regarding its precise effects remains unclear. To advance human-centered automation, this paper investigates the affective, cognitive, and behavioral consequences of trust in robots, and explores whether trust mediates the relationship between industry-relevant characteristics and human-centered HRC outcomes.MethodsIn a pseudo real-world test environment, 48 participants performed a manufacturing task in collaboration with a heavy-load robot. Trust, affective experience over time, intention to use, and safety-critical behavior were examined. A 2 × 2 × 2 mixed design varied the availability of feedback, time pressure, and system failures, each expected to affect the level of trust.ResultsIn the control group, trust remained consistently high across all conditions. System failures and feedback significantly reduced trust, whereas time pressure had no effect. System failures further increased negative affective experience, while feedback reduced safety-critical behavior. Trust was unrelated to affective experience but positively related to safety-critical behavior and intention to use. The relationship between feedback and safety-critical behavior, as well as intention to use, was significantly mediated by trust.DiscussionHighly relevant for implementation, the control group showed a tendency toward overtrust during collaboration, evidenced by disregarding system failures. The results indicate that implementing a feedback system alongside the simulation of safe system failures has the potential to adjust trust toward a more appropriate level, thereby reducing safety-critical behavior. Based on these findings, the paper posits several implications for the design of HRC and gives directions for further research.