UA
University of Alabama
Institute for Social Science Research (ISSR) · Behavioral Trust Calibration Lab
HumanAI Foundation · GSoC 2026
🔬 Researcher Dashboard

IRB-Approved Research Study

Humanlike AI Systems & Trust Attribution

This study investigates how interface design cues—such as an AI assistant’s name, tone, and confidence framing—influence human trust and decision-making. You will work through a series of brief decision scenarios where an AI assistant provides a recommendation and you decide whether to accept or override it.

Participant Information & Consent

Mentors: Andrya Allen · Dr. Xinyue Ye · Dr. Kelsey Chappetta · Dr. Andrea Underhill
University of Alabama · ISSR · Proposal ISSR3 · HumanAI GSoC 2026