Val Odiembo spends a few afternoons each month at her former high school, walking teenagers through the complexities of consent and healthy relationships. The 19-year-old Rhode Island College student has noticed something troubling in her peer educator work: many of the high schoolers she talks with turn to artificial intelligence when wrestling with relationship questions, especially the difficult ones about whether they've crossed a line with someone.
A UK study found one in ten young adults has consulted AI for sexual health guidance. A 2025 Pew Research Center report revealed that one in five teens have had romantic interactions with chatbots. The problem, Odiembo says, is that AI tends to tell users exactly what they want to hear, even when the advice glosses over real harm. Lost in that exchange is something irreplaceable: human connection and accountability.
Those concerns helped shape Vibe Check, a new online tool launched in mid-March by SafeBae, a youth-led nonprofit founded by survivors of sexual violence. Unlike a chatbot, Vibe Check walks young people through a series of guided questions to help them understand whether they violated their partner's consent, consider how to apologize, and commit to changed behavior. By late April, the tool had attracted over 3,500 unique visitors.
"It is intentionally not AI," said Shael Norris, SafeBae's co-founder and executive director. "It was built by our team based on over a decade of direct work with young people."
Drew Davis, SafeBae's director of strategic initiatives, noticed years ago that teenagers and young adults were flooding Reddit forums with deeply personal questions about whether they'd caused sexual harm. What shocked him was the quality of responses: either brutal condemnation suggesting the person was irredeemable, or wholesale absolution. There seemed to be no middle ground for genuine reflection and repair.
"It keeps me up at night how bad these responses were," Davis said. "Either you're an awful person, there is no such thing as accidentally causing harm, to the extent that you should kill yourself, or you did absolutely nothing wrong."
Davis found himself wrestling with a contradiction in how society approaches accountability. Following high-profile cases like the Epstein revelations, younger men face intense pressure to be held accountable for their actions. Simultaneously, online communities have hardened into opposing camps: one demanding absolute punishment, the other dismissing harm entirely. Against this backdrop, adolescents report record mental health struggles, with both girls who've survived assault and boys increasingly isolated from peers.
Davis wanted to create something different: a tool that could provide what he calls an "off-ramp to rejoin society," allowing people who may have caused harm to move toward accountability and repair without burdening survivors to lead those efforts themselves.
Vibe Check users navigate scenarios by clicking through questions like "The person I was with is mad and I'm worried I did something wrong" or "They seemed upset or distant after." Depending on what unfolds, the tool offers mini-lessons on nervous system responses, consent laws around alcohol, and grounding exercises. One section explains: "If someone became distant, upset, or quiet afterward, that can be a sign they didn't feel okay about what happened. You don't need them to 'prove' anything for their feelings to matter. Accountability here can look like listening, apologizing without pressure and respecting whatever boundaries they set."
The distinction between Vibe Check and AI tools matters. Researchers are still evaluating how well chatbots handle sex education, particularly on sensitive topics. While AI shows promise for making sexual health information more private and accessible, it struggles with complex subjects like consent compared to more straightforward topics like contraception, according to recent research by Scarlett Bergam, a graduating student at George Washington University's school of medicine and health sciences.
Apollo Knapp, a 17-year-old on SafeBae's youth board and peer educator for middle schoolers, tested Vibe Check with classmates and was struck by its depth. He hopes to introduce younger students to the tool before they encounter their first AI chatbot on the subject. "If humans are messing up consent this much, I don't even want to see what a robot's going to do with it," he said.
Author James Rodriguez: "A tool designed by humans who actually work with young people daily beats an algorithm trained on Reddit's worst impulses every single time."
Comments