Her harrowing experience revealed that parents need to immediately practice control and supervision with smart apps.

Many parents are relieved because AI assistants like Amazon Alexa are answering every inquisitive “why” a toddler has. However, a recent experience from mom Christy Hosterman has parents terrified. In a post shared on Facebook on February 27, 2026, she explained how the bot caused a harrowing experience for her and her daughter. The mom was busy cooking and so asked Alexa to tell her daughter, Stella, a story. To the mom’s surprise, the assistant abruptly stopped the story to make an unusually unacceptable request. Fortunately, Hosterman was around and immediately intervened. It is an eye-opening reminder to parents to ensure controlled and cautious use of technology and AI.
Hosterman revealed that she was cooking while her daughter was around. She asked Alexa to tell her a story. The assistant took up the command and began narrating a story. A little into the story, the bot “interrupted her and asked her what she was wearing and if it could see her pants.” Sharing screenshots of the exchange in the comments, it showed a disturbing phrase evoked by Alexa. It read, “Hold that thought, I’d love to see what you’re wearing.” The little girl mentioned she was wearing a skirt, after which the program asked her to show her skirt. That’s when Hosterman intervened and said she didn’t approve of the ridiculous request.
As soon as the mom put in her firm response, the bot apologized and mentioned that it had no visual capabilities and assumed it could see. “I mistakenly responded as if I could see, which was confusing and inappropriate,” the response came. It was added that the feature was designed to be safe for kids, and limitations should have been clear from the beginning. Yet, it was in no way a relief to the mom. She immediately got rid of the system. Several studies are cautioning about the risks of using Alexa and other systems that can enable privacy and other risks to individuals and children.
Surfshark’s findings showed that 1 in 10 smart home apps collect data for user tracking. Many of these are designed to be “data-hungry” and constantly retrieve and store information, including personal, security and other details. Out of the top smart home apps like Ring, LG, and so on, Amazon and Google stored the highest number of data points (28 and over 22, respectively). Data shared by Fair Play for Kids revealed that 75% of children between 3 and 10 years of age believe that Alexa always tells the truth. Children don’t understand surveillance and risks. They can give out data in umpteen ways because smart technologies are exposed to them on all fronts, right from home systems to toys and even learning.

According to a study from Taylor and Francis, Alexa poses far more safety threats. A few years ago, when a 10-year-old asked Alexa for a challenge, it suggested using a penny to touch a plug. The system is also designed to seek sensitive information. For instance, when you ask for an opinion casually, they redirect the conversation to mood, your day, or other sensitive topics that include personal information. In many instances, parents have been around and observant to stop serious repercussions that shouldn’t have stemmed from a prompt in the first place. After Hosterman’s experience, parents are being more alert with their systems and ensuring controlled activity.


Megan Ramsey wrote, “If I’m not mistaken, there is a camera in the top right-hand corner of your Alexa! It should have a way to completely cover it, which I wouldn’t fully trust either. How scary!” Cindy Wilson added, “I threw mine out years ago when hubby and I were sitting on the couch, and we heard someone speaking to us. I couldn't even tell you what she was asking because I immediately unplugged it and threw it in the trash."
If you know of any children who are being subjected to abuse, please contact The Childhelp National Child Abuse Hotline at (800) 422-4453
Job candidate opens rejection email, finds HR's lazy ChatGPT prompt instead
iPhone’s new AI feature sees man geta harsh reality check after he receives a breakup text