Its creator, Adam Hildreth, 22, calls it the Anti-Grooming Engine, The Guardian reports. "He claims the product is 99.9% effective in identifying adults online with a sexual motivation," and it's not keyword filtering. "The software is designed to look out for conversation patterns, typing speed, use of grammar and punctuation, and any aggressive or bullying language. Using extracts of online conversations between young people as examples of 'good' data, it is fed into the computer and compared with conversation gathered from that of suspected groomers." And the computer, he says, "learns" to tell the difference. CyberSentinel in the US has made some similar claims in the past, indicating that others have thought of this approach (see this in 2001). The proof is in the pudding, though, The Guardian cites one child-safety advocate as saying, and the pudding's not done yet – check out the article to get the full picture. Here's info in this site about "How to recognize grooming".
NetFamilyNews – by Anne Collier
- New Facebook policy targets guns, other regulated items
- Google’s new learning tool that learns
- The flap over Talking Angela the chatbot app
- About the worldwide ‘selfie’ phenomenon
- How technology will improve the well-being of young adults
- Calling our children narcissists on ‘a sociopathic scale’: Really!?
- Nothing complicated about this: Read ‘It’s Complicated’!
- Teens’ own (wise) perspectives on life with social media
Analysis & News – by Larry Magid
- Adults spend 11 hour a day using electronic media
- Smartphones that promise user privacy
- Author danah boyd on why teens and social media are ‘complicated’
- Security experts at RSA decry government hacking
- In defense of Internet safety education
- ‘Neknominate’ is a stupid and potentially deadly online dare game
- Confessions of a binge viewer
- People who suffer from so-called ‘game addiction’ have other problems