Its creator, Adam Hildreth, 22, calls it the Anti-Grooming Engine, The Guardian reports. "He claims the product is 99.9% effective in identifying adults online with a sexual motivation," and it's not keyword filtering. "The software is designed to look out for conversation patterns, typing speed, use of grammar and punctuation, and any aggressive or bullying language. Using extracts of online conversations between young people as examples of 'good' data, it is fed into the computer and compared with conversation gathered from that of suspected groomers." And the computer, he says, "learns" to tell the difference. CyberSentinel in the US has made some similar claims in the past, indicating that others have thought of this approach (see this in 2001). The proof is in the pudding, though, The Guardian cites one child-safety advocate as saying, and the pudding's not done yet - check out the article to get the full picture. Here's info in this site about "How to recognize grooming".