by Larry Magid
(this article is a work in progress and subject to editing and revision)
My inbox is full of pitches for products and services that claim to prevent bullying. Most use some type of word recognition to either block offensive messages or report them to parents. But even if you add the word “cyber” to bullying, it’s still bullying. Bullying isn’t caused by technology and it’s not cured by it either. It’s about relationships.
While I have no doubt that monitoring software that runs on phones, computers or the web can be used to focus in on offensive words, I don’t think it’s the solution to offensive behavior. At best it can block a message from getting through or inform parents that a kid is either being bullied or bullying others. While that might have some value, it does nothing to stop the underlying attitudes that foster and motivate the behavior and it does little to equip the person being bullied with the skills needed to survive unscathed.
And even if technology can identify offensive words, it’s not all that good at understanding the subtle meaning behind the words. Words must be considered in context and context can only be ascertained if you understand the relationship between the people involved. Even racial, ethnic, sexist and homophobic epitaphs have their context. The same word that’s used as part of a vicious attack can sometimes be a term of endearment. Among friends, it can be used affectionately or humorously or perhaps in a teasing and maybe even annoying but not necessarily harmful manner. In other contexts it can be mean or hateful. The same can be true of other “hurtful words,” which sometimes hurt a lot, sometimes hurt a little and sometimes don’t actually hurt at all. Humans are a lot better than machines at distinguishing between an inside joke and a vitriolic attack.
Even if we can agree that something is negative, it’s close to impossible for a machine to know how someone will respond. Reactions can range from laughing it off, mild annoyance to anger and depression. And how someone reacts often has more to do with their own resilience or other things going on in their life than the words themselves. The same comment that one person might laugh at or get over quickly could be extremely hurtful to someone else.
Not really a solution
The biggest problem with a technical “solution,” is that it’s not really a solution. If a person is thinking mean thoughts, then simply muzzling them does nothing to modify those thoughts. Combating bullying, hatred, bigotry and cruelty is more than just suppressing mean speech, it’s also helping people think differently about others.
And, finally, blocking or reporting negative speech doesn’t build the resilience that young people need to thrive even in the face of negative behavior. While we must seek to end bullying and harassment and limit meanness, we must also learn to cope with it. That’s not to say it’s ever acceptable, but we can’t let bullies ruin our lives and we must teach our children to stand up to bullies and realize that it’s not their fault if someone else is mean.
Might help some children
I’m not saying there is never a role for monitoring software. There are some children who need a bit of help either because they can’t control their impulses to say inappropriate things or because they are particularly vulnerable to bullying. Clearly, parents must make their own decisions as to whether monitoring software is appropriate for their children. But such products are not for everyone. Most kids don’t bully and most kids that do receive an occasional mean or annoying message are able to handle them without horrendous consequences. As always, your first response should be to talk with your kids and find out what they need before trying to find a technical solution to a problem that may or may not exist. Technology can sometimes support parenting but it can never replace it.
It takes a village, not an algorithm
Combatting bullying — which typically takes place at school — isn’t a one-off task. It involves encouraging a culture of respect and a set of commonly understood norms that celebrate diversity and discourage mean and hateful behavior. That takes everyone — students, teachers, administrators, parents, police officers, janitors and everyone in the community. It’s a job for people, not software.