Tuesday, April 26, 2011

FIGHTING CYBER BULLIES

Electronic Arts, Sony Use Crisp Thinking to Fight Cyber Bullies

Crisp Thinking CEO Adam Hildreth
Adam Hildreth, co-founder and chief executive of Crisp Thinking. Source: Crisp Thinking via Bloomberg
Adam Hildreth was a 14-year-old in Leeds, England, when he and six friends started Dubit, a virtual world for other teens, in 1999. The site, which makes money selling advertising aimed at kids socializing in cyberspace, grew quickly to 250,000 users. Dubit struggled to hire enough moderators to keep cyber bullies and other troublemakers in check, at one point needing 50 monitors a day. Human moderation “was never going to scale,” says Hildreth, now 26.
Hildreth, who left school at 16 to run Dubit full time, began working on software to detect bad behavior in the game world. The idea became his next company, Crisp Thinking, which he co-founded in 2005. Crisp’s software analyzes users’ language and actions to identify harassment, spamming, or predators who may be “grooming” potential victims. The system can react in real time by automatically warning or banning people who violate a site’s terms of service, or referring them to human moderators.
Online games, forums, and social networks use Crisp’s software to curb activity that could hurt users or damage their brands. Hildreth says Crisp has 75 clients, including Sony Online Entertainment, Electronic Arts, and the Cartoon Network. Clients pay between a few thousand dollars and tens of thousands a month. He expects revenue at the 30-employee company to double to $5 million in 2011 from $2.5 million last year, as cyber bullying cases make news and more companies realize the risk that negative interactions online pose to their brands.

OVERSIGHT VARIES

The degree of oversight in online communities varies widely, says Michael Kaiser, executive director of the National Cyber Security Alliance, a nonprofit funded by industry and government to promote online safety. Some sites do nothing at all, some rely on users to flag questionable behavior, and others have humans review every post before publishing, he says. For businesses hosting online communities, “the quality of the discussion on some of these sites is going to be a part of your brand,” says Kaiser. “There should be some expectation of policing or the brand or the site will lose credibility, or it becomes a place where just nasty people hang out.”
Crisp originally focused on stopping cyber bullies and online predators in virtual worlds and so-called massively multiplayer online games that can host thousands of users. The software can be used to identify a range of behaviors, from users disparaging a brand to racist comments to “gold farming,” the practice of accumulating currency in a virtual world to sell to other players. Beyond games, Crisp can track posts on online forums and clients’ Facebook pages and other social media sites. Companies use Crisp’s technology to identify customers who have complaints that should be referred to customer service, or even to find potential sales leads.
The software extracts meaning from users’ words, actions, and relationships online. Hildreth says Crisp processed 500 million pieces of user-generated content -- comments, chat messages, and blog posts -- in February and that number is growing by 25 percent a month. Most of the analysis is based on language (Crisp interprets messages in nine languages), though 20 percent is based on other user behaviors, like friend requests or actions in a virtual game. While many companies make software for families to monitor or restrict children’s interactions online, relatively few sell to Crisp’s market: the companies trying to keep online communities safe in the first place.

EXTRA MODERATION

Sony Online Entertainment began using Crisp for chat moderation in its Free Realms game in 2009 and in Star Wars: Clone Wars Adventures the next year, according to Brad Wilcox, the game company’s head of global customer service. The software provides an extra layer of moderation on top of Sony’s own chat filters and other safeguards to “identify various types of behavior (harassment, bullying, griefing [deliberately disrupting other players], grooming [attempts by predators to gain the trust of minors online], racism and anything sexually explicit) that is against the game’s code of conduct and has no place in our games,” he says in an e-mail. Sony investigates such incidents and then warns or banishes players found to be in violation.
Because Crisp only makes the software and does not operate online communities, the company doesn’t determine how brands handle misbehavior online. The technology can automate responses such as warning users, silencing them, or kicking them off, depending on the severity of the offenses and the user’s history on the site.
As more and more businesses create online communities around brands or use Facebook and Twitter for marketing, they’re facing risks that users could be harassed or offended on their forums, or use the sites to badmouth their brands. Hildreth suggests the problem Crisp was built to address will snowball. Says Hildreth, “Customer service in a public environment is a risk.”
To contact the reporter on this story: John Tozzi at jtozzi2@bloomberg.net

No comments:

Post a Comment