Library Filtering Remains Controversial
The American Library Association recommends against the use of automated filters, arguing they make too many mistakes and are inconsistent with a library’s commitment to intellectual freedom. But while many library directors fight to keep filters out, or minimize their use, others have had to fight to defend their Internet filtering policies.
In May, the Washington State Supreme Court upheld the right of the North Central Regional Library District to deny access to Internet content, even over the objection of a patron asking that the filter be removed. A group of library users backed by the American Civil Liberties Union argued that they had been denied access to constitutionally protected content, including content having nothing to do with sex, such as the Second Amendment Foundation's Web page, "Women and Guns," or the website of an art museum.
Further appeals are planned, so that may not be the last word. When the U.S. Supreme Court gave its blessing to the Children’s Internet Protection Act (CIPA), which restricts federal funding for Internet access to libraries that employ filtering, the court specifically said adult users should be able to request the removal of erroneous filters.
In Sonoma County, Calif., a grand jury has been pressing the Santa Rosa Central Library to install Internet filters over the objections of staff and the library board. The Greensboro, N.C., City Council recently voted to require Internet filtering over the objections of the library director – who still plans to argue that use of the filters should be optional for adult patrons. The Salt Lake City Public Library had rejected the use of filters in the past, but is taking another look at the technology. This, after the arrest of a man discovered browsing photos of young nude girls in a crowded area of the library.
From a technological standpoint, one question is whether Internet filtering and categorization software has improved, making some old objections obsolete. For example, one newspaper report on the Santa Rosa controversy quoted a library director as arguing that “someone who has breast cancer who wants to research treatment can't get it because the filter prevents it.” That’s a classic argument, but it only applies to the most rudimentary keyword-based filters.
“If I were a library director, I think I would use Internet filtering to some extent, at least in the children’s department – knowing full well that I would wind up blocking some stuff that should not be blocked,” says Lori Bowen Ayre, a technology consultant to libraries. “In my opinion, these filters are only 85% effective, at best.” The better products concentrate on categorizing Internet content by multiple criteria, often with human intervention at least at a spot checking level, she says.
The Websense Internet filter looks at many other clues, beyond keywords, such as what other websites a page of content links to, says senior product marketing manager Mike Lee. “If a page uses the word breast five times, but it links to eight or ten health care sites, that’s a tremendously strong indicator that this is not a porn site. Porn sites link to porn sites, and healthcare sites link to healthcare sites.”
The software works by parsing web content almost the same way a search engine does, except instead of helping finding content customers want to find, filters seek out content customers may want to block, including pornography and malicious software downloads. The automated filters sort content into categories, like business, entertainment, file sharing, nudity, sex and so on. Libraries and other customers then decide which categories to block, Lee says.
Filters work mostly by analyzing the text on the page and other
contextual clues. Although a big part of their purpose is to filter out
offensive images, in most cases they don't analyze the images
themselves. Although techniques exist for analyzing flesh tones to
determine if an image is showing too much skin, that analysis is too
processor intensive for routine use, Lee says.
In the corporate world, the propriety of filtering out porn is rarely challenged, Lee says. There, the hotter issue is figuring out how to open up access to social media and other websites that many companies previously blocked but are now using for business purposes. Companies want to open up, while still keeping employees from wasting time on Internet games and other frivolity.
Lee also emphasized that while his company classifies the content, its library and business customers make their own decisions about what to block and how aggressively to block.
“Yes, the filters have gotten more sophisticated, but everyone agrees they both over-block and under-block,” says Deborah Caldwell-Stone, deputy director of the ALA’s office of intellectual freedom. And once filtering is imposed, it often goes beyond just porn. High school students doing research find they can’t research topics like marijuana use or abortion. The practical effect is often simply that students learn how to bypass the filters using proxy servers and other hacks, she says. Better to concentrate on teaching children about responsible Internet use, she says.
The extent of the problem is often exaggerated by excitable politicians, Caldwell-Stone says. Sandy Neerman, the library director in Greensboro, says that was the case with her City Commission, which cited hundreds of disciplinary incidents when only a handful were computer-related.
Neerman is hoping the City Commission will agree to her plan to make filtering software standard only in the children’s department and optional elsewhere in the library. “Children up to age 17 would need a parent’s permission for an unfiltered search,” she says. “I’m trying to compromise, while still maintaining the integrity of the library.”
Elsewhere, some libraries have managed to provide adult patrons with unfiltered access, and do it with little controversy. Leslie Scherer, director of the Wallingford Public Library in Wallingford, Conn., says she would rather rely on the judgment of her staff to intervene in the rare case that someone is viewing something inappropriate. To use software filtering properly would mean creating a process for deactivating it on a case-by-case basis, she says. “I think it’s more of a staffing issue if you filter, and have to remove that filter, than it is to have to tap someone on the shoulder occasionally.”
David F. Carr is a freelance writer and former Technology Editor of Baseline. Contact him at firstname.lastname@example.org