Content-control software
From Wikipedia, the free encyclopedia
Part of a series on |
Censorship |
By media |
Banned books Banned films · Re-edited film |
Methods |
Book burning · Bleeping Broadcast delay Content-control software Expurgation · Gag order Pixelization · Postal Prior restraint Self-censorship Whitewashing Chilling effect Conspiracy of silence Verbal offence |
Contexts |
Corporate · Political Criminal speech · Hate speech |
By country |
Censorship Freedom of speech |
Content-control software, also known as censorware or web filtering software, is a term for software designed and optimized for controlling what content is permitted to a reader, especially when it is used to restrict material delivered over the Web. Content-control software determines what content will be available on a particular machine or network; the motive is often to prevent persons from viewing content which the computer's owner(s) or other authorities may consider objectionable; when imposed without the consent of the user, content control can constitute censorship. Common use cases of such software include parents who wish to limit what sites their children may view from home computers, schools performing the same function with regard to computers found at school, and employers restricting what content may be viewed by employees while on the job. Some content-control software includes time control functions that empowers parents to set the amount of time that child may spend accessing the Internet or playing games or other computer activities.
In some countries, such software is ubiquitous. In Cuba, if computer user types a dissent keyword, the word processor or browser is automatically closed, and a "state security" warning is given.[1]
Contents |
[edit] Terminology
This article uses the term "content control", a term also used on occasion by CNN,[2] Playboy magazine[3] and the New York Times.[4] However, two other terms, censorware and web filtering, while more controversial, are often used.
Companies that make products that selectively block Web sites do not refer to these products as censorware, and prefer terms such as "Internet filter" or "URL Filter"; in the specialized case of software specifically designed to allow parents to monitor and restrict the access of their children, "parental control software" is also used. Some products log all sites that a user accesses and rates them based on content type for reporting to an "accountability partner" of the person's choosing, and the term accountability software is used. Internet filters, parental control software, and/or accountability software may also be combined into one product.
Those critical of such software, however, use the term "censorware" freely: consider the Censorware Project, for example.[5] The use of the term censorware in editorials criticizing makers of such software is widespread and covers many different varieties and applications: Xeni Jardin used the term in a 9 March 2006 editorial in the New York Times when discussing the use of American-made filtering software to suppress content in China; in the same month a high school student used the term to discuss the deployment of such software in his school district.[6]
Seth Finkelstein, an anti-censorware advocate and recipient of the EFF Pioneer Award, described what he saw as a terminology battle, in a hearing at the Library of Congress in 2003:
- I think the best public relations that the censorware companies ever did was to get the word "filter" attached to their products. When you think of a spam filter, for example, you think of something that you do not want to see.
- But, again, as I said earlier, censorware is not like a spam filter. What censorware is, is an authority wants to prevent a subject under their control from viewing material that the authority has forbidden to them. This description is general.[7]
In general, outside of editorial pages as described above, traditional newspapers do not use the term censorware in their reporting, preferring instead to use terms such as content filter, content control, or web filtering; the New York Times and the Wall Street Journal both appear to follow this practice. On the other hand, Web-based newspapers such as CNET use the term in both editorial and journalistic contexts, e.g., "Windows Live to Get Censorware."[8]
[edit] Issues
Filters can be implemented in many different ways: by a software program on a personal computer or by servers providing Internet access. Choosing an Internet service provider (ISP) that blocks objectionable material before it enters the home can help parents who worry about their children viewing objectionable content.
Those who believe content-control software is useful may still not agree with certain ways in which it is used, or with mandatory general regulation of information. For example, many would disapprove of filtering viewpoints on moral or political issues, agreeing that this could become support for propaganda. Many would also find it unacceptable that an ISP, whether by law or by the ISP's own choice, should deploy such software without allowing the users to disable the filtering for their own connections. In addition, some argue that using content-control software may violate sections 13 and 17 of the Convention on the Rights of the Child.[9] In 1998, a United States federal district court in Virginia ruled that the imposition of mandatory filtering in a public library violates the First Amendment of the U.S. Bill of Rights.[10]
[edit] History
As the World Wide Web rose to prominence, parents, led by a series of stories in the mass media, began to worry that allowing their children to use the Web might expose them to indecent material. The US Congress responded by passing the Communications Decency Act, banning indecency on the Internet. Civil liberties groups challenged the law under the First Amendment and the Supreme Court ruled in their favor.[11] Part of the civil liberties argument, especially from groups like the Electronic Frontier Foundation, was that parents who wanted to block sites could use their own content-filtering software, making government involvement unnecessary.[citation needed]
Critics then argued that while content-filtering software might make government censorship less likely, it would do so only by allowing private companies to censor as they pleased. They further argued that government encouragement of content filtering, or legal requirements for content-labeling software, would be equivalent to censorship. Although at severe risk of being sued under charges such as copyright infringment, trade secret violation, or breach of license agreement,[12] groups such as the Censorware Project began reverse-engineering the content-control software and decrypting the blacklists to determine what kind of sites the software blocked.[citation needed] They discovered that such tools routinely blocked unobjectionable sites while also failing to block intended targets. An example of this tendency was the filtering of all sites containing the word "breast", on the assumption that this word could only be mentioned in a sexual context. This approach had the consequence of blocking sites that discuss breast cancer, women's clothing, and even chicken recipes. Similarly, over-zealous attempts to block the word "sex" would block words such as "Essex" and "Sussex". For some reason, one filter blocked looking up the word "swallow". Content-control software has been cited[13] as one of the reasons Beaver College decided to change its name to Arcadia University, as content-control software had been blocking access to the college Web site.
Some content-control software companies responded by claiming that their filtering criteria were backed by intensive manual checking. The companies' opponents argued, on the other hand, that performing the necessary checking would require resources greater than the companies possessed and that therefore their claims were not valid.[14]
Many types of content-control software have been shown to block sites based on the religious and political leanings of the company owners. Examples include blocking several religious sites[15][16] (including the Web site of the Vatican), many political sites, and sites about gay/lesbians.[17] X-Stop was shown to block sites such as the Quaker web site, the National Journal of Sexual Orientation Law, the Heritage Foundation, and parts of The Ethical Spectacle.[18] CYBERsitter blocks out sites like National Organization for Women.[19] Nancy Willard, an academic researcher and attorney, reported on the close relationships between conservative Christian organizations and filtering software companies providing filters in U.S. public schools and libraries.[20] From her review of publicly available documentation, she concluded that seven of the filtering software companies were blocking Web sites based on religious or other inappropriate bias.[21]
Most content control software is marketed to organizations or parents. It is, however, also advertised on occasion to facilitate self-censorship by people struggling with internet addiction. Those obsessed by online pornography or gambling, chat rooms or the internet in general, may wish to prevent or restrict their own access. A number of accountability software products such as Covenant Eyes or X3Watch have recently appeared: many such, marketed as self-censorship or accountability software, are promoted at churches and via religious media.[22]
Microsoft's MSN Messenger silently removes all messages with a reference to the search service Scroogle's URL.[23] Filters may also give false negatives, something that affected the Horniman Museum.[24]
The site Peacefire.org posted information about some pages that were blocked and it was then added to the blocklist. Solid Oak Software has vowed that Peacefire's reports about CYBERsitter "will be blocked wherever they may be."[citation needed]
[edit] Content labeling
Content labeling may be considered another form of content-control software. In 1994, the Internet Content Rating Association (ICRA) — now part of the Family Online Safety Institute — developed a content rating system for online content providers. Using an online questionnaire a webmaster describes the nature of his web content. A small file is generated that contains a condensed, computer readable digest of this description that can then be used by content filtering software to block or allow that site.
ICRA labels come in a variety of formats.[25] These include the World Wide Web Consortium's Resource Description Framework (RDF) as well as Platform for Internet Content Selection (PICS) labels used by Microsoft's Internet Explorer Content Advisor.[26]
ICRA labels are an example of self-labeling. Similarly, in 2006 the Association of Sites Advocating Child Protection (ASACP) initiated the Restricted to Adults self-labeling initiative. ASACP members were concerned that various forms of legislation being proposed in the United States were going to have the effect of forcing adult companies to label their content.[27] The RTA label, unlike ICRA labels, does not require a webmaster to fill out a questionnaire or sign up to use. Like ICRA the RTA label is free. Both labels are recognized by a wide variety of content-control software.
The Voluntary Content Rating (VCR) system was devised by Solid Oak Software for their CYBERsitter filtering software, as an alternative to the PICS system, which some critics deemed too complex. It employs HTML metadata tags embedded within web page documents to specify the type of content contained in the document. Only two levels are specified, mature and adult, making the specification extremely simple.
[edit] Use in public libraries
The examples and perspective in this article may not represent a worldwide view of the subject. Please improve this article or discuss the issue on the talk page. |
[edit] USA
The use of Internet filters or content-control software varies widely in public libraries in the United States, since Internet use policies are established by the local library board. Many libraries adopted Internet filters after Congress conditioned the receipt of universal service discounts on the use of Internet filters through the Children's Internet Protection Act (CIPA). Other libraries do not install content control software, believing that acceptable use policies and educational efforts address the issue of children accessing age-inappropriate content while preserving adult users' right to freely access information. Some libraries use Internet filters on computers used by children only. Some libraries that employ content-control software allow the software to be deactivated on a case-by-case basis on application to a librarian; libraries that are subject to CIPA are required to have a policy that allows adults to request that the filter be disabled without having to explain the reason for their request.
Many legal scholars believe that a number of legal cases, in particular Reno v. American Civil Liberties Union, established that the use of content-control software in libraries is a violation of the First Amendment.[28] The Children's Internet Protection Act [CIPA] and the June 2003 case United States v. American Library Association found CIPA constitutional as a condition placed on the receipt of federal funding, stating that First Amendment concerns were dispelled by the law's provision that allowed adult library users to have the filtering software disabled, without having to explain the reasons for their request. The plurality decision left open a future "as-applied" Constitutional challenge, however. In November 2006, a lawsuit was filed against the North Central Regional Library District (NCRL) in Washington State for its policy of refusing to disable restrictions upon requests of adult patrons.[29]
In March 2007, Virginia passed a law similar to CIPA that requires public libraries receiving state funds to use content-control software. Like CIPA, the law requires libraries to disable filters for an adult library user when requested to do so by the user.[30]
[edit] Australia
The Australian Internet Safety Advisory Body has information about "practical advice on Internet safety, parental control and filters for the protection of children, students and families" that also includes public libraries.[31]
NetAlert, the software made available free of charge by the Australian government, was allegedly cracked by a 16 year old student, Tom Wood, less than a week after its release in August 2007. Wood supposedly bypassed the $84 million filter in about half an hour to highlight problems with the government's approach to Internet content filtering.[32]
The Australian Government has introduced legislation that requires ISP's to "restrict access to age restricted content (commercial MA15+ content and R18+ content) either hosted in Australia or provided from Australia" that was due to commence from 20 January 2008, known as Cleanfeed.[33]
Cleanfeed is a proposed mandatory ISP level content filtration system. It was proposed by the Beazley led Australian Labor Party opposition in a 2006 press release, with the intention of protecting children who were vulnerable due to claimed parental computer illiteracy. It was announced on 31 December 2007 as a policy to be implemented by the Rudd ALP government, and initial tests in Tasmania have produced a 2008 report. Cleanfeed is funded in the current budget, and is moving towards an Expression of Interest for live testing with ISPs in 2008. Public opposition and criticism have emerged, led by the EFA and gaining irregular mainstream media attention, with a majority of Australians reportedly "strongly against" its implementation.[34] Criticisms include its expense, inaccuracy (it will be impossible to ensure only illegal sites are blocked) and the fact that it will be compulsory, which can be seen as an intrusion on free speech rights.[35] Cleanfeed is a responsibility of Senator Conroy's portfolio.
[edit] Denmark
In Denmark it is stated policy that it will "prevent inappropriate Internet sites from being accessed from children's libraries across Denmark."[36] "'It is important that every library in the country has the opportunity to protect children against pornographic material when they are using library computers. It is a main priority for me as Culture Minister to make sure children can surf the net safely at libraries,' states Brian Mikkelsen in a press-release of the Danish Ministry of Culture."[37]
[edit] Bypassing filters
Some software may be bypassed successfully by using alternative protocols such as FTP or telnet or HTTPS, conducting searches in a different language, using a proxy server or a circumventor such as Psiphon. Also cached web pages returned by Google or other searches could bypass some controls as well. Web syndication services may provide alternate paths for content. Some of the more poorly-designed programs can be shut down by killing their processes: for example, in Microsoft Windows through the Windows Task Manager, or in Mac OS X using Activity Monitor. Numerous workarounds and counters to workarounds from content-control software creators exist. Google services are often blocked by filters, but these may most often be bypassed by using https:// in place of http://.There is one program called Ultrasurf that is designed to bypass the Chinese firewall. It is considered one of the most advanced proxies.
Many content filters have an option which allows authorized people to bypass the content filter. This is especially useful in environments where the computer is being supervised and the content filter is aggressively blocking Web sites that need to be accessed.
[edit] Content-control software products
As described above, many content-control software products as well as the concept of content-control software in general, especially in government-funded services or those not age-restricted, can be controversial. Many ISPs offer parental control options, among them Charter Communications, Earthlink, Yahoo!, and AOL; and more general software such as Norton Internet Security includes "parental controls". Mac OS X v10.4 offers parental controls for several applications (Mail, Finder, iChat, Safari & Dictionary). Microsoft's Windows Vista operating system also includes content-control software. See List of content-control software for more.
[edit] Gateway-based content-control solutions
Increasingly, companies that look to implement content-control solutions are implementing security appliances at the network gateway, such as those from proprietary vendors like St Bernard Software, Bloxx,[38] CensorNet[39] or Kerio Winroute Firewall and content filtering web proxy like SafeSquid. Untangle is an open source alternative for filtering web content at the gateway. See the Content-control software category for a number of articles on content-control software products.
Gateway Based Content Control Solutions may be more difficult to bypass than desktop software solutions, since they are less easily removed or disabled by the local user.
[edit] See also
Look up censorware in Wiktionary, the free dictionary. |
- Accountability software
- List of Content Control Software
- Internet censorship
- Internet pornography
- Internet safety
- Censorship
- Cleanfeed
- Image retrieval
- Geolocation
- Geolocation software
- Computer surveillance
- Parental controls
- Great Firewall of China
- Scieno Sitter
- Wordfilter
- WebMinder
- Nannyware
- Netsentron
[edit] References
- ^ "Going online in Cuba: Internet under surveillance". Reporters Without Borders. 2006. http://www.rsf.org/IMG/pdf/rapport_gb_md_1.pdf.
- ^ [1]
- ^ [2], the San Francisco Chronicle [3]
- ^ [4]
- ^ Censorware Project
- ^ [5]
- ^ [6]
- ^ [7]
- ^ Convention on the Rights of the Child.
- ^ Mainstream Loudon v. Board of Trustees of the Loudon County Library, 24 F. Supp. 2d 552 (E.D. Va. 1998)
- ^ ruling transcript
- ^ Microsystems v Scandinavia Online
- ^ [8] Slashdot article
- ^ [9] National Academies whitepaper
- ^ [10]
- ^ [11]
- ^ [12]
- ^ [13]
- ^ [14]
- ^ See: Filtering Software: The Religious Connection.
- ^ [15]
- ^ [16]
- ^ [17]
- ^ [18]
- ^ "ICRA: Technical standards used". FOSI. http://www.fosi.org/icra/#tech. Retrieved on 2008-07-04.
- ^ "Browse the Web with Internet Explorer 6 and Content Advisor". Microsoft. March 26, 2003. http://www.microsoft.com/windows/ie/ie6/using/howto/security/contentadv/config.mspx.
- ^ "ASACP Participates in Financial Coalition Against Child Pornography". November 20, 2007. http://www.asacp.org/page.php?content=news&item=511. Retrieved on 2008-07-04.
- ^ Wallace, Jonathan D. (November 9, 1997). "Purchase of blocking software by public libraries is unconstitutional". http://www.spectacle.org/cs/library.bak.
- ^ "ACLU Suit Seeks Access to Information on Internet for Library Patrons". ACLU of Washington. November 16, 2006. http://www.aclu-wa.org/detail.cfm?id=557.
- ^ Sluss, Michael (March 23, 2007). "Kaine signs library bill: The legislation requires public libraries to block obscene material with Internet filters". The Roanoke Times. http://www.roanoke.com/politics/wb/wb/xp-109919.
- ^ NetAlert
- ^ [19]
- ^ [20]
- ^ [21]
- ^ [22]
- ^ [23]
- ^ [24]
- ^ Bloxx
- ^ CensorNet
[edit] External links
- [25]
- Peacefire: Open Access for the Net Generation
- The Censorware Project: Exposing the Secrets of Censorware Since 1997
- Seth Finkelstein's Anticensorware Investigation - Censorware Exposed
- Restricted to Adults Label Self-Labeling Initiative by Adult Webmasters
- Filtering Facts
- Discussion of global net censorship, Berkman Center for Internet & Society, Harvard, March 2008
- Global survey of Web filtering & blocking: Rebecca MacKinnon at Web 2.0 Summit, November 2008