Content-control software

Content-control software
Content Filter preventing access to [jailbreakme.com], a site providing a series of jailbreaks for Apple's iOS mobile operating system, in Raffles Institution. See JailbreakMe.

Content-control software, also known as censorware or web filtering software, is a term for software designed and optimized for controlling what content is permitted to a reader, especially when it is used to restrict material delivered over the Web. Content-control software determines what content will be available.

The restrictions can be applied at various levels: a government can attempt to apply them nationwide (see Internet censorship), or they can, for example, be applied by an ISP to its clients, by an employer to its personnel, by a school to its students, by a library to its visitors, by a parent to a child's computer, or by an individual user to his or her own computer.

The motive is often to prevent persons from viewing content which the computer's owner(s) or other authorities may consider objectionable; when imposed without the consent of the user, content control can constitute censorship. Some content-control software includes time control functions that empowers parents to set the amount of time that child may spend accessing the Internet or playing games or other computer activities.

In some countries, such software is ubiquitous. In Cuba, if a computer user at a government controlled Internet cafe types certain words, the word processor or browser is automatically closed, and a "state security" warning is given.[1]

Contents

Terminology

This article uses the term "content control", a term also used on occasion by CNN,[2] Playboy magazine[3] the San Francisco Chronicle[4] and the New York Times.[5] However, two other terms, censorware and web filtering, while more controversial, are often used. Nannyware has also been used in both product marketing and by the media.

Companies that make products that selectively block Web sites do not refer to these products as censorware, and prefer terms such as "Internet filter" or "URL Filter"; in the specialized case of software specifically designed to allow parents to monitor and restrict the access of their children, "parental control software" is also used. Some products log all sites that a user accesses and rates them based on content type for reporting to an "accountability partner" of the person's choosing, and the term accountability software is used. Internet filters, parental control software, and/or accountability software may also be combined into one product.

Those critical of such software, however, use the term "censorware" freely: consider the Censorware Project, for example.[6] The use of the term censorware in editorials criticizing makers of such software is widespread and covers many different varieties and applications: Xeni Jardin used the term in a 9 March 2006 editorial in the New York Times when discussing the use of American-made filtering software to suppress content in China; in the same month a high school student used the term to discuss the deployment of such software in his school district.[7] [8]

In general, outside of editorial pages as described above, traditional newspapers do not use the term censorware in their reporting, preferring instead to use terms such as content filter, content control, or web filtering; the New York Times and the Wall Street Journal both appear to follow this practice. On the other hand, Web-based newspapers such as CNET use the term in both editorial and journalistic contexts, for example "Windows Live to Get Censorware."[9]

Types of filtering

Filters can be implemented in many different ways: by a software program on a personal computer or by servers providing Internet access. Choosing an Internet service provider (ISP) that blocks objectionable material before it enters the home can help parents who worry about their children viewing objectionable content.

Most content control software is marketed to organizations or parents. It is, however, also advertised on occasion to facilitate self-censorship by people struggling with Internet addiction. Those obsessed by online pornography or gambling, chat rooms or the Internet in general, may wish to prevent or restrict their own access. A number of accountability software products marketed as self-censorship or accountability software. These are promoted at churches and via religious media.[10]

Opinions as to when this software is moral (and sometimes legal) to use vary widely with individual people being strongly in favor and against the same software used in different scenarios.

Client-side filters:[11] This type of filter is installed as software on your computer. This filter can be customized to meet a family's needs. This filter can be disabled only by someone with the password. Client-side filters usually work well in places like libraries, because only some access points need to be filtered, for instance, the computers in the children's playing and reading section.

Content-limited (or filtered) ISPs: Content-limited (or filtered) ISPs are Internet service providers that offer access to only a set portion of Internet content. Anyone who subscribes to this type of service is subject to the same type of restrictions. These types of filters are especially designed for kids. It offers webpages that have been carefully reviewed multiple times and assessed for their appropriateness and safety. Content, entertainment, and education are key aspects as well. There are strict rules for those who break chat room and bulletin board rules. Those who break the rules are uninvited. Email and instant messages can only be received from specified parties.

Content limited ISPs that are intended for both children and adults allows access to almost everything, except what the ISP considers inappropriate. Limits and monitoring on email are very rare or do not exist at all. There are ISPs that provide both filtered/unfiltered access as well. Filtered access is available when there are ISPs that offer parental controls and passwords.

Server filter-sides: These servers are in demand[citation needed] and very useful in institutional settings, such as schools and library systems. All users are subject to the access policy defined by the institution. The filtering comes from the institution's ISPs directly or from the institution itself. The filtering can be customized, so a school district's high school library can have a different filtering profile than the district's junior high school library.

Search-engine filters: Many search engines, such as Google and Alta Vista offer users the option of turning on a safety filter. When this safety filter is activated, it filters out the inappropriate links from all of the search results. If one knows the actual URL of a website that features sexual explicit or 18 + content, they have the ability to access it without using a search engine. Engines like Lycos, Yahoo, and Bing offer kid-oriented versions of their engines that permit only children friendly websites.[12]

Pros and cons

Overblocking: A filter that is highly effective at taking out content can be described as over blocking. Over blocking can filter out very important and acceptable material, such as health related information. Still, in many cases parents will still accept over blocking just to prevent exposure to other sites.

Underblocking: Whenever new information is uploaded to the Internet, filters can under block[13] content if the people responsible for the parties do not update their systems.

Default Setting: The default setting has the ability to block out many categories from the Internet entirely. Some of these categories include: sexual images, nudity, artwork showing nudity, sex education, violence, weapons, drugs, alcohol, types of abortions, homosexual lifestyles and many more.[14]

Criticism

Morality and opinion

Many[15] would disapprove of government filtering viewpoints on moral or political issues, agreeing that this could become support for propaganda. Many[16] would also find it unacceptable that an ISP, whether by law or by the ISP's own choice, should deploy such software without allowing the users to disable the filtering for their own connections.

Critics argue that content-filtering software allows private companies to censor as they please. (See Religious or political censorship, below). They further argue that government encouragement of content filtering, or legal requirements for content-labeling software, would be equivalent to censorship.[citation needed]

Some people contrast content-control with censoring, claiming that limiting the content one can view is similar to how communist governments limit the content its citizens can view in order to promote one idea. They argue that there is a chance for the content-filtering software to be abused.[citation needed]

Legal actions

In 1998, a United States federal district court in Virginia ruled that the imposition of mandatory filtering in a public library violates the First Amendment of the U.S. Bill of Rights.[17]

In 1996 the US Congress passed the Communications Decency Act, banning indecency on the Internet. Civil liberties groups challenged the law under the First Amendment and in 1997 the Supreme Court ruled in their favor.[18] Part of the civil liberties argument, especially from groups like the Electronic Frontier Foundation, was that parents who wanted to block sites could use their own content-filtering software, making government involvement unnecessary.[citation needed]

In the late 1990s, groups such as the Censorware Project began reverse-engineering the content-control software and decrypting the blacklists to determine what kind of sites the software blocked. This led to legal action alleging violation of the "Cyber Patrol" license agreement.[19] They discovered that such tools routinely blocked unobjectionable sites while also failing to block intended targets. (See Over-zealous filtering, below).

Some content-control software companies responded by claiming that their filtering criteria were backed by intensive manual checking. The companies' opponents argued, on the other hand, that performing the necessary checking would require resources greater than the companies possessed and that therefore their claims were not valid.[20]

Over-zealous filtering

Content-control software was mentioned as blocking access to Beaver College before its name change to Arcadia University.[21] Another example was the filtering of Horniman Museum.[22]

Religious, anti-religious, and political censorship

Many types of content-control software have been shown to block sites based on the religious and political leanings of the company owners. Examples include blocking several religious sites[23][24] (including the Web site of the Vatican), many political sites, and sites about gay/lesbians.[25] X-Stop was shown to block sites such as the Quaker web site, the National Journal of Sexual Orientation Law, the Heritage Foundation, and parts of The Ethical Spectacle.[26] CYBERsitter blocks out sites like National Organization for Women.[27] Nancy Willard, an academic researcher and attorney, pointed out that many U.S. public schools and libraries use the same filtering software that many Christian organizations use.[28] Cyber Patrol, a product developed by The Anti-Defamation League and Mattel's The Learning Company[29], has been found to block not only political sites it deems to be engaging in 'hate speech' but also human rights web sites, such as Amnesty International's web page about Israel and gay-rights web sites, such as glaad.org.[30]

Content labeling

Content labeling may be considered another form of content-control software. In 1994, the Internet Content Rating Association (ICRA) — now part of the Family Online Safety Institute — developed a content rating system for online content providers. Using an online questionnaire a webmaster describes the nature of his web content. A small file is generated that contains a condensed, computer readable digest of this description that can then be used by content filtering software to block or allow that site.

ICRA labels come in a variety of formats.[31] These include the World Wide Web Consortium's Resource Description Framework (RDF) as well as Platform for Internet Content Selection (PICS) labels used by Microsoft's Internet Explorer Content Advisor.[32]

ICRA labels are an example of self-labeling. Similarly, in 2006 the Association of Sites Advocating Child Protection (ASACP) initiated the Restricted to Adults self-labeling initiative. ASACP members were concerned that various forms of legislation being proposed in the United States were going to have the effect of forcing adult companies to label their content.[33] The RTA label, unlike ICRA labels, does not require a webmaster to fill out a questionnaire or sign up to use. Like ICRA the RTA label is free. Both labels are recognized by a wide variety of content-control software.

The Voluntary Content Rating (VCR) system was devised by Solid Oak Software for their CYBERsitter filtering software, as an alternative to the PICS system, which some critics deemed too complex. It employs HTML metadata tags embedded within web page documents to specify the type of content contained in the document. Only two levels are specified, mature and adult, making the specification extremely simple.

Use in public libraries

US

The use of Internet filters or content-control software varies widely in public libraries in the United States, since Internet use policies are established by the local library board. Many libraries adopted Internet filters after Congress conditioned the receipt of universal service discounts on the use of Internet filters through the Children's Internet Protection Act (CIPA). Other libraries do not install content control software, believing that acceptable use policies and educational efforts address the issue of children accessing age-inappropriate content while preserving adult users' right to freely access information. Some libraries use Internet filters on computers used by children only. Some libraries that employ content-control software allow the software to be deactivated on a case-by-case basis on application to a librarian; libraries that are subject to CIPA are required to have a policy that allows adults to request that the filter be disabled without having to explain the reason for their request.

Many legal scholars believe that a number of legal cases, in particular Reno v. American Civil Liberties Union, established that the use of content-control software in libraries is a violation of the First Amendment.[34] The Children's Internet Protection Act [CIPA] and the June 2003 case United States v. American Library Association found CIPA constitutional as a condition placed on the receipt of federal funding, stating that First Amendment concerns were dispelled by the law's provision that allowed adult library users to have the filtering software disabled, without having to explain the reasons for their request. The plurality decision left open a future "as-applied" Constitutional challenge, however. In November 2006, a lawsuit was filed against the North Central Regional Library District (NCRL) in Washington State for its policy of refusing to disable restrictions upon requests of adult patrons, but CIPA was not challenged in that matter.[35] In May 2010, the Washington State Supreme Court provided an opinion after it was asked to certify a question referred by the United States District Court for the Eastern District of Washington: “Whether a public library, consistent with Article I, § 5 of the Washington Constitution, may filter Internet access for all patrons without disabling Web sites containing constitutionally-protected speech upon the request of an adult library patron.” The Washington State Supreme Court ruled that NCRL’s internet filtering policy did not violate Article I, Section 5 of the Washington State Constitution. The Court said: “It appears to us that NCRL’s filtering policy is reasonable and accords with its mission and these policies and is viewpoint neutral. It appears that no article I, section 5 content-based violation exists in this case. NCRL’s essential mission is to promote reading and lifelong learning. As NCRL maintains, it is reasonable to impose restrictions on Internet access in order to maintain an environment that is conducive to study and contemplative thought.” The case now returns to federal court.

In March 2007, Virginia passed a law similar to CIPA that requires public libraries receiving state funds to use content-control software. Like CIPA, the law requires libraries to disable filters for an adult library user when requested to do so by the user.[36]

Australia

The Australian Internet Safety Advisory Body has information about "practical advice on Internet safety, parental control and filters for the protection of children, students and families" that also includes public libraries.[37]

NetAlert, the software made available free of charge by the Australian government, was allegedly cracked by a 16 year old student, Tom Wood, less than a week after its release in August 2007. Wood supposedly bypassed the $84 million filter in about half an hour to highlight problems with the government's approach to Internet content filtering.[38]

The Australian Government has introduced legislation that requires ISP's to "restrict access to age restricted content (commercial MA15+ content and R18+ content) either hosted in Australia or provided from Australia" that was due to commence from 20 January 2008, known as Cleanfeed.[39]

Cleanfeed is a proposed mandatory ISP level content filtration system. It was proposed by the Beazley led Australian Labor Party opposition in a 2006 press release, with the intention of protecting children who were vulnerable due to claimed parental computer illiteracy. It was announced on 31 December 2007 as a policy to be implemented by the Rudd ALP government, and initial tests in Tasmania have produced a 2008 report. Cleanfeed is funded in the current budget, and is moving towards an Expression of Interest for live testing with ISPs in 2008. Public opposition and criticism have emerged, led by the EFA and gaining irregular mainstream media attention, with a majority of Australians reportedly "strongly against" its implementation.[40] Criticisms include its expense, inaccuracy (it will be impossible to ensure only illegal sites are blocked) and the fact that it will be compulsory, which can be seen as an intrusion on free speech rights.[40] Another major criticism point has been that although the filter is claimed to stop certain materials, the underground rings dealing in such materials will not be affected. The filter might also provide a false sense of security for parents, who might supervise children less while using the Internet, achieving the exact opposite effect. Cleanfeed is a responsibility of Senator Conroy's portfolio.

Denmark

In Denmark it is stated policy that it will "prevent inappropriate Internet sites from being accessed from children's libraries across Denmark."[41] "'It is important that every library in the country has the opportunity to protect children against pornographic material when they are using library computers. It is a main priority for me as Culture Minister to make sure children can surf the net safely at libraries,' states Brian Mikkelsen in a press-release of the Danish Ministry of Culture."[42]

Bypassing filters

Content filtering in general can "be bypassed entirely by tech-savvy individuals"[43] Blocking content on circumvention advice "[will not]...guarantee that users won't eventually be able to find a way around the filter."[43]

Some software may be bypassed successfully by using alternative protocols such as FTP or telnet or HTTPS, conducting searches in a different language, using a proxy server or a circumventor such as Psiphon. Also cached web pages returned by Google or other searches could bypass some controls as well. Web syndication services may provide alternate paths for content. Some of the more poorly-designed programs can be shut down by killing their processes: for example, in Microsoft Windows through the Windows Task Manager, or in Mac OS X using Force Quit or Activity Monitor. Numerous workarounds and counters to workarounds from content-control software creators exist. Google services are often blocked by filters, but these may most often be bypassed by using https:// in place of http:// since content filtering software is not able to interpret content under secure connections (in this case SSL).

Many content filters have an option which allows authorized people to bypass the content filter. This is especially useful in environments where the computer is being supervised and the content filter is aggressively blocking Web sites that need to be accessed.[citation needed]

An encrypted VPN can also be used as means of bypassing content control software, especially if the content control software is installed on an Internet gateway or firewall.

Products and services

Some ISPs offer parental control options. Some offer security software which includes parental controls. Mac OS X v10.4 offers parental controls for several applications (Mail, Finder, iChat, Safari & Dictionary). Microsoft's Windows Vista operating system also includes content-control software.

Content filtering technology exists in two major forms: application gateway or packet inspection. For HTTP access the application gateway is called a web-proxy or just a proxy. Such web-proxies can inspect both the initial request and the returned web page using arbitrarily complex rules and will not return any part of the page to the requester until a decision is made. In addition they can make substitutions in whole or for any part of the returned result. Packet inspection filters do not initially interfere with the connection to the server but inspect the data in the connection as it goes past, at some point the filter may decide that the connection is to be filtered and it will then disconnect it by injecting a TCP-Reset or similar faked packet. The two techniques can be used together with the packet filter monitoring a link until it sees an HTTP connection starting to an IP address that has content that needs filtering. The packet filter then redirects the connection to the web-proxy which can perform detailed filtering on the website without having to pass through all unfiltered connections. This combination is quite popular because it can significantly reduce the cost of the system.

Gateway-based content control software may be more difficult to bypass than desktop software as the user does not have physical access to the filtering device. However, many of the techniques in the Bypassing filters section still work.

See also

References

  1. ^ "Going online in Cuba: Internet under surveillance". Reporters Without Borders. 2006. http://www.rsf.org/IMG/pdf/rapport_gb_md_1.pdf. 
  2. ^ "Young, angry ... and wired - May 3, 2005". Edition.cnn.com. 2005-05-03. http://edition.cnn.com/2005/WORLD/asiapcf/04/27/eyeonchina.internet/. Retrieved 2009-10-25. 
  3. ^ Umstead, R. Thomas (2006-05-21). "Playboy Preaches Control". Multichannel News. http://www.multichannel.com/article/CA6336423.html. Retrieved 22 November 2010. 
  4. ^ "?". http://www.google.com/search?q=cache:u6Golw6fCu8J:www.sfgate.com/cgi-bin/article.cgi%3Ffile%3D/news/archive/2002/10/25/financial1400EDT0155.DTL%26type%3Dprintable+internet+%22content+control%22+site:sfgate.com. [dead link]
  5. ^ Bickerton, Derek (1997-11-30). "Digital Dreams - The". New York Times. http://query.nytimes.com/gst/fullpage.html?res=9403E1D8143BF933A05752C1A961958260. Retrieved 2009-10-25. 
  6. ^ "Censorware Project". censorware.net. http://censorware.net/. [dead link]
  7. ^ "?". http://159.54.226.83/apps/pbcs.dll/article?AID=/20060319/COLUMN0203/603190309/1064. [dead link]
  8. ^ "DMCA 1201 Exemption Transcript, April 11 - Censorware". Sethf.com. http://sethf.com/anticensorware/hearing_dc.php. Retrieved 2009-10-25. 
  9. ^ "Windows Live to get censorware - ZDNet.co.uk". News.zdnet.co.uk. 2006-03-14. http://news.zdnet.co.uk/software/windows/0,39020396,39257292,00.htm. Retrieved 2009-10-25. 
  10. ^ PostBlog. "SafeFamilies.org | Accountability Software: Encyclopedia of Urban Ministry". Urbanministry.org. http://www.urbanministry.org/wiki/accountability-software. Retrieved 2009-10-25. 
  11. ^ "Client-side filters". nap.edu. http://www.nap.edu/netsafekids/pro_fm_filter.html. 
  12. ^ "Filtering". nap.edu. http://www.nap.edu/netsafekids/pro_fm_filter.html. Retrieved 22 November 2010. 
  13. ^ Stark, Philip B. (November 10, 2007). "The Effectiveness of Internet Content Filters" (PDF). University of California, Berkeley. http://www.stat.berkeley.edu/~stark/Preprints/filter07.pdf. Retrieved 22 November 2010. 
  14. ^ Stark, Philip B. (November 10, 2007). "The Effectiveness of Internet Content Filters". University of California, Berkeley. http://www.stat.berkeley.edu/~stark/Preprints/filter07.pdf. Retrieved 22 November 2010. 
  15. ^ Lui, Spandas (23 March 2010). "Microsoft, Google and Yahoo! speak out in ISP filter consultation". arnnet.com. http://www.arnnet.com.au/article/340550/microsoft_google_yahoo_speak_isp_filter_consultation/?fp=16&fpid=1. Retrieved 22 November 2010. 
  16. ^ "Net giants query Australia filter". BBC News. 2010-02-16. http://news.bbc.co.uk/2/hi/technology/8517829.stm. Retrieved 2010-04-30. 
  17. ^ "Mainstream Loudon v. Board of Trustees of the Loudon County Library, 24 F. Supp. 2d 552 (E.D. Va. 1998)". Tomwbell.com. http://www.tomwbell.com/NetLaw/Ch04/Loudoun.html. Retrieved 2009-10-25. 
  18. ^ "Tuling transcript". CNN. http://www.cnn.com/US/9703/cda.scotus/transcript.ruling.html. [dead link]
  19. ^ "Microsystems v Scandinavia Online". Electronic Frontier Foundation. http://www.eff.org/legal/cases/Microsystems_v_Scandinavia_Online/?f=20000316_verif_complaint.html. Retrieved 2009-10-25. 
  20. ^ "National Academies whitepaper". nationalacademies.org. Archived from the original on 2006-04-19. http://web.archive.org/web/20060419190143/http://www7.nationalacademies.org/itas/whitepaper_1.html. 
  21. ^ "Web Censors Prompt College To Consider Name Change". slashdot.org. http://slashdot.org/article.pl?sid=00/03/01/2230240&tid=146. Retrieved 22 November 2010. 
  22. ^ "Porn filters have a field day on Horniman Museum". theregister.co.uk. http://www.theregister.co.uk/2004/10/08/horniman_museum_filtered/. 
  23. ^ Kelly Wilson (2008-11-06). "Hometown Has Been Shutdown - People Connection Blog: AIM Community Network". Hometown.aol.com. http://hometown.aol.com/Mjolnir13/test.htm. Retrieved 2009-10-25. 
  24. ^ "Notice!!". Members.tripod.com. http://members.tripod.com/~Trifold/NOTICE.html. Retrieved 2009-10-25. 
  25. ^ "?". http://www.glaad.org/media/archive_detail.php?id=103&. [dead link]
  26. ^ "The Mind of a Censor". Spectacle.org. http://www.spectacle.org/cs/burt.html. Retrieved 2009-10-25. 
  27. ^ "CYBERsitter: Where do we not want you to go today?". Spectacle.org. http://www.spectacle.org/alert/peace.html. Retrieved 2009-10-25. 
  28. ^ "See: Filtering Software: The Religious Connection". Csriu.org. http://www.csriu.org/onlinedocs/documents/religious2.html. Retrieved 2009-10-25. 
  29. ^ "See: ADL and The Learning Company Develop Educational Software". adl.org. 
  30. ^ "See: Cyber Patrol Examined". peacefire.org. http://www.peacefire.org/censorware/Cyber_Patrol/. Retrieved 2011-08-26. 
  31. ^ "ICRA: Technical standards used". FOSI. http://www.fosi.org/icra/#tech. Retrieved 2008-07-04. 
  32. ^ "Browse the Web with Internet Explorer 6 and Content Advisor". Microsoft. March 26, 2003. http://www.microsoft.com/windows/ie/ie6/using/howto/security/contentadv/config.mspx. 
  33. ^ "ASACP Participates in Financial Coalition Against Child Pornography". November 20, 2007. http://www.asacp.org/page.php?content=news&item=511. Retrieved 2008-07-04. 
  34. ^ Wallace, Jonathan D. (November 9, 1997). "Purchase of blocking software by public libraries is unconstitutional". http://www.spectacle.org/cs/library.bak. 
  35. ^ "ACLU Suit Seeks Access to Information on Internet for Library Patrons". ACLU of Washington. November 16, 2006. http://www.aclu-wa.org/detail.cfm?id=557. 
  36. ^ Sluss, Michael (March 23, 2007). "Kaine signs library bill: The legislation requires public libraries to block obscene material with Internet filters". The Roanoke Times. http://www.roanoke.com/politics/wb/wb/xp-109919. 
  37. ^ "NetAlert". NetAlert. 2009-06-30. http://www.netalert.gov.au/. Retrieved 2009-10-25. 
  38. ^ Higginbottom, Nick (2007-08-26). "Top stories". News.com.au. http://www.news.com.au/story/0,23599,22304224-2,00.html. Retrieved 2009-10-25. [dead link]
  39. ^ "New restricted access arrangements". ACMA. http://www.acma.gov.au/WEB/STANDARD/pc=PC_310905. Retrieved 2009-10-25. 
  40. ^ a b "Learn - No Clean Feed - Stop Internet Censorship in Australia". No Clean Feed. http://nocleanfeed.com/learn.html. Retrieved 2009-10-25. 
  41. ^ "Danish Ministry of Culture Chooses SonicWALL CMS 2100 Content Filter to Keep Children's Libraries Free of Unacceptable Material". Prnewswire.com. http://www.prnewswire.com/cgi-bin/stories.pl?ACCT=104&STORY=/www/story/07-27-2006/0004404991&EDATE=. Retrieved 2009-10-25. 
  42. ^ "Danish Minister of Culture offers Internet filters to libraries". Saferinternet.org. http://www.saferinternet.org/ww/en/pub/insafe/news/articles/0606/dk.htm. Retrieved 2009-10-25. 
  43. ^ a b Satterfield, Brian (2007-06-04). "Understanding Content Filtering: An FAQ for Nonprofits". Techsoup.org. http://www.techsoup.org/learningcenter/ctc/page7091.cfm. Retrieved 2009-03-19. 

External links


Wikimedia Foundation. 2010.

Игры ⚽ Поможем решить контрольную работу

Look at other dictionaries:

  • Bess (content-control software) — Bess is a brand of content control software made by Secure Computing Corporation, which acquired maker N2H2 in 2003; it is usually used in libraries and schools. The main purpose of the system is as an Internet filter, blocking minors using the… …   Wikipedia

  • List of content-control software — This is a list of notable content control software and services.Windows workstation based applications* Covenant Eyes * K9 Web Protection * Naomi * Scieno Sitter * Sentry Parental Controls * Websense * Windows Live OneCare Family Safety * Windows …   Wikipedia

  • Comparison of revision control software — The following is a comparison of revision control software. The following tables includes general and technical information for notable revision control and software configuration management (SCM) software. This is an incomplete list, which may… …   Wikipedia

  • Content filtering — See also: Content control software Content filtering is the technique whereby content is blocked or allowed based on analysis of its content, rather than its source or other criteria. It is most widely used on the internet to filter email and web …   Wikipedia

  • content filtering — noun Web filtering, content control software or spam blocking solutions …   Wiktionary

  • Content storage management — (CSM) is a technique for the evolution of traditional media archive technology used by media companies and content owners to store and protect valuable file based media assets. CSM solutions focus on active management of content and media assets… …   Wikipedia

  • Content management — Content management, or CM, is the set of processes and technologies that support the collection, managing, and publishing of information in any form or medium. In recent times this information is typically referred to as content or, to be precise …   Wikipedia

  • Content analysis — or textual analysis is a methodology in the social sciences for studying the content of communication. Earl Babbie defines it as the study of recorded human communications, such as books, websites, paintings and laws. According to Dr. Farooq… …   Wikipedia

  • Content-addressable storage — Content addressable storage, also referred to as associative storage or abbreviated CAS, is a mechanism for storing information that can be retrieved based on its content, not its storage location. It is typically used for high speed storage and… …   Wikipedia

  • Software system — A software system is a system based on software forming part of a computer system (a combination of hardware and software). The term software system is often used as a synonym of computer program or software. The term software system is related… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”