In many countries, including the United States, search providers are obligated to respond to claims from rights holders about unauthorized posting, distribution, or other publication of protected content. The international community recognizes that such unauthorized publication can infringe on the rights of content owners and has shioned both international treaties and local laws to address the matter. Pursuant to these laws, and in support of our own policies encouraging respect for intellectual property, we might remove certain displayed search results from our index upon notice from rights holders.
Sadly, the abuse of children is not new, but the internet affords a number of new opportunities to those who would commit crimes against children, including trafficking in images of ual abuse. Bing works with law enforcement and other authorities to help stop the flow of this content online. One of the ways that we do this is by removing displayed search results that have been reviewed by credible agencies and found to contain or relate to the abuse of children.
We remove these types of displayed search results only when were confident that the government or quasi-governmental agency providing the links:
Bing doesnt control the operation or design of the websites we index. We also dont control what these websites publish. As long as the website continues to make the information available on the web, the information will be generally available to others through Bing or other search services.
Here are the ways that Bing does this and when.
In limited cases, where relevant laws and/or public policy concerns address issues such as privacy, intellectual property protection, and the protection of children, we might remove a displayed search result of a particular resource. In each case, we try to limit our removal of displayed search results to a narrow set of circumstances so that we dont overly restrict access of Bing users to relevant information.
From time to time, webpages that are publicly available will intentionally or inadvertently contain private information that is posted without the consent of the individual identified or in circumstances that create security or privacy risks. Examples include inadvertent posting of public records, private phone numbers, identification numbers and the like, or intentional posting of email passwords, login credentials, credit card numbers, or other data that is intended to be used for fraud or hacking.
If the information has already been removed from that website but is still showing up in Bing displayed search results, you can request that we remove the information by using our content removal request form.
Bing automatically scans (or crawls) the internet to develop and maintain an index to generate and display a set of search results (the displayed search results). The index is really a catalog of available online resources, including websites, images, videos, documents, and other items. Particular displayed search results are created by using a computer algorithm to match the search terms you enter with results in our index. In general, we try to provide as comprehensive and as useful a collection of displayed search results as we can. We design algorithms to provide the most relevant and useful results and determine which displayed search results appear for any given search.
For your sHow Bing delivers search results? 1ecurity and privacy, dont include personal info, like your name,Online Education. email address, password, or phone number.
Bing categorizes certain countries as strict markets. In these strict markets, we might restrict the display of adult content (as locally defined), and because of the local customs, norms, and laws, we might limit SafeSearch settings only to strict. Set to strict, SafeSearch filters the display of explicit search results in images, videos, and text. Markets that are limited to strict include:
Bing doesnt control the sites that publish this information or what they publish. Most of the time the website is in the best position to address any privacy concerns about the information it publishes. As long as the website continues to make the information available on the web, the information will be available to others. Once the website has removed the information and we have crawled the site again, it will no longer appear in our results.
Provides some measure of recourse (like the ability to appeal) if content or sites hosting such content are blocked incorrectly.
Some pages captured in our index turn out to be pages of little or no value to users and may also have characteristics that artificially manipulate the way search and advertising systems work in order to distort their relevance relative to pages that offer more relevant information. Some of these pages include only advertisements and/or links to other websites that contain mostly ads, and no or only superficial content relevant to the subject of the search. To improve the search experience for consumers and deliver more relevant content, we might remove such displayed search results, or adjust our algorithms to prioritize more useful and relevant pages in displayed search result sets.
Similarly, countries around the world have adopted laws and procedures to address demation, libel, slander, and other harms related to lse statements that are made or implied to be ct and which might yield a negative perception about an individual, business, or other organization. We may remove displayed search results containing allegedly dematory content. For example, we might remove a displayed search result if we receive a valid and narrow court order indicating that a particular link has been found to be dematory.
Bing recognizes that the rights of content owners exist alongside the rights of users and that creativity and innovation online should be encouraged. To this end, Bing has helped develop a set of principles with respect to user-generated content applications (some of which generate links that we catalog in our search service). Learn more about those principles at We also review counter-notices that comply with the Digital Millennium Copyright Act sent from parties who wish to object to the removal of their content.
Some countries maintain laws or regulations that apply to search service providers that require that we remove access to certain information that Bing has indexed, primarily for geopolitical purposes or local cultural norms and sensibilities. We must integrate our support for freedom of access to information by people of all countries with required compliance that allows us to offer the search services in a specific jurisdiction. When approached with a request for removal of displayed search results by a governmental entity, we require proof of the applicable law and authority of the government agency, and an official request to require removal. If such proof is provided and we can verify it, then we may comply with the removal request. If we are required to implement the request, we will do so narrowly. If the removal request is inconsistent with international standards, we might choose to seek clarification to further narrow our obligation to comply.
Bing offers SafeSearch settings, which allow most users to set the type of filtering of adult content that they would like applied to their search results. By deult, in most markets all searches are set to moderate, which restricts visually explicit search results but does not restrict explicit text. Because of local customs or cultural norms, certain countries may impose legal restrictions on the display of adult content. As a result, Insurance knowledge what constitutes adult content might vary depending on the market.
In particular, we remove from our displayed search results links that have been identified by either the Internet Watch Foundation (UK), NCMEC (USA), FSM (Germany) as, in their good ith judgment, hosting or providing access to child abuse content that is illegal under the laws of their jurisdiction. Removing these links from displayed search results doesnt block them from being accessed on the internet or discovered through means other than Bing, but it does reduce the ability of those who would seek out child abuse content to find it and reduces the extent to which sellers of such content can profit from it.