Search Limit Maximum value
|Limit||Maximum value||Limit type||Notes|
|SharePoint search service applications||20 per farm||Supported||Multiple SharePoint search service applications can be deployed on the same farm, because you can assign search components and databases to separate servers. The recommended limit of 20 is less than the maximum limit for all service applications in a farm.|
|Crawl databases and database Items||10 crawl databases per search service application
25 million items per crawl database
|Threshold||The crawl database stores the crawl data (time/status, etc.) about all items that have been crawled. The supported limit is 10 crawl databases per SharePoint Search service application.
The recommended limit is 25 million items per crawl database (or a total of four crawl databases per search service application).
|Crawl components||16 per search service application||Threshold||The recommended limit per application is 16 total crawl components; with two per crawl database, and two per server, assuming the server has at least eight processors (cores).
The total number of crawl components per server must be less than 128/(total query components) to minimize propagation I/O degradation. Exceeding the recommended limit may not increase crawl performance; in fact, crawl performance may decrease based on available resources on the crawl server, database, and content host.
|Index partitions||20 per search service application; 128 total||Threshold||The index partition holds a subset of the search service application index. The recommended limit is 20. Increasing the number of index partitions results in each partition holding a smaller subset of the index, reducing the RAM and disk space that is needed on the query server hosting the query component assigned to the index partition. The boundary for the total number of index partitions is 128.|
|Indexed items||100 million per search service application; 10 million per index partition||Supported||SharePoint Search supports index partitions, each of which contains a subset of the search index. The recommended maximum is 10 million items in any partition. The overall recommended maximum number of items (e.g., people, list items, documents, Web pages) is 100 million.|
|Crawl log entries||100 million per search application||Supported||This is the number of individual log entries in the crawl log. It will follow the “Indexed items” limit.|
|Property databases||10 per search service application;128 total||Threshold||The property database stores the metadata for items in each index partition associated with it. An index partition can only be associated with one property store. The recommended limit is 10 property databases per search service application. The boundary for index partitions is 128.|
|Query components||128 per search application; 64/(total crawl components) per server||Threshold||The total number of query components is limited by the ability of the crawl components to copy files. The maximum number of query components per server is limited by the ability of the query components to absorb files propagated from crawl components.|
|Scope rules||100 scope rules per scope; 600 total per search service application||Threshold||Exceeding this limit will reduce crawl freshness, and delay potential results from scoped queries.|
|Scopes||200 site scopes and 200 shared scopes per search service application||Threshold||Exceeding this limit may reduce crawl efficiency and, if the scopes are added to the display group, affect end-user browser latency. Also, display of the scopes in the search administration interface degrades as the number of scopes passes the recommended limit.|
|Display groups||25 per site||Threshold||Display groups are used for a grouped display of scopes through the user interface. Exceeding this limit starts degrading the scope experience in the search administration interface.|
|Alerts||1,000,000 per search application||Supported||This is the tested limit.|
|Content sources||50 per search service application||Threshold||The recommended limit of 50 can be exceeded up to the boundary of 500 per search service application. However, fewer start addresses should be used, and the concurrent crawl limit must be followed.|
|Start addresses||100 per content source||Threshold||The recommended limit can be exceeded up to the boundary of 500 per content source. However, the more start addresses you have, the fewer content sources should be used. When you have many start address, we recommend that you put them as links on an html page, and have the HTTP crawler crawl the page, following the links.|
|Concurrent crawls||20 per search application||Threshold||This is the number of crawls underway at the same time. Exceeding this number may cause the overall crawl rate to decrease.|
|Crawled properties||500,000 per search application||Supported||These are properties that are discovered during a crawl.|
|Crawl impact rule||100||Threshold||Recommended limit of 100 per farm. The recommendation can be exceeded; however, display of the site hit rules in the search administration interface is degraded. At approximately 2,000 site hit rules, the Manage Site Hit Rules page becomes unreadable.|
|Crawl rules||100 per search service application||Threshold||This value can be exceeded; however, display of the crawl rules in the search administration interface is degraded.|
|Managed properties||100,000 per search service application||Threshold||These are properties used by the search system in queries. Crawled properties are mapped to managed properties.|
|Mappings||100 per managed property||Threshold||Exceeding this limit may decrease crawl speed and query performance.|
|URL removals||100 removals per operation||Supported||This is the maximum recommended number of URLs that should be removed from the system in one operation.|
|Authoritative pages||1 top level and minimal second and third level pages per search service application||Threshold||The recommended limit is one top-level authoritative page, and as few second -and third-level pages as possible to achieve the desired relevance.
The boundary is 200 per relevance level per search application, but adding additional pages may not achieve the desired relevance. Add the key site to the first relevance level. Add more key sites at either second or third relevance levels, one at a time, and evaluate relevance after each addition to ensure that the desired relevance effect is achieved.
|Keywords||200 per site collection||Supported||The recommended limit can be exceeded up to the maximum (ASP.NET-imposed) limit of 5,000 per site collection given five Best Bets per keyword. If you exceed this limit, display of keywords on the site administration user interface will degrade. The ASP.NET-imposed limit can be modified by editing the Web.Config and Client.config files (MaxItemsInObjectGraph).|
|Metadata properties recognized||10,000 per item crawled||Boundary||This is the number of metadata properties that can be determined and potentially mapped or used for queries when an item is crawled.|