MOSS and ECM Class Notes and Questions

For those folks attending the MOSS and ECM class I delivered in Boston/Waltham, please find the notes below.

Class Questions

Question Answer

What do you do if you want to take control (administrative) over a large number of site collections?

Use Web Application Policy to control access to all site collections. You can either add or deny users/groups to web applications. Be wary that this may present challenges when trying to troubleshoot security access privilege problems in that folks must be aware of what the policy is doing

MSIT Site Delete Capture failure

Ensure the application pool credentials for the application pool serving the Web application has Full Control of the Web application, is a member of the Farm Administrators Group, and a member of either the Administrators or WSS_ADMIN_WPG on the WFEs

How to manually delete user profiles

https://blogs.msdn.com/bgeoffro/archive/2008/03/18/moss-user-profile-delete-tool.aspx

Blog comment approval for anonymous comments (how can we leverage workflows)

Potentially have a OWSTimer job examine the list where comments are stored.  The OWS Timer job would wake up and conduct notification approval

Implementing Kerberos with SharePoint

https://blogs.msdn.com/bgeoffro/archive/2008/02/11/adding-kerberos-ssl-to-central-administration.aspx

How to import/export lists with lookup columns

Most likely, what would happen is that the list would export and import, but the list would throw an error until the lookup column was deleted

If you have a query/index server and want to move query role out to WFE's how do you do that. Error encountered in that "Could not find a query server"

Does the WFE have the query server bits on the farm? If not, this error can occur. Remove the WFE's, from the farm, one by one and add them back with the full bits installed.

If you have no web front ends left and the database server dies, how to restore if you have a SQL backup?

Restore the content databases to the SQL Server, acquire and install MOSS on new WFEs. Recreate the farm by running the configuration wizard. Complete farm configuration through CA. Create corresponding web applications and attach existing content databases. Restore/recreate the SSP from the database(s) (if not full fidelity SSP backup). Reindex (if required). Deploy customizations

What is in the search database? What is the storage recommendation for the search database if the index is kept on the file system?

Logs, and search configuration information, search properties and tabs as well as metadata attributes about what is being crawled. The BDC will expand the database significantly because of the number of items and properties involved there.  The reasoning for the storage requirement of the search database to be expanded is that it houses a fairly denormalized representation of managed and crawl properties of the index.  The index is built from the properties and data in the search database and ends up having a smaller disk footprint on the server than the representation of its source in the search database

Index search and optimization

We can control how much the indexer will index on a single document based on registry keys on the indexer

under the regkey HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office Server\12.0\Search\Global\Gathering Manager

MaxGrowFactor * MaxDownloadSize = max size of a fine that can be indexed In MB

MaxDownloadSize = 64MB (default = 16MB) MaxGrowFactor = 4, allows index filter to produce up to 256MB (64 x 4) of text from a file. (Defaults of 16MB * 4MB= 64MB of text)

Does content width in a list contribute to slowness of rendering

Its not the width of the rows although they will slow page render times, but just gathering records from SQL, iterating through them and rendering is what the limitation is. The key takeaway is that managing SQL performance bottlenecks is extremely important, and retrieving large numbers of rows can incur significant SQL costs.

Double size estimates for FT index because of index change

Master merge behavior is when MOSS is indexing, it creates several shadow indexes that get merged into the master. This is because MOSS can write to smaller files quicker than managing a large file. The master merge change didn't introduce any new requirements into the disk estimation and planning process

Double size estimates for FT index because of index change

From TechNet
  1. Estimate how much content you plan to crawl and the average size of each file. If you do not know the average size of files in your corpus, use 10 KB per document as a starting point.

    Use the following formula to calculate how much disk space you need to store the content index:

    GB of disk space required = Total_Corpus_Size (in GB) x File_Size_Modifier x 2.85

    where File_Size_Modifier is a number in the following range, based on the average size of the files in your corpus:

    • 1.0 if your corpus contains very small files (average file size = 1 KB).

    • 0.12 if your corpus contains moderate files (average file size = 10 KB).

    • 0.05 if your corpus contains large files (average file size = 100 KB or larger).

NoteNote

This equation is intended only to establish a starting-point estimate. Real-world results may vary widely based on the size and type of documents being indexed, and how much metadata is being indexed during a crawl operation.

In this equation, you multiply Total_Corpus_Size (in GB) x File_Size_Modifier to get the estimated size of the index file. Next, you multiply by 2.85 to accommodate overhead for master merges when crawled data is merged with the index. The final result is the estimated disk space requirement.

For example, for a corpus size of 1 GB that primarily contains files that average 10KB in size, use the following values to calculate the estimated size of the index file:

1 GB x 0.12 = 0.12GB

According to this calculation, the estimated size of the index file is 120MB.

Next, multiply the estimated size of the index file by 2.85:

120 MB x 2.85 = 342MB

Thus, the disk space required for the index file and to accommodate indexing operations is 342MB, or 0.342GB.

https://technet.microsoft.com/en-us/library/cc262574.aspx

Downloadable book- Planning and Deploying Service Pack 1 for Microsoft Office SharePoint Server

Heather's Base Master Page File for SharePoint 2007 https://www.heathersolomon.com/blog/archive/2007/01/26/6153.aspx
Customizing the Content Query Web Part and Custom Item Styles https://www.heathersolomon.com/blog/articles/customitemstyle.aspx

Can you have multiple index servers in a farm? Is there a way to have an index server per SSP

You can have one Index Server per SSP. The SSP's can even index each other. This would allow you to segregate the task of indexing large backend systems for search (like a records center) while ensuring that you meet indexing SLA's on the rest of the farm.

If you have a workflow that submits documents to the record center, how do you prevent people from resubmitting? What are the ACL's around that

Use a custom master page that replaces the existing ECB javascript with your own to hide the menu option for submitting to the records center. Alternatively don't establish a farm-wide records center connection so that the menu item will not be present or use feature stapling to see if we can remove it.

How can we attach content types or metadata from the messages

Use a workflow or event handler to assign content types to the item and then fill in information using code

How to incoming documents get mapped to a records management library https://blogs.msdn.com/recman/archive/2006/09/12/750034.aspx By source library content types.
ASP.NET and Workflow threading models https://msdn2.microsoft.com/en-us/magazine/cc163623.aspx
Remote Blob Storage for MOSS/WSS SP1 https://blogs.msdn.com/toddca/archive/2007/08/06/windows-sharepoint-services-3-0-remote-blob-storage-api.aspx A feature implemented in SP1, it allows for alternative storage for Blobs on the file system instead of the content database.  A great way for storage for those that have TB's of data, but it is an API only.  Need third party vendors or ISV's to allow for management of the blobs between servers, etc.

Also, here is a list of links and resources that are relevant to the modules in the class:

  1. Supplemental links for content for MOSS and ECM Training
  2. Adding Kerberos and SSL to CA
  3. Branding sites and application pages
  4. SharePoint Capacity Planner
    1. https://www.microsoft.com/downloads/results.aspx?pocId=&freetext=SharePoint%20Capacity%20Planner&DisplayLang=en

Here are some supplemental limits for consideration when planning your information architecture

Level Performance Limits
Site Collection 2000 subsites of any site is the recommended limit.  The same content database is used for an entire site collection.  This may affect performance in its operations like backup and restore
Site 2000 libraries and lists is the recommended limit
Library 10,000,000 documents is the recommended limit, 2,000 items per view is the recommended limit
Folder 2000 items per folder is the recommended limit