Monday, May 30, 2011

Event 8058. SharePoint 2010 Password Expiration

Topic: SharePoint 2010 Enterprise
Subject:  SharePoint 2010 Password Management
Problem: I keep getting the Event Log Error. “The credentials used for the account domain\user expired on 12/05/06 12:48:50, and need to be updated.  If they are not updated, the system may stop working.  The account is used by…..”
Response: In SharePoint 2010 Managed Accounts can be set to generate new passwords automatically and notify the user before the passwords expires. You can also change the account password manually. The service accounts I use within my development farms have an AD policy set for the password to never expire.  So why do I get an expiration notice?
You can view the options available:
Central Administration
-> Security
-> Configure managed accounts ->
<Edit one of your Managed Accounts>

Funny thing is that I will see my Password update information has a N/A for the next password change.  So who is sending the event saying my password has expired and why?   The “who” is easy it is a timer job which is enabled by default.  The why is a good question.  There is obviously an expiration policy in days that is being picked up by the timer job telling us the password is expired. It definitely is the default expiration date set in AD as I checked out of curiosity.
Solution:
1.      To disable the daily event message if you choose not to use Password management in SharePoint.

2.      Central Administration -> Monitoring -> Review Job Definitions

3.      Scroll and find the job definition “Password Management”

4.      Double Click to Edit the Job

5.      Click Disable

Conclusions:  Before I get the comments regarding “This will make it extra secure”.  Most organizations already have password policies in place and I am not sure I want SharePoint automatically generating passwords for me.  I fully understand the “Least Permissions theory”, etc. this is simply how to disable if you choose not to implement Password Management through SharePoint.

KORITFW

Wednesday, May 25, 2011

FS4SP Full and Incremental Crawls and deleted items

Topic: SharePoint 2010 and FS4SP Enterprise Search and the SharePoint Crawler
Subject: SharePoint 2010 and FS4SP and deleting of items from the index during Full and Incremental crawls.
Problem: I have switch from using the SharePoint 2010 index to the FS4SP as my index.  Delete items appear to take a different path when they are removed from a content source during full and Incremental crawls.  Is this an issue?
This question statement could as easily be:
When indexed items are removed from a Content Source the access URL does not appear to be deleted from the Crawl Database after a Full or Incremental Crawl.
Response:
I hear a lot of questions regarding FAST and deleting items not working.  The SharePoint crawler does appear to have slightly different end result when items are removed from the index through Full or Incremental crawls.   It would appear that against FS4SP there is an issue but the real question is whether it has any deep negative effects.  In the solution\example section I will show a high level of how the items move around the crawl database and what to expect.
 For easiest comparison I will perform the steps against both a SharePoint Search Service Application and a FAST Content Search Service Application at the same time.  You can choose to perform all the steps individually.
Solution\Example:
1.      Choose  or setup 2 Search Service Applications
a.      1 SharePoint Search Services Application (example: SSA)
b.      1 FAST Search Service Application (example: FSA)

2.      Perform an Index Reset on both SSAs

3.      Determine Crawl Databases association with each SSA.  The crawl Database can be obtained from the Search Administrator Topology for each SSA.

a.      Example: SSA – CrawlDatabase: SSACrawlDB
b.      Example: FSA – CrawlDatabase: FSACrawlDB


4.      Create a Folder accessible to the Search Service Application(s)
a.      Place 5 documents within the folder
b.      Example
                                                    i.     1.doc
                                                   ii.     2.doc
                                                  iii.     3.doc
                                                  iv.     4.doc
                                                   v.     5.doc

5.      In both the Search Service Applications
a.      Setup a new FileShare Content source Name “Test Delete”

6.      Execute a Full Crawl of the “Test Delete” content source for both Search Service Applications
a.      Execute the Crawl Log Report Jobs
                                                    i.     From CA -> Monitoring -> Review Job Definitions
1.      Find the timer job “Crawl Log Report for <your SSA>”
2.      Double Click to Edit and Click “Run Now”
                                                   ii.     Repeat for both Search Service Applications
**SideNote: The crawl log reports are now on Timer Jobs in SharePoint 2010.  Execute Manually will ensure the “MSSCrawlUrlReport” table has been updated.
7.      Open SQL Server Management Studio
a.      Open SQL Query and Execute the Follow SQL:
SELECT * FROM [<Your SSA Crawl DB>].dbo.MSSCrawlURL
SELECT * FROM [<Your SSA Crawl DB>].dbo.MSSCrawlUrlReport
SELECT * FROM [<Your SSA Crawl DB>].dbo.MSSCrawlDeletedURL
SELECT * FROM [<Your FSA Crawl DB>].dbo.MSSCrawlURL
SELECT * FROM [<Your FSA Crawl DB>].dbo.MSSCrawlUrlReport
SELECT * FROM [<Your FSA Crawl DB>].dbo.MSSCrawlDeletedURL
b.      You will get a record for each Crawled Item in the “MSSCrawlURL” and “MSSCrawlUrlReport” for both SharePoint and FAST in addition to folders.

                                                    i.     In SharePoint you will also get additional records for anchor Points records. We will ignore the anchor points
                                                   ii.     In SharePoint you will also get Deleted records in the “MSSCrawlDeleteURL” table for any anchor point.  We will ignore these records.
.
8.      Test the Index.
a.      Open a Search Center for SharePoint and perform a search for “doc”
                                                    i.     You should get back all 5 documents
b.      Open a Search Center for FAST and perform a search for “doc”
                                                    i.     You should get back all 5 documents

9.      Delete the document “5.doc” from the File Share Folder

10.   Execute an Incremental Crawl and Execute the “Crawl Log Reports” for both Search Service Applications.

11.   Repeat Step #8
a.      Both indexes should be cleaned of “5.doc”
b.      In the case of SharePoint we could have query the Property database but with such a small corpus it is just as easy to use a search center

12.   Optional
a.      Change the SQL to include a where clause to look at less data and focus on the single item.

SELECT * FROM [<Your SSA Crawl DB>].dbo.MSSCrawlURL
WHERE DisplayUrl like ‘%5.doc’
SELECT * FROM [<Your SSA Crawl DB>].dbo.MSSCrawlUrlReport
WHERE DisplayUrl like ‘%5.doc’
SELECT * FROM [<Your SSA Crawl DB>].dbo.MSSCrawlDeletedURL
WHERE DisplayUrl like ‘%5.doc’

SELECT * FROM [<Your FSA Crawl DB>].dbo.MSSCrawlURL
WHERE DisplayUrl like ‘%5.doc’
SELECT * FROM [<Your FSA Crawl DB>].dbo.MSSCrawlUrlReport
WHERE DisplayUrl like ‘%5.doc’
SELECT * FROM [<Your FSA Crawl DB>].dbo.MSSCrawlDeletedURL
WHERE DisplayUrl like ‘%5.doc’

13.   Re-Query the SharePoint FAST Crawl Databases
a.      The “MSSCrawlDeletedUrl” table should now have a record reflecting “5.doc”
b.      The “MSSCrawlURL” table should have been updated to reflect:
                                                    i.     Column: “DeletePending=2”
                                                   ii.     Column: “ContentSourceId=-1”
c.      The “MSSCrawlUrlReport” table should have been updated to reflect:
                                                    i.     Column: “IsDeleted=1”
Everything seems to follow the same path at this point.  No surprises.

14.   So if the Item has been deleted from the Index why do we still have an MSSCrawlUrl record with “DeletePending = 2”.
a.      It turns out that the “MSSCrawlUrl” record does get removed until the 2nd crawl after the delete. Probably why “DeletePending = 2”.

15.   Re-run the incremental and Crawl Reports

16.   Check the SQL and you will find no change

17.   Re-run the incremental and Crawl Reports for a third time

18.   Check the SQL and you will see where the divergence occurs
a.      After the 2nd incremental crawl occurs after the delete the “MSSCrawlUrl” record gets removed within the SharePoint Crawl Database but the record remains in the FAST Crawl database.

19.   Is it an issue?
a.      Outside of the fact that the Crawl database over time will become artificially inflated in regards to storage requirements compared to a SharePoint crawl database, the only way I can see it as an issue is if the delete access url record gets re-submit upon each Incremental. Let check.

20.   Add the follow SQL Query
   SELECT * FROM [<Your FSA Crawl DB>].dbo.MSSCrawlQueue
SELECT * FROM [<Your FSA Crawl DB>].dbo.MSSCrawlURL
WHERE DisplayUrl like ‘%5.doc’

21.   There will be a little timing upon executing this test.
a.      Kick-off an Incremental Crawl against the FAST Content Source
b.      Repeatedly execute the Query until you find the “MSSCrawlQueue” table Populated
c.      The “MSSCrawlQueue” table is the queue for those items that are going to be crawled.

You won’t find it getting queue up.   I already knew this as the “DeletePending=2” and “ContentSourceID=-1” will prevent it from being re-crawled.
22.   I have tested using/against:
a.      Full Crawls instead of incremental which should act the same
b.      No CUs installed
c.      APR 2011 CUs – Same results


Conclusion:  A far as bugs go this isn’t much of one to write home about.  On extremely large Farms 200-300 M with a lot of deleting content such as from Exchange in theory I could experience some sluggishness and an inflated storage requirement.  This could be easily rectified with a little testing and a SQL statement with “ContentSourceId=-1” and “DeletePending=2”.
For now we will leave it for what it is … an interesting lesson on how items move through the Crawl Database with an eye on extra storage requirements.
If anyone sees any different behavior I would be interested and hearing about it.

KORITFW

Tuesday, May 10, 2011

FS4SP and Configuration Error: An error occurred while installing Resource store

Topic: SharePoint 2010, FAST Search for SharePoint 2010 (FS4SP), Configuration Wizard, Resource store error
Subject:  Configuring FS4SP.
Problem: My FS4SP configuration wizard continually fails while configuring the resource store.  When I manually run the wizard in debug mode I can’t find the cause of the error.  My log throws errors like the following:
5/1/2011 2:57:01 PM Error InstallResourceStore - An error occurred while executing binary "F:\FASTSearch\bin\ResourceStoreInstaller.exe". Return code is not 0.
5/1/2011 2:57:01 PM Error Utility.WriteException - Exception -  : Exception - System.Management.Automation.RuntimeException: An error occurred while installing Resource store.
5/1/2011 2:57:01 PM Error Utility.WriteException - Exception Stack Trace -
5/1/2011 2:57:01 PM Error Utility.WriteException - Error Details -
5/1/2011 2:57:01 PM Error Utility.WriteException - Target Object -
5/1/2011 2:57:01 PM Error Utility.WriteException - Fully qualified error Id -
5/1/2011 2:57:01 PM Error Utility.WriteException - Exception -  : Exception - System.Management.Automation.RuntimeException: An error occurred while installing resource store.
5/1/2011 2:57:01 PM Error Utility.WriteException - Exception Stack Trace -
5/1/2011 2:57:01 PM Error Utility.WriteException - Error Details -
5/1/2011 2:57:01 PM Error Utility.WriteException - Target Object –

What is the problem?

Response: I have only run into this issue once and was not going to blog about it as I doubt many people run into it but last week I was in a meeting watching a presentation in which the last PowerPoint slide showed the recommendation of 16 – 24 cores for a FAST server.  The problem turns out that the FS4SP configuration wizard will not succeed on a server with more than 16 cores.  FS4SP will run on more than 16 cores you just can’t configure it on more than 16 cores.
Solution:
1.      Lower the number of cores on the server
2.      Run the Configuration Wizard
3.      Re-enable the cores.
Conclusion:
For VM’s it is pretty easy to reconfigure the VM to have fewer cores and re-implement once the configuration wizard has been run successfully.  For physical hardware you will need to reduce the cores through the BIOS and then re-enable.

KORITFW

Thursday, May 5, 2011

Debugging the FS4SP Configuration Wizard

Topic: SharePoint 2010, FAST Search for SharePoint 2010 (FS4SP), Configuration Wizard, psconfig.ps1
Subject:  Manually running the FS4SP Configuration Wizard to configure and debug a FS4SP installation.
Problem: I am continually getting errors when running the FS4SP Configuration Wizard.  The log file produced does not give me enough information.  What can I do?
Response: The Microsoft FAST Search Server 2010 for SharePoint configuration wizard can be run manually with more verbose logging which can help in identifying issues with a FS4SP installation. In the solution\example I will not cover all the details of a FS4SP installation as TechNet does a good job so I will cover the high-lights surrounding the Configuration Wizard.  The configuration wizard can be run manually on any FS4SP server in the FARM but I will focus on a single server stand-alone FS4SP farm.  The same principals can be applied for running the configuration on multi-node farms.
Solution\Example:
1.      It is very important to run installation “As Administrator” when installing software on Windows Server 2008

a.      Right Click on PrerequisiteInstaller and run as Administrator

b.      Install FS4SP
                                                    i.     Double Click fsserver.msi

                                                   ii.     Alternative to running fsserver.msi
1.      Open Command prompt a Administrator
2.      Issue the Command “fsserver.msi”
a.      This would eliminate all questions regarding Permissions if you ever believe the installation is failing due to permission issues on Windows Server 2008.

2.      Open the FAST Command Shell as Administrator

a.      Navigate to the <FAST Install Drive>\FASTSearch\install\scripts directory

b.      Use the psconfig.ps1 to Configure the FAST Farm

c.      Side Notes:
                                                    i.     Issue the following command replacing the appropriate values <>
                                                   ii.     I parsed my Command for readability. This is a single line command.

                      .\psconfig.ps1 -action i -roleName single 
                            -userName <yourdomain>\<fastuser>
                            -localMachineName <FAST2010server>.<yourdomain>
                            -databaseConnectionString <yoursqlserver>.<yourdomain>
                            -databaseName FASTAdminDB
                            -logFileName C:\FASTSearch\WizardLog.txt
                            -logLevelAsString Debug
                            -SharePointInstalledMode Basic
                            -SharePointServerName <SP2010server>.<yourdomain>
                            -SharePointUserIdentity <yourdomain>\<spuser>

                     Response to Command:

                     Password for user <yourdomain>\<fastuser> : *********
                     Self signed certificate password for FAST Search Server for SharePoint : *********
                     Please wait while Windows configures FAST Search Server for SharePoint.
                     Configuration may take several minutes...
                     WARNING: System cannot determine if the Firewall is on. Please make sure
                      it is turned on, in order to create IP Security rules.
                     ...........................You must now reboot to activate configuration

3.      Site Notes:
a.      I ignored providing any options to enter passwords but the psconfig.ps1 prompted were needed because they were not supplied.
b.      –SharePointInstalledMode [Basic/Advanced]  Basic is for a Stand-alone SharePoint Server /Advanced for a Farm

4.      I also changed the Log Level to Debug and provided a path to a log file

a.      When running the Configuration Wizard through the UI the log level is Warning.

b.      Changing to Debug level will provide a full step by step of all the configuration steps being performed

c.      With this high-level of debugging detail not only can you see every step being performed but it is much easier to find specific issues.

d.      Once the issue is identified you can edit the psconfig.ps1 command and find the exact command being executed when the error occurred. Many times you can run it individually outside of the wizard. I have seen many debug sessions which are focused on non-issues.  If they had run the command individually they would have verified it was not the piece of the puzzle which was broken.

5.      After successfully debugging and configuring the FAST Server continue with the TechNet deployment instructions for deploying and securing the FARM.

6.      The psconfig.ps1 can be run on multi-node farms as well for both Admin and non-admin nodes.

a.      A deployment file will need to be created by hand just like through the wizard

b.      To obtain the usage definitions for the psconfig.ps1 command issue “psconfig.ps1” at the FAST Command prompt without specifying any options.

Conclusions: Having the ability to run the Configuration Wizard manually with an increased log level can greatly increase the ability to find a configuration problem when the UI Wizard fails.  As always error messages aren’t always clear so being able to track the configuration on a detail level can lead to the exact point of failure.

KORITFW

Wednesday, May 4, 2011

Manipulating crawled properties in the FAST Search (FS4SP) pipeline

Topic: SharePoint 2010, FAST Search for SharePoint 2010 (FS4SP), processors.Basic Module, CustomerExtensibility, <load module="processors.Basic" class="AttributeCopy" />
Subject:  Manipulating crawled properties in the FS4SP pipeline.
Problem: I have a crawled property I want to manipulate.  What is the best way to do this?
Response: Any and all documentation will tell you that “CustomerExtensibility” is the only way to do this but what happens if the problem statement is: 
1.      The crawl property I need to manipulate is causing an issue in the pipeline before the “CustomerExtensibility” stage?

The answer is yes there are other methods to manipulate crawled properties and you may be faced with an issue where the “CustomerExtensibility” stage may not work.  The alternative methods are not documented therefore deemed “Not Supported” by Microsoft.  If you have other issues later on you will want to temporarily remove any “Non Supported” changes in your pipeline before calling support or they may not help even though the technique used is the exact modules MS already used in the pipeline.  
In the Solution\Example below I will use a real problem a customer was having and Microsoft didn’t have a solution.   I could have re-arranged the FS4SP pipeline to move customer extensibility up in the order of processing and fixed the issue but what would happen if I needed to use customer extensibility to perform crawled property manipulation further down in the pipeline based on properties which are set by other processor stages?  I probably could duplicate the “CustomerExtensibility” stage and have 2 within my pipeline but it seems like an awful lot of work to fix a simple issue. If you read my blogs on “Implementing the Windows 2008 TIFF IFilter and FAST Search for SharePoint 2010” ( http://fs4sp.blogspot.com/2011/04/implementing-windows-2008-tiff-ifilter.html) and “FS4SP and User_Converter_Rules.xml” (http://fs4sp.blogspot.com/2011/04/fs4sp-and-userconverterrulesxml.html ) you may run into this issue if you are using a custom crawler or protocol handler. The User_Converter_Rules.xml uses the crawl property “FileExtension” which is set by the SharePoint crawler based on the access url of the item being crawled. 

In this case the Custom crawler is indexing a records based system which contains attachments.  The attachment is a non-ocr’d tiff.   The “User_Converter_Rules.xml” has been modified to use the Windows 2008 TIFF IFilter which will OCR the tiff file before indexing.
The Custom crawler uses an access url launch point of a custom aspx page which displays the record and any associated attachments (in this case .tif).  In this case our FileExtension crawled property is set to “ASPX” therefore the User_Converter_Rules.xml will not match the true “TIFF” extension.
Solution/Example:
1.      The processors.basic module functionality is used in several stages of the FS4SP pipeline.  It obviously works so why not use it to resolve the issue.

2.      Edit the pipeline configuration
a.      In windows explorer navigate to <FAST Search Install Driver>\FASTSearch\etc

b.      Edit pipelineconfig.xml

c.      Search for nodes in which “processors.Basic” is the load module

Results:
<processor name="DocInit" type="general">
   <load module="processors.Basic" class="DocInit"/>
</processor>
<processor name="Sizer" type="general">
   <load module="processors.Basic" class="Sizer"/>
</processor>
….
<processor name="DocumentSecurityUnknown" type="general">
   <load module="processors.Basic" class="DefaultValue"/>
  
</processor>

3.       The “processors.Basic” module is used in the Processor Stages: “DocInit”, “Sizer”, and “DocumentSecurityUnknown” using the classes “DocInit”, “Size”,”DefaultValue” of the processors.Basic module.

4.      The “processors.Basic” module has 19 classes associated with it available.

5.      We will use the   <load module="processors.Basic" class="AttributeCopy" /> to solve this problem.

6.      Modify the FS4SP pipeline to implement a new processor stage
a.      In windows explorer navigate to <FAST Search Install Driver>\FASTSearch\etc

b.      Edit pipelineconfig.xml

c.      In the <processors> node add:

<processor name="FixExtension" type="general">
   <load module="processors.Basic" class="AttributeCopy"/>
   <input>
      <attribute name="&lt;Input&gt;"/>
   </input>
   <output>
      <attribute name="&lt;Output&gt;"/>
   </output>
  <config>
        <param name="Input" 
value="2014D5E9-5DCB-43D0-BCC8-090D134A29F2:MYFILEEXTENSION:31" type="str"/>
        <param name="Output"
value="0B63E343-9CCC-11D0-BCDB-00805FCCCE04:FileExtension:31" type="str"/>  
        <param name="Attributes" value="" type="str"/>
  </config>     
</processor>

d.      Where the “Input” value is the crawled property value used to replace the “Output” crawled property value.  In this case “MYFILEEXTENSION” is a custom crawled property associated with my crawler and “FileExtension” is the built-in crawl property populated by SharePoint and used by User_Converter_Rules.xml.

e.      Side Note:  The FS4SP pipeline can be very particular regarding case.  If you are not 100% sure of you crawled property attributes for the Input/Output parameters.  Run the crawl 1st with a Spy stage enable.  Copy the values directly from your Spy trace into the Input/Output values.


7.      Add the new Stage to the Pipeline
a.      Navigate to the <pipeline name="Office14 (webcluster)" default="1">

b.      Modify the <!—Document Conversion --> section of the pipeline to look like:
      <!-- Document Conversion -->
      <processor name="FixExtension"/>
      <processor name="AttachmentsHandler"/>
      <processor name="UTFDetectorConverter"/>
      <processor name="FastFormatDetector"/>
      <processor name="FormatDetector"/>
      <processor name="XMLMapper"/>
      <processor name="SimpleConverter"/>
      <processor name="PDFConverter"/>
      <processor name="IFilterConverter"/>
      <processor name="SearchExportConverter"/> 

c.      Optional: Add to Spy Stages around the new processor stage.
                                                    i.     I added to show results.
                                                   ii.     If you are un-familiar with using the Spy stage for Debugging see my Blog on “Spy Stage in the FAST Search (FS4SP) pipeline” (http://fs4sp.blogspot.com/2011/04/spy-stage-in-fast-search-for-sharepoint_05.html )

     <processor name="Spy1"/>
     <processor name="FixExtension"/>
     <processor name="Spy2”/>
     <processor name="AttachmentsHandler"/>

d.      Save the changes and Reset the pipeline
                                                    i.     From the FAST Command Shell as Administrator issue:
1.      “psctrl reset”

e.      Crawled the Content Source

f.       Results from Spy1 which occurs before the new “FixExtension” stage.  The “FileExtension” crawled property got set to the value “ASPX” based on the url property which means the User_Convert_Rules.xml will not fire the customer TIFF IFilter Converter.


#### ATTRIBUTE url <type 'str'>: http://myappserver/Record.aspx?ID=1
#### ATTRIBUTE 0B63E343-9CCC-11D0-BCDB-00805FCCCE04:FileExtension:31 <type 'str'>: ASPX
#### ATTRIBUTE 2014D5E9-5DCB-43D0-BCC8-090D134A29F2:MYFILEEXTENSION:31 <type 'str'>: tif

g.      Results from Spy2 which occurs after the “FixExtension. Note the “FileExtension” crawled property now contains the value we need to the “User_Converter_Rules.xml” to identify the tif file.
#### ATTRIBUTE url <type 'str'>: http://myappserver/Record.aspx?ID=1
#### ATTRIBUTE 0B63E343-9CCC-11D0-BCDB-00805FCCCE04:FileExtension:31 <type 'str'>: tif
#### ATTRIBUTE 2014D5E9-5DCB-43D0-BCC8-090D134A29F2:MYFILEEXTENSION:31 <type 'str'>: tif

Conclusion: Understanding the pipeline OOB processor stages and how they works can be very beneficial and solving problems. 

 
KORITFW