Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

Solution

Performance Metrics Sample

What is the best approach for, and metric samples for, how long will it take to scan 1000+ sites for Broken Links?

Factors affecting Job Performance

Please see:

Recommendations for Large Jobs

  • Use recommendations in above 'Factors affecting Job Performance'
  • Start with a small production site, which has less visibility (not the Executive Office home page for example (smile)), which you copy into a non-production environment to test. The 'test' sites should have a good sample of links that you would find in other sites.
  • Get familiar with the Find & Replace behavior, for example, all Find & Replace URL link 'strings' are compared to links found, all link replacement strings are matched, it will not match first one and stop, it will match all and execute all Link Replacement strings against EACH URL. This is in case you have 2 rules, such as replace http://oldserver and also replace /oldpath/, where one or both may execute on a single URL. Contact us if you need support for this.
  • Keep a recent 'good' backup of your site before doing a find and replace.
  • Break down jobs into batches. For example, if you have 1000 site collections, with 60K subsites, you should fix links in batches, the same way you may approach a large content migration job, such as 500 subsites per person per evening/weekend (or whenever you have scheduled to fix links).
  • We recommend no more than 600 scheduled jobs at the same time per machine/user. More than 600 jobs at the same point in time on the same machine may have performance degradation due to conflicting XML file reads/writes (the tool queues the writes to the log files and to the job data). You can increase this number by adding more machines or users to run the jobs.
  • NOTE: In our test case below, we created 600 site jobs.

Our Test Environment Used (for large migration 'batches')

·        Hyper-V Windows 7 Client machine, 8GB RAM, Intel Quad-Core CPU 2.6GHz

·        SharePoint Essentials Toolkit Enterprise Suite 2016 Edition v5.9.8.8

·        SharePoint 2013 Enterprise

·        5 Site Collections with 600 sub sites (total) tested (for large jobs, this would be a 'batch size' we could complete at one time on one machine)

·        Average of 500-800MB per site, total of 380GB

·        2 WFE & 1 APP server 12GB RAM each

·        1 SQL server 32 GB RAM

·        Number of files, Items and Links to parse

o   Approx. 250,000 Items (List Items and Files)

o   Approx 100,000 URLs found in report

·        Style Library and other OOTB SharePoint Lists included in job, non-customized

Test Results

For the batch of 600 sites/jobs GB test above it completed the job in 5.5hrs

NOTE

For large jobs, Sites and Lists can be scanned in parallel. You do no need to run them sequentially. Contact us or review the User Manual for more details on parallel, multi-threaded jobs.

Find Replace

This test did not replace any URLs. To factor in Find/Replace of 20% of the links, we would factor in a 10% increase in time to complete for 4 replacement conditions. Half of the percentage of links found to fix.





  • No labels