sharepoint 2016 content database size - QuestiEssaydatesCom

+0 332 548 954

[email protected]

sharepoint 2016 content database size

Prairie Developer

Menu
  • SharePoint
    • SharePoint 2010
      • Configuration
      • Development
    • SharePoint 2013
      • Configuration
      • Development
      • TroubleShooting (SP 2013)
    • SharePoint 2016
      • Configuration (SP 2016)
      • Development (SP 2016)
      • Troubleshooting (SP 2016)
    • PowerShell
    • Troubleshooting
  • Office365
  • General Development
  • .NET
    • C#
    • MVC
  • SQL
    • SQL Server
    • Azure DB
    • Troubleshooting
  • About
    • About Me
    • Speaking Events

Home / Blog Posts /

Pushing the Limits of SharePoint Storage: Getting Beyond 200GB

October 18, 2016 by David Drever Configuration , Configuration , Configuration (SP 2016) , SharePoint , SharePoint 2013 , SharePoint 2016 Backup\Restore , Configuration , SharePoint 2013 , SharePoint 2016 7

Recently at my client site (I have a lot of posts that start this way) we have been getting more and more requests for groups that want to bring higher amounts of data into SharePoint.  These requests are really pushing the limits of SharePoint Storage thresholds.  So I started looking into the ways that we can get around that.  Our thought was that since Microsoft recently announced being able to handle 25TB of data for SharePoint Online Site Collections.  We should be able to easily handle the 4TB ceiling in our on-prem environment.

Update: I wrote another blog post concerning this where I go into greater detail on how to test if your environment can go beyond the 200GB threshold and the results of a test I did.  You can view that information here.

SharePoint Database Size Limits

The limitations of SharePoint’s content databases are pretty well documented here: https://technet.microsoft.com/en-CA/library/cc262787.aspx#ContentDB . But in a nutshell you want to keep your content databases below 200GB.  The same document actually suggests splitting out your site collections if the content DB reached more than 100GB.  This would be to allow for growth within the sites.

But what if it’s a single site collection within that database?  This now means you should consider branching off the site collection into multiple site collections.  For example, create an archive site collection to house data that is no longer actively updated or used.  Likely this will cut down on your data usage a great deal.  You will have to migrate the data in order to do it, but it is a necessary evil to save on space.

Why the Database Limits?

For the most part the limits are based around the internal tools of SharePoint.  Microsoft states some of the site collection actions like moving a site collection or backing up and restoring could risk full database locks thus affecting other site collections in the content DB.  It could even fail outright.

Along the same lines, patching your environment can drastically increase with huge DBs.  Especially if the patch requires a DB modification.  This doesn’t happen often, but has been known to occur.  So if you are applying a CU to your environment, you may need to be prepared for very long processing times.  And remember: the longer something takes to run, the more chance of failure as other processes within your environment are still running and could affect the work you are doing.

What about Remote Blob Storage?

So while I was looking into pushing the limits of SharePoint storage I did a lot of investigation into the concept of remote blob storage and what this does for us.  While researching RBS solutions and what they could do for our environments I found the terminology and information provided by some of the providers indicated you could exceed the 200GB limitation using their tools.  They didn’t actually come out and state their tool could do it, but they way it was worded gave myself and colleagues that impression.  I can’t stress this enough: RBS solutions do NOT allow you to break the recommended storage ceiling.

What do they do?  Well I am glad you asked.  Most often, because of the use and type of data, DB drives are put on faster, more expensive drive hardware.  What RBS allows you to do is take the bulk of that data and move it to storage that is fast enough, but not as expensive as the DB drives.  It doesn’t break the ceiling, just provides another method of storing the same data cheaper (but then again, the cost may even out as you are now paying for the software to do this and the supporter’s hours instead of just the hardware).  However, when backing up, restoring and\or patching the environment the same issues are going to occur.  The process is still going to pull the data from RBS into your backup location as well as the added complexity of having a middle-ware in place while trying to patch.  RBS solutions tend to really complicate things too.  This is because the data is no longer located in the same location.  This means performing SQL backups is not going to cut it.  SQL backups will get you the data still in the content DB and the metadata for the RBS locations, but it will save the actual blob data being stored outside of the content DB.  This means you need to use the vendor’s software to perform your backups.

Pushing the Limits of SharePoint Storage

So what if you can’t split out your site collections, or the data in that single site collection can’t be archived or split out somewhere else.  This is the question that was tasked to me in our environment.  The first thing you need to do is speak with your middle-ware team or whatever your data storage team is called.  Explain the need to attain at the very least .25 IOPS per GB to the recommended 2 IOPS per GB within the disk system.  Determine if the disk you are on, or the disk you could be on, support that level of throughput.  In our case, our data center was believed to not only meet, but exceed the recommendations, and this was with the cheap disk.  Because they didn’t have the space I needed to test (I asked for 10 TB) they were able to give me 5 TB on the faster, fibre channel disk.

My initial test was uploading enough files into a single SharePoint library in order to push the content DB to about 750 GB.  At this point the interactions with the site were still normal and you couldn’t tell (other than the list threshold) that there was that much data in the site collection.  That was the case until I started a site collection backup via PowerShell.  Both the backup and restore took 40 hours to complete.  This is because you are doing the backup through SharePoint which was designed to pull the data out of the content DB, not actually backup the DB itself.  This adds a ton of overhead.  This illustrates Microsoft’s concern around backups of this size.  It takes a long time and a lot of things can go wrong in that time frame.  This was further supported because the backup itself failed three times until I figured out it was because our VM backups “stunned” the SharePoint VM very, very briefly when completing the image backup.  This allowed the network connection to break and fail the process.  Perfect example of what Microsoft was warning with large environments.  However, once removed from the backup process, the backup and restore worked just fine (if you ignore that 40 hour restore is ok).  The restore took just about as long.

My recommendation is to completely ignore site collection backups, site exports, lists exports when dealing with large data sets (at least in your production environment).  Pulling a production system out of a backup rotation is a bad idea.  If you are looking at 40+ hours for a backup at 750 GB think what a backup at 2+ TB will be like.  That’s a long time for your production environments to not have a backup run against them.

Instead, do your backups via SQL Server DB backups.  Here’s why:

After testing the 750 GB DB via SharePoint backups my intention was to attempt a SQL Server DB backup and restore.  However, the day I was going to start doing that I found out they were taking back the space granted for testing within the next few days.  So over the weekend I dumped a great deal more data into SharePoint and moved the content DB to 2.3 TB of data.  Again, the environment appeared to be responding fine.  This time the SQL backup took only 6.5 hours, oodles faster than a SharePoint backup.  That was also backing up to a network location not to a local drive.  Unfortunately, I lost the drive space before I could fully test the restore process.

Further Testing Required

I had a number of tests remaining.  Perhaps someone else who has the space would like to take up where I left off:

  • Restore environment at 2+TB (Backup was successful)
  • Move DB usage to 4 TB and perform backup and restore testing
  • When at the max ceiling perform the following tests
    • Create a new sub site.  This appeared to take a long time at 750GB.  I think it had to do with moving data around within the DB.
    • Create new lists and libraries and put some data in it.
    • Using stress testing software to pound the heck out of the system (impersonate multiple users) performing the following:
      • Add files
      • Edit files
      • Delete files
      • Open files
      • Update metadata
      • Run Workflows

Conclusions I Have Reached So Far

It is obvious that you can reach extreme levels of content within your SharePoint environment but there has to be certain controls and processes put in place.  This is what Microsoft is talking about in that document I linked to at the beginning of this post.  You have to plan for long backups and restores.  You have to ensure you don’t have processes running that will kill these backups\restores mid-run.  You have to have plans in place on how you are going to handle the data once it gets so large.  What about Disasters?  Do you have offsite storage for these backups?  While my tests show that it is possible to reach the extremes of data storage, you really have to make sure that you have ALL your T’s crossed and I’s dotted.  Because if something happens and you can’t handle the data, you just lost an unfathomable amount of information.

If I ever get the drive space back to perform more testing (I am certainly trying to).  I will post a follow-up to my testing and make more concrete recommendations on moving forward.

 

Thanks for reading!

Share this:

  • Email
  • LinkedIn
  • Twitter

Comments

Leave a Reply Cancel reply

Manoj
April 10, 2017 at 11:41
Reply

Thanks for the details, it will definitely save time to see the results of all these wonderful testing.

murali
April 11, 2017 at 04:42
Reply

Hi
Did you do re-test with higher Data?

David Drever
April 11, 2017 at 07:08
Reply

Not yet. We are planning on testing next month. Hopefully I can get the teams to go ahead with my plans.

Rachid
May 2, 2017 at 01:41
Reply

Thanks for the experience sharing!
We have an archiving site collection that has reached 400GB with one million PDF documents.
We thought that everything would go fine, but we experienced troubles with the crawling.
A full crawl takes up to 20 hours. Then the following incremental crawl takes indecent time (couple of hours) because it re-discovers several hundred thousand document “missed” by the full crawl.
In your article, I see no testing around crawling, that’s why I share our experience.

David Drever
May 2, 2017 at 07:38
Reply

Hi Rachid,

This is great input! Thank you. It’s funny you mention this. I finally received the space and approval to test again. This was something I wanted to test during this phase. Which version of SharePoint are you on?

Dave

Rachid
May 2, 2017 at 07:42
Reply

Hi again,
We are on 2013. I cannot give you more details as I’m not a great technical expert of SharePoint.
SharePoint being stongly used in my company, I believe we are up-to-date with the updates (except 2016 of course)

David Drever
May 2, 2017 at 07:45
Reply

I am also going to be testing with SP2013. I’ll let you know how I make out. Probably will be at least a month till I can get testing complete and update my blog.

Take care!

Recent Posts
  • Building a Modern SharePoint Solution: Part 9 – Starting a Microsoft Flow from a PowerApp
  • Building a Modern SharePoint Solution: Part 8 – Sending a Tweet from Microsoft Flow
  • Building a Modern SharePoint Solution: Part 7 – Creating a Multiple Approver Microsoft Flow
  • Proper way to handle workflows that throw a Microsoft.Workflow.Client.ActivityValidationException
  • Building a Modern SharePoint Solution: Part 6 – Create a Flow to be Started Manually
Archives
  • August 2018
  • July 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • April 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • March 2015
  • January 2014
  • October 2013
  • August 2013
  • May 2013
  • January 2013
Tags
About Me
BCS
BCS Dev Series
Conference
Configuration
Development
Dev Farm Series
Error Message
Error Resolution
External Content Type
Folders
ITPro
JavaScript
JQuery
Methodologies
Microsoft Flow
Microsoft Teams
Modern Solution Series
O365
Office365
Office Integration
PowerApps
PowerShell
Power User
PrairieDevCon
Reports
REST API
Secure Store
SharePoint
SharePoint 2010
SharePoint 2013
SharePoint 2016
SharePoint Designer 2013
SharePoint Development
SharePoint Features
SharePoint Online
SharePoint Saturday
SharePoint Search
Site Template
Slide Deck
Speaking Event
SPQuery
SPView
Troubleshooting
Workflow
Recent Posts

Building a Modern SharePoint Solution: Part 9 – Starting a Microsoft Flow from a PowerApp

Building a Modern SharePoint Solution: Part 9 - Starting a Microsoft Flow from a PowerApp

This is the final post in my series of building a modern SharePoint Solution.  In this post, I will demonstrate starting a Microsoft Flow from…

Read more

Building a Modern SharePoint Solution: Part 8 – Sending a Tweet from Microsoft Flow

Building a Modern SharePoint Solution: Part 8 - Sending a Tweet from Microsoft Flow

In my previous post, I showed you how to set up multiple approvals in a workflow. This post is going to continue along the requirements…

Read more

%d bloggers like this:
This site uses cookies for analytics, personalized content and ads. By continuing to browse this site, you agree to this use. Learn more
  • TechNet
  • Products
  • IT Resources
  • Downloads
  • Training
  • Support
Products
  • Windows
  • Windows Server
  • System Center
  • Microsoft Edge
 
  • Office
  • Office 365
  • Exchange Server
 
  • SQL Server
  • SharePoint Products
  • Skype for Business
  • See all products »
Resources
  • Channel 9 Video
  • Evaluation Center
  • Learning Resources
  • Microsoft Tech Companion App
  • Microsoft Technical Communities
  • Microsoft Virtual Academy
  • Script Center
  • Server and Tools Blogs
  • TechNet Blogs
 
  • TechNet Flash Newsletter
  • TechNet Gallery
  • TechNet Library
  • TechNet Magazine
  • TechNet Wiki
  • Windows Sysinternals
  • Virtual Labs
Solutions
  • Networking
  • Cloud and Datacenter
  • Security
  • Virtualization
Updates
  • Service Packs
  • Security Bulletins
  • Windows Update
Trials
  • Windows Server 2016
  • System Center 2016
  • Windows 10 Enterprise
  • SQL Server 2016
  • See all trials »
Related Sites
  • Microsoft Download Center
  • Microsoft Evaluation Center
  • Drivers
  • Windows Sysinternals
  • TechNet Gallery
Training
  • Expert-led, virtual classes
  • Training Catalog
  • Class Locator
  • Microsoft Virtual Academy
  • Free Windows Server 2012 courses
  • Free Windows 8 courses
  • SQL Server training
  • Microsoft Official Courses On-Demand
Certifications
  • Certification overview
  • Special offers
  • MCSE Cloud Platform and Infrastructure
  • MCSE: Mobility
  • MCSE: Data Management and Analytics
  • MCSE Productivity
Other resources
  • Microsoft Events
  • Exam Replay
  • Born To Learn blog
  • Find technical communities in your area
  • Azure training
  • Official Practice Tests
Support options
  • For business
  • For developers
  • For IT professionals
  • For technical support
  • Support offerings
More support
  • Microsoft Premier Online
  • TechNet Forums
  • MSDN Forums
  • Security Bulletins & Advisories
Not an IT pro?
  • Microsoft Customer Support
  • Microsoft Community Forums

Internet Explorer TechCenter
 

Sign in
United States (English) Drop down arrow
Brasil (Português) Česká republika (Čeština) Deutschland (Deutsch) España (Español) France (Français) Indonesia (Bahasa) Italia (Italiano) România (Română) Türkiye (Türkçe) Россия (Русский) ישראל (עברית) المملكة العربية السعودية (العربية) ไทย (ไทย) 대한민국 (한국어) 中国 (中文) 台灣 (中文) 日本 (日本語)

 

 
Home Internet Explorer 10 Internet Explorer 9 Internet Explorer 8 Previous Versions Library Forums



Ask a question

Search related threads


  • Remove From My Forums

Asked by:

 none

SharePoint 2016 Content DB size limit for general use


SharePoint

 > 

SharePoint 2013 – General Discussions and Questions

    Question

  • Question

    Sign in to vote

    0


    Sign in to vote

    Hi All,

    My understating is SharePoint 2016 Content DB size limit for general use are increased to 1TB.

    But, in Microsoft website:  Software boundaries and limits for SharePoint Server 2016 ,
    the information shows SharePoint 2016 Content DB size limit for general use are still 200GB.

    I want to confirm which value is correct, and any documentation can support.

    Thanks a lot 

    Monday, January 23, 2017 8:47 AM

    Reply

    |

    Quote

All replies

  • Question

    Sign in to vote

    1


    Sign in to vote

    Hi

    both 😉

    200Gb – recommended value

    up to 4T based on local scenarios , topology and hardware resources


    Romeo Donca, Orange Romania (MCSE, MCITP, CCNA) Please Mark As Answer if my post solves your problem or Vote As Helpful if the post has been helpful for you.

    • Proposed as answer by
      croute1
      Monday, January 23, 2017 3:59 PM
    Monday, January 23, 2017 9:20 AM

    Reply

    |

    Quote

  • Question

    Sign in to vote

    0


    Sign in to vote

    Thanks Romeo,

    Do you have any information why Microsoft announce in 1TB in most of the official conference, but keep 200GB when 2016 release?

    Tuesday, January 24, 2017 1:13 AM

    Reply

    |

    Quote

  • Question

    Sign in to vote

    0


    Sign in to vote
    It’s an old recommendation that really isn’t applicable in today’s world with fast drives. There’s no reason to stick with 200GB.


    Trevor Seward

    Office Servers and Services MVP


    Author, Deploying SharePoint 2016


    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.

    Wednesday, January 25, 2017 4:49 PM

    Reply

    |

    Quote

    Moderator

  • Question

    Sign in to vote

    0


    Sign in to vote

    Hi,

    I am checking to see how things are going there on this issue. Please let us know if you would like further assistance.

    Regards,

    Victoria


    Please remember to mark the replies as answers if they help.
    If you have feedback for TechNet Subscriber Support, contact
    [email protected]

    Tuesday, January 31, 2017 9:45 AM

    Reply

    |

    Quote

    Moderator

 

© 2018 Microsoft. All rights reserved.
Newsletter | Contact Us | Privacy Statement | Terms of Use | Trademarks | Site Feedback

admin