Wednesday, February 07, 2018

Sizing MSSQL Server DB's - Rules of Thumb


Microsoft Technet has published an article that includes some rules of thumb for sizing a SQL Server system.
NOTE: These are for sharepoint.
ANOTHER NOTE:) These are rules of thumb, not hard-and-fast recommendations.
Still, I often find myself called upon to make such recommendations in the absence of proper analysis, so I wanted to record some of these for future reference.

Technet article - Rule of Thumb for MSSQL Server Sizing (Sharepoint)

 ESTIMATING MEMORY REQUIREMENTS 

NOTE: SQL Server 2016 Standard can support up to 32 GB RAM. Because RAM is cheap, doesn't impact licensing, and vastly improves MSSQL Server performance, you will find nearly any source recommending you max out the RAM that is supported by your SQL Server edition. The #'s below are useful for estimating CPU cores, which are generally less important for OLTP systems and generally DO impact licensing.
Combined size of content databases RAM recommended for computer running SQL Server
Minimum for small production deployments 8 GB
Minimum for medium production deployments 16 GB
Recommendation for up to 2 terabytes 32 GB
Recommendation for the range of 2 terabytes to 5 terabytes 64 GB
Recommendation for more than 5 terabytes Additional RAM over 64 GB can improve SQL Server caching speed

CPU

We then figure 1 CPU core for every 4 GB RAM, so a 16 GB system would have 4 CPU cores. 32 GB = 8.
Of course,this varies by app!!! If your app does a lot of in-db processing, or you have scads of concurrent users, you'll want to increase that figure. Our app is a 3-tier system, and does most processing (exception: sorts) in the middle tier, so CPU on the DB is very nearly never our bottleneck. If your app depends more on the DB, consider a 1CPU / 2 GB RAM ratio.


DISK SIZE

Of course, there's not much that I can say about sizing your disk, excepting some broad generalizations about the broad generalizations you need to make :-)
If you can determine the base things that have most impact on your db size, you can extrapolate from that.
For instance, if you have an insurance Policy Administration System, you might look at a test environment, count the # of policies, and create a MB / Policy ratio.
Inventory items, sales transactions, or song records are all examples of the "main things" that will drive your db growth, with other tables (such as, say, a Users table for a smaller installation) being ancillary or lost in the noise.

CONCLUSION

So, we end up with a SWAG on the storage size, which leads us to a rule of thumb on memory, and another rule of thumb on CPU.

No comments:

DBT-50000 when using DBCA.bat on Windows (Oracle 19.11)

I’ve been having some trouble getting DBCA to run in order to create databases. Thought I’d share it with you, and thus document it for la...