How would you go about testing the scalability of a Database & Web Application?

I have made an application using Microsoft ASP.NET and an MS-SQL DB through NHibernate ORM for a client and the DevOps department has asked me the following question:

“Do you have any tests which tells how fast a database we need to handle the load of 10/100/1.000/10.000/100.000 users etc so we can scale it accordingly?”

Has anyone attempted this or has some experience with it?

The solution I am considering is to write an integration test that would simulate a set of queries that a standard user would execute against the application. Measure the performance using NHibernateProfiler. Then execute that standard user test 10/100/1000 amount of times while keeping track of the Profiler results.

1 Like

It’s not an integration test you’re writing per se, although you are going to do the same things; you’re not testing functionality, you’re testing performance. And it has to run as close to a production environment as possible - you shouldn’t be doing this locally, with tests in the application, as that won’t be close to accurate. It should be applied externally once you’ve deployed to a staging environment or whatever. If you’ve got VS enterprise licence, you can use the load tester via your team service accout - see https://www.visualstudio.com/en-us/docs/test/performance-testing/run-performance-tests-app-before-release. I can’t comment on how good it is; I’ve only used Tsung or basic load testers I’ve built. The devops guy at my last work used Gatling pretty heavily and seemed to think it was good.

4 Likes

If it’s not possible to duplicate your production machine, you can use load testing to further optimize your code, or your sql query, or your sql database setup (addtl indexes, etc). You can establish a "before " baseline, tweak your code or sql query, and retest again. This can only give you information on percentage increase, say 20% performance increase after optimization etc.

But if they want solid numbers on how many users your app can support then it has to be done on the same hardware setup or your live server. Lots of factors affect it, speed of network, memory installed, separate or combined IIS/SQL server, whether the same page or sets of database records are being hit, or wildly random records, transaction logging on or off, Type of RAID drive, HD speeds, using SSD or not, Windows server version, SQL version, Firewall? Etc…

1 Like

Thanks a bunch for your answers :smiley: I’ve seen the VS Load testing tool before and I think it’ll be my first attempt at it. I will place a test DB and a test instance of the sites Web Api on Azure and perform the load test and see if I can get a rough estimate of the amount of users it would take before the DB would need some sort of horizontal scaling.

@DanCouper Gatling looks promising i must admit.
@owel I assume if both sites are running on Azure with the same server set up then the environments should be close enough to get an estimate.

What confuses me about his question is that he would like me to tell him how fast a database he needs and that isn’t really something that I know how to measure. Is it the speed of the query compared to the size of data, or perhaps the instance type in azure.

On the Azure site they list this

Service Tier (Performance Level) Users Database Size
Basic 5 720 MB
Standard (S0) 10 1 GB
Standard (S1) 20 2.1 GB
Standard (S2) 50 7.1 GB
Premium (P1) 100 14 GB
Premium (P2) 200 28 GB
Premium (P6/P3) 800 114 GB

; where at Standard (S0) level supports 10 users (i.e. 10 transactions per second)

1 Like

Maybe you should check Jmeter too?

1 Like