31

I created a trial account on Azure, and I deployed my database from SmarterAsp.

When I run a pivot query on SmarterAsp\MyDatabase, the results appeared in 2 seconds.

However, running the same query on Azure\MyDatabase took 94 seconds.

I use the SQL Server 2014 Management Studio (trial) to connect to the servers and run query.

Is this difference of speed because my account is a trial account?

Some related info to my question

the query is:

ALTER procedure [dbo].[Pivot_Per_Day]
@iyear int,
@imonth int,
@iddepartment int

as

declare @columnName Nvarchar(max) = ''
declare @sql Nvarchar(max) =''

select @columnName += quotename(iDay) + ','
from (
        Select day(idate) as iDay
        from kpivalues where year(idate)=@iyear and month(idate)=@imonth
        group by idate
        )x

set @columnName=left(@columnName,len(@columnName)-1)

set @sql ='


Select * from (
select kpiname, target, ivalues, convert(decimal(18,2),day(idate)) as iDay   

from kpi

inner join kpivalues on kpivalues.idkpi=kpi.idkpi

inner join kpitarget on kpitarget.idkpi=kpi.idkpi

inner join departmentbscs on departmentbscs.idkpi=kpi.idkpi

where iddepartment='+convert(nvarchar(max),@iddepartment)+'

group by kpiname,target, ivalues,idate)x

pivot
(
     avg(ivalues)
    for iDay in (' + @columnName + ')
) p'

execute sp_executesql @sql

Running this query on 3 different servers gave me different results in terms of Elapsed time till my pivot table appear on the screen:

Azure - Elapsed time = 100.165 sec

Smarterasp.net - Elapsed time = 2.449 sec

LocalServer - Elapsed time = 1.716 sec

Regarding my trial account on Azure, I made it with the main goal to check if I will have a better speed than Smarter when running stored procedure like the above one. I choose for my database Service Tier - Basic, Performance level -Basic(5DTUs) and Max. Size 2GB.

My database has 16 tables, 1 table has 145284 rows, and the database size is 11mb. Its a test database for my app.

My questions are:

  1. What can I do, to optimize this query (sp)?
  2. Is Azure recommended for small databases (100mb-1Gb)? I mean performance vs. cost!

Conclusions based on your inputs:

  • I made suggested changes to the query and the performance was improved with more than 50% - Thank you Remus
  • I tested my query on Azure S2 and the Elapsed time for updated query was 11 seconds.
  • I tested again my query on P1 and the Elapsed time was 0.5 seconds :)

  • the same updated query on SmarterASP had Elapsed time 0.8 seconds.

Now its clear for me what are the tiers in Azure and how important is to have a very good query (I even understood what is an Index and his advantage/disadvantage)

Thank you all, Lucian

Lucian Bumb
  • 2,821
  • 5
  • 26
  • 39
  • 1
    Isn't that a question you should ask the Azure support? How should we know the details of your account and their equipment? – arkascha Jun 27 '15 at 08:42
  • 1
    Are you talking about SQL Azure? The performance is very dependent on the tiers. Anything less than S2 is unlikely to give you acceptable performance. And no, it's not dependent on load, individual query performance is heavily affected by the tiers. – flytzen Jun 27 '15 at 08:52
  • 1
    @Lucian - as your question stands, it's completely unclear. You haven't shown any SQL query in your question, and you haven't mentioned a single thing about either your local server or your selected SQL Database tier. That said: If it's about server performance (or tier performance), that's a question for ServerFault, not StackOverflow. If it's a query issue, then it's appropriate for StackOverflow, but not as a performance difference between different service tiers / hosting companies. – David Makogon Jun 27 '15 at 16:57
  • 4
    In defence of Lucian's question, there are very clear differences between performance at the different SQL Azure database tiers. The price/performance ratio grows faster than linearly so understanding this is very important for application design, as decisions such as sharding and caching can often be driven by the way price and performance correlates. I have had this happen more than once recently. To be clear, I am a big fan of SQL Azure - I have more than 100 databases in production in Azure. I also think the price is very competitive - but understanding this is essential for design. – flytzen Jun 27 '15 at 17:43
  • 2
    @lucian; following your edit to clarify; its because of the Basic perf level. As per my answer below, try with an S2 as a more realistic starting level. It's worth trying all the levels one at a time, the perf diff is huge. – flytzen Jun 28 '15 at 07:14
  • @Frans, wait a minute. You have over 100 databases in Azure? If you are using S2 or better, that means that you are spending over $7,500 a month for databases! Is that right!!? That comes out to $90,000 a year. How do you justify that? And please understand, I am being genuine. I love the idea of Azure, but I'm having a hard time justifying the cost. – N1njaB0b Sep 25 '15 at 14:49
  • @N1njaB0b; well, in the old tiers it was a lot cheaper. Elastic Pools is what has saved us; If I had to have gone to the new tiers with individual databases then, yes, I would be paying that kind of money. As it is, my total annual Azure bill at the moment runs to well over USD100,000 per year. You do get discounts when you get to bigger usage. However, I have >10 separate systems for different clients and >80 servers so compared to traditional hosting and associated devops, Azure is super cheap :) Just look at how much a single SQL Enterprise license will cost you... – flytzen Sep 25 '15 at 20:28

3 Answers3

13

(Update: the original question has been changed to also ask how to optimise the query - which is a good question as well. The original question was why the difference which is what this answer is about).

The performance of individual queries is heavily affected by the performance tiers. I know the documentation implies the tiers are about load, that is not strictly true.

I would re-run your test with an S2 database as a starting point and go from there.

Being on a trial subscription does not in itself affect performance, but with the free account you are probably using a B level which isn't really useable by anything real - certainly not for a query that takes 2 seconds to run locally.

Even moving between, say, S1 and S2 will show a noticeable difference in performance of an individual query. If you want to experiment, do remember you are charged a day for "any part of a day", which is probably okay for S level but be careful when testing P level.

For background; when Azure introduced the new tiers last year, they changed the hosting model for SQL. It used to be that many databases would run on a shared sqlserver.exe. In the new model, each database effectively gets its own sqlserver.exe that runs in a resource constrained sandbox. That is how they control the "DTU usage" but also affects general performance.

flytzen
  • 7,348
  • 5
  • 38
  • 54
  • 2
    Your assertion about a trial account having any type of performance difference is **completely false**. There is absolutely zero different in service performance with trial accounts, and I don't even know where you received this information. Additionally, you are giving "guesses" about how SQL Database service is implemented under the covers. – David Makogon Jun 27 '15 at 16:55
  • 2
    I've got over 100 sql azure databases in production and have confirmed this with extensive testing. I have also had it confirmed directly by the people inside microsoft who actually work on this. If you read my answer in – flytzen Jun 27 '15 at 17:03
  • Regarding trial accounts: As a Microsoft employee who works with Azure daily, you're telling me something I've never heard. Ever. – David Makogon Jun 27 '15 at 17:04
  • ... I'm referring to the different database tiers and I state an assumption that he is using a B tier in his test. By all means carry out your own tests and you'll see what I mean. – flytzen Jun 27 '15 at 17:05
  • 2
    You're stating that databases allocated from trial subscriptions have lower performance than those under paid subscriptions. And this is untrue. – David Makogon Jun 27 '15 at 17:06
  • 1
    Correct, I wasn't clear. I have updated my answer to make it clearer. Thank you for your feedback. – flytzen Jun 27 '15 at 17:19
  • @david makagon there is extensive discussion of this in the sql azure advisors group amongst other places. If you want to discuss further, contact me offline as most of the detail is under nda. flytzen on Twitter. – flytzen Jun 27 '15 at 17:22
  • Thanks Frans! I just ran into the OP exact issue running on basic. I upgraded the db to the next tier up and bam; 22 second page loads from a heavily loaded dashboard of queries just increased performance by 19 seconds. – JReam Nov 03 '15 at 20:10
  • @Toolkit. Not at all - very far from it, in fact. I've got database on tier B and databases on P4, it just depends on your needs. My suggestion about S2 was as a starting point for a test. S2 is a reasonably beefy setup and, IMO, at $75/month it is extremely cheap considering the level of resilience, backup and so on you get (3 running instances, backups, world-wide point-in-time restore, threat detection, audit, automatic optimisation etc). I know people who pay thousands per month for less :) What I *am* saying is that on Azure you pay for what you use, so always optimise your queries. – flytzen Jan 05 '17 at 13:42
13

This is first and foremost a question of performance. You are dealing with a poorly performing code on your part and you must identify the bottleneck and address it. I'm talking about the bad 2 seconds performance now. Follow the guidelines at How to analyse SQL Server performance. Once you get this query to execute locally acceptable for a web app (less than 5 ms) then you can ask the question of porting it to Azure SQL DB. Right now your trial account is only highlighting the existing inefficiencies.

After update

...
@iddepartment int
...
iddepartment='+convert(nvarchar(max),@iddepartment)+'
...

so what is it? is the iddepartment column an int or an nvarchar? And why use (max)?

Here is what you should do:

  • parameterize @iddepartment in the inner dynamic SQL
  • stop doing nvarchar(max) conversion. Make the iddepartment and @iddertment types match
  • ensure indexes on iddepartment and all idkpis

Here is how to parameterize the inner SQL:

set @sql =N'
Select * from (
select kpiname, target, ivalues, convert(decimal(18,2),day(idate)) as iDay   
from kpi
inner join kpivalues on kpivalues.idkpi=kpi.idkpi
inner join kpitarget on kpitarget.idkpi=kpi.idkpi
inner join departmentbscs on departmentbscs.idkpi=kpi.idkpi
where iddepartment=@iddepartment
group by kpiname,target, ivalues,idate)x
pivot
(
     avg(ivalues)
    for iDay in (' +@columnName + N')
) p'

execute sp_executesql @sql, N'@iddepartment INT', @iddepartment;

The covering indexes is, by far, the most important fix. That obviously requires more info than is here present. Read Designing Indexes including all sub-chapters.

As a more general comment: this sort of queries befit columnstores more than rowstore, although I reckon the data size is, basically, tiny. Azure SQL DB supports updateable clustered columnstore indexes, you can experiment with it in anticipation of serious data size. They do require Enterprise/Development on the local box, true.

Remus Rusanu
  • 288,378
  • 40
  • 442
  • 569
  • Yes, my query can be optimize for sure, I am a beginner and all what I did was searching the internet for "How to...?" :) – Lucian Bumb Jun 27 '15 at 11:18
  • 5
    @remus - This is a great comment to make, but it's not an answer; it's a suggestion for doing some analysis and posting a better question. – David Makogon Jun 27 '15 at 16:59
  • 4
    @DavidMakogon I really consider this an answer, not a comment. Lucian really needs to focus first in getting the app performing before he can consider a the matter of choosing the Azure SLO. Answering *why* is the whatever SLO he now choose so much slower than an unconstrained standalone instance will not help, because I'm pretty sure he cannot simply choose a PXX and let the app as-is. – Remus Rusanu Jun 27 '15 at 19:13
2

It's nothing to do with the fact that your account is trial, it's due to the lower performance level you have selected.

In other service (SmarterAsp) and running local instance you probably do not have performance restrictions rather size restrictions.

At this point it's impossible to put together what actually DTU means / what sort of DTU number is associated with a Sql server installed in your local machine or in any other hosting provider.

However, there are some good analysis (https://cbailiss.wordpress.com/2014/09/16/performance-in-new-azure-sql-database-performance-tiers/) done regarding this but nothing official.

Low Flying Pelican
  • 5,974
  • 1
  • 32
  • 43