SQL Centre of Excellence

If you are looking at running SQL Server in the cloud a key question you will ask is “Should I rent or Buy” SQL licenses. E.g. should you use the stock images and pay by the hour or should you buy your own licences.

Some facts that will help you decide:

  • SQL Server Enterprise Edition can vary in price from say 16,000 euro to 24,000 euro per four cores depending on your licensing scheme and bartering ability.
  • You MUST pay for software assurance at 20% per year to transfer licenses to the cloud (Azure or Amazon)
  • Azure charges about 1,164 euro per month for enterprise edition, per four cores.
  • Buying licences with SA gives you rights to use a “DR” instance without paying for its license. When renting you need to pay for DR while its turned on (not good for say a passive mirror).

So assuming full retail prices we can say the following:

  • If you are running for more than 2 years at 100% up time you should buy.
  • If you are powering down images then factor this in. So if 50% powered down,  only buy if you are sweating the licences for > 4 years.
  • If you are running “warm” DR Servers then buy licenses if run for more than a year. Effectively buying licences becomes twice as appealing.

One thing I am looking at the moment is a scale out architecture where there may be two nodes up 100% of the time and then additional nodes scaled on demand. In this hybrid case it makes sense to buy licenses on the permanent nodes and “rent” on the elastic nodes.

Obviously this doesn’t take into account that renting gives you more flexibility, so is certainly the mode of choice for temporary environments.

Hot of the press is the announcement of MS Azure support for local SSD, up to 800GB which should be enough for “most” TempDBs.
http://azure.microsoft.com/blog/2014/09/22/new-d-series-virtual-machine-sizes/

Now this doesn’t have me too excited as its volatile and core storage is still a bit challenging to get IOPS on, especially if you need “burst” capability. What has me super excited is the introduction of the “memory” intensive images, or the D series with very competitive pricing.

clip_image002_thumb2

Annoyingly the pricing is a bit misleading as provisioning page has the price above (768.48 per month), but the pricing page has a different price

http://azure.microsoft.com/en-us/pricing/details/virtual-machines/

image_thumb[1]

 

If we look at a 16 core server options with close to 128GB ram we can compare the old and new offerings against Amazon. The only way to get 16 cores before was via the A9 image and it was really pricey, now we have local SSD thrown in for 51% price drop (ok with no infiniband, but I'm good with the trade).

  Azure (Old Pricing) Azure (new) Amazon EC2
Name A9 STANDARD_D14 r3.4xlarge
Cost/Month € 2,715 1,314 1,092
Spec 16 Cores
112 GB
40 GBit Infiniband
16 Cores
112 GB
800GB Local SSD
16 Cores
122 GB
2x320GB Local SSD

So, while the price for 16 core images has come down a lot its still not as competitive as the Amazon offering.

What is it missing to be a technically superior solution to Amazon for running SQL Server. A few things IMO:

  • Provisioned IOPS
  • Wholesale Replacement of old fashioned Magnetic Media with SSD
  • Burst Capably on IOPS

Amazon still has these trump cards and while it does, it will be holding the technical edge for running SQL Server at the high end on demanding IO based workloads.

However todays announcement is a huge leap forward. If you can leverage the Local SSD and/or don’t need massive IOPS then you can pocket a higher business value.

Why Pay for Professionals ?

by Bob Duffy 9. September 2014 09:23

Often, especially when working for larger organisations that need to go through a procurement process I get asked why our rates seem so high. They can hire SQL people from the global outsourced companies at 3-4 times less, and even go to general local consultancy practises for 1/2 of the rates.

The preferred solution to a lot of this trouble is to move away from charging by hour and to charge for results. Its super hard to do a lot of the time when people are so used to watching per hour costs rather than the quality and overall business value.

Here are two of my favourite memes doing the rounds.

“There is Always Someone who will do it Cheaper”

 

Another favourite line floating around the last while is

“If you think hiring a professional is expensive, wait until you hire an amateur”

Load Testing SSAS in Oslo

by Bob Duffy 30. August 2014 09:47

Thanks for everyone who made it to my SQL Saturday Session in Olso today on load testing Analysis Services.

If you are interested in getting started to building a load test harness in visual studio, I have attached the source code I used for the demos and sample database for AdventureWorks.

image

Happy Load Testing!

When you need to do load testing for Analysis Services its common to need to capture and “replay” MDX queries. I’ll go into replay tools later – the first step is to capture the queries. You will need

  • An XMLA trace definition file capturing at a minimum the QueryBegin and QueryEnd events (Events 9 and 10)
  • AN XMLA command to end the trace
  • The ASCMD.EXE utility to automate the starting/stopping of the trace. The utility is available here
    http://msftdbprodsamples.codeplex.com/downloads/get/384553

Here’s a quick video demonstrating the process of capturing a trace and importing the results into a data table.

More details on managing server side traces are also provided on this link here

http://blogs.msdn.com/b/pfebi/archive/2013/02/12/managing-server-side-analysis-services-profiler-trace.aspx

Its hard to do capacity planning for MOLAP cubes and a common question asked is if we have enough memory or why the server is showing zero bytes free memory wise.

While the msmdsrv.exe process can consume a lot of memory between the storage engine cache and formula cache, we need to be super careful to ALSO leave space for the FileSystemCache which appears as the “Standby” are within resource manager.

Its perfectly possible that SSAS has enough memory, but the FileSystemCache is starved of memory so any large queries will go to physical disk rather than to the cache which is “considerably” slower.

The screenshot below shows a Server with lots of memory after a load test of 100,000 queries. Note the FilesystemCache is > than the msmdsrv.exe memory!

image

You may find that SSAS has plenty of memory free but your server still has zero bytes free due to a massive “Standby” cache. You have two options:

a) Add more memory

b) Limit the FileSystemCache by using the LimitSystemFileCacheSizeMB setting in the ini file.

If you limit the cache be aware that IOPS from the FileSystemCache are super fast compared to disk, so you may get a performance hit.

 

Two lessons for me

1. Don't just use the commit size for the msmdsrv.exe when looking at memory usage for analysis services.

2. Maybe for cache warming we should look to warm up the FileSystemCache first. This would be considerably faster if we could leverage large block sizes and sequential IO rather than the small random IO that the SSAS storage engine does. Hmmm going to think about how to do this now…

Thanks to everyone who made it to my session at SqlBits. I’m hanging around the event so feel free to ping my on twitter (@bob_duffy) if you want to chat about Cube Processing over a beet at the event. I love discussing the challenges of cube processing and swapping notes on how to optimise !!

Attached is a copy of my slides

A few people wanted a copy of the Test Harness too. The test harness does the following:

- runs XMLA command to start trace

- run logman to start performance counters

- run XMLA to process cube

- stop performance counters

I have a few speaking events lines up for the rest of the year.

www.SqlBits.com– 17th-19th July

Sql Saturday Oslo, 30th August 2014

SQL Saturday Dublin, 19th/20th Sept 2014

  • World Jenga Championships Ireland v Denmark.
  • Lightning Talk – 5 minutes “call my SQL bluff”

PASS Summit 2014, Nov 4th/7th 2014

  • Load Testing Analysis Services

SQL Server Days (Belgium), Sept 3th/1st Oct 2014

Hope to see you at one of the events!

If you are coming in from Dublin Airport there are a few options on getting to the Hilton hotel, Charlemont for Sql Saturday Dublin 2014

http://www.sqlsaturday.com/310/

By Taxi

There is a taxi rank to your right as you exit the terminal at Dublin Airport. It's well managed by Dublin Airport staff and any queues move pretty quickly. A taxi to Charlemont Place will cost you around €30. the Hotel can help you book a taxi back.

If you are travelling around Dublin and want to book a taxi there is an excellent App for mobile phones to book and track taxis. You wont need this at the Airport as they have a 24 hour taxi rank.

https://www.hailocab.com/ireland

By Aircoach and Luas

There is a dedicated coach between the airport and city centre that is very comfortable and has WIFI. It also runs 24 hours and has GPS tracking on its web site to see where the coaches are!

From the city centre there is a very good light rail system and the Hotel is only two stops south of the city centre (Stephens Green)

take the Aircoach Leopardstown service to Kildare Street and then walk around the corner to the LUAS stop at St. Stephen's Green (just at the top of Grafton Street). It's about 5 mins walk. From there you can take the LUAS two stops to Charlemont which is right by your hotel.

The Aircoach as far as Kildare Street costs €8 single per adult and €14 return. The LUAS costs €1.50 each way.

Just walk outside the airport and you'll see blue signs for the Aircoach to Leopardstown.

www.luas.ie

www.aircoach.ie

By Bike

Ok you cant actually cycle from the Airport to the Hotel, but if you are sight seeing there is an excellent bike rental system in Dublin where you can rent bikes and drop them off at set points.

The site below shows you where the pickup points are. There is one bike station right outside the hotel.

http://www.dublinbikes.ie/

The speaker line up for #SqlSatDublin is announced with some 29 speakers including:

  • 7 Speakers from MSFT including a principal architect on the SQL team.
  • 15 MVP’s from around the globe.
  • 3 SQL MCMs.
  • 2 Instructors from the SSAS Maestro course.

image

 

To the see the schedule and register follow this link

http://www.sqlsaturday.com/310/eventhome.aspx

 

 

See you there!

Bob

Update: 23/06/2014. Sorry we missed the MVP flag from a few speakers! That's updated to 15 MVP’s speaking at #SqlSatDublin

Page List

Page List