Hi all,
I have a stored procedure that receives XML as input and it runs many checks (selects) in many tables, somewhere during the process it performs required selects with a few tables involved (never more than 10 tables but some of them might have many thousands of records), performs inserts or updates.
My matter is concurrency, I may deal with one insert but I may deal with 100 or 1000 at the same time.
Since SQL Server, by default applies READ Committed isolation level, deadlocks began to occur.
Some SQL enhancements and tunning were made. Using Execution Plan helped understanding some performance loss reading XML data, but deadlocks kept happening. Recently I analyzed SNAPSHOT isolation level and applied it successfuly (no more deadlocks since selects make no lock on destination tables). This works like a charm for something like 150 requests simultaneously.
My question here is how can I make this process quick but yet reliable, I mean, somehow If I deal with 100 requests I'm able to satisfy them in an acceptable manner, but I'm sure that if I deal with 1000 it will not go ok.
My question is if is possible to use Service Broker or some queue system that allows me something like I execute X requests at the same time, and after those requests are done I may execute X more (for example, 50 requests at the same time)?
An important note is that the same tables involved in this process are also involved in other processes that can be executed at the same time (or not) and are involved in an application that also can be using these tables.
Thank you for your opinions.