Wednesday, March 21, 2012

How to reduce timeouts in SQL Server?

Hi,
We have designed a system which processes millions of transactions a
year. There are a couple of tables in the system which are heavily
used. These 2 tables gets about 30,000 rows inserted into them
everyday at an average. The inserts are done through stored procedures
after perfomring some complex logic. There are also a few scheduled
jobs that run every night which accesses these 2 tables. In addition,
there are a few reports that run in the web system which acces these
table for ad-hoc reporting. In the last few months, the number of
transactions have increased quite a bit and system started giving us
problems such as a lot of timeouts and a few deadlockks everyday.
I am thinking that if I can somehow replicate these tables every 6
hours or so and then use the new tables for all reporting purposes, my
timeouts might decrease. Could you suggest on how I could go about
replicating these 2 tables in an efficient way? and whether this would
help my situation?
Any other suggestions on how I could improve my system would be
appreciated.
Thanks,
Karan
ouch....
tough question without being able to see your database.
here are some thoughts...
1. Make sure indexes are being optimized\maintained on a regular basis (Read
up on showContig and on dbcc IndexDefrag)
2. review your critical transactions and make sure that you are not causing
excessive blocking and deadlocks (This is really hard, you'll have to read
up on troubleshooting deadlocks)
3. yes your adhoc reporting should be moved out to another database. You can
use DTS for this, replication or log shipping. We are using log shipping to
satisfy this need.
you're likely going to need to spend some serious hours pinpointing the
problem here. My guess is that option #1 above will buy you some serious
performance gains.
Cheers,
Greg Jackson
PDX, Oregon
|||Articles for reference:
http://www.microsoft.com/technet/com...l/sql1023.mspx
http://vyaskn.tripod.com/watch_your_timeouts.htm
"Karan" wrote:

> Hi,
> We have designed a system which processes millions of transactions a
> year. There are a couple of tables in the system which are heavily
> used. These 2 tables gets about 30,000 rows inserted into them
> everyday at an average. The inserts are done through stored procedures
> after perfomring some complex logic. There are also a few scheduled
> jobs that run every night which accesses these 2 tables. In addition,
> there are a few reports that run in the web system which acces these
> table for ad-hoc reporting. In the last few months, the number of
> transactions have increased quite a bit and system started giving us
> problems such as a lot of timeouts and a few deadlockks everyday.
> I am thinking that if I can somehow replicate these tables every 6
> hours or so and then use the new tables for all reporting purposes, my
> timeouts might decrease. Could you suggest on how I could go about
> replicating these 2 tables in an efficient way? and whether this would
> help my situation?
> Any other suggestions on how I could improve my system would be
> appreciated.
> Thanks,
> Karan
>
|||I performed the DBCC SHOWCONTIG on the tables affected and the stats
look ok to me. I am pasting the stats below..
DBCC SHOWCONTIG scanning 'claims' table...
Table: 'table_name' (322100188); index ID: 1, database ID: 7
TABLE level scan performed.
- Pages Scanned........................: 202265
- Extents Scanned.......................: 25349
- Extent Switches.......................: 27598
- Avg. Pages per Extent..................: 8.0
- Scan Density [Best Count:Actual Count]......: 91.61% [25284:27599]
- Logical Scan Fragmentation ..............: 0.64%
- Extent Scan Fragmentation ...............: 10.38%
- Avg. Bytes Free per Page................: 672.6
- Avg. Page Density (full)................: 91.69%
DBCC execution completed. If DBCC printed error messages, contact your
system administrator.
Please let me know if you see anything I could improve with this
stats. All other indexes on these table have similar stats as well.
Thanks for your help.
Karan
|||looks good to me.
Red Flag would be scan density LESS THAN 80%....You look fine in that
regard.
Greg Jackson
PDX, Oregon

1 comment:

Blogger said...

Did you know that that you can earn dollars by locking selected areas of your blog or site?
Simply join AdWorkMedia and use their content locking plug-in.

Post a Comment