Data Compression before or after loading operations? [SQL Server]

We have a process that collects and archives anywhere from 30 to 100 million rows a day. I asked one of my team members to research and come up with a suggestion for us to load the data and keep it partitioned by day. Then apply a sliding window to the oldest partitions.

As per BOL, we went ahead and tested by inserting into a compressed heap and then creating a clustered index on that table:

1
2
3
4
5
6
7
8
9
-- Insert into Partitioned Heap
BEGIN TRAN
INSERT INTO [History_201010].[dbo].[Buffer_History_Stage] ([column1], [column2], [column3], [column4])
SELECT [column1], [column2], [column3], [column4]
FROM [History].[dbo].[Buffer_History_Archive]
IF @@ERROR=0
COMMIT TRAN
ELSE
ROLLBACK TRAN

–Results
(28034332 row(s) affected)
— Time to Complete: 00:09:38

— Create compressed, partitioned clustered index (sort in tempdb)
BEGIN TRAN
ALTER TABLE [History_201010].[dbo].[Buffer_History_Stage] ADD CONSTRAINT [PK_Buffer_History_Stage] PRIMARY KEY CLUSTERED
(
[column1] ASC,
[column2] ASC
) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = ON, IGNORE_DUP_KEY = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, DATA_COMPRESSION=PAGE) ON [ps_history] ([datecreated])
IF @@ERROR=0
COMMIT TRAN
ELSE
ROLLBACK TRAN

— Time to Complete: 00:07:19

–Switch Partition to Permanent Table (should be instantaneous)
BEGIN TRAN
ALTER TABLE [History_201010].[dbo].[Buffer_History_Stage] SWITCH PARTITION 1 TO [History_201010].dbo.[Buffer_History] PARTITION 1;
IF @@ERROR=0
COMMIT TRAN
ELSE
ROLLBACK TRAN

— Time to Complete: 00:00:00
— Total Time to Complete: 00:16:57

In comparison, we attempted to just directly insert into a permanent table:

1
2
3
4
5
6
7
8
9
--Insert into Compressed, Partitioned Clustered Index directly
BEGIN TRAN
INSERT INTO [History_201010].[dbo].[Buffer_History] ([column1], [column2], [column3], [column4])
SELECT [column1], [column2], [column3], [column4]
FROM [History].[dbo].[Buffer_History_Archive]
IF @@ERROR=0
COMMIT TRAN
ELSE
ROLLBACK TRAN

–Results
— (28034332 row(s) affected)
— Total Time to Complete: 00:12:43

We tested this several times on an idle Test box, and were looking just for time; not really looking at the IO Statstics.

Based on what we have experienced, it appears that it is more efficient to just immediately insert into a compressed table, with a partitioned clustered index.

One difference between our test and what Microsoft was suggesting was that they performed a bulk insert. We used data already in another database for the same instance.  Our tests were ran with 28 Million rows.

 

Who needs DBA Skills? [#TSQL2sDay]

T-SQL Tuesday for November 2010

T-SQL Tuesday for November 2010

Databases are doing just fine. If an issue is encountered add more hardware to the mix. If that does not work then add more servers and move databases to it. No DBAs on site, just developers and a business owner that wants to save money.

What? With that first paragraph I can pinpoint at least 3 main reasons why DBA skills are required:

  1. Adding more hardware is just a tactical approach, never strategical as the new one will encounter the same issue the old one did in a matter of months.
  2. Adding more servers can be complex as databases will reside on different boxes. Messing around with linked servers and executing joins across boxes are not the best approach.
  3. Licensing costs go through the roof! Yes. I have dealt with a solution where the Lead Developer decided to add 8 Servers running SQL Server 2005 Enterprise Edition each. The tab was running over $300K just in licensing costs.

Ok, so what else will a good DBA perform to add value to any company? These come to mind:

  1. To plan a long term solution properly. Architect a database structure in order to be flexible to extensions, following 3rd and 4th normalization forms, with proper relationships, keys and indexes.
  2. To understand the business requirements and design a database layout that will approach them. Which tables will be transactional? Which ones will be historical? Which ones will receive bulk insert operations?
  3. To implement proper security.
  4. If data will grow too large, to prepare the underlying tables to support sliding windows to swap out old data.
  5. To make the business owners aware that a separate entity would be beneficial if reporting is crucial. Nothing better than replicating data to another instance and allowing it be queried all day long.
  6. To make developers understand that business logic shall not run in the database tier. Data validation, cleansing, and any process that will require scalar UDFs are bad. If there is no way to add an app tier, then CLRs can help, but these kind of solutions need to have proper justifications.
  7. To see if filtered and/or include indexes can benefit a solution.
  8. To justify if data compression can be a good trade off with CPU cycles.
  9. To suggest when mirroring, log shipping, replication and/or clustering will benefit.
  10. To understand if I/O will require additional spindles and how to prove it.
  11. To understand if an edition downgrade, release upgrade, virtualization or X64 implementation will add value to any solution.
  12. To have an up to date checklist for production database implementations
  13. To follow a good process: Unit Test, QA, UAT, Production
  14. To learn from the best and to share his knowledge with their team.

DBA skills are not just a nice-to-have. They are a requirement. Do it right the first time and avoid the hassle of fixing in the future.

Once I was approached by a developer and he literally told me:

Every night you should have a job that truncates the logs. Also, you need to shrink the databases to make them faster.

I am not making that up! It is a true story. Imagine if that developer was taking care of the databases. I believe that most developers have no business in the DB world.

My SQLPass Schedule [Training]

This is my SQL Pass Summit schedule. I am double booked in some cases as I have not decided which one should I attend yet. Other than that, I just need to figure out to which parties should I make it 🙂

Schedule for Oscar Zamora

Sunday, November 7

6:00 PM

PASS Booth Open

Registration

Monday, November 8

7:15 AM

Internet Pavilion

Registration

7:30 AM

Continental Breakfast

8:00 AM

Microsoft Labs

12:00 PM

Pre-Conference Lunch

Tuesday, November 9

6:30 AM

Internet Pavilion

Registration

7:00 AM

Continental Breakfast

7:15 AM

Speaker Ready Room

8:00 AM

Microsoft Labs

8:15 AM

Keynote Speaker Day 1 – Ted Kummert

10:00 AM

SQL Server Clinic

10:15 AM

Consulting – Should I Consider It?

SQLCAT: HA DR Customer Panel — Architectures & Lessons Learned (90 mins)

11:00 AM

Community Learning Center

Exhibit Hall

11:15 AM

SQLCAT: Customer Learning of using FileStream and Remote Blob Store (RBS) for a Large Scale Deployment

11:30 AM

Lunch – Birds of a Feather with MVP’s

12:00 PM

Energizing the Next Generation: Encouraging and Inspiring Young Women to Choose Tech Careers

12:30 PM

PASS Summit Book Signing

1:00 PM

PASS Summit Book Signing

1:30 PM

Advanced SQL Server Deployment Techniques for Virtualization (DBA-313)

3:00 PM

Demystifying MDX in Reporting Services (BID-218)

Zero to Cube: Fast Track to Analysis Services Development (BIA-202)

4:30 PM

Inside DAX

6:15 PM

Introduction to the Entity Framework

7:00 PM

Speaking at PASS – How to Write An Abstract

Wednesday, November 10

6:45 AM

Sponsor-Hosted Breakfast (Microsoft)

7:00 AM

Continental Breakfast

Sponsor-Hosted Breakfast (VMware)

7:15 AM

Internet Pavilion

Registration

Speaker Ready Room

8:00 AM

Microsoft Labs

8:15 AM

Keynote Speaker Day 2 – Quentin Clark

10:00 AM

SQL Server Clinic

10:15 AM

Business Intelligence and the Cloud (90 mins) (90R1)

Don’t Wait Consolidate (DBA-245)

11:00 AM

Community Learning Center

Exhibit Hall

11:30 AM

Lunch

Women in Technology Luncheon (Sponsored by GoDaddy.com)

1:30 PM

50 surprising features of SQL Server Business Intelligence

Building a Comprehensive Professional Development Plan (PD-468)

3:00 PM

Building NextGen Scalable & Highly Available Architectures with Microsoft SQL Server 2008 R2 (DBA-461)

Cooking with SSRS: Advanced Report Design Recipes (BID-234)

4:30 PM

Managing system performance with the Data Collector and Management Data Warehouse with Reporting Services

Master Data Services: Fixing Data before ETL (BIA-204)

Thursday, November 11

6:45 AM

Registration

Sponsor-Hosted Breakfast (Quest Software)

7:00 AM

Continental Breakfast

7:15 AM

Internet Pavilion

Speaker Ready Room

8:00 AM

Microsoft Labs

8:15 AM

Keynote Speaker Day 3 – David Dewitt

10:00 AM

SQL Server Clinic

10:15 AM

Hardware 201: Selecting and Sizing Database Hardware for OLTP Performance (DBA-242)

SQL Azure Data Sync – Integrating On-Premises Data with the Cloud

11:30 AM

Boxed Lunch

12:00 PM

Chapters Lunch

PASS Summit Book Signing

12:30 PM

PASS Summit Book Signing

1:00 PM

Real World Analysis Services Stored Procedures (BIA-206)

2:30 PM

Automating SQL Buildouts with Hyper-V and SQL Server 2008 R2 (DBA-249)

Monitor Your Business with PerformancePoint Services Monitoring and Analytics (BID-217)

4:00 PM

Optimizing Large-Scale OLTP Workloads

Friday, November 12

7:00 AM

Internet Pavilion

7:15 AM

Registration

7:30 AM

Continental Breakfast

8:30 AM

SharePoint for the DBA

12:00 PM

Post-Conference Lunch

Incorrect Bios Settings Can Reduce Database Performance [SQL Server]

Sam Saffron commented on my blog post asking me if I could share my thoughts on an issue he was experiencing. This was on a newly migrated SQL Server 2008 R2 database solution running on Nehalem Processor based CPUs. I went ahead and shared what I experienced in my environment.

It turned out that the issue was not Hyper Threading related, but a Bios misconfiguration instead. This is an excerpt of his blog post:

In the process of asking the question and researching it ourselves, we discovered the answer. These Dell servers were inexplicably shipped with BIOS settings that …

  • did not allow the host operating system to control the CPU power settings
  • did not set the machine to high performance mode
  • did not scale CPU speed under load properly

… kind of the worst of all worlds. But Kyle quickly flipped a few BIOS settings so that the machine was set to “hyperspeed mode”, and performance suddenly got a lot better. How much better?

A conclusion of this experience, check for Bios Configurations. Dell needs to tell their customers which settings can be beneficial for certain applications, like SQL Server.