question

RishabhPatil avatar image
0 Votes"
RishabhPatil asked ErlandSommarskog commented

Cost effective way to archive history data from SQL large tables

Hi all,

I am doing research on how to proceed archive of my large SQL tables.

Currently, we have been doing it in the different archive database of same instance using insert and purge approach. The archived tables has become so big like having more than one billion records in each. We have found that the 1TB hard drive has fully occupied by these archive heap tables and we need to upgrade it more.

Now, the question comes. Do I need to think about other options available for this archiving? Or only way is to put some disk storage.

Mainly we have 3 tables interlinked with each other

My requirement is that :-
1] I should be able to archive all the records from these tables older than 6 months.
2] Incremental archiving
3] Need to run the archive everyday maybe?


Note: We don't make very much use of that archive data.

I am looking for cost effective solution to tackle this once and for all.

Any help would be appreciated.
Thanks

@ErlandSommarskog

sql-server-general
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

ErlandSommarskog avatar image
0 Votes"
ErlandSommarskog answered

Obviously, you will need disk space for that data.

But there are compression techniques, for instance columnstore, and there is a special COLUMNSTORE_ARCHIVE which gives even better compression.

It sounds to me like a good thing that you have separate databases for this archive data, because then you can easily move those databases around without affecting the main database.

Whether you rant to run the archiving daily, weekly or with some other frequence, depends both on the archiving and how the archiving affects the main database, and maybe more the latter.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

RishabhPatil avatar image
0 Votes"
RishabhPatil answered

Someone please help on this.
Not sure if this has been reached to the technet forum.

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

RishabhPatil avatar image
0 Votes"
RishabhPatil answered RishabhPatil edited

@ErlandSommarskog

Thanks the compression archival sounds something useful.
I will implement and see how much space I could save with this. Cheers!

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

RishabhPatil avatar image
0 Votes"
RishabhPatil answered ErlandSommarskog commented

@ErlandSommarskog

Hi, one table has LongDescription column as Varchar(Max) getting below error.

Msg 35343, Level 16, State 1, Line 3
The statement failed. Column 'LongDescription' has a data type that cannot participate in a columnstore index. Omit column 'LongDescription'.

How should I deal now?
Thanks

· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

I guess you could refactor the table and move the LOB column to a separate table. If it turns out that the many rows have the same value in this column, you could save some space, by replacing the text with an id, and only store the repeated value once. Of course, it is not unlikely that all non-NULL values are different, but it can be worth investigating.

If the LOB values are largely unique, it makes more sense to have the table with the LOB column keyed with the same key as the main table.

You can also save space on the LOB column by applying the compress function. If you do this, you need to make the column a varbinary(MAX). If you do this, I recommend that you add a computed column:

actualtext AS convert(varchar(MAX), decompress(compressedcol))

Else people may not know how to retrieve the data. (And beware, there is both decompress and uncompress, but only the former works!)

0 Votes 0 ·