Saturday, January 4, 2014

Missed to regularly purge MSDB!! , what you have to consider prior to start ?

In the last couple of days I had to deal with these number for a MSDB db hasn't purged since 3 years !!!! yeah 3 years , so I had to do many tasks out of the normal tasks ,

let me give you brief about the figures
  1. backupFile --- 10 Million rows
  2. backupFilegroup --- 6 Million rows
  3. backupmediafamily--- 4 Million rows
  4. backupmediaset--- 4 Million rows
  5. backupset--- 4 million
the total size of the MSDB is 11 GB ,

the fact is , if you start purging the historical backup data you will not be able to perform any type of backup operations for hours if it hasn't take days , in addition to growth of the log file of MSDB dbs , so I had a practical experience I would like to share to do so in a very smooth way as following :

  1. extract the whole data in the list of tables above to another SQL instance plus these tables
  • restorefile
  • restorefilegroup
  • restore history
2.apply missing indexes

 Create index IX_backupmediaset_media_set_id on backupmediaset(media_set_id)
go
 
Create index IX_backupmediafamily_media_set_id on backupmediafamily(media_set_id)
go
 
CREATE NONCLUSTERED INDEX [IX_pro_Del]
ON [dbo].[backupset] ([database_name])
INCLUDE ([backup_set_id])
GO
 
CREATE NONCLUSTERED INDEX []
ON [dbo].[backupset] ([database_name])
INCLUDE ([media_set_id])
GO
 
last step start delete per database
EXEC sp_delete_database_backuphistory 'you database Name'
GO
once you reach a reasonable number of transactions in above listed tables you can get back to normal operation and delete by date

what you have done on the test instance can be applied to your production without any interruption to your operation (backup and  size of log file)

let me know if anything goes wrong ... Enjoy :)
 

Thursday, November 19, 2009

SQL Server 2008 R2 Pricing and Feature Changes

Standard Edition: Now with Backup Compression
SQL Server 2008 introduced backup compression, but it was only available in Enterprise Edition. At the time, Enterprise Edition cost around $20,000 more per processor than Standard Edition, so companies couldn’t justify upgrading to Enterprise Edition just to get backup compression. Companies had to need Enterprise for multiple features in order to stomach the price. If all a DBA needed was compression, they could buy backup compression software much cheaper than the price of Enterprise Edition.
In SQL 2008 R2, even Standard Edition gets backup compression. That’s a game-changer, and I’d expect to see smaller companies that do backup compression – and nothing else – to start falling by the wayside.
In addition, Standard can now be a managed instance – it can be managed by some of the slick multi-server-management tools coming down the pike like the Utility Control Point (read my SQL 2008 R2 Utility review). It can’t be the management server itself – it can’t be a Utility Control Point – but at least we can manage Standard. It’s good to see that Microsoft recognizes all servers need to be managed, not just the expensive ones. Big thumbs up there.
Enterprise Edition: CPU Limits
In Enterprise, Microsoft giveth and Microsoft taketh away. SQL 2008 R2’s BI tools include a new Master Data Services tool. It’s targeted at enterprises with data warehouses that need to manage incoming data from lots of different sources, and that data isn’t always clean or correct. MDS helps make sure data follows business rules. This isn’t a common need for OLTP systems, so it’s only included in Enterprise, not Standard. Makes sense.
A little less easy to stomach, however, is a new set of caps on Enterprise Edition. The current SQL 2008 comparison page shows that Enterprise has no licensing limit on memory or the number of CPU sockets. SQL 2008 R2 Enterprise Edition is capped at 8 CPU sockets, and there’s a memory cap as well, but I haven’t been able to track down a public page showing the cap. The only hint is the SQL 2008 R2 edition comparison page, which notes that Datacenter Edition (more on that in a second) is licensed for “memory limits up to OS maximum.” If that wasn’t a unique selling point, it shouldn’t be included in the feature list.
The more expensive Enterprise can act as the management server (Utility Control Point) for up to 25 instances. However, that doesn’t mean you need to buy one Enterprise per 24-25 Standard servers, and then manage them in pools – there’s an app edition for that.
Datacenter Edition: For, Well, Datacenters
The new Datacenter Edition picks up where Enterprise now runs out of gas. It supports more than 8 sockets, up to 256 cores, and all the memory you can afford. Or can’t afford, for that matter.
If you’re going to manage over 25 instances with the Utility Control Point stuff, Datacenter Edition can manage “more than 25 instances” according to Microsoft’s edition comparison page. I like how they worded that – they didn’t say “unlimited instances,” because there will be performance impacts associated with using Utility Control Points. The performance data collections gather a lot of data, and storing it for hundreds of instances will take some pretty high performance hardware.

Parallel Data Warehouse Edition: Sold with Hardware Only
The big new fella in town getting all the press is the artist formerly known as Project Madison, formerly known as DATAllegro. It’s a scale-out data warehouse appliance, but you won’t find this appliance at Home Depot. This version of SQL Server is sold in reference architecture hardware packages from Bull, Dell, HP, EMC, and IBM. Write one check, and you get a complete soup-to-nuts data warehouse storage engine that includes everything from the servers, SAN, configuration, and training.
I had the chance to talk with Microsoft’s Val Fontama, and I’ll post more details of that interview next week, but I have to share one quick snippet. I asked what happens when a Parallel Data Warehouse system starts to have performance issues, and he explained that the DBA will need to call in specialized Parallel engineers. You won’t be popping open this rack and installing another drawer of hard drives yourself or adding additional commodity hardware boxes to scale out your datacenter. It’s more of a sealed solution than something you have to build yourself.
I have mixed feelings about this – as a guy who loves hardware, I want to dive under the hood. However, as a guy who’s managed data warehouses, I know it’s one heck of an ugly skillset to learn on the job, and when data gets into the 5-10 terabyte range, you can’t afford to make configuration mistakes.

How Much Would You Pay For All This?
It slices. It dices. And if you call now, you can get all this for the low, low sticker price of:
• Standard Edition – $7,499 per processor (socket)
• Enterprise Edition – $28,749 per processor
• Datacenter Edition – $57,498 per processor
• Parallel Data Warehouse Edition – $57,498 per processor (but you’ll be buying this in combination with hardware anyway)
Eagle-eyed readers will note it’s about a 20% price increase from SQL Server 2008. That’s probably easy to justify on Standard Edition because Microsoft can say they’re throwing in backup compression, a feature that normally would have cost extra from third party vendors.
SQL 2008 R2 Enterprise Edition, however, won’t have quite as easy of a time justifying its price increase given that it now has CPU caps and already had backup compression anyway

Wednesday, July 1, 2009

MS SQL Server System Databases

The Resource Database
SQL Server added the Resource database. This database contains all the read-only critical system tables, metadata, and stored procedures that SQL Server needs to run. It does not contain any information about your instance or your databases, because it is only written to during an installation of a new service pack.

The Resource database contains all the physical tables and stored procedures referenced
logically by other databases. The database can be found by default in C:\Program Files\Microsoft SQL
Server\MSSQL10.MSSQLSERVER\MSSQL\Binn.mdf and .ldf, and there is only one Resource database perinstance.
The use of drive C: in the path assumes a standard setup. If your machine is set up differently, you may
need to change the path to match your setup. Additionally, the .MSSQLSERVER is the instance name.
If your instance name is different, use your instance name in the path.
In SQL Server 2000, when you upgraded to a new service pack, you would need to run many long scripts to drop and recreate system objects. This process took a long time to run and created an environment hat couldn’t be rolled back to the previous release after the service pack. In SQL Server 2008, when you pgrade to a new service pack or quick fix, a copy of the Resource database overwrites the old database.
This enables you to both quickly upgrade your SQL Server catalog and roll back a release.
The Resource database cannot be seen through Management Studio and should never be altered unless you’re under instruction to do so by Microsoft Product Support Services (PSS). You can connect to the database under certain single-user mode conditions by typing the command USE MSSQLSystemResource.
Typically, a DBA runs simple queries against it while connected to any database, instead of having to connect to the resource database directly. Microsoft provides some functions which allow this access. For example, if you were to run this query while connected to any database, it would return your Resource database’s version and the last time it was upgraded:
SELECT serverproperty(‘resourceversion’) ResourceDBVersion,
serverproperty(‘resourcelastupdatedatetime’) LastUpdateDate Do not place the Resource database on an encrypted or compressed drive. Doing this may cause upgrade or performance issues.



The master Database
The master database contains the metadata about your databases (database configuration and file location),logins, and configuration information about the instance. You can see some of the metadata stored in master by running the following query, which returns information about the databases that exist onthe server:SELECT * FROM sys.databasesThe main difference between the Resource and master databases is that the master database holds dataspecific to your instance, whereas the Resource database just holds the schema and stored proceduresneeded to run your instance, but does not contain any data specific to your instance. You should alwaysback up the master database after creating a new database, adding a login, or changing the configurationof the server.


tempdb Database
The tempdb database is similar to the operating system paging file. It’s used to hold temporary objects created by users, temporary objects needed by the database engine, and row-version information. The tempdb database is created each time you restart SQL Server. The database will be recreated to be its original database size when the SQL Server is stopped. Because the database is recreated each time, there is no reason to back it up. Data changes made to objects in the tempdb database benefit from reduced logging.
It is important to have enough space allocated to your tempdb database, because many operations that you will use in your database applications use the tempdb. Generally speaking, you should set tempdb to autogrow as it needs space. If there is not enough space, the user may receive one of the following errors:
❑ 1101 or 1105: The session connecting to SQL Server must allocate space in tempdb.
❑ 3959: The version store is full.
❑ 3967: The version store must shrink because tempdb is full.


model Database
model is a system database that serves as a template when SQL Server creates a new database. As each database is created, SQL Server copies the model database as the new database. The only time this does not apply is when you restore or attach a database from a different server.
If a table, stored procedure, or database option should be included in each new database that you create on a server, you may simplify the process by creating the object in model. When the new database is created, model is copied as the new database, including the special objects or database settings you have added to the model database. If you add your own objects to model, model should be included in your backups, or you should maintain a script which includes the changes.

msdb Database
msdb is a system database that contains information used by SQL Server agent, log shipping, SSIS, and the backup and restore system for the relational database engine. The database stores all the information about jobs, operators, alerts, and job history. Because it contains this important system-level data, you should back up this database regularly

Data Governance Learning Plan

Seven courses to build the knowledge of Data Governance Program ( planning & establishment ) by Kelle O'Neal with 100 MCQs Exam 1....