SQL Server security best practice

Security! This is the word comes in mind of every concerned person when it come...

Change the Collation Settings in MS SQL Server

This post will show you how to change the collation settings in MS SQL Server for specific database...

Resolve collation conflict

In this post I will show you how you can resolve collation conflict error...

Book: SQL Server 2008 High Availability

In this book I have tried to cover every single piece of information that might requires for installing and configuring SQL Server HA option like Clustering, Replication, Log Shipping and Database Mirroring...

Why to recompile Stored Procedure

Generally, we create views and stored procedures (proc here after) ...

25 July 2016

Learn About SQL Server Master Data Services 2016

SQL Server 2016 comes up with the new release of Master Data Services (MDS). In it, some new features have been added to improve its performance. It helps the database administrators to manage the database properly. In the following discussion, we will be discussing the changes that have taken place in SQL Server Master Data Services 2016.

In-Depth Study of Latest Features in Master Data Service

There are various changes that have taken place in MDS SQL Server 2016 are discussed below:

Configuration control

There is a control on logging configurations for models. Models enclose the entities and the entities contain data. The data have a log file that keeps on increasing depending upon the transactional data. Therefore, it is important to have the control on retention and it is a necessary feature. In MDS 2016, user can control it with the mentioned setting under Model Configuration. It also gives an option to manage the Log retention days. By default, it is 1 and indicates that the log tables will not be cleaned. If the value is 0, then the log tables preserve only day’s work. The data logs of the previous days are truncated. If the value is greater than 0 then, the log will retain for number of days that is specified by value. The default value is -1, which means that log will not be deleted. Therefore, user must take care while changing it as it contains the history of transactional log, validation, and staging batches.

Data Compression

Row-level security in SQL helps in compressing the entity data and indexes it in a way to minimize the I/O operation. It is helpful at the time, when values of data are not so wide. However, there is an increase in CPU usage with this compression, which is needed to be considered when there are some limitations on CPU.

Explicit Hierarchies

SQL Server Master Data Services previously had two kinds of hierarchies, i.e. Explicit and Derived. Explicit hierarchies are built for some specific purpose of dynamic level of hierarchies. Derived hierarchies are used to set one to many relations between the entities. In SQL 2016, Explicit is deprecated.

Entity Dependencies

In the previous edition of MDS, the relationship entity could be created in Excel Add-in but there is no way to be visible on web. The new edition of MDS has a tab in MDS explorer for it to view on web.

Functioning of Security

There is the new permission in SQL 2016, i.e. super user function. This permission works as Server Admin. In the previous edition of MDS, the server admin access is given to the user who configured the MDS and no one can change it. However, in this new feature in Master Data Service in SQL Server 2016 helps to resolve this issue.

Detailed-level Security

In the previous edition of MDS, there are three modes- Update, Read-Only, and Deny that limits the settings create only, or update only securities. Whereas in the new version, there are six modes available, i.e. Deny, Read, Admin, Create, Delete, and Update.

Display Attributes Name

User can display the name for columns in the latest edition, which helps in changing the display name of code. It makes easy for users to set the desired name accordingly. These were unchangeable in the prior editions of MDS.

Conclusion

In the above discussion, latest features in SQL Server Master Data Services 2016 are described. It helps users in having the deep understanding about the features of MDS 2016 for its proper usage.

24 June 2016

Problems When Running DBCC SHRINKFILE On SQL Server Database

Overview of the Situation

While accessing data files in SQL Server, many issues may occur that may lead to frustrating situation for users. One such situation occurs when user are running DBCC SHRINKFILE. There are many issues with running DBCC SHRINKFILE on your SQL Server data file. In the information described below, user will get to know the causes of this issue and relevant solutions to overcome from the situation.

Reason for Occurrence of Issue

The fragmentation plays a major role in the occurrence of an issue with running DBCC SHRINKFILE on your SQL Server data file. It occurs in two ways as mentioned below:

  • First is the fragmentation of file. As by default 1 MB of file size is used, commonly the file must grow. Though file fragmentation is not directly linked with the fragmentation that is within SQL table, it grounds extra IO operations.
  • Another form of fragmentation is index fragmentation. It arises when the SQL Server splits a page to permit for insertion of new row. The fragmentation of index can be restrained with
sys.dm_db_index_physical_stats

Workaround with the Situation

DBCC SHRINKFILE helps to shrink the size of .mdf file whether to a specific size or to a small size. It moves the pages of file from back to first free pages in front of file along through the importance the pages in reverse order ends up.

User can remove the fragmentation by rebuilding the index by using DBCC DBREINDEX or user can user ALTER TABLE with REBUILD option. User can use mentioned script to rebuild:

PK_Consume_Space

It helps to remove the fragmentation and after that, the user can measure both the space used by database and fragmentation.

Sometimes, the user notice that the size of database has grown from 18 to 42 megabytes. This is because the data is copied to new space in the file and there are two copies of data on disk. If the script is run to add extra space to defragmented index then, the space is allotted for the insertion of new rows without splitting. It helps to boost the performance at time of insertion, which helps to eliminate the fragmentation. User can note down the percentage, which signifies that the physical order rows in the table that are within the file helps to match the logical order of rows. This ordering of data helps in efficient reading the data.

There are the effective solutions with which user can solve the problem as mentioned:

  • User must have extra space in file. If the database keeps on increasing or very dynamic, the extra space is important to avoid the growth of file at the production time.
  • Another option is to rebuild a complete file. It requires CREATE INDEX with DROP_EXISTING statement for all the indexes that are present in file and can be difficult. The easy solution to use ALTER INDEX REORGANIZE or DBCC INDEXDEFRAG. It helps in rebuilding to remove defragmentation.

Conclusion

In the above discussion, some reasons of issues are discussed by running DBCC SHRIKFILE on SQL Server data file. Even along with this, a solution is also discussed which helps users to remove the issues with running DBCC SHRINKFILE on your SQL Server data files.

15 May 2016

SQLPASS SUMMIT 2016 Discount Code

SQLPASS SUMMIT 2016 is around, only few months left for this super event. Probably you all would have heard about the SQLPASS SUMMIT but let me just brief you about this event. This event happens every year during October in different cities in USA, 3 days event. Well, you would also ask why I would bother even if it's an event that run for 3 or 4 days? I would say, my friend, you just need to go to SQLPASS SUMMIT 2015 website and read some numbers, you will surprise to see that there are thousands of people travelling to attend this SUMMIT and to listens the renown speakers including MCM, MVPs. Edwin Sarmiento ( T | B ), my colleague, SQL MVP, MCM and friend is kind of enough to write a guide for SQLPASS SUMMIT First Timer.  

So, let me end the surprise here, want to attend the SQLPASS SUMMIT 2016? Then I have something that would make you happy, and, that is the discount code worth 150$, yes, you've read it right. What you need to do is, while you are registering you need to provide the below details and avail 150$ discount. 

Your Chapter: Surat User Group
Unique Discount Code: LC15KCF8
*The discount code cannot be combined with any other PASS Summit 2016 registration, and cannot be applied retroactively

I would also suggest that you:
  1. Don't forget to read a guide for SQLPASS SUMMIT First Timer by Edwin Sarmiento ( T | B )
  2. Use the Unique discount code when you register for SQL PASS SUMMIT
  3. I am attending the SUMMIT this year, if you happened to attend this year, let's meet. I will be on twitter  




Thank you
Hemantgiri S. Goswami

26 April 2016

How to Recover Data From Corrupted Log File In SQL Server ?

Introduction

SQL Server is a database platform designed for providing large-scale transactions, e-commerce applications, data mining, etc. It is widely used on Business platforms for data analysis, data integration and processing components.

Structured Query Language (commonly known as SQL) is a programming language used for managing data held in relational database management system (RDMS) consisting of data definition language, data manipulation language and a data control language. However, the SQL server database consists of three files:

  1. Primary Database File (MDF file)
  2. Secondary Database File (NDF file)
  3. Log file (LDF file)

Among these three, one database file is Log File. The file maintains a log of all the transactions done in SQL Server Database so that information can be later accessed to recover the database. There must exist one log file for each database and it is possible that more than one log file can be created for a single database. The file is in .ldf file extension.

How SQL Server Log File Corrupt?

The log files of the SQL server may be corrupted, i.e. while performing some action upon SQL database, an interruption in between the procedure is probable to occur due to which user first has to repair and recover the log file corrupted SQL server and then continue their transaction.

Causes of Log File Corruption

  • Viruses or other Malicious Software: In computer system, many viruses can infect and damage the log files and makes them inaccessible.
  • Terminating System Abnormally: If system/application is quit abnormally, then files are prone to be corrupted or damaged.
  • Input Output Configuration: The I/O subsystem is a vital component of database used to store system and user databases. Hence if the configuration is disturbed on enhanced that it may lead to corruption in log files
  • Storage Size Issue: The biggest reason behind the corruption of log files is the storage size. In case the data exceeds the limit of LDF, corruption is likely to occur.

How to Repair Corrupt SQL Log File ?

We will be dealing with corrupted log file for repairing it and recovering data from it by executing commands in SQL server as a hit and trial method. User may apply any one the following workaround and if none of the technique resolve the problem then it means that log file is highly corrupted and can only be repaired using a third party tool.

Solution 1: Using Backup of Log (LDF) File

If users create the backup of log file after every transaction, then they can easily retrieve their data from that backup file.

Solution 2: If Event Log Files are Corrupted

If event log files are corrupted, follow the following steps:

  1. Right-Click on the system folder.
  2. From the menu displaying, select on Properties and then click on Clear Log.
  3. A warning box will display that do you want to archive the system log files, click Yes to continue.
  4. Now save the file anywhere on your system.
  5. Now you will be back to properties window, in that click on OK option.
  6. Close and reopen the application window to see the changes

Solution 3: Put Database Online & Offline

If crash recovery function is enabled, then the database in online mode leads to transaction roll back. Follow the following steps:

  1. Put the database in an offline mode.
  2. Clear active log by using checkpoint to delete corrupted log files.
  3. Now again pitch to Online mode and try to access log files.

Solution 4: Recover using DBCC CHECKDB

In order to repair a corrupt LDF file, use the WITH TABLOCK option for DBCC CHECKDB. It will recover the data from a corrupted LDF file that has been corrupted or damaged due to some reason such as logical corruption.

Solution 5: Recover using CHECKDB Command:

For rebuilding log files, we use another technique, i.e. putting database into Emergency mode and then executing CHECKDB command. However, to change the database mode execute following command:

ALTER DATABASE dbname SET EMERGENCY, SINGLE_USER

Similarly, for repairing the log file using CHECKDB command execute the following syntax:

DBCC CHECKDB(‘dbname’ , REPAIR_ALLOW_DATA_LOSS)

Bottom Lines

Here we end up with the fact that manual procedure has a limitation and the technique does not give guarantee the outcome of an exact solution for repairing the log transaction file corrupted in SQL server, it’s just a hit and trial method. However, another drawback is that it is too complex and if something goes wrong then the result may be complete database loss. Therefore, the most reliable and trouble free method is to use SQL Server Transaction Log Recovery Tool, which repair and recover data from corrupted log files. The data integrity is maintained and user can export LDF files into the featured output format for reading the entries without dependency on server.

25 March 2016

Configure SQL Server Database Mirroring: Step By Step

Database Mirroring in SQL Server is deployed as a method to increase the availability of a SQL Server database in the event of a disaster. It can only be implemented in databases that have full recovery model and works on per-database basis. In database mirroring, two copies of a single SQL Server database are stored on different servers that are separated physically. While the Primary Server provides database to the clients, the Mirror Server acts as a standby server that takes over the place of physical server in case any accident occurs.

In this write-up, we will discuss the procedure to configure SQL Server Database Mirroring

Procedure to Configure Database Mirroring in SQL Server

The procedure of configuring SQL Server Database Mirroring is mainly divided into two sections:

I. Preparation of server instances to take part in Database Mirroring in SQL Server

For every database mirroring session, following requirements are need to be fulfilled:

  1. Separate server instances, residing on separate host systems, should host the primary server, the mirror server, and the witness server.
  2. All the server instances need a database mirroring endpoint. If you need to do so, make sure that the endpoint is accessible by other server instances as well.
  3. Two types of transport security exist for database mirroring:
    • Windows Authentication
    • Certificate-based Authentication
  4. The network access requirement depends on the form of authentication that is being used:
    • If Windows Authentication Is Used
    • In case the server instances run in different domain user accounts, all the accounts require a login into the master database of the other accounts. In case the login does not exist, the users must create it.

    • Is Certificates Are Used
    • For enabling certificate authentication on a server instance, the administrator must configure all the server instances to use both outbound and inbound connections certificates. It is to be noted that outbound server needs to be configures before inbound connections.

  5. Make sure that the logon for the all the database exists on the mirror server.

II. Establishing Database Mirroring in SQL Server

Pre-requisites:

  1. Restore all the below-mentioned backups for creating a mirror database by RESTORE WITH NORECOVERY command on all the restore operations:
    • Restore the recent full database backup of principal server. Make sure that the principal server used full recovery model at the time of the backup. Moreover, the name of the mirror database must be same as the primary server.
    • In case any differential backup of the data has been taken after the restoration of full backup, restore the recent differential backup.
    • Do the restoration of all the log backups done since the differential or full backup of the database.
  2. Mirroring can be setup by either using Windows Authentication i.e. Transact-SQL or by Database Mirroring Wizard i.e. SQL Server Management Studio. Here, we are going to use SQL Server Management Studio.

Procedure for Configuring SQL Server Database Mirroring Via SQL Server Management Studio

Follow the below-mentioned steps for establishing database mirroring in SQL Server:

  1. Open SQL Server Management Studio.
  2. Connect to the Primary server.
  3. Click on View option and select Object Explorer option from the drop-down menu.
  4. Click on the Server Name and expand the server tree.
  5. Click on Databases and expand it.
  6. Select the database that has to be mirrored.
  7. Right click on the database and select Tasks option.
  8. Click on Mirror option. Database Properties dialog box with Mirroring page will open.
  9. Click on Configure Security option to start the configuration.
  10. The Configure Security option will automatically create the database mirroring endpoint on all the server instance and will enter the server network address.
  11. Make sure that the below-mentioned conditions exist.
    • You are connected to principal server instance.
    • Security is configured in a correct manner.
    • The TCP address of the primary and mirror servers are specified.
    • If the operating mode is High Safety with automatic failover, then make sure the TCP address of the witness server is also specified.
  12. Click on Start Monitoring to start the process.
  13. In case you want to change the operating mode, you can do so and save the changes by Ok.

Conclusion

With the help of the above-mentioned procedure, you can easily configure SQL Server Database Mirroring.

20 January 2016

SQL Server Community Activities under Surat User Group

I am very pleased to announce that I will be speaking about SQL Server Storage Structure, Level 100 session on 23rd Jan 2016 at 5 PM.  


Well, this will be the first appearance as a speaker after 2012, since this is on-demand session there will be limited number of audience of 10~20 but I will be very happy to speak.  There will be some announcement to be made after the discussion of this UG meet and I will wrote another post and make those announcement public.

The Venue will be:

Inkey Solutions
406, Empire State Building, Ring Road, Nr Udhna Darwaja

12 January 2016

Blog Roll - List of blog post and script contribution

Greetings of the day!!


I am getting frequent emails with questions why I do not write as often I used to write earlier. I know, I got very few blog posts since 2013; the reason was that I had some personal commitment that kept me busy to take time off and make a blog post. However I tried to keep the moment by posting couple of blogs every year but that's on the another blog space - my employers blog space at http://www.pythian.com/blog/author/goswami/ . I have also contributed some scripts to Technet Script Gallery.  Below is the list of the blog post I made last year and the Script Gallery.

I am determined to post at least couple of entry each month now, I would appreciate your support as always!!


Links to the blog posts
Date Title URL
Jan 5 2016 SQL Server 2016 – AlwaysOn Basic Availability Group http://www.pythian.com/blog/alwayson-basic-availability-group-sql-server-2016/
July 31 2015 SQL Server and OS Error 1117, Error 9001, Error 823 http://www.pythian.com/blog/sql-server-and-os-error-1117-error-9001-error-823/
July 9 2015 Reading System Logs on SQL Server http://www.pythian.com/blog/reading-system-logs-on-sql-server/
July 20 2015 Reading System Logs on SQL Server - Part 2 http://www.pythian.com/blog/reading-system-logs-sql-server-part-2/
Sep 30 2015 Import / Export Multiple SSIS Packages http://www.pythian.com/blog/importexport-multiple-ssis-packages/
July 28 2014 Unexpected Shutdown caused by ASR http://www.pythian.com/blog/unexpected-shutdown-caused-by-asr/
Jan 23 2014 Script to Collect Database Information Quickly http://www.pythian.com/blog/script-to-collect-database-information-quickly/


Links to the Technet script Gallery

Date Title URL
Nov 22 2015 Script to Collect ALL Database Information wtih VLF Count https://gallery.technet.microsoft.com/Script-to-Collect-ALL-82664699
May 3 2013 Collect Cluster Information using TSQL https://gallery.technet.microsoft.com/scriptcenter/COLLECT-CLUSTER-INFORMATION-9a75e4a7
Mar 9 2013 Configure Auto Growth in Fixed MB https://gallery.technet.microsoft.com/scriptcenter/Configure-AutoGrowth-in-f4f3d7d1
Jun 26 2015 Script to Monitor Database Mirroring Health https://gallery.technet.microsoft.com/scriptcenter/Script-to-monitor-database-0f35c5d7
Jun 26 2015 Script to Monitor AlwaysOn Health https://gallery.technet.microsoft.com/scriptcenter/TSQL-for-AlwaysOn-Health-6aae827d

03 August 2015

SQL Server GEEKS Annual summit 2015

BannerSSGS_200x200px

Last month, I have requested my manager for an approval so that I can attend SQL Server Geeks Annual Summit 2015,  and now  when I hear back from him, I am really very excited that I will be attending.

You must be wondering what is so exciting about this Summit ? Well, let me tell you, there are number of reasons, at least for me to be that excited. 

  • In past  (even this year), when I had a chance I could not attended PASS Summit. And I always use to think why something like this is not organizing in INDIA. Well done Amit Bansal ( Blog | Twitter ) and team. You have done this, and, like other folks, I would be thankful.
  • I would be able to meet friends whom I never met in-person despite of knowing them for so long time, and to some I will be meeting after several years. For example Satya Shyam K Jayanti ( Blog | Twitter ), Amit Bansal ( Blog | Twitter ) , Prashant Kumar ( Blog | Twitter ) and probably Denny Cherry ( Blog | Twitter )  as I had few email exchange while I was authoring my book on SQL Server High Availability for Beginners.
  • I have always believed in learning, if you believe me learning is never ending process, if we want to make progress  we have to keep learning; SQLServerGeek’s Annual Summit will provide that platform to learn something new, something exciting, for instance BI, Cloud and Big Data. Learning these technologies from experts will be an advantage.

There are some personal reasons as well for me to attend this event, as I already mentioned – to meet few friends in-person, and to make few more friends. I have been knowing few of these friends while I was aspiring DBA, when I was MVP and when I was being transformed as a professional.

I have been knowing Satya Shyam K Jayanti when I was aspiring DBA, I learned many things from him. Amit Bansal, I know him since 2007 when I was an MVP, though we didn’t meet each other personally we have had many occasion to talk over phone,  email and chat. Prashant Kumar is my colleague and friend, we’ve already met in-person and we talk to each other often as he is working as technical escalation, one level up then me. I will also meet Pinal Dave ( Blog | Twitter ) if he is in city, he don’t need introduction in SQL Server community, he is like brother to me.   There are others whom I have read and it would be fun and exciting to meet if time permits, they are Benjamin Nevarez, Sarabpreet Singh and Manohar Punna.

I would like to thank them all and wish belated Guru Purnima, all as I have learned things from them all. I have always admired dedication and zest that Pinal and Amit showed for community, I wish someday I will be able to match that level of energy and dedication. Sorry I missed to name few, if I write about them all this list and post will not end.  

In last, I would like to thank Amit once again for organizing this event and providing a platform to learn and socialize with alike minded community people. I would also like to thank my manager – John Eisbrener and CEO – Paul Vallee  and my company Pythian for sponsoring me so that I can enjoy the company of alike minded geeks and learn something new.

See you there at #SSGAS2015!!!!

09 November 2013

SSAS– Resolving target machine actively refused

medium_5462764117

I always wanted to have my hands on SSAS, SSIS and SSRS as I always feel short-hand in these areas. Yesterday I decided to get my hands dirty with one of this – SSAS.  Few month back I have created a virtual lab on my 4 year old Dell Laptop with 3 VMs and about total 4 instances of SQL Server – 2 on Host machine and 2 on Guest, one of them is having SSAS installed on one of the named instance.

With lot of excitement I tried connecting to SSAS – it was unsuccessful attempt!! It shows me an error message which says - A connection cannot be made to redirector. Ensure that 'SQL Browser' service is running. (Microsoft.AnalysisServices.AdomdClient)

SSAS_Error

As we always do – Googled and it lead me to the connect item filed by Aaron Bertrand ( Blog | Twitter ). Reading through the post I came to know that if I try to connect to the named instance SSAS it’s chance that I get this very error. I also have the similar case, I have renamed my machine hence this error showed up with the named instance.

Workaround:  As suggested in the comment section, I have granted NTFS permission on folder program files (x86)\microsoft sql server\90\shared\asconfig for the SQL Browser service account and I am sorted!!

Reference: Microsoft Connect ID 679792 

photo credit: JLaw45 via photopin cc

—Hemantgiri S. Goswami (http://www.sql-server-citation.com)

03 November 2013

SQL Server–Security Risk Assessment PMP Way

clip_image001Recently I had a chance to speak my friend Ashish Sharma who is my colleague too. We were discussing Project Management topics (he recently passed PMP and ACP). Our discussion ended at very interesting topic – RISK Assessment.
Definition: According to PMBOK, Projects can have unforeseen event or activity that can impact the project’s progress, result or an outcome in negative or positive way. Further, A Risk can be assessed using two factors – impact and probability. Determining the quantitative or qualitative value of risk related to a concrete situation and recognized threats is known as Risk Assessment.
So, coming to the point, how do we do Risk Assessment for Database systems, especially SQL Server ?
I myself have never come to situation where I have created or used Risk Assessment Register for SQL Server. I was thinking is there some readily available Risk Assessment Register / document somewhere? Hence, I have decided to research little more on it. I found absolutely nothing on this very topic. However, searching for different term - SQL Server security, and Risk Assessment – none found.
After spending some time researching I have found a good sample document for Risk Assessment Register that we can use, you can download it from here.
Now, this is what we have is Risk Register, how about threats? Sometime back, in 2009 I have wrote an article on SQL Server Security, using that article I have tried to pen down some threats that I can think off at this moment, they are:
Possible Threats Risk Mitigation
Virus Attack Install AV on DB Server however do not forget to exclude SQL server files
Unauthorised Logins Always use Audit (C2, Windows, SQL or both) failed login attempts and analyze them regularly. Always, rename SA / Admin account and use strong password mechanism for all the servers. If possible always try to use domain authenticated acconts.
MiM Attack To metigate MiM attach do following :
  • Isolate database server from Application / IIS Server
  • Configure SQL Server to use non-default port
  • Use Encryption / SSL
  • Use Firewall inbound / outbound rules
Root Access To avoid someone getting root access do following:
  • Disable / Rename in-built Admin account
  • Grant least access to Service Account
  • Use strong / random password
SQL Injection Do following:
  • Avoid using Dynamic SQL when you can
  • use parameterized query / stored procedure 3) Always validate the value at form level before it's being passed to database server
  • Refer above items
Known Configuration Always try to customize default configuration if you can, for example:
  • Run SQL server on non standard port
  • disable / rename SA account
  • Disable xp_cmdshell and other extended procedures

References :
http://en.wikipedia.org/wiki/Risk_assessment
http://en.wikipedia.org/wiki/Project_risk_management
http://www.brighthubpm.com/risk-management/3247-creating-a-risk-register-a-free-excel-template/
photo credit: kenteegardin via photopin cc
-- Hemantgiri S. Goswami (http://www.sql-server-citation.com )

21 October 2013

MSDB– cleanup is necessary

Recently I have been asked to work on the task which is to reduce the size of MSDB database. The MSDB has grown to 20 Gigs. Well, you would say, what is the big deal with that? 20 gigs is not considered a big for database. Correct, but for MSDB, yes it is.
So, the question is - why the size has grown this much, and what would be the adverse impact it would have on the performance on my system?
I would say, there could me many reasons, like:
  • Usually, we do not create user objects inside MSDB, but it is good to check
  • Check if there are multiple SSIS/DTS Packages that are large, check with development team if you can store them in file system. Check the link for the list of tables refer links SQL Server 2005, SQL Server 2008 , SQL Server 2008 R2 , and SQL Server 2012
  • There isn’t any CleanUp job configured
  • There are several hundreds of jobs running i.e. LS aka Log Shipping
  • And, so on….
The case:
Client has a server configured with LS for DR purpose. The LS is configured to sync every 15 minutes for several – hundreds of databases which intern inserting lots of data into the historical tables like backup, restore and log_shipping_monitor_history table –all of them had > 75 Lacks of records.
The issue for us was, that the MSDB is configured on the local drive aka where OS and binaries resides – no RAID. Also, the size of the C drive is nearing to it’s capacity – 30 Gigs and it’s quickly filling up. Well, on top of this, the database is in FULL recovery model. The reason that MSDB grows to 20 Gigs are
  1. It never had CleanUp job on it,
  2. there are hundreds of databases keep inserting records for backup and restore
Adverse impact:
  1. Possibly, your backup would take longer than usual as it would take time to write backup and /or restore history
  2. You would see timeout error when you try to dig out the reason for backup job failure
What I did was, I have created a maintenance plan for CleanUp which will call sp_delete_backuphistory which will run cleanup for below tables, per client’s request I have configured job to remove all the older data before 60 days.
To complete the cleanup activity job successfully, the MSDB and tempdb will need some space to grow which is not possible in our case since we are left with only 10 Gigs of space.  Hence, I have added an addition log file and data file for tempdb and msdb on another drive where we have ample space. Schedule LOG backup for MSDB to run every 10 minutes.
And, then, I’ve invoked the CleanUp job – it took about 3 hours to finish, but it did what it should.
Took FULL backup for MSDB, change the recovery model to simple and shrink it – we were able to shrunk the MSDB successfully and bring it down to 6.5 Gigs.
Constraints and possible options:
  1. We had a limited maintenance window to accomplish a task
  2. Another maintenance activity has to be performed once we are done
  3. Option: We could have script foreign keys and other constraint, drop constraints and keys, and delete the records. I haven’t opt this method because I personally never did this.
Take out from this post:  Do health check, and, schedule a Cleanup task for MSDB to run on regular basis.

-- Hemantgiri Goswami (http://www.sql-server-citation.com/)



photo credit: mccun934 via photopin cc

06 September 2013

Not so good news for MCM Aspirants

Few days back, late Friday (30th August 2013) MS has made an announcement – that they are cancelling the MCM Certification, not so good news for MCM Aspirants. This announcement has upset many in the SQL Community including me.

I have my own reasons, and belief about not attending any of the certification exam until few month back. In my 14+ years of career not even a single employer of mine have asked me to go and get certified. I can say, I am lucky, but all I know that I have worked hard – a lot. I have tried to learn a lot, from friends, from colleagues, from online web forums, books online, books and MS Evangelists, and still learning. And, that pays me well, I participate in community activities, speaks, blog and have been able to put my thoughts together to publish a book on SQL Server Cluster – this way I am surviving as a professional.

To be honest, during my career I have interviewed a lot many people and many of them are certified but they don’t have exposure and practical troubleshooting skills which was expected; apart from this, there are dumps available that can made it easy to clear an exam. Please, Neither I am not criticizing those who has passed nor I am saying all who haven’t passed aren’t good as professional, but I am just trying to make a point why I haven’t attempted one. With this kind of experience it never interest me to pass the certification.

And, then, MCM was introduced. I have heard a lot about Oracle exams which has lab test as well and MCM is similar kind of exams. Brent Ozar ( Blog | Twitter ) blogged about What is MCM in 2010. I have followed all his posts about MCM and has developed my interest in attending MCM training followed by an exam. Event though the exam and training fees are way too high for me, I have made my mind that I will utilize the money my employer provide each of us for training + some from my pocket to get train and earn this title – but, sadly, it is gone now.  Since MCITP is a pre-requisite for MCM, I have made my mind, attempted and failed. However, I was sure, I was reading more and will clear it in next attempt. But, I am not sure, whether I will be taking an MCITP now. Sad smile

I will be more than happiest person, if this decision for MCM exam are reversed or some relevant / at par exams are introduced. In our country, it is said that “whatever happened has happened for the good, and whatever that is going to be happen in future will be for good”.  I am hoping that this saying will hold true.

At the end, I would like to say, if you are believer and aspirant of this exam just like me please take some time and vote on the MS Connect site for reconsideration.

Thank you

Hemantgiri S. Goswami (http://www.sql-server-citation.com)

07 August 2013

SSRS – Download all RDL files from Report Server in one go.

Introduction

People working on SSRS are well aware that “Report Manager” does not support downloading all the report files (.rdl files) at one go out-of-box. And I am sure that many of you might have come across this requirement some day or other. Hence, today I will share a simple handy script which would help you to download all the required report files at once.

Implementation

SSRS uses SQL Server to store it’s details as a backend and the Catalog table is used to store the report file in binary form. The below script simply pulls the report definition from the Catalog table & uses BCP utility to export the same at a pre-defined path as a .rdl file.

To use the BCP utility from TSQL, we need to execute  “xp_cmdshell” command; it is disabled by default. So, first you need to execute the below script to enable it -

-- Allow advanced options to be changed.



EXEC sp_configure 'show advanced options', 1



GO



 



-- Update the currently configured value for advanced options.



RECONFIGURE



GO



 



-- Enable xp_cmdshell



EXEC sp_configure 'xp_cmdshell', 1



GO



 



-- Update the currently configured value for xp_cmdshell



RECONFIGURE



GO



 



-- Disallow further advanced options to be changed.



EXEC sp_configure 'show advanced options', 0



GO



 



-- Update the currently configured value for advanced options.



RECONFIGURE



GO




Once successfully executed, the below script with the required changes could be executed to download the files -





--Replace NULL with keywords of the ReportManager's Report Path, 



--if reports from any specific path are to be downloaded



DECLARE @FilterReportPath AS VARCHAR(500) = NULL 



 



--Replace NULL with the keyword matching the Report File Name,



--if any specific reports are to be downloaded



DECLARE @FilterReportName AS VARCHAR(500) = NULL



 



--Replace this path with the Server Location where you want the



--reports to be downloaded..



DECLARE @OutputPath AS VARCHAR(500) = 'D:\Reports\Download\'



 



--Used to prepare the dynamic query



DECLARE @TSQL AS NVARCHAR(MAX)



 



--Reset the OutputPath separator.



SET @OutputPath = REPLACE(@OutputPath,'\','/')



 



--Simple validation of OutputPath; this can be changed as per ones need.



IF LTRIM(RTRIM(ISNULL(@OutputPath,''))) = ''



BEGIN



  SELECT 'Invalid Output Path'



END



ELSE



BEGIN



   --Prepare the query for download.



   /*



   Please note the following points -



   1. The BCP command could be modified as per ones need. E.g. Providing UserName/Password, etc.



   2. Please update the SSRS Report Database name. Currently, it is set to default - [ReportServer]



   3. The BCP does not create missing Directories. So, additional logic could be implemented to handle that.



   4. SSRS stores the XML items (Report RDL and Data Source definitions) using the UTF-8 encoding. 



      It just so happens that UTF-8 Unicode strings do not NEED to have a BOM and in fact ideally would not have one. 



      However, you will see some report items in your SSRS that begin with a specific sequence of bytes (0xEFBBBF). 



      That sequence is the UTF-8 Byte Order Mark. It’s character representation is the following three characters, “”. 



      While it is supported, it can cause problems with the conversion to XML, so it is removed.



   */



   SET @TSQL = STUFF((SELECT



                      ';EXEC master..xp_cmdshell ''bcp " ' +



                      ' SELECT ' +



                      ' CONVERT(VARCHAR(MAX), ' +



                      '       CASE ' +



                      '         WHEN LEFT(C.Content,3) = 0xEFBBBF THEN STUFF(C.Content,1,3,'''''''') '+



                      '         ELSE C.Content '+



                      '       END) ' +



                      ' FROM ' +



                      ' [ReportServer].[dbo].[Catalog] CL ' +



                      ' CROSS APPLY (SELECT CONVERT(VARBINARY(MAX),CL.Content) Content) C ' +



                      ' WHERE ' +



                      ' CL.ItemID = ''''' + CONVERT(VARCHAR(MAX), CL.ItemID) + ''''' " queryout "' + @OutputPath + '' + CL.Name + '.rdl" ' + '-T -c -x'''



                    FROM



                      [ReportServer].[dbo].[Catalog] CL



                    WHERE



                      CL.[Type] = 2 --Report



                      AND '/' + CL.[Path] + '/' LIKE COALESCE('%/%' + @FilterReportPath + '%/%', '/' + CL.[Path] + '/')



                      AND CL.Name LIKE COALESCE('%' + @FilterReportName + '%', CL.Name)



                    FOR XML PATH('')), 1,1,'')



  



  --SELECT @TSQL



  



  --Execute the Dynamic Query



  EXEC SP_EXECUTESQL @TSQL



END




Conclusion



Hope, this helps & save a lot of your valuable time.



Happy Reporting!