Delete OLD SQL Backups using Powershell in one go !!!

November 23, 2013 Leave a comment

issue :

some times the maintance plans do not delete old Backups files (based on their age) ,that we have intended it to do

also the delete maintance task (subtask) may also skip files .. based on their creation date or AGE ..

manually deleting the old .BAK or any other files is not really inspiring after a certain time ..

Solution : 

we can use Powershell (PS) to help us acheive the same result .. with few lines ..

SETUP :: im backing up the databases into a test folder .. but the PS script can used to do the same tasks for any storage filer server .. that you must be using to store backups — JUST MAKE SURE YOU PASS the CORRECT PATH ..

use below script to backup all database to a sepcfic path

USE master

GO

DECLARE @int varchar(15)

declare @Rand varchar(15)

DECLARE @SQL varchar(200)

declare @path varchar(200)

DECLARE @Cur varchar(20)

declare curr CURSOR FOR

SELECT name from sys.databases where database_id <> 2

OPEN curr

FETCH NEXT

FROM curr INTO @cur

WHILE @@FETCH_STATUS = 0

BEGIN

–declare @var char(20)

PRINT @cur
–set @var = @cur

set @int = convert(varchar(20),getdate(),112)

set @rand = substring(convert(varchar(10),rand()*1000,112),0,4)

set @path = ‘D:\POWERSHELL\TEST\New Folder\’+@cur+’_’+@int+’_’+@rand+’.BAK’

–print @path

backup database @cur to disk = @path
FETCH NEXT

FROM curr INTO @cur

END

CLOSE curr

DEALLOCATE curr

GO

Above script is send all full backups to path –> D:\POWERSHELL\TEST\New Folder\    (choose any path .. that works for you !!!)

once we have backups in place ..  (or we know which Backups are to deleted according to their age ) .. start you powershell ..

START –> RUN –> POWERSHELL.EXE    (will work on all machines above windows XP,Server 2003 ) .. for this 2 .. you need to download and install

you do not have to change you path .. to the folder .. from where you want to delete files .. (example in case of external storage location which is accessed like \\DATADOMAINSERVER01\SQLBACKUP\DAILY_Backups\) .. you can run the powershell script ..  from wherever your current path ..

in my case im switching to the location which holds the path .. because i need to show few options ..

the -Recurse option present in DIR or get-CHILDITEM is able to traverse through all sub folders under the parent folder ..

hence with this option you will be able to delete old files in a external storage path like —

\\DATADOMAINSERVER01\SQLBACKUP\DAILY_Backups\

below results show .. that DIR – Recurse is able to trverse through D:\POWERSHELL\TEST\ and a subfolder named New folder

(we excluded the BAK files .. so as to minimize the outcome)

PS_dir

below command will give you backups created 3 hours back .. you will be looking for much older backups and use below command

which works with days instead of hours  (i have used below .. as a DEMO )

 USE addhours(-30)   (to get a count of files and folders which creation date older then 30 days )

get-childitem “D:\POWERSHELL\TEST\New Folder” -Recurse | Where-Object {$_.creationdate-lt ($(getdate).adddays(-30))} | measure

PS_dir_2

NOW :  we only want to work with .BAK files and possibly may like to exlcude some files from deletion command … so we will use -include *.BAK (or *.SAFE) and -Exlcude  (to exclude any type of files example *.PS1 )

if you use SQL SAFe for backups or any other Backup utility use the required extention

use below command

basically .. we are using -lt (less then comparision) to filter out files older then 30 days .. (in example its older then 2 hours)

below command will give .. only 2 files .. of type *.BAK ..

 get-childitem “D:\POWERSHELL\TEST\New Folder” -Recurse -include *.BAK -exclu
de *.PS1 | Where-Object {$_.creationtime -lt ($(get-date).adddays(-30))} | select -First 2

PS_Dir_3

use below command to get a exact count of *.BAK files .. you will be deleting

 get-childitem “D:\POWERSHELL\TEST\New Folder” -Recurse -include *.BAK

-exclude *.PS1 | Where-Object {$_.creationtime -lt ($(get-date).adddays(-30))} | measure 

PS_dir_4

if you trying to get a count of files older then 30 days .. on large file server .. above command will take some time to complete ..

———-command to delete files older then certains Days ————————

********** take care of below points **************

1> mention correct path from where you want to delete backup files (even sub-folder will be taken into account)

2> mention -include *.BAK .. to make sure only backup files are deleted .. also if needed exclude any application or config file

if its possibly on the mentioned path ..

3> make sure to mention correct AGE or no. of days you want to delete files  adddays(-30)) .. means you want to delete a month old files or 2 months old files ETC 

****************************************************************

 get-childitem “D:\POWERSHELL\TEST\New Folder” -Recurse -include *.BAK -exclude *.PS1 | Where-Object {$_.creationtime -lt ($(get-date).adddays(-30))} | remove-item -force

****************************************************************

PS_dir_5

 

Above command will delete the backup files of type (*.BAK) and exclude the files of type (*.PS1)  older then 30 days

under path D:\POWERSHELL\TEST\New Folder and any sub folder .. it may have .. 

try out this command .. and let me know your thoughts ..

ENJOY 🙂

 

Msg 3140, Level 16, State 5, Line 1 Could not adjust the space allocation for file ‘DBfilename’

October 1, 2013 4 comments

We had database which had few tables with image/text (LOB) datatypes .. and after some archeiving of older data … we wanted to reclaim the free space in the database ..

normal Shrink command was not able to reclaim any of free space in the database .. so i decided to go with notruncate and truncate_only options .. but the second part was failing with an error .. Could not adjust the space allocation for file

SCRIPT

go

print ‘start time’
print getdate()
go
DBCC shrinkfile (1,notruncate)
go
print ‘completed first step’
print getdate()
go
DBCC shrinkfile (1,truncateonly)
print ‘completed 2nd step’
print getdate()

out put :: 

(1 row(s) affected)
DBCC execution completed. If DBCC printed error messages, contact your system administrator.
completed first step
Oct 1 2013 2:17AM
Msg 3140, Level 16, State 5, Line 1
Could not adjust the space allocation for file ‘DBfilename’.
DBCC execution completed. If DBCC printed error messages, contact your system administrator.
completed 2nd step
Oct 1 2013 2:17AM

Solution :

after aboveerror i tried to shrink the DB with DBCC Shrinkfile (1,targetSize)  (which had initially failed to shrink any free space)

but this time i was able to shrink the DB to required size …  so solution to error such as this is

just Shrink the DB with t-sql script  DBCC Shrinkfile (1,targetSize)  or use GUI to shrink the DB … 

it should complete real fast as all the shrinking has been already done by DBCC shrinkfile (1,truncateonly) …

ue below script to confirm the reduction is the overall DB files ..

select
a.FILEID,
[FILE_SIZE_MB] =
convert(decimal(12,2),round(a.size/128.000,2)),
[SPACE_USED_MB] =
convert(decimal(12,2),round(fileproperty(a.name,’SpaceUsed’)/128.000,2)),
[FREE_SPACE_MB] =
convert(decimal(12,2),round((a.size-fileproperty(a.name,’SpaceUsed’))/128.000,2)) ,
NAME = left(a.NAME,15),
FILENAME = left(a.FILENAME,30)
from
dbo.sysfiles a

refer below link for internals of shrink operation

http://blogs.msdn.com/b/psssql/archive/2008/03/28/how-it-works-sql-server-2005-dbcc-shrink-may-take-longer-than-sql-server-2000.aspx

ENJOY 🙂

Could not find database ID 2, name ‘tempdb’. The database may be offline

September 14, 2013 Leave a comment

 

 

 

one of my SQL server .. has been logging in ERRORS related to Tempdb been unavailable during Startup and after few seconds everything

 

seems to be working fine .. i assumed that its related to drives holding the TempDB was going offline or it was not accessible when the SQL

 

was coming online .. below is the error message 

 

ERROR :

 

013-09-10 06:36:22.850 spid55 Could not find database ID 2, name ‘tempdb’. The database may be offline. Wait a few minutes and try again. 2013-09-10 06:36:22.980 spid9s Starting up database ‘tempdb’.

 

After some time the SQL would recover normally .. without any furthur errors or similar messages 

 

Solution :

 

i figured out that this is a known issue and does occur in SQL 2000 and 2005 (could also be a possiblity in higher versions too)

 

below is what is causing this 

 

Application like  Biztalk or web Farm keep trying to connect to SQL every second while the sql is starting up .. 

 

and they try to Create objects in tempdb .. from the very go … as TEMPDB is been refreshed and not accessible  above mentioned messages

 

are logged in SQL error log .. (which can lead to DBA’s looking for reason from drive prespective)

 

–> i have used TRACE FLAG 3614 to cause the SSNETLIB.dll (SQL server network library) to be loaded only after successful recovery of

 

system databases and hance above messages are not logged in SQL error log .. 

 

refer : http://support.microsoft.com/kb/834798

 

 

 

load test you SQL server CPU

August 24, 2013 Leave a comment

Hey There ..

                      I found 2 great CPU load testing tools .. that you can use to run heck your SQL server CPU ..

                      here’s what I got from them (download it from below link )

                       http://www.primatelabs.com/geekbench/

                     GEEKBENCH 3 can be downloaded from above link .. and below are some of the test it run against the CPU and

                      memory .. the results are very detailed and easy to consume ..

Geekbench_1

                       once you start the benchmark test .. you will see different load been put on CPU and memory ..

Geekbench_2

                       you will observer .. that CPU and memory are going over the roof .. and wish to figure out other ways to do

                       the  same …

Geekbench_3                    the results are informative about what is and what counter and numbers our server resources have hit

Geekbench_4

                     the 2nd great tool .. that we will discuss is available for download from below link ..

                     http://www.cpuid.com/softwares/cpu-z/versions-history.html

                    once you are done with download .. and get it installed .. you will get a screen similar to below ..

CPU-Z_2

the different tabs on above screen cover all aspect of processor .. that you would be interested in finding out ..

so enjoy and play around with these free tools and spread the word ..

Thanks for reading .. 🙂

bonus video by

Understanding CPU Specifications DalePoston

great whitepapers on Contention in SQL Server

August 24, 2013 1 comment

 

Hey .. Check out 2 great whitepapers on contention in SQL SERVER ..

 

Diagnosing and Resolving Latch Contention on SQL Server

 

http://sqlcat.com/sqlcat/b/whitepapers/archive/2011/07/05/diagnosing-and-resolving-latch-contention-on-sql-server.aspx

 

Diagnosing and Resolving Spinlock Contention on SQL Server

 

http://sqlcat.com/sqlcat/b/whitepapers/archive/2011/07/05/diagnosing-and-resolving-spinlock-contention-on-sql-server.aspx

Enjoy 🙂

 

SQL SERVER – Difference Between Union vs. Union All – Optimal Performance Comparison

August 10, 2013 Leave a comment

Difference Between Union vs. Union All
Great Post by Pinal Dave ..
Enjoy 🙂

Categories: INDEXES

getting Started with Execution plan – part 1

August 4, 2013 Leave a comment

Hi There,

today we will see ..ways to retrieve estimated Execution plan through different option available ..

I’m using 2008 R2 Adventure works database which is below  available link

http://msftdbprodsamples.codeplex.com/releases/view/59211

(for other downloadable database option use THIS)

execution plans can be either estimated or it can be actual execution plan ..

     estimated execution plan :

its how the optimizer thinks will be the best possible way to implement a particular query .

which operators and in what sequence to access the tables …

so the query completes in fastest possible time ,having lowest cost

COST –> CPU cost + IO cost

actual execution plan : is how the storage engine actually executed the query it got from relational engine

relational engine passes on the estimated execution plan to the storage engine …

which uses it as reference to execute the submitted query by user …

Actual execution plan may differ due to the fact .. that statistics are out

of date, execution of some part of query caused recompilation or the

storage engine choose to execute the query in parallel mode …

estimated execution plans are stored in plan cache from where it can be reused …

different ways to retrieve execution plans

A. while we access/retrieve  the estimated execution plan ..

the query is never actually executed ,only estimated

plan is produced by the optimizer..

ways to get estimated execution plan

 1> set Showplan_all ON  (give a text based output of estimated execution plan .. has quite a information )

set Showplan_all OFF (to switch off)

EP_5

  2> SET  SHOWPLAN_text ON (gives a text based estimated execution plan ,which has lesser information )

compared to  Showplan_ALL,but this is usefull for tools like oSQL.exe which will ready consume the output

set Showplan_Text OFF

EP_6

3> set Showplan_XML ON (to get the XML based execution plan which much more detailed)

set Showplan_XML OFF

once you click on the SHOWPLANXML link in the result tab .. you will below screen .. with more tooltips

EP_3

EP_4

 

 

 

             4> forth option to get the estimated execution plan is SSMS based .. you need to click a button within GUI or

go to SSMS – > Query -> display estimated execution plan or Control + L

EP_2

in the next post .. we will see ways to retrieve actual execution plan ..

im some cases actual plan may differ from estimated execution plan ..

Thanks for reading ……. ENJOY 🙂

multi-threading is for computers,but human are single-threaded and will be

August 2, 2013 Leave a comment

Check out a great post by Thomas Larock about why to FOCUS on single-task at time .. to really getting it done ..

why we should switch off social technology to focus on our most critical projects ..

“To do two things at once is to do neither.” – Publilius Syrus

read the whole blog on below link …

http://thomaslarock.com/2013/08/whats-the-biggest-mistake-you-make-each-day/

ENJOY:)

using powershell to select from Tables in database

July 28, 2013 1 comment

I was looking for some ways to connect to connect to SQL instance .. Sarabpreet had covered quite interesting ways to do the same … you may check his article over here (a big Thanks to Sarab for sharing the same …)

below is what I tried on windows 7 (SQL 2008 R2 )

once we are within shell .. shell treats all objects with SQL

as files and folders … like what we have while we are working with CMD

we can use different ways to connect to sql server via PowerShell tools ..PS1

1> type powershell.exe   in the start –> run prompt and hit enter

this will keep you PowerShell prompt

2> type SQLPS.EXE   in the start –> run prompt and hit enter

3> or right click on any tab in the SSMS and click on Start PowerShell

depending on were you have right clicked .. you be taken to appropriate

depth or hierarchy with in the SQL instance

PS1_1

for this example we will use step 3 …

refer the snapshot on the right …

if you right click on any of the object within the SQL instance .. we will

taken into the corresponding hierarchy level within the SQL

once we are within the database .. we will use command DIR .. to look for

list of folder and files within (comparing the database structure to OS file system )

Note :: after a certain level command will not return any results or output …

below is an example of a command ran at database level .. we can all objects that are available in SSMS for  a database

connect_2

we will need to use a SQLPS command named INVOKE-SQLCMD to get any meaning full data out of the SQL server ..

once OS based commands like dir (which will work same as LS and get-childitem) hits its limits ..

1> get-help invoke-sqlcmd -FULL

2> get-help invoke-sqlcmd -EXAMPLE

3> get-help invoke-sqlcmd

connect_7

invoke-SQLCMD has multiple parameters which help use to define query and connection options

so once we are within correct location (use cd or cd .. to move from one path to another .. )

we will select data from a table ..as below  .. command used is as below (we may try it on any table with some data in it)

invoke-sqlcmd “select top 10 * from dbo.T2 where id > 1000”

connect_3

below is an example of using SQLCMD to run another command to populate data ..

so you can run any t-sql command in SQLPS shell.. that use can run in SSMS

connect_4

so what is use of all this .. if we can run these commands in SSMS … well the power of powershell

really comes across when we have list of servers to manange .. and we want to automate some admin

tasks across all of them …. going forward I will share more examples on Powershell …

Enjoy:)

Check a Huge collection of books on Microsoft technology .. all FREEEEE

July 23, 2013 Leave a comment
Design a site like this with WordPress.com
Get started