Monday, December 8, 2014

Monitor Database Growth and Usage by Willem Gossink, 2014/12/04

http://www.sqlservercentral.com/articles/Monitoring/118079/

Monitor Database Growth and Usage

By Willem Gossink
As a DBA, you probably manage hundreds of databases, if not thousands. Each of these databases has at least two devices and each of these devices may grow (or shrink), either through autogrowth, scripting or manual interventions. Even if we do not consider the creation or deletion of databases or the addition of devices, you may be facing a lot of changes on the storage solution that underlies your database files. In most cases, these changes tend to gradually fill up your disks.

Build a repository

If you want to manage that process of database disk usage (capacity management?), you will require history data. You will need a way to record information about your database files and store that data in a repository. A process like that will put you in the know about changes taking place and will allow you to do some basic trending.
That is exactly what the PowerShell script outlined below will do for you: pull out information regarding size and usage of database devices from your database servers and store that information in a history table. You can then report on the data using tools such as Excel PivotChart or your own favorite graphing solution, and monitor trends. Or you can zoom in on a single database, to see the relation between the size of a data file and its usage. An example is shown below, where the red, block-shaped graph represents the growth of a database’s physical data file (.mdf) over the past year. The blue, spiked graph shows the increase in data inside the physical file over the same period.

Thursday, October 9, 2014

Use SSIS to send emails by Joe Millay, 2014/10/09

Large-scale web sites typically send thousands of emails a day. Often, the code generating these emails is in separate applications resulting in non-standardized, difficult to maintain code. The solution in this article proposes to use the power of SQL Server and SSIS to send emails.
This solution is used by Community Health Network, Indianapolis, IN, http://www.ecommunity.com, which supports eight medical campuses, 70+ physician practices, an online retailer for durable medical equipment (http://www.homehealthmedical.com), and multiple outlier facilities. The site sends thousands of emails a day: patient reminders and confirmations, online retail shopping order confirmations, administrator notifications, etc. 
The SSIS package in this article has been in place for 5 years and, 3,000,000 emails later, it is still doing its magic. Instead of having mulitple lines of code in multiple locations, we now have a standardized email functionality.  The solution is comprised of three basic blocks :
  • PART 1: A table where the email information is stored
  • PART 2: An integration services package that sends the email
  • PART 3: A SQL Server Agent job that runs the package

PART 1: The Table (tbl_SendEmail) and Its Insert Procedure (usp_ins_sendEmail) ...

read on here: http://www.sqlservercentral.com/articles/Integration+Services+(SSIS)/71485/

Tuesday, October 7, 2014

Solving Kerberos Issues in SSRS When Running Beside IIS By Ian Massi 2014/10/07

http://www.sqlservercentral.com/articles/Reporting+Services+(SSRS)/116001/


Introduction

When relying on Windows authentication to give your users access to SQL Server resources from Reporting Services, Active Directory will use either NTLM or Kerberos to resolve access rights.  If Reporting Services isn’t running on the same server as the resources you’re trying to access, then Kerberos is your only option to allow Windows credentials to be passed through.  If Kerberos isn’t pleased with your configuration, it will visit upon you the dreaded double-hop authentication issue.  You can resolve this issue by configuring Active Directory to appease it.  However, when Reporting Services is running on a server that’s also running IIS (Internet Information Services), things can become a bit trickier.  In this harsh environment, appeasement of Kerberos is not enough for us.  We will tame it.

Environment

In our environment we were running SQL Server Reporting Services 2005 (SSRS 2005) on a separate server that already had IIS installed.  SSRS 2005 requires IIS to be able to handle the web side of things and we didn’t want IIS on our SQL Server.  We also have SQL Server Analysis Services (SSAS) in our environment.  There are reports that use SSAS as a data source, which always requires Windows authentication.  So we had the web site for SSRS set up to run in IIS under an application pool that used a domain account while the Windows Service Identity for SSRS ran under NT Authority\NetworkService.

read on ... (http://www.sqlservercentral.com/articles/Reporting+Services+(SSRS)/116001/)

Monday, June 16, 2014

Running SQL Server Databases in the Amazon Cloud (Part 1)

MSSQLTips author Sadequl HussainBy:   


http://www.mssqltips.com/sqlservertip/3251/running-sql-server-databases-in-the-amazon-cloud-part-1/?utm_source=dailynewsletter&utm_medium=email&utm_content=headline&utm_campaign=20140616

Tuesday, June 10, 2014

Connect to SQL Servers in another domain using Windows Authentication by Aaron Bertrand

MSSQLTips author Aaron BertrandBy:  




http://www.mssqltips.com/sqlservertip/3250/connect-to-sql-servers-in-another-domain-using-windows-authentication/?utm_source=dailynewsletter&utm_medium=email&utm_content=headline&utm_campaign=20140610


Problem
You may find that you need to connect remotely to a SQL Server that is in another domain. Since I work from home, I face this challenge all the time - almost every SQL Server I connect to is in another domain (and usually behind a VPN, a stingy firewall, and a stingier network team). This is a problem when SQL authentication is not an option, because my Windows machine is in its own domain; Management Studio cannot directly override the local Windows credentials and pass the credentials for a remote domain user.
The easy answer is to simply establish a remote desktop session to the machine in question, and use Management Studioand other tools locally, but there are several challenges with that. Many corporations have policies in place that prevent tools like SSMS from being installed on the server, and some even resort to Server Core at least in part as a means to enforce this policy. They may not have a dedicated "jump" box to support remote desktop connections to the SQL Server machine, or they may have such a box, but without all of the tools you require.
Solutions
Read on Aaron's post....

Tuesday, May 6, 2014

Thursday, March 13, 2014

SFTP, encrypt or compress data files in SSIS using custom components By Stan Kulp, 2014/03/13

http://www.sqlservercentral.com/articles/Integration+Services+(SSIS)/106616/

http://www.sqlservercentral.com/Forums/Topic1550512-2674-1.aspx?Update=1




The following article is a walkthrough of the CodePlex open source SSIS extensions for SFTP, PGP encryption and Zip archiving.

Resources

Download the appropriate version of the SSIS extensions, and the SFTP mini-server..

1. SSIS Extensions - SFTP Task, PGP Task, Zip Task

2. Core FTP Mini SFTP Server




Install and configure the SSIS extensions

Run the "Installer SSIS Extensions SQL 2008 R2.msi" or "Installer SSIS Extensions SQL 2012.msi file" downloaded above.
After the extensions have been installed start "Microsoft Visual Studio" and create a new SSIS package, then right-click on the "Toolbox" panel and select "Choose Items" from the popup menu.
Select the "SSIS Control Flow Items" tab from the "Choose Toolbox Items" panel, add checkmarks to the "PGP Task," "SFTP Task" and "Zip Task" checkboxes, then click the "OK" button.
The PGP, SFTP and Zip tasks have been added to the toolbox and are ready for use in SSIS packages.

Create SQL Server demo data

Open SQL Server Managment Studio and create a database named TestDB. Download the attached "sp500hst.zip" resource file and extract the "sp500hst.txt" file it contains to a folder on your computer. Run the following SQL script to create and populate a table named "TestDB.dbo.sp500hst."
Note: Be sure you change the code to match your source folder.
/*
     Change 'C:\source_directory\' to the folder containing the 'sp500hst.txt'
     demo file.

     If you get a 'file not found' error, it probably means your are referencing
     a database on a remote server. To run the script you need to copy the file
     to a network folder and reference it with the fully-qualified UNC path, ie. 
     '\\ComputerName\SharedFolder\sp500hst.txt.'
*/

USE TestDB

BEGIN TRY
    DROP TABLE [dbo].[sp500hst]
END TRY
BEGIN CATCH
END CATCH
GO
CREATE TABLE [dbo].[sp500hst] (
    [Date] [VARCHAR](10) NULL,
    [Ticker] [VARCHAR](10) NULL,
    [Open] [VARCHAR](10) NULL,
    [High] [VARCHAR](10) NULL,
    [Low] [VARCHAR](10) NULL,
    [Close] [VARCHAR](10) NULL,
    [Volume] [VARCHAR](10) NULL
) ON [PRIMARY]
GO
BULK INSERT [dbo].[sp500hst] FROM 'C:\source_directory\sp500hst.txt' WITH (FIELDTERMINATOR = ',')
GO
SELECT * FROM [dbo].[sp500hst]
Select from the TestDB.dbo.sp500hst table to confirm that it contains the test data.

Configure demo directories

Create file folders named "c:\testfiles\" and "c:\sftp\."
Go to http://ianpurton.com/online-pgp/ to generate a PGP public key.
Paste the public key into a text file named "pgp_public_key.txt" and save it to the "c:\testfiles\" folder.
Alternatively, you can download the attached "pgp_public_key.txt" resource file and save it to the "c:\testfiles\" folder.
At this point the "c:\testfiles\" directory looks like this...
...and the "c:\sftp\" directory looks like this.

Configure and start the mini SFTP server

Double-click the "msftpsrvr.exe" file downloaded above, configure the server with user "username," password "password," port "22" and root path "c:\sftp\," then click the "Start" button.

Configure the SSIS package

Download the attached "SFTP_PGP_ZIP.dtsx" resource file and open it in "Microsoft Visual Studio" to display the task components and the configuration variables passed from the "Create data file" script task to the SFTP, PGP and Zip task components.
Double-click on the "TestDB" connection manager and configure it to point to the location of the test data created above.

Execute the SSIS package

Click on the "Start Debugging"  button to execute the SSIS package.
The SFTP mini server displays log messages when the transfers occur.
After package execution is complete...
...the "c:\testfiles\" directory looks like this...
...and the "c:\sftp\" directory looks like this.
This is what the "c:\testfiles\market.csv" file looks like.

Script task code

The script task queries the sp500hst table and writes to a file named "market.csv."
Imports System
Imports System.IO
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports System.Data.SqlClient

<System.AddIn.AddIn("ScriptMain", Version:="1.0", Publisher:="", Description:="")> _
<System.CLSCompliantAttribute(False)> _
Partial Public Class ScriptMain
    Inherits Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase

    Enum ScriptResults
        Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success
        Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
    End Enum

    Public Sub Main()

        Dim file_name As String = Dts.Variables("file_name").Value.ToString
        Dim file_directory As String = Dts.Variables("file_directory").Value.ToString
        Dim file_path As String = file_directory & file_name
        Dim zip_target_root As String = file_name & ".zip"
        Dim zip_target As String = file_directory & zip_target_root
        Dim pgp_target_root As String = file_name & ".pgp"
        Dim pgp_target As String = file_path & ".pgp"
        Dim pgp_public_key As String = file_directory & "pgp_public_key.txt"

        Dts.Variables("file_path").Value = file_path
        Dts.Variables("zip_target_root").Value = zip_target_root
        Dts.Variables("zip_target").Value = zip_target
        Dts.Variables("pgp_target_root").Value = pgp_target_root
        Dts.Variables("pgp_target").Value = pgp_target
        Dts.Variables("pgp_public_key").Value = pgp_public_key

        Dim FileToDelete As String = file_path
        If System.IO.File.Exists(FileToDelete) = True Then
            System.IO.File.Delete(FileToDelete)
        End If

        Dim mySqlStatement As String = "SELECT [Date],[Ticker],[Open],[High],[Low],[Close],[Volume] FROM [dbo].[sp500hst]"
        Dim myADONETConnection As SqlClient.SqlConnection = DirectCast(Dts.Connections("TestDB").AcquireConnection(Dts.Transaction), SqlClient.SqlConnection)
        Dim myCommand As New SqlClient.SqlCommand(mySqlStatement, myADONETConnection)
        Dim reader As SqlDataReader = myCommand.ExecuteReader(CommandBehavior.CloseConnection)

        Dim file As System.IO.StreamWriter
        file = My.Computer.FileSystem.OpenTextFileWriter(file_path, True)
        Dim sLine As String

        Do While reader.Read()

            sLine = reader("Date").ToString
            sLine += "," & reader("Ticker").ToString
            sLine += "," & reader("Open").ToString
            sLine += "," & reader("High").ToString
            sLine += "," & reader("Low").ToString
            sLine += "," & reader("Close").ToString
            sLine += "," & reader("Volume").ToString

            file.WriteLine(sLine)

        Loop

        file.Close()
        reader.Close()
        myADONETConnection.Close()

        Dts.TaskResult = ScriptResults.Success

    End Sub

End Class
It also sets the values of the parameters used by the PGP, Zip and SFTP custom task components.

Suppress spurious warnings

When the "DelayValidation" property one of the custom task components is set to "False," the task component will display a warning icon.
Set the "DelayValidation" property to "True" for all the custom task components to override spurious warnings.

Configuration of SFTP, PGP and Zip task components

Double-click each task component to bring up its task editor. The configuration parameters for each component are selected from drop-down lists in its task editor panel. This is how the PGP task component is configured.
This is how the Zip task component is configured.
This is how the SFTP task component that transfers the encrypted file is configured.
This is how the SFTP task component that transfers the zipped is configured.

Resources:

sp500hst.zip | pgp_public_key.txt | SFTP_PGP_ZIP.dtsx

Monday, January 13, 2014

Monitor SQL Server Disk Space Using PowerShell

Monitor SQL Server Disk Space Using PowerShell by  



http://www.mssqltips.com/sqlservertip/3109/monitor-sql-server-disk-space-using-powershell/?utm_source=dailynewsletter&utm_medium=email&utm_content=headline&utm_campaign=20140113


Problem
I work in an environment where getting new software approved is often quite difficult. We have a disk space monitor in place, but it doesn't provide the type of monitoring my team was looking for. Monitoring disk space is critical for DBAs In addition, we use mount points, which is not supported by our existing disk space monitor.
Solution
My solution is PowerShell centric. Many of you may wonder why a DBA would use PowerShell. I tend to use the following guidelines to determine whether I will use PowerShell or TSQL to perform a task.
  • TSQL - If the primary function of the task is set based (Select, Insert, Update, Delete), I use TSQL
  • PowerShell - If the task I am trying to perform has a significant amount of flow control, or needs to access objects that are not SQL, I tend to use PowerShell

Friday, January 10, 2014

Fuzzy-String Search: Find misspelled information with T-SQL by By Thomas Keller, 2014/01/10 (first published: 2012/09/18)

Fuzzy-String Search: Find misspelled information with T-SQL
by By Thomas Keller, 2014/01/10 (first published: 2012/09/18)

http://www.sqlservercentral.com/articles/Fuzzy+Match/92822/

Fuzzy-String Search: Find misspelled information with T-SQL

By Thomas Keller, 2014/01/10 (first published: 2012/09/18)
How many times has a user told you that they entered a contact into your CRM system, and now they cannot locate it by name? Then, after you help them locate it by some other means, it turns out that either the original entry or the search term was misspelled? Would you like your search to be able to suggest alternates when a key word is misspelled?

When "fuzzy" string matching is most helpful

Whether it's misheard, mistyped, poorly scanned, or copied and pasted from a bad source, misspelled information is a common irritant to the user. If a user can't find the record they're looking for quickly, they will probably enter it again. If it's your job to merge the duplicates, then it's to your advantage to provide your users with a "fuzzy" search option. It may take a few seconds longer, so don't make it your default search. Just give your users the option, and they will learn to try it when their first search doesn't find what they're looking for. If they start entering a new record anyway, you can also launch a background process right after the user fills in the key field(s), which can alert them to potential duplicates before they are ready to hit the save button.