1

Hope I dont upset anybody by asking too simple a question!

I have a requirement to export data from a SQL Server 2012 table, to a CSV file. This needs to be done either every hour, or ideally if it is possible, whenever a new record is created or an existing record is updated/deleted. The table contains a list of all Sites we maintain. I need to export this CSV file to a particular location, as there is an API from a third party database which monitors this location and imports CSV files from there.

The data to be extracted from SQL is:

Mxmservsite.siteid as Marker_ID, mxmservsite.name as Name, 'SITE' as Group, '3' as Status, 
'' as Notes, mxmservsite.zipcode as Post_Code, 'GB' as Country, '' as Latitude, 
'' as Longitude, '' as Delete
Where dataareaid='ansa'

Anyone have any clues how I can go about doing this? Sorry, I am a newbie with SQL and still learning the basics! I have searched for similar questions in the past, but havent found anything. I know there is a utility called BCP, but not sure whether that would be the best way, and if it would be, then how do I use it to run every hour, or whenever there is a record update/delete/insert?

Cheers

Naz
  • 11
  • 1
  • 1
  • 2
  • 1
    You can use SSIS or write custom code and run in job – Lukasz Szozda Aug 24 '15 at 16:02
  • " if it is possible, whenever a new record is created or an existing record is updated/deleted." => write custom code, add trigger on table which you want to track – Lukasz Szozda Aug 24 '15 at 16:13
  • Everybody seems to think triggers are a bad idea? This is quite a low activity table, maybe 3 or 4 record changes per day happen in it. Would that make it suitable for triggers? – Naz Aug 24 '15 at 16:22
  • As always it depends, you can create stored procedure that exports data, then create job that execute this stored procedure, and in trigger start job. You can event track how many changes occurs and if less than threshold do nothing. – Lukasz Szozda Aug 24 '15 at 16:27

3 Answers3

2

Here's some powershell that would do what you're after; just schedule it using the Windows Task Scheduler:

function Execute-SQLQuery {
    [CmdletBinding()]
    param (
        [Parameter(Mandatory = $true)]
        [string]$DbInstance
        ,
        [Parameter(Mandatory = $true)]
        [string]$DbCatalog
        ,
        [Parameter(Mandatory = $true)]
        [string]$Query
        ,
        [Parameter(Mandatory = $false)]
        [int]$CommandTimeoutSeconds = 30 #this is the SQL default
    )
    begin {
        write-verbose "Call to 'Execute-SQLQuery': BEGIN"
        $connectionString = ("Server={0};Database={1};Integrated Security=True;" -f $DbInstance,$DbCatalog)
        $connection = New-Object System.Data.SqlClient.SqlConnection
        $connection.ConnectionString = $connectionString
        $connection.Open()    
    }
    process {
        write-verbose "`n`n`n-----------------------------------------"
        write-verbose "Call to 'Execute-SQLQuery': PROCESS"
        write-verbose $query 
        write-verbose "-----------------------------------------`n`n`n"
        $command = $connection.CreateCommand()
        $command.CommandTimeout = $CommandTimeoutSeconds
        $command.CommandText = $query
        $result = $command.ExecuteReader()
        $table = new-object “System.Data.DataTable”
        $table.Load($result)
        Write-Output $table
    }
    end {
        write-verbose "Call to 'Execute-SQLQuery': END"
        $connection.Close()
    }
}

Execute-SQLQuery -DbInstance 'myServer\InstanceName' -DbCatalog 'myDatabase' -Query @"
select Mxmservsite.siteid as Marker_ID
 , mxmservsite.name as Name
 , 'SITE' as Group
 , '3' as Status
 , '' as Notes
 , mxmservsite.zipcode as Post_Code
 , 'GB' as Country
 , '' as Latitude
 , '' as Longitude
 , '' as Delete
 From mxmservsite --this wasn't in your original code
 Where dataareaid='ansa'
 "@ | Export-CSV '.\MyOutputFile.csv' -NoType 

To have something triggered on any change is possible; i.e. you could create a trigger on the table, then use xp_cmdshell to execute a script or similar; but that's going to lead to performance problems (triggers are often a bad option if used without being fully understood). Also xp_cmdshell opens you up to some security risks.

There are many other ways to achieve this; currently I have a thing for PowerShell as it gives you loads of flexibility with little overhead.

Another option may be to look into using linked servers to allow your source database to directly update the target without need for CSV.

JohnLBevan
  • 22,735
  • 13
  • 96
  • 178
  • This is great. What would you are forced to use SQL Authentication, as in, avoiding hardcoded user/pass? Also, if you are running this on a server -which you would so the machine stays running - doesn't enabling powershell for the service account then also open up security risks? – Davos Mar 15 '17 at 03:35
  • 1
    You can add parameters to the start of the script to allow those arguments to be passed in at runtime. You can store credentials in an encrypted file: http://stackoverflow.com/a/6240319/361842. You should restrict permissions on the folder holding the script (and encrypted password file; though that's less important) so that only the account running the script can amend it; that avoids someone adding code to output the password once it's unencrypted. The password is (by default) encrypted/decrypted per user; so only those knowing the account's credentials would be able to run the decrypt. – JohnLBevan Mar 15 '17 at 09:42
0

Another option - create a sql agent job that runs bcp.exe command to do the export for you, at any interval you want (every hour). With bcp.exe, you can specify your file location, column/row terminators, and the filtering query.

If you want to export at every change, you can add an after trigger as mentioned above, and simply exec the sql agent job, which will execute asynchronously. If you are concerned about performance, then you should test it out to understand the impact.

If you like @John's powershell script, stick it in a sql agent job and schedule it, if anything to keep all your SQL tasks centralized.

Greg
  • 3,861
  • 3
  • 23
  • 58
0

You'll need to specify the Server Name that you are currently on. You're not able to use a drive letter using D$, but need to use a Shared drive name. The following works in 2012.

-- Declare report variables
DECLARE @REPORT_DIR VARCHAR(4000)
DECLARE @REPORT_FILE VARCHAR(100)
DECLARE @DATETIME_STAMP VARCHAR(14)
DECLARE @Statement VARCHAR(4000)
DECLARE @Command VARCHAR(4000)

--SET variables for the Report File
SET @DATETIME_STAMP = (SELECT CONVERT(VARCHAR(10), GETDATE(), 112) + REPLACE(CONVERT(VARCHAR(8), GETDATE(),108),':','')) -- Date Time Stamp with YYYYMMDDHHMMSS
SET @REPORT_DIR = '\\aServerName\SharedDirectory\' -- Setting where to send the report. The Server name and a Shared name, not a drive letter
SET @REPORT_FILE = @REPORT_DIR + 'Tables_' + @DATETIME_STAMP + '.csv' --the -t below is used for the csv file

--Create the CSV file report with all of the data. The @Statement variable must be used to use variables in the xp_cmdshell command.
SET @Statement = '"SELECT * FROM  sys.tables" queryout "'+@REPORT_FILE+'" -c -t"," -r"\n" -S"CurrentServerName\Databasename" -T' --The -S must be used with the -T
SET @Command = 'bcp '+@Statement+' '
EXEC master..xp_cmdshell @Command 
Fairy
  • 3,592
  • 2
  • 27
  • 36
Swerdna
  • 1
  • 1