Conditional Logic in Postdeployment.SQL Script Using SQLcmd

Conditional logic in PostDeployment.sql script using SQLCMD

UPDATE

I've now discovered that the if/else syntax above doesn't work for me because some of my linked scripts require a GO statement. Essentially the :r just imports the scripts inline, so this becomes invalid sytax.

If you need a GO statement in the linked scripts (as I do) then there isn't any easy way around this, I ended up creating several post deployment scripts and then changing my project to overwrite the main post depeployment script at build time depending on the build configuration. This is now doing what I need, but it seems like there should be an easier way!

For anyone needing the same thing - I found this post useful

So in my project I have the following post deployment files:

  • Script.PostDeployment.sql (empty file which will be replaced)
  • Default.Script.PostDeployment.sql (links to scripts needed for standard data config)
  • Configuration1.Script.PostDeployment.sql (links to scripts needed for a specific data config)

I then added the following to the end of the project file (right click to unload and then right click edit):

  <Target Name="BeforeBuild">
<Message Text="Copy files task running for configuration: $(Configuration)" Importance="high" />
<Copy Condition=" '$(Configuration)' == 'Release' " SourceFiles="Scripts\Post-Deployment\Default.Script.PostDeployment.sql" DestinationFiles="Scripts\Post-Deployment\Script.PostDeployment.sql" OverwriteReadOnlyFiles="true" />
<Copy Condition=" '$(Configuration)' == 'Debug' " SourceFiles="Scripts\Post-Deployment\Default.Script.PostDeployment.sql" DestinationFiles="Scripts\Post-Deployment\Script.PostDeployment.sql" OverwriteReadOnlyFiles="true" />
<Copy Condition=" '$(Configuration)' == 'Configuration1' " SourceFiles="Scripts\Post-Deployment\Configuration1.Script.PostDeployment.sql" DestinationFiles="Scripts\Post-Deployment\Script.PostDeployment.sql" OverwriteReadOnlyFiles="true" />
</Target>

Finally, you will need to setup matching build configurations in the solution.

Also, for anyone trying other work arounds, I also tried the following without any luck:

  1. Creating a post build event to copy the files instead of having to hack the project file XML. i couldn't get this to work because I couldn't form the correct path to the post deployment script file. This connect issue describes the problem

  2. Using variables for the script path to pass to the :r command. But I came across several errors with this approach.

How can I conditionally include large scripts in my ssdt post deployment script?

I ended up using a mixture of our build tool (Jenkins) and SSDT to accomplish this. This is what I did:

  1. Added a build step to each environment-specific Jenkins job that writes to a text file. I either write a SQLCMD command that includes the import file or else I leave it blank depending on the build parameters the user chooses.
  2. Include the new text file in the Post Deployment script via :r.

That's it! I also use this same approach to choose which pre and post deploy scripts to include in the project based on the application version, except that I grab the version number from the code and write it to the file using a pre-build event in VS instead of in the build tool. (I also added the text file name to .gitignore so it doesn't get committed)

If condition in SQLCMD script in sql database project

There is a way to do this, but that info must first be prefaced with:

Don't   Danger Will Robinson!   do   Danger Will Robinson!   this!!

Your object definitions really need to be the same across all environments. Meta-data (as in config data) can be different, but not object definitions. Else you are setting yourself (not to mention the next unlucky person to take this over) up for failure. It is unmaintainable. There are differences in performance, functionality, and security between using a Linked Server and not using one. In your current setup, any QA done against this code is invalid as it isn't even close to testing the same thing in each environment.

BUT, it is understandable that there are environment-specific differences that need to be dealt with. In which case, you should have a different Linked Server definition in each environment. Don't need the Linked Server in one or more environments? That's fine, just create a "loop back" Linked Server that points at the Instance that it is currently residing in.


That said, you just need to bury definitions for objects requiring their own batch in an EXEC, such as the following:

IF ('$(Environment)' = 'Production')
BEGIN
EXEC(N'ALTER VIEW [dbo].[vwTable]
AS
SELECT Col1, Col2
FROM Production.database.dbo.Table1;');
END
ELSE
BEGIN
EXEC(N'ALTER VIEW [dbo].[vwTable]
AS
SELECT Col2, Col3, Col4
FROM LinkedServer.Development.database.dbo.Table2;');
END;

Dealing with SQLCMD :r in a post-deployment script

Well, it turns out, that simply specifying the relative path to the script file is enough. I got misled by Visual Studio's Validate SQL Syntax feature, which evaluated the path after :r by using devenv.exe's location as a starting directory.

However, when Deploy is executed on the project, the :r command uses the script's directory as the starting directory, so the file's location can be given easily using a relative path.

How to pass a variable to sql deploy on devops?

Here the complete solution i didn't find anywhere:

  1. Declare a variable for the dacpac. Inside Visual studio, go to the property project page, tab SQLCMD Variables. Add the env variable like this:

    Sample Image

  2. In the sql script, write the variable inside a string, so it won't break the build with a SQL syntax error. It is not a SQL variable. The string '$(env)' will be replaced as a token before it runs:

    if ('$(env)' = 'DEV') ...

    You can test it still in visual studio by publishing the project.

  3. Finally go the deploy task definition and setup the variable like this:

    - task: SqlDacpacDeploymentOnMachineGroup@0   
    displayName: Install database
    inputs:
    TaskType: 'dacpac'
    DacpacFile: '**/app-db.dacpac'
    TargetMethod: 'server'
    ServerName: '(localdb)\MSSQLLocalDB'
    DatabaseName: 'app-dev'
    AuthScheme: 'windowsAuthentication'
    AdditionalArguments: /v:env=DEV

SQLCMD include all scripts in folder

I got it working:

<MyFilesPath Include="$(ProjectDir)MyPath\*.sql"/>
<MyFiles Include="@(MyFilesPath->':r %22..\Scripts\%(filename)%(extension)%22%0D%0A', '')"/>

Then I include @MyFiles in my PostScript-file.



Related Topics



Leave a reply



Submit